Gated Orthogonal Recurrent Units: On Learning to Forget
Author(s)
Jing, Li; Gulcehre, Caglar; Peurifoy, John; Shen, Yichen; Tegmark, Max; Soljacic, Marin; Bengio, Yoshua; ... Show more Show less
DownloadPublished version (596.1Kb)
Terms of use
Metadata
Show full item recordAbstract
© 2019 Massachusetts Institute of Technology. We present a novel recurrent neural network (RNN)based model that combines the remembering ability of unitary evolution RNNs with the ability of gated RNNs to effectively forget redundant or irrelevant information in its memory. We achieve this by extending restricted orthogonal evolution RNNs with a gating mechanism similar to gated recurrent unit RNNs with a reset gate and an update gate. Our model is able to outperform long short-term memory, gated recurrent units, and vanilla unitary or orthogonal RNNs on several long-term-dependency benchmark tasks. We empirically show that both orthogonal and unitary RNNs lack the ability to forget. This ability plays an important role in RNNs. We provide competitive results along with an analysis of our model on many natural sequential tasks, including question answering, speech spectrum prediction, character-level language modeling, and synthetic tasks that involve long-term dependencies such as algorithmic, denoising, and copying tasks.
Date issued
2019Department
Sloan School of Management; Massachusetts Institute of Technology. Department of PhysicsJournal
Neural Computation
Publisher
MIT Press - Journals