We present a novel recurrent neural network (RNN) architecture that combines
the remembering ability of unitary RNNs with the ability of gated RNNs to
effectively forget redundant information in the input sequence. We achieve this
by extending Unitary RNNs with a gating mechanism. Our model is able to
outperform LSTMs, GRUs and Unitary RNNs on different benchmark tasks, as the
ability to simultaneously remember long term dependencies and forget irrelevant
information in the input sequence helps with many natural long term sequential
tasks such as language modeling and question answering. We provide competitive
results along with an analysis of our model on the bAbI Question Answering
task, PennTreeBank, as well as synthetic tasks that involve long-term
dependencies such as parenthesis, denoising and copying tasks.

Source: http://bppro.link/?c=Gkv

About

No Comments

Be the first to start a conversation

Leave a Reply

Your email address will not be published.

14 − eight =