A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures

Volume: 31, Issue: 7, Pages: 1235 - 1270
Published: Jul 1, 2019
Abstract
Recurrent neural networks (RNNs) have been widely adopted in research areas concerned with sequential data, such as text, audio, and video. However, RNNs consisting of sigma cells or tanh cells are unable to learn the relevant information of input data when the input gap is large. By introducing gate functions into the cell structure, the long short-term memory (LSTM) could handle the problem of long-term dependencies well. Since its...
Paper Details
Title
A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures
Published Date
Jul 1, 2019
Volume
31
Issue
7
Pages
1235 - 1270
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.