APSIPA Transactions on Signal and Information Processing > Vol 11 > Issue 1

Recurrent Neural Networks and Their Memory Behavior: A Survey

Yuanhang Su, University of Southern California, USA, suyuanhang@hotmail.com , C.-C. Jay Kuo, University of Southern California, USA
 
Suggested Citation
Yuanhang Su and C.-C. Jay Kuo (2022), "Recurrent Neural Networks and Their Memory Behavior: A Survey", APSIPA Transactions on Signal and Information Processing: Vol. 11: No. 1, e26. http://dx.doi.org/10.1561/116.00000123

Publication Date: 16 Aug 2022
© 2022 Y. Su and C.-C. J. Kuo
 
Subjects
 
Keywords
Recurrent neural networkslong short-term memorygated recurrent unitbidirectional recurrent neural networksnatural language processing
 

Share

Open Access

This is published under the terms of CC BY-NC.

Downloaded: 1942 times

In this article:
Introduction 
Inception: Simple Recurrent Neural Networks 
Time Unrolling and Back Propagation Through Time 
LSTM and Further Extensions 
Theory of Memory Decay and Enhancement 
Conclusion and Future Work 
References 

Abstract

After their inception in the late 1980s, recurrent neural networks (RNNs) as a sequence computing model have seen mushrooming interests in communities of natural language processing, speech recognition, computer vision, etc. Recent variations of RNNs have made breakthroughs in fields such as machine translation where machines can achieve human level quality. RNNs assisted speech recognition technology is providing services on subtitles for live streaming videos. In this survey, we will offer a historical perspective by walking through the early years of RNNs all the way to their modern forms, detailing their most popular architectural designs and, perhaps more importantly, demystify the mathematical aspect behind their memory behaviors.

DOI:10.1561/116.00000123