Ciftci, KorayAkdenizy, Rafet2024-10-292024-10-292020978-172816376-5https://doi.org/10.1109/TSP49548.2020.9163439https://hdl.handle.net/20.500.11776/1231243rd International Conference on Telecommunications and Signal Processing, TSP 2020 -- 7 July 2020 through 9 July 2020 -- Milan -- 162353Dynamic interactions among the recurrently connected network elements provide a key mechanism for understanding the short term memory. Besides network structure, input statistics play a fundamental role in determining the memory capacity of a network. Recurrent neural networks can store temporal sequences of input stimuli longer than the size of the network when the input is sparse. In this work, the interplay between the short term memory capacity and forgetting is investigated with an orthogonal recurrent neural network. Forgetting is parametrized as a decay element in a linear model of short term memory and it was observed that a certain amount of forgetting, or memory fading is necessary for successful recovery of longer stimulus sequences. © 2020 IEEE.en10.1109/TSP49548.2020.9163439info:eu-repo/semantics/closedAccessCompressed sensingmemoryneural networkrecurrentsparsityForgetting is Necessary for Recurrent Networks to Recover Sequences Longer than Network SizeConference Object6686712-s2.0-85090599647