Forgetting is Necessary for Recurrent Networks to Recover Sequences Longer than Network Size

Küçük Resim Yok

Tarih

2020

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Yayıncı

Institute of Electrical and Electronics Engineers Inc.

Erişim Hakkı

info:eu-repo/semantics/closedAccess

Özet

Dynamic interactions among the recurrently connected network elements provide a key mechanism for understanding the short term memory. Besides network structure, input statistics play a fundamental role in determining the memory capacity of a network. Recurrent neural networks can store temporal sequences of input stimuli longer than the size of the network when the input is sparse. In this work, the interplay between the short term memory capacity and forgetting is investigated with an orthogonal recurrent neural network. Forgetting is parametrized as a decay element in a linear model of short term memory and it was observed that a certain amount of forgetting, or memory fading is necessary for successful recovery of longer stimulus sequences. © 2020 IEEE.

Açıklama

43rd International Conference on Telecommunications and Signal Processing, TSP 2020 -- 7 July 2020 through 9 July 2020 -- Milan -- 162353

Anahtar Kelimeler

Compressed sensing, memory, neural network, recurrent, sparsity

Kaynak

2020 43rd International Conference on Telecommunications and Signal Processing, TSP 2020

WoS Q Değeri

Scopus Q Değeri

Cilt

Sayı

Künye