Forgetting is Necessary for Recurrent Networks to Recover Sequences Longer than Network Size
dc.contributor.author | Ciftci, Koray | |
dc.contributor.author | Akdenizy, Rafet | |
dc.date.accessioned | 2024-10-29T17:43:23Z | |
dc.date.available | 2024-10-29T17:43:23Z | |
dc.date.issued | 2020 | |
dc.department | Tekirdağ Namık Kemal Üniversitesi | |
dc.description | 43rd International Conference on Telecommunications and Signal Processing, TSP 2020 -- 7 July 2020 through 9 July 2020 -- Milan -- 162353 | |
dc.description.abstract | Dynamic interactions among the recurrently connected network elements provide a key mechanism for understanding the short term memory. Besides network structure, input statistics play a fundamental role in determining the memory capacity of a network. Recurrent neural networks can store temporal sequences of input stimuli longer than the size of the network when the input is sparse. In this work, the interplay between the short term memory capacity and forgetting is investigated with an orthogonal recurrent neural network. Forgetting is parametrized as a decay element in a linear model of short term memory and it was observed that a certain amount of forgetting, or memory fading is necessary for successful recovery of longer stimulus sequences. © 2020 IEEE. | |
dc.identifier.doi | 10.1109/TSP49548.2020.9163439 | |
dc.identifier.endpage | 671 | |
dc.identifier.isbn | 978-172816376-5 | |
dc.identifier.scopus | 2-s2.0-85090599647 | |
dc.identifier.startpage | 668 | |
dc.identifier.uri | https://doi.org/10.1109/TSP49548.2020.9163439 | |
dc.identifier.uri | https://hdl.handle.net/20.500.11776/12312 | |
dc.indekslendigikaynak | Scopus | |
dc.language.iso | en | |
dc.publisher | Institute of Electrical and Electronics Engineers Inc. | |
dc.relation.ispartof | 2020 43rd International Conference on Telecommunications and Signal Processing, TSP 2020 | |
dc.relation.publicationcategory | Konferans Öğesi - Uluslararası - Kurum Öğretim Elemanı | en_US |
dc.rights | info:eu-repo/semantics/closedAccess | |
dc.subject | Compressed sensing | |
dc.subject | memory | |
dc.subject | neural network | |
dc.subject | recurrent | |
dc.subject | sparsity | |
dc.title | Forgetting is Necessary for Recurrent Networks to Recover Sequences Longer than Network Size | |
dc.type | Conference Object |