Distributed Sequence Memory of Multidimensional Inputs in Recurrent Networks (bibtex)
by , and
Abstract:
Recurrent neural networks (RNNs) have drawn interest from machine learning researchers because of their effectiveness at preserving past inputs for time-varying data processing tasks. To understand the success and limitations of RNNs, it is critical that we advance our analysis of their fundamental memory properties. We focus on echo state networks (ESNs), which are RNNs with simple memoryless nodes and random connectivity. In most existing analyses, the short-term memory (STM) capacity results conclude that the ESN network size must scale linearly with the input size for unstructured inputs. The main contribution of this paper is to provide general results characterizing the STM capacity for linear ESNs with multidimensional input streams when the inputs have common low-dimensional structure: sparsity in a basis or significant statistical dependence between inputs. In both cases, we show that the number of nodes in the network must scale linearly with the information rate and poly-logarithmically with the ambient input dimension. The analysis relies on advanced applications of random matrix theory and results in explicit non-asymptotic bounds on the recovery error. Taken together, this analysis provides a significant step forward in our understanding of the STM properties in RNNs.
Reference:
Distributed Sequence Memory of Multidimensional Inputs in Recurrent NetworksA.S. Charles, D. Yin and C.J. Rozell. Journal of Machine Learning Research, 18(7), pp. 1-37, 2017.
Bibtex Entry:
@Article{charles.16,
  author = 	 {Charles, A.S. and Yin, D. and Rozell, C.J.},
  title = 	 {Distributed Sequence Memory of Multidimensional Inputs in Recurrent Networks},
abstract = {Recurrent neural networks (RNNs) have drawn interest from machine learning researchers because of their effectiveness at preserving past inputs for time-varying data processing tasks.  To understand the success and limitations of RNNs, it is critical that we advance our analysis of their fundamental memory properties.  We focus on echo state networks (ESNs), which are RNNs with simple memoryless nodes and random connectivity. In most existing analyses, the short-term memory (STM) capacity results conclude that the ESN network size must scale linearly with the input size for unstructured inputs. The main contribution of this paper is to provide general results characterizing the STM capacity for linear ESNs with multidimensional input streams when the inputs have common low-dimensional structure: sparsity in a basis or significant statistical dependence between inputs. In both cases, we show that the number of nodes in the network must scale linearly with the information rate and poly-logarithmically with the ambient input dimension. The analysis relies on advanced applications of random matrix theory and results in explicit non-asymptotic bounds on the recovery error. Taken together, this analysis provides a significant step forward in our understanding of the STM properties in RNNs.},
url = {http://jmlr.org/papers/volume18/16-270/16-270.pdf},
journal = {Journal of Machine Learning Research},
year = 2017,  
volume  = {18},
number  = {7},
pages   = {1-37}
}
Powered by bibtexbrowser