Sparse coding via thresholding and local competition in neural circuits (bibtex)
by C.J. Rozell, D.H Johnson, R.G. Baraniuk and B.A. Olshausen
Abstract:
While evidence indicates that neural systems may be employing sparse approximations to represent sensed stimuli, the mechanisms underlying this ability are not understood. We describe a local ly competitive algorithm (LCA) that solves a collection of sparse coding principles minimizing a weighted combination of mean-squared error (MSE) and a coefficient cost function. LCAs are designed to be implemented in a dynamical system composed of many neuron-like elements operating in parallel. These algorithms use thresholding functions to induce local (usually one-way) inhibitory competitions between nodes to produce sparse representations. LCAs produce coefficients with sparsity levels comparable to the most popular centralized sparse coding algorithms while being readily suited for neural implementation. Addi- tionally, LCA coefficients for video sequences demonstrate inertial properties that are both qualitatively and quantitatively more regular (i.e., smoother and more predictable) than the coefficients produced by greedy algorithms.
Reference:
Sparse coding via thresholding and local competition in neural circuitsC.J. Rozell, D.H Johnson, R.G. Baraniuk and B.A. Olshausen. Neural Computation, 20(10), pp. 2526–2563, October 2008. \textbfSelected for Faculty of 1000 Biology (now F1000Prime).
Bibtex Entry:
@Article{rozell.06c,
  author = 	 {Rozell, C.J. and Johnson, D.H and Baraniuk, R.G. and Olshausen, B.A.},
  title = 	 {Sparse coding via thresholding and local competition in neural circuits},
  journal = 	 {Neural Computation},
  year = 	 {2008},
  volume = {20},
  number = {10},
  pages = {2526--2563},
  month = {October},
  abstract =     {While evidence indicates that neural systems may be
                  employing sparse approximations to represent sensed
                  stimuli, the mechanisms underlying this ability are
                  not understood. We describe a local ly competitive
                  algorithm (LCA) that solves a collection of sparse
                  coding principles minimizing a weighted combination
                  of mean-squared error (MSE) and a coefficient cost
                  function. LCAs are designed to be implemented in a
                  dynamical system composed of many neuron-like
                  elements operating in parallel. These algorithms use
                  thresholding functions to induce local (usually
                  one-way) inhibitory competitions between nodes to
                  produce sparse representations. LCAs produce
                  coefficients with sparsity levels comparable to the
                  most popular centralized sparse coding algorithms
                  while being readily suited for neural
                  implementation. Addi- tionally, LCA coefficients for
                  video sequences demonstrate inertial properties that
                  are both qualitatively and quantitatively more
                  regular (i.e., smoother and more predictable) than
                  the coefficients produced by greedy algorithms. },
  url =          {http://siplab.gatech.edu/pubs/rozellNeuralComp2008.pdf},
  note = {\textbf{Selected for Faculty of 1000 Biology (now F1000Prime).}}
}
Powered by bibtexbrowser