Information Theory and Neuroscience (bibtex)
by and
Abstract:
When Shannon developed information theory, he envisioned a systematic way to determine how much "information" could be transmitted over an arbitrary communications channel. While this classic work embraces many of the key aspects of neural communication (e.g., stochastic stimuli and communication signals, multiple-neuron populations, etc.), there are difficulties in applying his concepts meaningfully to neuroscience applications. We describe the classic information theoretic quantities—entropy, mutual information, and capacity—and how they can be used to assess the ultimate fidelity of the neural stimulus representation. We also discuss some of the problems that accompany using and interpreting these quantities in a neuroscience context. We also present an overview of post-Shannon research areas that leverage his work in rate-distortion theory that are extremely relevant to neuroscientists looking to understand the neural code. The presentation is meant to be mostly tutorial in nature, setting the stage for other workshop presentations.
Reference:
Information Theory and NeuroscienceD.H. Johnson and C.J. Rozell. In Computational Neuroscience Meeting Workshop on Methods of Information Theory in Computational Neuroscience, July 2006.
Bibtex Entry:
@InProceedings{johnson.06,
  author = 	 {Johnson, D.H. and Rozell, C.J.},
  title = 	 {Information Theory and Neuroscience},
  booktitle =	 {Computational Neuroscience
 Meeting Workshop on Methods of Information Theory in
 Computational Neuroscience},
  year =	 2006,
  address =	 {Edinburgh, UK},
  month =	 {July},
  abstract = {When Shannon developed information theory, he envisioned
a systematic way to determine how much "information" could be
transmitted over an arbitrary communications channel. While this
classic work embraces many of the key aspects of neural communication
(e.g., stochastic stimuli and communication signals, multiple-neuron
populations, etc.), there are difficulties in applying his concepts
meaningfully to neuroscience applications. We describe the classic
information theoretic quantities---entropy, mutual information, and
capacity---and how they can be used to assess the ultimate fidelity of
the neural stimulus representation. We also discuss some of the
problems that accompany using and interpreting these quantities in a
neuroscience context. We also present an overview of post-Shannon
research areas that leverage his work in rate-distortion theory that
are extremely relevant to neuroscientists looking to understand the
neural code. The presentation is meant to be mostly tutorial in
nature, setting the stage for other workshop presentations.}
}
Powered by bibtexbrowser