Information Theory and Neuroscience: A Tutorial (bibtex)
by D.H. Johnson, C.J. Rozell and I.N. Goodman
Abstract:
When Shannon developed information theory, he envisioned a systematic way to determine how much "information" could be transmitted over an arbitrary communications channel. While this classic work embraces many of the key aspects of neural communication (e.g., stochastic stimuli and communication signals, multiple-neuron populations, etc.), there are difficulties in applying his concepts meaningfully to neuroscience applications. We describe the classic information theoretic quantities—entropy, mutual information, and capacity—and how they can be used to assess the ultimate fidelity of the neural stimulus representation. We also discuss some of the problems that accompany using and interpreting these quantities in a neuroscience context. We also present an overview of post-Shannon research areas that leverage his work in rate-distortion theory that are extremely relevant to neuroscientists looking to understand the neural code. The presentation is meant to be mostly tutorial in nature, setting the stage for succeeding presentations.
Reference:
Information Theory and Neuroscience: A TutorialD.H. Johnson, C.J. Rozell and I.N. Goodman. November 2006.
Bibtex Entry:
@Conference{johnson.06b,
  author = 	 {Johnson, D.H. and Rozell, C.J. and Goodman, I.N.},
  title = 	 {Information Theory and Neuroscience: {A} Tutorial},
  booktitle =	 {Gulf Coast Consortium Conference
on Theoretical \& Computational Neuroscience},
  year =	 2006,
  address =	 {Houston, TX},
  month =	 {November},
  abstract = {When Shannon developed information theory, he envisioned
  a systematic way to determine how much "information" could be
  transmitted over an arbitrary communications channel. While this
  classic work embraces many of the key aspects of neural
  communication (e.g., stochastic stimuli and communication signals,
  multiple-neuron populations, etc.), there are difficulties in
  applying his concepts meaningfully to neuroscience applications. We
  describe the classic information theoretic quantities---entropy,
  mutual information, and capacity---and how they can be used to
  assess the ultimate fidelity of the neural stimulus
  representation. We also discuss some of the problems that accompany
  using and interpreting these quantities in a neuroscience
  context. We also present an overview of post-Shannon research areas
  that leverage his work in rate-distortion theory that are extremely
  relevant to neuroscientists looking to understand the neural
  code. The presentation is meant to be mostly tutorial in nature,
  setting the stage for succeeding presentations.}
}
Powered by bibtexbrowser