Principles of Neural Information Theory

Computational neuroscience and metabolic efficiency

Book cover with neural network pattern

About this book

"How the brain works, and (more importantly) why it works that way."

Dr Robin Ince

The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer.

In this richly illustrated book, Shannon's mathematical theory of information is used to explore the metabolic efficiency of neurons, with special reference to visual perception. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain.

Written in an informal style, with a comprehensive glossary, tutorial appendices, explainer boxes, and a list of annotated Further Readings, this book is an ideal introduction to cutting-edge research in neural information theory.

Contains 220 pages and 126 figures.

ISBN: 978-0993367922

Published 8 June 2018, Francis Crick's birthday.

Download Chapter 1 (PDF, 6.9MB)

Lecture slides on neural information theory (PDF, 1.3MB)

Corrections

Available in print and ebook from:

Amazon.com logo
Amazon.co.uk logo

Contents

1. In the light of evolution

1.1 Introduction
1.2 All that we see
1.3 In the light of evolution
1.4 In search of general principles
1.5 Information theory and biology
1.6 An overview of chapters

2. Information theory

2.1 Introduction
2.2 Finding a route, bit by bit
2.3 Information and entropy
2.4 Maximum entropy distributions
2.5 Channel capacity
2.6 Mutual information
2.7 The Gaussian channel
2.8 Fourier analysis
2.9 Summary

3. Measuring neural information

3.1 Introduction
3.2 The neuron
3.3 Why spikes?
3.4 Neural information
3.5 Gaussian firing rates
3.6 Information about what?
3.7 Does timing precision matter?
3.8 Rate codes and timing codes
3.9 Summary

4. Pricing neural information

4.1 Introduction
4.2 The efficiency-rate trade off
4.3 Paying with spikes
4.4 Paying with hardware
4.5 Paying with power
4.6 Optimal axon diameter
4.7 Optimal distribution of axon diameters
4.8 Axon diameter and spike speed
4.9 Optimal mean firing rate
4.10 Optimal distribution of firing rates
4.11 Optimal synaptic conductance
4.12 Summary

5. Encoding colour

5.1 Introduction
5.2 The eye
5.3 How aftereffects occur
5.4 The problem with colour
5.5 A neural encoding strategy
5.6 Encoding colour
5.7 Why aftereffects occur
5.8 Measuring mutual information
5.9 Maximising mutual information
5.10 Principal component analysis
5.11 PCA and mutual information
5.12 Evidence for efficiency
5.13 Summary

6. Encoding time

6.1 Introduction
6.2 Linear models
6.3 Neurons and wine glasses
6.4 The LNP model
6.5 Estimating LNP parameters
6.6 The predictive coding model
6.7 Estimating predictive parameters
6.8 Evidence for predictive coding
6.9 Summary

7. Encoding space

7.1 Introduction
7.2 Spatial frequency
7.3 Do ganglion cells decorrelate images?
7.4 Optimal receptive fields: overview
7.5 Receptive fields and information
7.6 Measuring mutual information
7.7 Maximising mutual information
7.8 van Hateren's model
7.9 Predictive coding of images
7.10 Evidence For predictive coding
7.11 Is receptive field spacing optimal?
7.12 Summary

8. Encoding visual contrast

8.1 Introduction
8.2 The compound eye
8.3 Not wasting entropy
8.4 Measuring the eye's response
8.5 Maximum entropy encoding
8.6 Efficiency of maximum entropy encoding
8.7 Summary

9. The neural Rubicon

9.1 Introduction
9.2 The Darwinian cost of efficiency
9.3 Crossing the neural Rubicon

Further reading

Appendices

A. Glossary
B. Mathematical symbols
C. Correlation and independence
D. A vector matrix tutorial
E. Neural information methods
F. Key equations

References

Index

Book figures

Coloured discs optical illusion
Cones and cells diagram

The image above flips between colour and grey discs every three seconds. The grey discs should appear to have a light colour wash, which is the colour aftereffect.

Note that these aftereffects are not exactly red-green and blue-yellow unless the computer screen has been calibrated precisely.

Most of the figures in this book are licensed for use as specified below (eg for teaching and for non-commercial attributed use). Only these 119 figures are contained in the ZIP file below.

Download the figures (ZIP, 15.6MB)

The download contains a single folder in which each of the 119/126 figures is a PDF file.

Creative Commons License icon

Creative Commons License

Only figures from Principles of Neural Information Theory by James V Stone are licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Reviews

"This is a terrific book, which cannot fail to help any student who wants to understand precisely how energy and information constrain neural design. The tutorial approach adopted makes it more like a novel than a textbook. Consequently, both mathematically sophisticated readers and readers who prefer verbal explanations should be able to understand the material.

"Overall, Stone has managed to weave the disparate strands of neuroscience, psychophysics, and Shannon’s theory of communication into a coherent account of neural information theory. I only wish I'd had this text as a student!"

Peter Sterling, Professor of Neuroscience, University of Pennsylvania, co-author of Principles of Neural Design (2015)

"Essential reading for any student of the 'why' of neural coding: why do neurons send signals they way they do? Stone’s insightful, clear, and eminently readable synthesis of classic studies is a gateway to a rich, glorious literature on the brain. Student and professor alike will find much to spark their minds within. I shall be keeping this wonderful book close by, as a sterling reminder to ask not just how brains work, but why."

Mark Humphries, Professor of Computational Neuroscience, University of Nottingham, UK

"This excellent book provides an accessible introduction to an information theoretic perspective on how the brain works, and (more importantly) why it works that way. Using a wide range of examples, including both structural and functional aspects of brain organisation, Stone describes how simple optimisation principles derived from Shannon's information theory predict physiological parameters (eg axon diameter) with remarkable accuracy.

"These principles are distilled from original research papers, and the informal presentation style means that the book can be appreciated as an overview; but full mathematical details are also provided for dedicated readers. Stone has integrated results from a diverse range of experiments, and in so doing has produced an invaluable introduction to the nascent field of neural information theory."

Dr Robin Ince, Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, UK