The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory...

Buy Now From Amazon

Product Review

The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the computational efficiency of neurons, with special reference to visual perception and the efficient coding hypothesis. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural processing; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, and a list of annotated Further Readings, this book is an ideal introduction to the principles of neural information theory.

Similar Products

Artificial Intelligence Engines: A Tutorial Introduction to the Mathematics of Deep LearningInformation Theory: A Tutorial IntroductionAn Introductory Course in Computational Neuroscience (Computational Neuroscience Series)Bayes' Rule: A Tutorial Introduction to Bayesian AnalysisThe Book of Why: The New Science of Cause and EffectMachine Learning: An Applied Mathematics IntroductionReinforcement Learning: An Introduction (Adaptive Computation and Machine Learning series)The Deep Learning Revolution (The MIT Press)