Stone, J.V. (2016) Principles of Neural Information Theory A Tutorial Introduction. Tutorial Introductions . Sebtel Press , Sheffield ISBN 978-0993367922
Abstract
The human brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the computational efficiency of neurons, with special reference to visual perception. A diverse range of examples is used to show how information theory effectively defines fundamental and unbreachable limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary and tutorial appendices, this book is ideal for novices who wish to understand the essential principles of neural information theory.
Metadata
Item Type: | Book |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © 2017 Sebtel Press. All rights reserved. No part of this book may be reproduced or transmitted in any form without written permission from the author. The author asserts his moral right to be identified as the author of this work in accordance with the Copyright, Designs and Patents Act 1988. |
Dates: |
|
Institution: | The University of Sheffield |
Academic Units: | The University of Sheffield > Faculty of Science (Sheffield) > Department of Psychology (Sheffield) |
Depositing User: | Symplectic Sheffield |
Date Deposited: | 01 Dec 2016 16:00 |
Last Modified: | 01 Dec 2016 16:00 |
Status: | Published |
Publisher: | Sebtel Press |
Series Name: | Tutorial Introductions |
Related URLs: | |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:105940 |