Computation and Neural Systems


Book Description

Computational neuroscience is best defined by its focus on understanding the nervous systems as a computational device rather than by a particular experimental technique. Accordinlgy, while the majority of the papers in this book describe analysis and modeling efforts, other papers describe the results of new biological experiments explicitly placed in the context of computational issues. The distribution of subjects in Computation and Neural Systems reflects the current state of the field. In addition to the scientific results presented here, numerous papers also describe the ongoing technical developments that are critical for the continued growth of computational neuroscience. Computation and Neural Systems includes papers presented at the First Annual Computation and Neural Systems meeting held in San Francisco, CA, July 26--29, 1992.




Single Neuron Computation


Book Description

This book contains twenty-two original contributions that provide a comprehensive overview of computational approaches to understanding a single neuron structure. The focus on cellular-level processes is twofold. From a computational neuroscience perspective, a thorough understanding of the information processing performed by single neurons leads to an understanding of circuit- and systems-level activity. From the standpoint of artificial neural networks (ANNs), a single real neuron is as complex an operational unit as an entire ANN, and formalizing the complex computations performed by real neurons is essential to the design of enhanced processor elements for use in the next generation of ANNs.The book covers computation in dendrites and spines, computational aspects of ion channels, synapses, patterned discharge and multistate neurons, and stochastic models of neuron dynamics. It is the most up-to-date presentation of biophysical and computational methods.




From Neuron to Cognition via Computational Neuroscience


Book Description

A comprehensive, integrated, and accessible textbook presenting core neuroscientific topics from a computational perspective, tracing a path from cells and circuits to behavior and cognition. This textbook presents a wide range of subjects in neuroscience from a computational perspective. It offers a comprehensive, integrated introduction to core topics, using computational tools to trace a path from neurons and circuits to behavior and cognition. Moreover, the chapters show how computational neuroscience—methods for modeling the causal interactions underlying neural systems—complements empirical research in advancing the understanding of brain and behavior. The chapters—all by leaders in the field, and carefully integrated by the editors—cover such subjects as action and motor control; neuroplasticity, neuromodulation, and reinforcement learning; vision; and language—the core of human cognition. The book can be used for advanced undergraduate or graduate level courses. It presents all necessary background in neuroscience beyond basic facts about neurons and synapses and general ideas about the structure and function of the human brain. Students should be familiar with differential equations and probability theory, and be able to pick up the basics of programming in MATLAB and/or Python. Slides, exercises, and other ancillary materials are freely available online, and many of the models described in the chapters are documented in the brain operation database, BODB (which is also described in a book chapter). Contributors Michael A. Arbib, Joseph Ayers, James Bednar, Andrej Bicanski, James J. Bonaiuto, Nicolas Brunel, Jean-Marie Cabelguen, Carmen Canavier, Angelo Cangelosi, Richard P. Cooper, Carlos R. Cortes, Nathaniel Daw, Paul Dean, Peter Ford Dominey, Pierre Enel, Jean-Marc Fellous, Stefano Fusi, Wulfram Gerstner, Frank Grasso, Jacqueline A. Griego, Ziad M. Hafed, Michael E. Hasselmo, Auke Ijspeert, Stephanie Jones, Daniel Kersten, Jeremie Knuesel, Owen Lewis, William W. Lytton, Tomaso Poggio, John Porrill, Tony J. Prescott, John Rinzel, Edmund Rolls, Jonathan Rubin, Nicolas Schweighofer, Mohamed A. Sherif, Malle A. Tagamets, Paul F. M. J. Verschure, Nathan Vierling-Claasen, Xiao-Jing Wang, Christopher Williams, Ransom Winder, Alan L. Yuille




Introduction To The Theory Of Neural Computation


Book Description

Comprehensive introduction to the neural network models currently under intensive study for computational applications. It also provides coverage of neural network applications in a variety of problems of both theoretical and practical interest.




Biophysics of Computation


Book Description

Neural network research often builds on the fiction that neurons are simple linear threshold units, completely neglecting the highly dynamic and complex nature of synapses, dendrites, and voltage-dependent ionic currents. Biophysics of Computation: Information Processing in Single Neurons challenges this notion, using richly detailed experimental and theoretical findings from cellular biophysics to explain the repertoire of computational functions available to single neurons. The author shows how individual nerve cells can multiply, integrate, or delay synaptic inputs and how information can be encoded in the voltage across the membrane, in the intracellular calcium concentration, or in the timing of individual spikes.Key topics covered include the linear cable equation; cable theory as applied to passive dendritic trees and dendritic spines; chemical and electrical synapses and how to treat them from a computational point of view; nonlinear interactions of synaptic input in passive and active dendritic trees; the Hodgkin-Huxley model of action potential generation and propagation; phase space analysis; linking stochastic ionic channels to membrane-dependent currents; calcium and potassium currents and their role in information processing; the role of diffusion, buffering and binding of calcium, and other messenger systems in information processing and storage; short- and long-term models of synaptic plasticity; simplified models of single cells; stochastic aspects of neuronal firing; the nature of the neuronal code; and unconventional models of sub-cellular computation.Biophysics of Computation: Information Processing in Single Neurons serves as an ideal text for advanced undergraduate and graduate courses in cellular biophysics, computational neuroscience, and neural networks, and will appeal to students and professionals in neuroscience, electrical and computer engineering, and physics.




Computational Systems Neurobiology


Book Description

Computational neurosciences and systems biology are among the main domains of life science research where mathematical modeling made a difference. This book introduces the many different types of computational studies one can develop to study neuronal systems. It is aimed at undergraduate students starting their research in computational neurobiology or more senior researchers who would like, or need, to move towards computational approaches. Based on their specific project, the readers would then move to one of the more specialized excellent textbooks available in the field. The first part of the book deals with molecular systems biology. Functional genomics is introduced through examples of transcriptomics and proteomics studies of neurobiological interest. Quantitative modelling of biochemical systems is presented in homogeneous compartments and using spatial descriptions. A second part deals with the various approaches to model single neuron physiology, and naturally moves to neuronal networks. A division is focused on the development of neurons and neuronal systems and the book closes on a series of methodological chapters. From the molecules to the organ, thinking at the level of systems is transforming biology and its impact on society. This book will help the reader to hop on the train directly in the tank engine.




Neural Engineering


Book Description

A synthesis of current approaches to adapting engineering tools to the study of neurobiological systems.




Neural Networks and Analog Computation


Book Description

The theoretical foundations of Neural Networks and Analog Computation conceptualize neural networks as a particular type of computer consisting of multiple assemblies of basic processors interconnected in an intricate structure. Examining these networks under various resource constraints reveals a continuum of computational devices, several of which coincide with well-known classical models. On a mathematical level, the treatment of neural computations enriches the theory of computation but also explicated the computational complexity associated with biological networks, adaptive engineering tools, and related models from the fields of control theory and nonlinear dynamics. The material in this book will be of interest to researchers in a variety of engineering and applied sciences disciplines. In addition, the work may provide the base of a graduate-level seminar in neural networks for computer science students.




Neuronal Dynamics


Book Description

This solid introduction uses the principles of physics and the tools of mathematics to approach fundamental questions of neuroscience.




An Introductory Course in Computational Neuroscience


Book Description

A textbook for students with limited background in mathematics and computer coding, emphasizing computer tutorials that guide readers in producing models of neural behavior. This introductory text teaches students to understand, simulate, and analyze the complex behaviors of individual neurons and brain circuits. It is built around computer tutorials that guide students in producing models of neural behavior, with the associated Matlab code freely available online. From these models students learn how individual neurons function and how, when connected, neurons cooperate in a circuit. The book demonstrates through simulated models how oscillations, multistability, post-stimulus rebounds, and chaos can arise within either single neurons or circuits, and it explores their roles in the brain. The book first presents essential background in neuroscience, physics, mathematics, and Matlab, with explanations illustrated by many example problems. Subsequent chapters cover the neuron and spike production; single spike trains and the underlying cognitive processes; conductance-based models; the simulation of synaptic connections; firing-rate models of large-scale circuit operation; dynamical systems and their components; synaptic plasticity; and techniques for analysis of neuron population datasets, including principal components analysis, hidden Markov modeling, and Bayesian decoding. Accessible to undergraduates in life sciences with limited background in mathematics and computer coding, the book can be used in a “flipped” or “inverted” teaching approach, with class time devoted to hands-on work on the computer tutorials. It can also be a resource for graduate students in the life sciences who wish to gain computing skills and a deeper knowledge of neural function and neural circuits.