Advances in Neural Information Processing Systems 10


Book Description

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. These proceedings contain all of the papers that were presented.




Neuronal Dynamics


Book Description

This solid introduction uses the principles of physics and the tools of mathematics to approach fundamental questions of neuroscience.




Neural Information Processing


Book Description

The four volume set LNCS 9947, LNCS 9948, LNCS 9949, and LNCS 9950 constitutes the proceedings of the 23rd International Conference on Neural Information Processing, ICONIP 2016, held in Kyoto, Japan, in October 2016. The 296 full papers presented were carefully reviewed and selected from 431 submissions. The 4 volumes are organized in topical sections on deep and reinforcement learning; big data analysis; neural data analysis; robotics and control; bio-inspired/energy efficient information processing; whole brain architecture; neurodynamics; bioinformatics; biomedical engineering; data mining and cybersecurity workshop; machine learning; neuromorphic hardware; sensory perception; pattern recognition; social networks; brain-machine interface; computer vision; time series analysis; data-driven approach for extracting latent features; topological and graph based clustering methods; computational intelligence; data mining; deep neural networks; computational and cognitive neurosciences; theory and algorithms.




An Introduction to Neural Information Processing


Book Description

This book provides an overview of neural information processing research, which is one of the most important branches of neuroscience today. Neural information processing is an interdisciplinary subject, and the merging interaction between neuroscience and mathematics, physics, as well as information science plays a key role in the development of this field. This book begins with the anatomy of the central nervous system, followed by an introduction to various information processing models at different levels. The authors all have extensive experience in mathematics, physics and biomedical engineering, and have worked in this multidisciplinary area for a number of years. They present classical examples of how the pioneers in this field used theoretical analysis, mathematical modeling and computer simulation to solve neurobiological problems, and share their experiences and lessons learned. The book is intended for researchers and students with a mathematics, physics or informatics background who are interested in brain research and keen to understand the necessary neurobiology and how they can use their specialties to address neurobiological problems. It is also provides inspiration for neuroscience students who are interested in learning how to use mathematics, physics or informatics approaches to solve problems in their field.







Advances in Neural Information Processing Systems 13


Book Description

The proceedings of the 2000 Neural Information Processing Systems (NIPS) Conference.The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. The conference is interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, vision, speech and signal processing, reinforcement learning and control, implementations, and diverse applications. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. These proceedings contain all of the papers that were presented at the 2000 conference.




Biophysics of Computation


Book Description

Neural network research often builds on the fiction that neurons are simple linear threshold units, completely neglecting the highly dynamic and complex nature of synapses, dendrites, and voltage-dependent ionic currents. Biophysics of Computation: Information Processing in Single Neurons challenges this notion, using richly detailed experimental and theoretical findings from cellular biophysics to explain the repertoire of computational functions available to single neurons. The author shows how individual nerve cells can multiply, integrate, or delay synaptic inputs and how information can be encoded in the voltage across the membrane, in the intracellular calcium concentration, or in the timing of individual spikes.Key topics covered include the linear cable equation; cable theory as applied to passive dendritic trees and dendritic spines; chemical and electrical synapses and how to treat them from a computational point of view; nonlinear interactions of synaptic input in passive and active dendritic trees; the Hodgkin-Huxley model of action potential generation and propagation; phase space analysis; linking stochastic ionic channels to membrane-dependent currents; calcium and potassium currents and their role in information processing; the role of diffusion, buffering and binding of calcium, and other messenger systems in information processing and storage; short- and long-term models of synaptic plasticity; simplified models of single cells; stochastic aspects of neuronal firing; the nature of the neuronal code; and unconventional models of sub-cellular computation.Biophysics of Computation: Information Processing in Single Neurons serves as an ideal text for advanced undergraduate and graduate courses in cellular biophysics, computational neuroscience, and neural networks, and will appeal to students and professionals in neuroscience, electrical and computer engineering, and physics.




Neural Information Processing and VLSI


Book Description

Neural Information Processing and VLSI provides a unified treatment of this important subject for use in classrooms, industry, and research laboratories, in order to develop advanced artificial and biologically-inspired neural networks using compact analog and digital VLSI parallel processing techniques. Neural Information Processing and VLSI systematically presents various neural network paradigms, computing architectures, and the associated electronic/optical implementations using efficient VLSI design methodologies. Conventional digital machines cannot perform computationally-intensive tasks with satisfactory performance in such areas as intelligent perception, including visual and auditory signal processing, recognition, understanding, and logical reasoning (where the human being and even a small living animal can do a superb job). Recent research advances in artificial and biological neural networks have established an important foundation for high-performance information processing with more efficient use of computing resources. The secret lies in the design optimization at various levels of computing and communication of intelligent machines. Each neural network system consists of massively paralleled and distributed signal processors with every processor performing very simple operations, thus consuming little power. Large computational capabilities of these systems in the range of some hundred giga to several tera operations per second are derived from collectively parallel processing and efficient data routing, through well-structured interconnection networks. Deep-submicron very large-scale integration (VLSI) technologies can integrate tens of millions of transistors in a single silicon chip for complex signal processing and information manipulation. The book is suitable for those interested in efficient neurocomputing as well as those curious about neural network system applications. It has been especially prepared for use as a text for advanced undergraduate and first year graduate students, and is an excellent reference book for researchers and scientists working in the fields covered.




Computational Neuroscience


Book Description

This volume includes papers originally presented at the 7th annual Computational Neuroscience Meeting (CNS'98) held in July of 1998 at the Fess Parker Doubletree Inn in Santa Barbara, California. The CNS meetings bring together computational neuroscientists representing many different fields and backgrounds as well as many different experimental preparations and theoretical approaches. The papers published here range from pure experimental neurobiology, to neuro-ethology, mathematics, physics, and engineering. In all cases the research described is focused on understanding how nervous systems compute. The actual subjects of the research include a highly diverse number of preparations, modeling approaches, and analysis techniques. Accordingly, this volume reflects the breadth and depth of current research in computational neuroscience taking place throughout the world.