The Topological and Dynamical Properties of Neural Networks and Their Functional Implications


Book Description

Understanding the inner workings of neural networks is a paramount scientific challenge. The challenge is rooted in our pursuit to unravel the human mind and reproduce its intelligence in our own machines. This task transcends any single discipline and borrows knowledge and expertise from the brain sciences, physics, mathematics, network science, machine learning, and complex systems. Neural networks, both biological and artificial, are built from the same underlying principles. They are a system of non-linear elements connected together via a network through which they communicate to perform complex computations beyond the capacity of any single element. Neural networks display fantastically rich dynamical properties and computational abilities. It is crucial to understand how the structural organization of the network affects its dynamics and functional capabilities. In this thesis, I explore modularity, a prominent topological feature found in many brain networks. It holds a crucial key for unraveling the mystery of neural systems.




Lectures in Supercomputational Neuroscience


Book Description

Written from the physicist’s perspective, this book introduces computational neuroscience with in-depth contributions by system neuroscientists. The authors set forth a conceptual model for complex networks of neurons that incorporates important features of the brain. The computational implementation on supercomputers, discussed in detail, enables you to adapt the algorithm for your own research. Worked-out examples of applications are provided.







Information-Theoretic Aspects of Neural Networks


Book Description

Information theoretics vis-a-vis neural networks generally embodies parametric entities and conceptual bases pertinent to memory considerations and information storage, information-theoretic based cost-functions, and neurocybernetics and self-organization. Existing studies only sparsely cover the entropy and/or cybernetic aspects of neural information. Information-Theoretic Aspects of Neural Networks cohesively explores this burgeoning discipline, covering topics such as: Shannon information and information dynamics neural complexity as an information processing system memory and information storage in the interconnected neural web extremum (maximum and minimum) information entropy neural network training non-conventional, statistical distance-measures for neural network optimizations symmetric and asymmetric characteristics of information-theoretic error-metrics algorithmic complexity based representation of neural information-theoretic parameters genetic algorithms versus neural information dynamics of neurocybernetics viewed in the information-theoretic plane nonlinear, information-theoretic transfer function of the neural cellular units statistical mechanics, neural networks, and information theory semiotic framework of neural information processing and neural information flow fuzzy information and neural networks neural dynamics conceived through fuzzy information parameters neural information flow dynamics informatics of neural stochastic resonance Information-Theoretic Aspects of Neural Networks acts as an exceptional resource for engineers, scientists, and computer scientists working in the field of artificial neural networks as well as biologists applying the concepts of communication theory and protocols to the functioning of the brain. The information in this book explores new avenues in the field and creates a common platform for analyzing the neural complex as well as artificial neural networks.




Functional Networks with Applications


Book Description

Artificial neural networks have been recognized as a powerful tool to learn and reproduce systems in various fields of applications. Neural net works are inspired by the brain behavior and consist of one or several layers of neurons, or computing units, connected by links. Each artificial neuron receives an input value from the input layer or the neurons in the previ ous layer. Then it computes a scalar output from a linear combination of the received inputs using a given scalar function (the activation function), which is assumed the same for all neurons. One of the main properties of neural networks is their ability to learn from data. There are two types of learning: structural and parametric. Structural learning consists of learning the topology of the network, that is, the number of layers, the number of neurons in each layer, and what neurons are connected. This process is done by trial and error until a good fit to the data is obtained. Parametric learning consists of learning the weight values for a given topology of the network. Since the neural functions are given, this learning process is achieved by estimating the connection weights based on the given information. To this aim, an error function is minimized using several well known learning methods, such as the backpropagation algorithm. Unfortunately, for these methods: (a) The function resulting from the learning process has no physical or engineering interpretation. Thus, neural networks are seen as black boxes.




Introduction to Neural Dynamics and Signal Transmission Delay


Book Description

In the design of a neural network, either for biological modeling, cognitive simulation, numerical computation or engineering applications, it is important to investigate the network's computational performance which is usually described by the long-term behaviors, called dynamics, of the model equations. The purpose of this book is to give an introduction to the mathematical modeling and analysis of networks of neurons from the viewpoint of dynamical systems.




Neural Networks: Computational Models and Applications


Book Description

Neural Networks: Computational Models and Applications presents important theoretical and practical issues in neural networks, including the learning algorithms of feed-forward neural networks, various dynamical properties of recurrent neural networks, winner-take-all networks and their applications in broad manifolds of computational intelligence: pattern recognition, uniform approximation, constrained optimization, NP-hard problems, and image segmentation. The book offers a compact, insightful understanding of the broad and rapidly growing neural networks domain.




Neural Networks Theory


Book Description

This book, written by a leader in neural network theory in Russia, uses mathematical methods in combination with complexity theory, nonlinear dynamics and optimization. It details more than 40 years of Soviet and Russian neural network research and presents a systematized methodology of neural networks synthesis. The theory is expansive: covering not just traditional topics such as network architecture but also neural continua in function spaces as well.




Ay's Neuroanatomy of C. Elegans for Computation


Book Description

AY's Neuroanatomy of C. elegans for Computation provides the neural circuitry database of the nematode Caenorhabditis elegans, both in printed form and in ASCII files on 5.25-inch diskettes (for use on IBM® and compatible personal computers, Macintosh® computers, and higher level machines). Tables of connections among neuron classes, synapses among individual neurons, gap junctions among neurons, worm cells and their embryonic origin, and synthetically derived neuromuscular connections are presented together with the references from which the data were compiled and edited. Sample data files and source codes of FORTRAN and BASIC programs are provided to illustrate the use of mathematical tools for any researcher or student interested in examining a natural neural network and discovering what makes it tick.




An Introduction to the Modeling of Neural Networks


Book Description

This book is a beginning graduate-level introduction to neural networks which is divided into four parts.