Analogical Connections


Book Description

Presenting research on the computational abilities of connectionist, neural, and neurally inspired systems, this series emphasizes the question of how connectionist or neural network models can be made to perform rapid, short-term types of computation that are useful in higher level cognitive processes. The most recent volumes are directed mainly at researchers in connectionism, analogy, metaphor, and case-based reasoning, but are also suitable for graduate courses in those areas.










Analogical Connections


Book Description

Presenting research on the computational abilities of connectionist, neural, and neurally inspired systems, this series emphasizes the question of how connectionist or neural network models can be made to perform rapid, short-term types of computation that are useful in higher level cognitive processes. The most recent volumes are directed mainly at researchers in connectionism, analogy, metaphor, and case-based reasoning, but are also suitable for graduate courses in those areas.




High-level Connectionist Models


Book Description

Presenting research on the computational abilities of connectionist, neural, and neurally inspired systems, this series emphasizes the question of how connectionist or neural network models can be made to perform rapid, short-term types of computation useful in higher cognitive processes. The most recent volumes are directed mainly at researchers in connectionism, analogy, metaphor, and case-based reasoning, but are also suitable for graduate courses in those areas.




Mathematical Perspectives on Neural Networks


Book Description

Recent years have seen an explosion of new mathematical results on learning and processing in neural networks. This body of results rests on a breadth of mathematical background which even few specialists possess. In a format intermediate between a textbook and a collection of research articles, this book has been assembled to present a sample of these results, and to fill in the necessary background, in such areas as computability theory, computational complexity theory, the theory of analog computation, stochastic processes, dynamical systems, control theory, time-series analysis, Bayesian analysis, regularization theory, information theory, computational learning theory, and mathematical statistics. Mathematical models of neural networks display an amazing richness and diversity. Neural networks can be formally modeled as computational systems, as physical or dynamical systems, and as statistical analyzers. Within each of these three broad perspectives, there are a number of particular approaches. For each of 16 particular mathematical perspectives on neural networks, the contributing authors provide introductions to the background mathematics, and address questions such as: * Exactly what mathematical systems are used to model neural networks from the given perspective? * What formal questions about neural networks can then be addressed? * What are typical results that can be obtained? and * What are the outstanding open problems? A distinctive feature of this volume is that for each perspective presented in one of the contributed chapters, the first editor has provided a moderately detailed summary of the formal results and the requisite mathematical concepts. These summaries are presented in four chapters that tie together the 16 contributed chapters: three develop a coherent view of the three general perspectives -- computational, dynamical, and statistical; the other assembles these three perspectives into a unified overview of the neural networks field.




Connectionist Models of Behaviour and Cognition II


Book Description

The neural computational approach to cognitive and psychological processes is relatively new. However, Neural Computation and Psychology Workshops (NCPW), first held 16 years ago, lie at the heart of this fast-moving discipline, thanks to its interdisciplinary nature ? bringing together researchers from different disciplines such as artificial intelligence, cognitive science, computer science, neurobiology, philosophy and psychology to discuss their work on models of cognitive processes.Once again, the Eleventh Neural Computation and Psychology Workshop (NCPW11), held in 2008 at the University of Oxford (England), reflects the interdisciplinary nature and wide range of backgrounds of this field. This volume is a collection of peer-reviewed contributions of most of the papers presented at NCPW11 by researchers from four continents and 15 countries.




Advanced Intelligent Computing Theories and Applications


Book Description

This volume, in conjunction with the two volumes LNCS 4681 and LNAI 4682, constitutes the refereed proceedings of the Third International Conference on Intelligent Computing held in Qingdao, China, in August 2007. The conference sought to establish contemporary intelligent computing techniques as an integral method that underscores trends in advanced computational intelligence and links theoretical research with applications.







Theoretical Advances in Neural Computation and Learning


Book Description

For any research field to have a lasting impact, there must be a firm theoretical foundation. Neural networks research is no exception. Some of the founda tional concepts, established several decades ago, led to the early promise of developing machines exhibiting intelligence. The motivation for studying such machines comes from the fact that the brain is far more efficient in visual processing and speech recognition than existing computers. Undoubtedly, neu robiological systems employ very different computational principles. The study of artificial neural networks aims at understanding these computational prin ciples and applying them in the solutions of engineering problems. Due to the recent advances in both device technology and computational science, we are currently witnessing an explosive growth in the studies of neural networks and their applications. It may take many years before we have a complete understanding about the mechanisms of neural systems. Before this ultimate goal can be achieved, an swers are needed to important fundamental questions such as (a) what can neu ral networks do that traditional computing techniques cannot, (b) how does the complexity of the network for an application relate to the complexity of that problem, and (c) how much training data are required for the resulting network to learn properly? Everyone working in the field has attempted to answer these questions, but general solutions remain elusive. However, encouraging progress in studying specific neural models has been made by researchers from various disciplines.