Artificial Neural Networks - ICANN 2008


Book Description

This two volume set LNCS 5163 and LNCS 5164 constitutes the refereed proceedings of the 18th International Conference on Artificial Neural Networks, ICANN 2008, held in Prague Czech Republic, in September 2008. The 200 revised full papers presented were carefully reviewed and selected from more than 300 submissions. The first volume contains papers on mathematical theory of neurocomputing, learning algorithms, kernel methods, statistical learning and ensemble techniques, support vector machines, reinforcement learning, evolutionary computing, hybrid systems, self-organization, control and robotics, signal and time series processing and image processing.




Artificial Neural Networks - ICANN 2008


Book Description

This two volume set LNCS 5163 and LNCS 5164 constitutes the refereed proceedings of the 18th International Conference on Artificial Neural Networks, ICANN 2008, held in Prague Czech Republic, in September 2008. The 200 revised full papers presented were carefully reviewed and selected from more than 300 submissions. The second volume is devoted to pattern recognition and data analysis, hardware and embedded systems, computational neuroscience, connectionistic cognitive science, neuroinformatics and neural dynamics. it also contains papers from two special sessions coupling, synchronies, and firing patterns: from cognition to disease, and constructive neural networks and two workshops new trends in self-organization and optimization of artificial neural networks, and adaptive mechanisms of the perception-action cycle.




Artificial Neural Networks – ICANN 2009


Book Description

This volume is part of the two-volume proceedings of the 19th International Conf- ence on Artificial Neural Networks (ICANN 2009), which was held in Cyprus during September 14–17, 2009. The ICANN conference is an annual meeting sp- sored by the European Neural Network Society (ENNS), in cooperation with the - ternational Neural Network Society (INNS) and the Japanese Neural Network Society (JNNS). ICANN 2009 was technically sponsored by the IEEE Computational Intel- gence Society. This series of conferences has been held annually since 1991 in various European countries and covers the field of neurocomputing, learning systems and related areas. Artificial neural networks provide an information-processing structure inspired by biological nervous systems. They consist of a large number of highly interconnected processing elements, with the capability of learning by example. The field of artificial neural networks has evolved significantly in the last two decades, with active partici- tion from diverse fields, such as engineering, computer science, mathematics, artificial intelligence, system theory, biology, operations research, and neuroscience. Artificial neural networks have been widely applied for pattern recognition, control, optimization, image processing, classification, signal processing, etc.




Artificial Neural Networks - ICANN 2010


Book Description

th This volume is part of the three-volume proceedings of the 20 International Conference on Arti?cial Neural Networks (ICANN 2010) that was held in Th- saloniki, Greece during September 15–18, 2010. ICANN is an annual meeting sponsored by the European Neural Network Society (ENNS) in cooperation with the International Neural Network So- ety (INNS) and the Japanese Neural Network Society (JNNS). This series of conferences has been held annually since 1991 in Europe, covering the ?eld of neurocomputing, learning systems and other related areas. As in the past 19 events, ICANN 2010 provided a distinguished, lively and interdisciplinary discussion forum for researches and scientists from around the globe. Ito?eredagoodchanceto discussthe latestadvancesofresearchandalso all the developments and applications in the area of Arti?cial Neural Networks (ANNs). ANNs provide an information processing structure inspired by biolo- cal nervous systems and they consist of a large number of highly interconnected processing elements (neurons). Each neuron is a simple processor with a limited computing capacity typically restricted to a rule for combining input signals (utilizing an activation function) in order to calculate the output one. Output signalsmaybesenttootherunitsalongconnectionsknownasweightsthatexcite or inhibit the signal being communicated. ANNs have the ability “to learn” by example (a large volume of cases) through several iterations without requiring a priori ?xed knowledge of the relationships between process parameters.




Artificial Neural Networks - ICANN 2008


Book Description

This two volume set LNCS 5163 and LNCS 5164 constitutes the refereed proceedings of the 18th International Conference on Artificial Neural Networks, ICANN 2008, held in Prague Czech Republic, in September 2008. The 200 revised full papers presented were carefully reviewed and selected from more than 300 submissions. The first volume contains papers on mathematical theory of neurocomputing, learning algorithms, kernel methods, statistical learning and ensemble techniques, support vector machines, reinforcement learning, evolutionary computing, hybrid systems, self-organization, control and robotics, signal and time series processing and image processing.




Constructive Neural Networks


Book Description

This book presents a collection of invited works that consider constructive methods for neural networks, taken primarily from papers presented at a special th session held during the 18 International Conference on Artificial Neural Networks (ICANN 2008) in September 2008 in Prague, Czech Republic. The book is devoted to constructive neural networks and other incremental learning algorithms that constitute an alternative to the standard method of finding a correct neural architecture by trial-and-error. These algorithms provide an incremental way of building neural networks with reduced topologies for classification problems. Furthermore, these techniques produce not only the multilayer topologies but the value of the connecting synaptic weights that are determined automatically by the constructing algorithm, avoiding the risk of becoming trapped in local minima as might occur when using gradient descent algorithms such as the popular back-propagation. In most cases the convergence of the constructing algorithms is guaranteed by the method used. Constructive methods for building neural networks can potentially create more compact and robust models which are easily implemented in hardware and used for embedded systems. Thus a growing amount of current research in neural networks is oriented towards this important topic. The purpose of this book is to gather together some of the leading investigators and research groups in this growing area, and to provide an overview of the most recent advances in the techniques being developed for constructive neural networks and their applications.




Support Vector Machines for Pattern Classification


Book Description

A guide on the use of SVMs in pattern classification, including a rigorous performance comparison of classifiers and regressors. The book presents architectures for multiclass classification and function approximation problems, as well as evaluation criteria for classifiers and regressors. Features: Clarifies the characteristics of two-class SVMs; Discusses kernel methods for improving the generalization ability of neural networks and fuzzy systems; Contains ample illustrations and examples; Includes performance evaluation using publicly available data sets; Examines Mahalanobis kernels, empirical feature space, and the effect of model selection by cross-validation; Covers sparse SVMs, learning using privileged information, semi-supervised learning, multiple classifier systems, and multiple kernel learning; Explores incremental training based batch training and active-set training methods, and decomposition techniques for linear programming SVMs; Discusses variable selection for support vector regressors.




The Relevance of the Time Domain to Neural Network Models


Book Description

A significant amount of effort in neural modeling is directed towards understanding the representation of information in various parts of the brain, such as cortical maps [6], and the paths along which sensory information is processed. Though the time domain is integral an integral aspect of the functioning of biological systems, it has proven very challenging to incorporate the time domain effectively in neural network models. A promising path that is being explored is to study the importance of synchronization in biological systems. Synchronization plays a critical role in the interactions between neurons in the brain, giving rise to perceptual phenomena, and explaining multiple effects such as visual contour integration, and the separation of superposed inputs. The purpose of this book is to provide a unified view of how the time domain can be effectively employed in neural network models. A first direction to consider is to deploy oscillators that model temporal firing patterns of a neuron or a group of neurons. There is a growing body of research on the use of oscillatory neural networks, and their ability to synchronize under the right conditions. Such networks of synchronizing elements have been shown to be effective in image processing and segmentation tasks, and also in solving the binding problem, which is of great significance in the field of neuroscience. The oscillatory neural models can be employed at multiple scales of abstraction, ranging from individual neurons, to groups of neurons using Wilson-Cowan modeling techniques and eventually to the behavior of entire brain regions as revealed in oscillations observed in EEG recordings. A second interesting direction to consider is to understand the effect of different neural network topologies on their ability to create the desired synchronization. A third direction of interest is the extraction of temporal signaling patterns from brain imaging data such as EEG and fMRI. Hence this Special Session is of emerging interest in the brain sciences, as imaging techniques are able to resolve sufficient temporal detail to provide an insight into how the time domain is deployed in cognitive function. The following broad topics will be covered in the book: Synchronization, phase-locking behavior, image processing, image segmentation, temporal pattern analysis, EEG analysis, fMRI analyis, network topology and synchronizability, cortical interactions involving synchronization, and oscillatory neural networks. This book will benefit readers interested in the topics of computational neuroscience, applying neural network models to understand brain function, extracting temporal information from brain imaging data, and emerging techniques for image segmentation using oscillatory networks




Rough Sets and Current Trends in Computing


Book Description

This book constitutes the refereed proceedings of the 7th International Conference on Rough Sets and Current Trends in Computing, RSCTC 2010, held in Warsaw, Poland, in June 2010.




Advances in Intelligent Systems and Computing II


Book Description

This book reports on new theories and applications in the field of intelligent systems and computing. It covers computational and artificial intelligence methods, as well as advances in computer vision, current issues in big data and cloud computing, computation linguistics, and cyber-physical systems. It also reports on data mining and knowledge extraction technologies, as well as central issues in intelligent information management. Written by active researchers, the respective chapters are based on papers presented at the International Conference on Computer Science and Information Technologies (CSIT 2017), held on September 5–8, 2017, in Lviv, Ukraine; and at two workshops accompanying the conference: one on inductive modeling, jointly organized by the Lviv Polytechnic National University and the National Academy of Science of Ukraine; and another on project management, which was jointly organized by the Lviv Polytechnic National University, the International Project Management Association, the Ukrainian Project Management Association, the Kazakhstan Project Management Association, and Nazarbayev University. Given its breadth of coverage, the book provides academics and professionals with extensive information and a timely snapshot of the field of intelligent systems, and is sure to foster new discussions and collaborations among different groups.