Computational Learning Theory and Natural Learning Systems: Making learning systems practical


Book Description

This is the fourth and final volume of papers from a series of workshops called "Computational Learning Theory and Ǹatural' Learning Systems." The purpose of the workshops was to explore the emerging intersection of theoretical learning research and natural learning systems. The workshops drew researchers from three historically distinct styles of learning research: computational learning theory, neural networks, and machine learning (a subfield of AI). Volume I of the series introduces the general focus of the workshops. Volume II looks at specific areas of interaction between theory and experiment. Volumes III and IV focus on key areas of learning systems that have developed recently. Volume III looks at the problem of "Selecting Good Models." The present volume, Volume IV, looks at ways of "Making Learning Systems Practical." The editors divide the twenty-one contributions into four sections. The first three cover critical problem areas: 1) scaling up from small problems to realistic ones with large input dimensions, 2) increasing efficiency and robustness of learning methods, and 3) developing strategies to obtain good generalization from limited or small data samples. The fourth section discusses examples of real-world learning systems. Contributors : Klaus Abraham-Fuchs, Yasuhiro Akiba, Hussein Almuallim, Arunava Banerjee, Sanjay Bhansali, Alvis Brazma, Gustavo Deco, David Garvin, Zoubin Ghahramani, Mostefa Golea, Russell Greiner, Mehdi T. Harandi, John G. Harris, Haym Hirsh, Michael I. Jordan, Shigeo Kaneda, Marjorie Klenin, Pat Langley, Yong Liu, Patrick M. Murphy, Ralph Neuneier, E.M. Oblow, Dragan Obradovic, Michael J. Pazzani, Barak A. Pearlmutter, Nageswara S.V. Rao, Peter Rayner, Stephanie Sage, Martin F. Schlang, Bernd Schurmann, Dale Schuurmans, Leon Shklar, V. Sundareswaran, Geoffrey Towell, Johann Uebler, Lucia M. Vaina, Takefumi Yamazaki, Anthony M. Zador.




Computational Learning Theory and Natural Learning Systems: Intersections between theory and experiment


Book Description

Annotation These original contributions converge on an exciting and fruitful intersection of three historically distinct areas of learning research: computational learning theory, neural networks, and symbolic machine learning. Bridging theory and practice, computer science and psychology, they consider general issues in learning systems that could provide constraints for theory and at the same time interpret theoretical results in the context of experiments with actual learning systems. In all, nineteen chapters address questions such as, What is a natural system? How should learning systems gain from prior knowledge? If prior knowledge is important, how can we quantify how important? What makes a learning problem hard? How are neural networks and symbolic machine learning approaches similar? Is there a fundamental difference in the kind of task a neural network can easily solve as opposed to those a symbolic algorithm can easily solve? Stephen J. Hanson heads the Learning Systems Department at Siemens Corporate Research and is a Visiting Member of the Research Staff and Research Collaborator at the Cognitive Science Laboratory at Princeton University. George A. Drastal is Senior Research Scientist at Siemens Corporate Research. Ronald J. Rivest is Professor of Computer Science and Associate Director of the Laboratory for Computer Science at the Massachusetts Institute of Technology.




ECAI 2000


Book Description




Boosting


Book Description

An accessible introduction and essential reference for an approach to machine learning that creates highly accurate prediction rules by combining many weak and inaccurate ones. Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate “rules of thumb.” A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical. This book, written by the inventors of the method, brings together, organizes, simplifies, and substantially extends two decades of research on boosting, presenting both theory and applications in a way that is accessible to readers from diverse backgrounds while also providing an authoritative reference for advanced researchers. With its introductory treatment of all material and its inclusion of exercises in every chapter, the book is appropriate for course use as well. The book begins with a general introduction to machine learning algorithms and their analysis; then explores the core theory of boosting, especially its ability to generalize; examines some of the myriad other theoretical viewpoints that help to explain and understand boosting; provides practical extensions of boosting for more complex learning problems; and finally presents a number of advanced theoretical topics. Numerous applications and practical illustrations are offered throughout.




Abduction and Induction


Book Description

From the very beginning of their investigation of human reasoning, philosophers have identified two other forms of reasoning, besides deduction, which we now call abduction and induction. Deduction is now fairly well understood, but abduction and induction have eluded a similar level of understanding. The papers collected here address the relationship between abduction and induction and their possible integration. The approach is sometimes philosophical, sometimes that of pure logic, and some papers adopt the more task-oriented approach of AI. The book will command the attention of philosophers, logicians, AI researchers and computer scientists in general.





Book Description




Genetic Programming III


Book Description

Genetic programming (GP) is a method for getting a computer to solve a problem by telling it what needs to be done instead of how to do it. Koza, Bennett, Andre, and Keane present genetically evolved solutions to dozens of problems of design, control, classification, system identification, and computational molecular biology. Among the solutions are 14 results competitive with human-produced results, including 10 rediscoveries of previously patented inventions.




Computational Learning Theory


Book Description

Content Description #Includes bibliographical references and index.




Foundations of Intelligent Systems


Book Description

This book constitutes the refereed proceedings of the 11th International Symposium on Methodologies for Intelligent Systems, ISMIS '99, held in Warsaw, Poland, in June 1999. The 66 revised full papers presented together with five invited papers were carefully reviewed and selected from a total of 115 submissions. The volume is divided into topical sections on logics for AI, intelligent information retrieval, intelligent information systems, learning and knowledge discovery, computer vision, knowledge representation, and evolutionary computation.




Handbook of Neural Computation


Book Description

In recent years, neural computation has developed from a specialized research discipline into a broadly based and dynamic activity with applications in an astonishing variety of fields. Many scientists, engineers and other practitioners are now using neural networks to tackle problems that are either intractable or unrealistically time consuming to solve through traditional computational strategies. The inaugural volume in the Computational Intelligence Library provides speedy dissemination of new ideas to a broad spectrum of neural network users, designers and implementers. Devoted to network fundamentals, models, algorithms and applications, the work is intended to become the standard reference resource for the neural network community. As the field expands and develops, leading researchers will report on an analyze promising new approaches. In this way, the Handbook will become an evolving compendium on the state of the art of neural computation. Available in loose-leaf print form as well as in an electronic edition that combines both CD-ROM and on-line (World Wide Web) access to its contents, the Handbook of Neural Computation is available on a subscription basis, with regularly published supplements keeping readers abreast of late-breaking developments and new advances in this rapidly developing field.