AISB91


Book Description

AISB91 is the eighth conference organized by the Society for the Study of Artificial Intelligence and Simulation of Behaviour. It is not only the oldest regular conference in Europe on AI - which spawned the ECAI conferences in 1982 - but it is also the conference that has a tradition for focusing on research as opposed to applications. The 1991 edition of the conference was no different in this respect. On the contrary, research, and particularly newly emerging research dir ections such as knowledge level expert systems research, neural networks and emergent functionality in autonomous agents, was strongly emphasised. The conference was organized around the following sessions: dis tributed intelligent agents, situatedness and emergence in autonomous agents, new modes of reasoning, the knowledge level perspective, and theorem proving and machine learning. Each of these sessions is discussed below in more detail. DISTRIBUTED INTELLIGENT AGENTS Research in distributed AI is concerned with the problem of how multiple agents and societies of agents can be organized to co-operate and collectively solve a problem. The first paper by Chakravarty (MIT) focuses on the problem of evolving agents in the context of Minsky's society of mind theory. It addesses the question of how new agents can be formed by transforming existing ones and illustrates the theory with an example from game playing. Smieja (GMD, Germany) focuses on the problem of organizing networks of agents which consist internally of neural networks.




TRAC: Trends in Analytical Chemistry


Book Description

TRAC: Trends in Analytical Chemistry, Volume 9 provides information pertinent to the trends in the field of analytical chemistry. This book discusses a variety of topics related to analytical chemistry, including flow chemography, condensation polymers, sedimentary organic matter, nucleosides, and fuzzy expert systems. Organized into 43 parts encompassing 87 chapters, this volume begins with an overview of particle induced X-ray emission and its analytical applications. This text then discusses direct memory access data acquisition, which is an efficient method of collecting data from analytical instrumentation. Other chapters consider the application of flow injection analysis in industrial research laboratory. This book discusses as well the utilization of the time-of-flight mass spectroscopy method. The final chapter deals with brassinosteroids, a group of steroidal plant growth substances that possess B-ring lactone and two vicinal diols. This book is a valuable resource for analytical chemists, biochemists, molecular biologists, physicists, engineers, scientists, and researcher workers.




Intelligent Tutoring Systems


Book Description

This volume of the Encyclopaedia offers a systematic introduction and a comprehensive survey of the theory of complex spaces. It covers topics like semi-normal complex spaces, cohomology, the Levi problem, q-convexity and q-concavity. It is the first survey of this kind. The authors are internationally known outstanding experts who developed substantial parts of the field. The book contains seven chapters and an introduction written by Remmert, describing the history of the subject. The book will be very useful to graduate students and researchers in complex analysis, algebraic geometry and differential geometry. Another group of readers will consist of mathematical physicists who apply results from these fields.




Database and Expert Systems Applications


Book Description

The Database and Expert Systems Applications (DEXA) conferences have established themselves as a platform for bringing together researchers and practitioners from various backgrounds and all regions of the world to exchange ideas, experiences and opinions in a friendly and stimulating environment. The papers presented at the conference represent recent developments in the field and important steps towards shaping the future of applied computer science and information systems. DEXA covers a broad field: all aspects of databases, knowledge based systems, knowledge management, web-based systems, information systems, related technologies and their applications. Once again there were a good number of submissions: out of 183 papers that were submitted, the program committee selected 92 to be presented. In the first year of this new millennium DEXA has come back to the United Kingdom, following events in Vienna, Berlin, Valencia, Prague, Athens, London, Zurich, Toulouse, Vienna and Florence. The past decade has seen several revolutionary developments, one of which was the explosion of Internet-related applications in the areas covered by DEXA, developments in which DEXA has played a role and in which DEXA will continue to play a role in its second decade, starting with this conference.










Correct System Design


Book Description

Computers are gaining more and more control over systems that we use or rely on in our daily lives, privately as well as professionally. In safety-critical applications, as well as in others, it is of paramount importance that systems controled by a computer or computing systems themselves reliably behave in accordance with the specification and requirements, in other words: here correctness of the system, of its software and hardware is crucial. In order to cope with this callenge, software engineers and computer scientists need to understand the foundations of programming, how different formal theories are linked together, how compilers correctly translate high-level programs into machine code, and why transformations performed are justifiable. This book presents 17 mutually reviewed invited papers organized in sections on methodology, programming, automation, compilation, and application.




Intelligent Decision Support


Book Description

Intelligent decision support is based on human knowledge related to a specific part of a real or abstract world. When the knowledge is gained by experience, it is induced from empirical data. The data structure, called an information system, is a record of objects described by a set of attributes. Knowledge is understood here as an ability to classify objects. Objects being in the same class are indiscernible by means of attributes and form elementary building blocks (granules, atoms). In particular, the granularity of knowledge causes that some notions cannot be expressed precisely within available knowledge and can be defined only vaguely. In the rough sets theory created by Z. Pawlak each imprecise concept is replaced by a pair of precise concepts called its lower and upper approximation. These approximations are fundamental tools and reasoning about knowledge. The rough sets philosophy turned out to be a very effective, new tool with many successful real-life applications to its credit. It is worthwhile stressing that no auxiliary assumptions are needed about data, like probability or membership function values, which is its great advantage. The present book reveals a wide spectrum of applications of the rough set concept, giving the reader the flavor of, and insight into, the methodology of the newly developed disciplines. Although the book emphasizes applications, comparison with other related methods and further developments receive due attention.