Knowledge Seeker - Ontology Modelling for Information Search and Management


Book Description

The Knowledge Seeker is a useful system to develop various intelligent applications such as ontology-based search engine, ontology-based text classification system, ontological agent system, and semantic web system etc. The Knowledge Seeker contains four different ontological components. First, it defines the knowledge representation model ¡V Ontology Graph. Second, an ontology learning process that based on chi-square statistics is proposed for automatic learning an Ontology Graph from texts for different domains. Third, it defines an ontology generation method that transforms the learning outcome to the Ontology Graph format for machine processing and also can be visualized for human validation. Fourth, it defines different ontological operations (such as similarity measurement and text classification) that can be carried out with the use of generated Ontology Graphs. The final goal of the KnowledgeSeeker system framework is that it can improve the traditional information system with higher efficiency. In particular, it can increase the accuracy of a text classification system, and also enhance the search intelligence in a search engine. This can be done by enhancing the system with machine processable ontology.




Advances in Robotics and Virtual Reality


Book Description

A beyond human knowledge and reach, robotics is strongly involved in tackling challenges of new emerging multidisciplinary fields. Together with humans, robots are busy exploring and working on the new generation of ideas and problems whose solution is otherwise impossible to find. The future is near when robots will sense, smell and touch people and their lives. Behind this practical aspect of human-robotics, there is a half a century spanned robotics research, which transformed robotics into a modern science. The Advances in Robotics and Virtual Reality is a compilation of emerging application areas of robotics. The book covers robotics role in medicine, space exploration and also explains the role of virtual reality as a non-destructive test bed which constitutes a premise of further advances towards new challenges in robotics. This book, edited by two famous scientists with the support of an outstanding team of fifteen authors, is a well suited reference for robotics researchers and scholars from related disciplines such as computer graphics, virtual simulation, surgery, biomechanics and neuroscience.




From Curve Fitting to Machine Learning


Book Description

The analysis of experimental data is at heart of science from its beginnings. But it was the advent of digital computers that allowed the execution of highly non-linear and increasingly complex data analysis procedures - methods that were completely unfeasible before. Non-linear curve fitting, clustering and machine learning belong to these modern techniques which are a further step towards computational intelligence. The goal of this book is to provide an interactive and illustrative guide to these topics. It concentrates on the road from two dimensional curve fitting to multidimensional clustering and machine learning with neural networks or support vector machines. Along the way topics like mathematical optimization or evolutionary algorithms are touched. All concepts and ideas are outlined in a clear cut manner with graphically depicted plausibility arguments and a little elementary mathematics. The major topics are extensively outlined with exploratory examples and applications. The primary goal is to be as illustrative as possible without hiding problems and pitfalls but to address them. The character of an illustrative cookbook is complemented with specific sections that address more fundamental questions like the relation between machine learning and human intelligence. These sections may be skipped without affecting the main road but they will open up possibly interesting insights beyond the mere data massage. All topics are completely demonstrated with the aid of the commercial computing platform Mathematica and the Computational Intelligence Packages (CIP), a high-level function library developed with Mathematica's programming language on top of Mathematica's algorithms. CIP is open-source so the detailed code of every method is freely accessible. All examples and applications shown throughout the book may be used and customized by the reader without any restrictions. The target readerships are students of (computer) science and engineering as well as scientific practitioners in industry and academia who deserve an illustrative introduction to these topics. Readers with programming skills may easily port and customize the provided code.




Data Mining


Book Description

The knowledge discovery process is as old as Homo sapiens. Until some time ago this process was solely based on the ‘natural personal' computer provided by Mother Nature. Fortunately, in recent decades the problem has begun to be solved based on the development of the Data mining technology, aided by the huge computational power of the 'artificial' computers. Digging intelligently in different large databases, data mining aims to extract implicit, previously unknown and potentially useful information from data, since “knowledge is power”. The goal of this book is to provide, in a friendly way, both theoretical concepts and, especially, practical techniques of this exciting field, ready to be applied in real-world situations. Accordingly, it is meant for all those who wish to learn how to explore and analysis of large quantities of data in order to discover the hidden nugget of information.




Intelligent Systems: Approximation by Artificial Neural Networks


Book Description

This brief monograph is the first one to deal exclusively with the quantitative approximation by artificial neural networks to the identity-unit operator. Here we study with rates the approximation properties of the "right" sigmoidal and hyperbolic tangent artificial neural network positive linear operators. In particular we study the degree of approximation of these operators to the unit operator in the univariate and multivariate cases over bounded or unbounded domains. This is given via inequalities and with the use of modulus of continuity of the involved function or its higher order derivative. We examine the real and complex cases. For the convenience of the reader, the chapters of this book are written in a self-contained style. This treatise relies on author's last two years of related research work. Advanced courses and seminars can be taught out of this brief book. All necessary background and motivations are given per chapter. A related list of references is given also per chapter. The exposed results are expected to find applications in many areas of computer science and applied mathematics, such as neural networks, intelligent systems, complexity theory, learning theory, vision and approximation theory, etc. As such this monograph is suitable for researchers, graduate students, and seminars of the above subjects, also for all science libraries.




Approximate Reasoning by Parts


Book Description

The monograph offers a view on Rough Mereology, a tool for reasoning under uncertainty, which goes back to Mereology, formulated in terms of parts by Lesniewski, and borrows from Fuzzy Set Theory and Rough Set Theory ideas of the containment to a degree. The result is a theory based on the notion of a part to a degree. One can invoke here a formula Rough: Rough Mereology : Mereology = Fuzzy Set Theory : Set Theory. As with Mereology, Rough Mereology finds important applications in problems of Spatial Reasoning, illustrated in this monograph with examples from Behavioral Robotics. Due to its involvement with concepts, Rough Mereology offers new approaches to Granular Computing, Classifier and Decision Synthesis, Logics for Information Systems, and are--formulation of well--known ideas of Neural Networks and Many Agent Systems. All these approaches are discussed in this monograph. To make the exposition self--contained, underlying notions of Set Theory, Topology, and Deductive and Reductive Reasoning with emphasis on Rough and Fuzzy Set Theories along with a thorough exposition of Mereology both in Lesniewski and Whitehead--Leonard--Goodman--Clarke versions are discussed at length. It is hoped that the monograph offers researchers in various areas of Artificial Intelligence a new tool to deal with analysis of relations among concepts.




Intelligent Systems


Book Description

Computational intelligence is a well-established paradigm, where new theories with a sound biological understanding have been evolving. The current experimental systems have many of the characteristics of biological computers (brains in other words) and are beginning to be built to perform a variety of tasks that are difficult or impossible to do with conventional computers. As evident, the ultimate achievement in this field would be to mimic or exceed human cognitive capabilities including reasoning, recognition, creativity, emotions, understanding, learning and so on. This book comprising of 17 chapters offers a step-by-step introduction (in a chronological order) to the various modern computational intelligence tools used in practical problem solving. Staring with different search techniques including informed and uninformed search, heuristic search, minmax, alpha-beta pruning methods, evolutionary algorithms and swarm intelligent techniques; the authors illustrate the design of knowledge-based systems and advanced expert systems, which incorporate uncertainty and fuzziness. Machine learning algorithms including decision trees and artificial neural networks are presented and finally the fundamentals of hybrid intelligent systems are also depicted. Academics, scientists as well as engineers engaged in research, development and application of computational intelligence techniques, machine learning and data mining would find the comprehensive coverage of this book invaluable.




Towards Intelligent Modeling: Statistical Approximation Theory


Book Description

The main idea of statistical convergence is to demand convergence only for a majority of elements of a sequence. This method of convergence has been investigated in many fundamental areas of mathematics such as: measure theory, approximation theory, fuzzy logic theory, summability theory, and so on. In this monograph we consider this concept in approximating a function by linear operators, especially when the classical limit fails. The results of this book not only cover the classical and statistical approximation theory, but also are applied in the fuzzy logic via the fuzzy-valued operators. The authors in particular treat the important Korovkin approximation theory of positive linear operators in statistical and fuzzy sense. They also present various statistical approximation theorems for some specific real and complex-valued linear operators that are not positive. This is the first monograph in Statistical Approximation Theory and Fuzziness. The chapters are self-contained and several advanced courses can be taught. The research findings will be useful in various applications including applied and computational mathematics, stochastics, engineering, artificial intelligence, vision and machine learning. This monograph is directed to graduate students, researchers, practitioners and professors of all disciplines.




Artificial Intelligence in Daily Life


Book Description

Given the exponential growth of Artificial Intelligence (AI) over the past few decades, AI and its related applications have become part of daily life in ways that we could never have dreamt of only a century ago. Our routines have been changed beyond measure by robotics and AI, which are now used in a vast array of services. Though AI is still in its infancy, we have already benefited immensely. This book introduces readers to basic Artificial Intelligence concepts, and helps them understand the relationship between AI and daily life. In the interest of clarity, the content is divided into four major parts. Part I (AI Concepts) presents fundamental concepts of and information on AI; while Part II (AI Technology) introduces readers to the five core AI Technologies that provide the building blocks for various AI applications, namely: Machine Learning (ML), Data Mining (DM), Computer Vision (CV), Natural Languages Processing (NLP), and Ontology-based Search Engine (OSE). In turn, Part III (AI Applications) reviews major contemporary applications that are impacting our ways of life, working styles and environment, ranging from intelligent agents and robotics to smart campus and smart city projects. Lastly, Part IV (Beyond AI) addresses related topics that are vital to the future development of AI. It also discusses a number of critical issues, such as AI ethics and privacy, the development of a conscious mind, and autonomous robotics in our daily lives.




Recommender Systems for the Social Web


Book Description

The recommendation of products, content and services cannot be considered newly born, although its widespread application is still in full swing. While its growing success in numerous sectors, the progress of the Social Web has revolutionized the architecture of participation and relationship in the Web, making it necessary to restate recommendation and reconciling it with Collaborative Tagging, as the popularization of authoring in the Web, and Social Networking, as the translation of personal relationships to the Web. Precisely, the convergence of recommendation with the above Social Web pillars is what motivates this book, which has collected contributions from well-known experts in the academy and the industry to provide a broader view of the problems that Social Recommenders might face with. If recommender systems have proven their key role in facilitating the user access to resources on the Web, when sharing resources has become social, it is natural for recommendation strategies in the Social Web era take into account the users’ point of view and the relationships among users to calculate their predictions. This book aims to help readers to discover and understand the interplay among legal issues such as privacy; technical aspects such as interoperability and scalability; and social aspects such as the influence of affinity, trust, reputation and likeness, when the goal is to offer recommendations that are truly useful to both the user and the provider.