Non-Additive Measure and Integral


Book Description

Non-Additive Measure and Integral is the first systematic approach to the subject. Much of the additive theory (convergence theorems, Lebesgue spaces, representation theorems) is generalized, at least for submodular measures which are characterized by having a subadditive integral. The theory is of interest for applications to economic decision theory (decisions under risk and uncertainty), to statistics (including belief functions, fuzzy measures) to cooperative game theory, artificial intelligence, insurance, etc. Non-Additive Measure and Integral collects the results of scattered and often isolated approaches to non-additive measures and their integrals which originate in pure mathematics, potential theory, statistics, game theory, economic decision theory and other fields of application. It unifies, simplifies and generalizes known results and supplements the theory with new results, thus providing a sound basis for applications and further research in this growing field of increasing interest. It also contains fundamental results of sigma-additive and finitely additive measure and integration theory and sheds new light on additive theory. Non-Additive Measure and Integral employs distribution functions and quantile functions as basis tools, thus remaining close to the familiar language of probability theory. In addition to serving as an important reference, the book can be used as a mathematics textbook for graduate courses or seminars, containing many exercises to support or supplement the text.




Non-Additive Measures


Book Description

This book provides a comprehensive and timely report in the area of non-additive measures and integrals. It is based on a panel session on fuzzy measures, fuzzy integrals and aggregation operators held during the 9th International Conference on Modeling Decisions for Artificial Intelligence (MDAI 2012) in Girona, Spain, November 21-23, 2012. The book complements the MDAI 2012 proceedings book, published in Lecture Notes in Computer Science (LNCS) in 2012. The individual chapters, written by key researchers in the field, cover fundamental concepts and important definitions (e.g. the Sugeno integral, definition of entropy for non-additive measures) as well some important applications (e.g. to economics and game theory) of non-additive measures and integrals. The book addresses students, researchers and practitioners working at the forefront of their field.




An Introduction to Measure Theory


Book Description

This is a graduate text introducing the fundamentals of measure theory and integration theory, which is the foundation of modern real analysis. The text focuses first on the concrete setting of Lebesgue measure and the Lebesgue integral (which in turn is motivated by the more classical concepts of Jordan measure and the Riemann integral), before moving on to abstract measure and integration theory, including the standard convergence theorems, Fubini's theorem, and the Carathéodory extension theorem. Classical differentiation theorems, such as the Lebesgue and Rademacher differentiation theorems, are also covered, as are connections with probability theory. The material is intended to cover a quarter or semester's worth of material for a first graduate course in real analysis. There is an emphasis in the text on tying together the abstract and the concrete sides of the subject, using the latter to illustrate and motivate the former. The central role of key principles (such as Littlewood's three principles) as providing guiding intuition to the subject is also emphasized. There are a large number of exercises throughout that develop key aspects of the theory, and are thus an integral component of the text. As a supplementary section, a discussion of general problem-solving strategies in analysis is also given. The last three sections discuss optional topics related to the main matter of the book.




Measure, Integral and Probability


Book Description

This very well written and accessible book emphasizes the reasons for studying measure theory, which is the foundation of much of probability. By focusing on measure, many illustrative examples and applications, including a thorough discussion of standard probability distributions and densities, are opened. The book also includes many problems and their fully worked solutions.




Optimization Based Data Mining: Theory and Applications


Book Description

Optimization techniques have been widely adopted to implement various data mining algorithms. In addition to well-known Support Vector Machines (SVMs) (which are based on quadratic programming), different versions of Multiple Criteria Programming (MCP) have been extensively used in data separations. Since optimization based data mining methods differ from statistics, decision tree induction, and neural networks, their theoretical inspiration has attracted many researchers who are interested in algorithm development of data mining. Optimization based Data Mining: Theory and Applications, mainly focuses on MCP and SVM especially their recent theoretical progress and real-life applications in various fields. These include finance, web services, bio-informatics and petroleum engineering, which has triggered the interest of practitioners who look for new methods to improve the results of data mining for knowledge discovery. Most of the material in this book is directly from the research and application activities that the authors’ research group has conducted over the last ten years. Aimed at practitioners and graduates who have a fundamental knowledge in data mining, it demonstrates the basic concepts and foundations on how to use optimization techniques to deal with data mining problems.




Theory of Random Sets


Book Description

This is the first systematic exposition of random sets theory since Matheron (1975), with full proofs, exhaustive bibliographies and literature notes Interdisciplinary connections and applications of random sets are emphasized throughout the book An extensive bibliography in the book is available on the Web at http://liinwww.ira.uka.de/bibliography/math/random.closed.sets.html, and is accompanied by a search engine




Information Processing and Management of Uncertainty in Knowledge-Based Systems


Book Description

The International Conference on Information Processing and Management of - certainty in Knowledge-Based Systems, IPMU, is organized every two years with the aim of bringing together scientists working on methods for the management of uncertainty and aggregation of information in intelligent systems. Since 1986, this conference has been providing a forum for the exchange of ideas between th theoreticians and practitioners working in these areas and related ?elds. The 13 IPMU conference took place in Dortmund, Germany, June 28–July 2, 2010. This volume contains 79 papers selected through a rigorous reviewing process. The contributions re?ect the richness of research on topics within the scope of the conference and represent several important developments, speci?cally focused on theoretical foundations and methods for information processing and management of uncertainty in knowledge-based systems. We were delighted that Melanie Mitchell (Portland State University, USA), Nihkil R. Pal (Indian Statistical Institute), Bernhard Sch ̈ olkopf (Max Planck I- titute for Biological Cybernetics, Tubing ̈ en, Germany) and Wolfgang Wahlster (German Research Center for Arti?cial Intelligence, Saarbruc ̈ ken) accepted our invitations to present keynote lectures. Jim Bezdek received the Kamp ́ede F ́ eriet Award, granted every two years on the occasion of the IPMU conference, in view of his eminent research contributions to the handling of uncertainty in clustering, data analysis and pattern recognition.




Integrated Uncertainty Management and Applications


Book Description

Solving practical problems often requires the integration of information and knowledge from many different sources, taking into account uncertainty and impreciseness. The 2010 International Symposium on Integrated Uncertainty Management and Applications (IUM’2010), which takes place at the Japan Advanced Institute of Science and Technology (JAIST), Ishikawa, Japan, between 9th–11th April, is therefore conceived as a forum for the discussion and exchange of research results, ideas for and experience of application among researchers and practitioners involved with all aspects of uncertainty modelling and management.




Ordinary and Fractional Approximation by Non-additive Integrals: Choquet, Shilkret and Sugeno Integral Approximators


Book Description

Ordinary and fractional approximations by non-additive integrals, especially by integral approximators of Choquet, Silkret and Sugeno types, are a new trend in approximation theory. These integrals are only subadditive and only the first two are positive linear, and they produce very fast and flexible approximations based on limited data. The author presents both the univariate and multivariate cases. The involved set functions are much weaker forms of the Lebesgue measure and they were conceived to fulfill the needs of economic theory and other applied sciences. The approaches presented here are original, and all chapters are self-contained and can be read independently. Moreover, the book’s findings are sure to find application in many areas of pure and applied mathematics, especially in approximation theory, numerical analysis and mathematical economics (both ordinary and fractional). Accordingly, it offers a unique resource for researchers, graduate students, and for coursework in the above-mentioned fields, and belongs in all science and engineering libraries.




Nonlinear Integrals And Their Applications In Data Mining


Book Description

Regarding the set of all feature attributes in a given database as the universal set, this monograph discusses various nonadditive set functions that describe the interaction among the contributions from feature attributes towards a considered target attribute. Then, the relevant nonlinear integrals are investigated. These integrals can be applied as aggregation tools in information fusion and data mining, such as synthetic evaluation, nonlinear multiregressions, and nonlinear classifications. Some methods of fuzzification are also introduced for nonlinear integrals such that fuzzy data can be treated and fuzzy information is retrievable.The book is suitable as a text for graduate courses in mathematics, computer science, and information science. It is also useful to researchers in the relevant area.