Soft Methods in Probability, Statistics and Data Analysis


Book Description

Classical probability theory and mathematical statistics appear sometimes too rigid for real life problems, especially while dealing with vague data or imprecise requirements. These problems have motivated many researchers to "soften" the classical theory. Some "softening" approaches utilize concepts and techniques developed in theories such as fuzzy sets theory, rough sets, possibility theory, theory of belief functions and imprecise probabilities, etc. Since interesting mathematical models and methods have been proposed in the frameworks of various theories, this text brings together experts representing different approaches used in soft probability, statistics and data analysis.




Fuzzy Probability and Statistics


Book Description

This book combines material from our previous books FP (Fuzzy Probabilities: New Approach and Applications,Physica-Verlag, 2003) and FS (Fuzzy Statistics, Springer, 2004), plus has about one third new results. From FP we have material on basic fuzzy probability, discrete (fuzzy Poisson,binomial) and continuous (uniform, normal, exponential) fuzzy random variables. From FS we included chapters on fuzzy estimation and fuzzy hypothesis testing related to means, variances, proportions, correlation and regression. New material includes fuzzy estimators for arrival and service rates, and the uniform distribution, with applications in fuzzy queuing theory. Also, new to this book, is three chapters on fuzzy maximum entropy (imprecise side conditions) estimators producing fuzzy distributions and crisp discrete/continuous distributions. Other new results are: (1) two chapters on fuzzy ANOVA (one-way and two-way); (2) random fuzzy numbers with applications to fuzzy Monte Carlo studies; and (3) a fuzzy nonparametric estimator for the median.




Soft Methods in Probability, Statistics and Data Analysis


Book Description

Papers presented at the first International Workshop on Soft Methods in Probability and Statistics, SMPS'2002, held in Warsaw in September 2002.




Encyclopedia of Data Warehousing and Mining


Book Description

Data Warehousing and Mining (DWM) is the science of managing and analyzing large datasets and discovering novel patterns and in recent years has emerged as a particularly exciting and industrially relevant area of research. Prodigious amounts of data are now being generated in domains as diverse as market research, functional genomics and pharmaceuticals; intelligently analyzing these data, with the aim of answering crucial questions and helping make informed decisions, is the challenge that lies ahead. The Encyclopedia of Data Warehousing and Mining provides a comprehensive, critical and descriptive examination of concepts, issues, trends, and challenges in this rapidly expanding field of data warehousing and mining (DWM). This encyclopedia consists of more than 350 contributors from 32 countries, 1,800 terms and definitions, and more than 4,400 references. This authoritative publication offers in-depth coverage of evolutions, theories, methodologies, functionalities, and applications of DWM in such interdisciplinary industries as healthcare informatics, artificial intelligence, financial modeling, and applied statistics, making it a single source of knowledge and latest discoveries in the field of DWM.




Soft Methodology and Random Information Systems


Book Description

The analysis of experimental data resulting from some underlying random process is a fundamental part of most scientific research. Probability Theory and Statistics have been developed as flexible tools for this analyis, and have been applied successfully in various fields such as Biology, Economics, Engineering, Medicine or Psychology. However, traditional techniques in Probability and Statistics were devised to model only a singe source of uncertainty, namely randomness. In many real-life problems randomness arises in conjunction with other sources, making the development of additional "softening" approaches essential. This book is a collection of papers presented at the 2nd International Conference on Soft Methods in Probability and Statistics (SMPS’2004) held in Oviedo, providing a comprehensive overview of the innovative new research taking place within this emerging field.




Granular, Fuzzy, and Soft Computing


Book Description

The first edition of the Encyclopedia of Complexity and Systems Science (ECSS, 2009) presented a comprehensive overview of granular computing (GrC) broadly divided into several categories: Granular computing from rough set theory, Granular Computing in Database Theory, Granular Computing in Social Networks, Granular Computing and Fuzzy Set Theory, Grid/Cloud Computing, as well as general issues in granular computing. In 2011, the formal theory of GrC was established, providing an adequate infrastructure to support revolutionary new approaches to computer/data science, including the challenges presented by so-called big data. For this volume of ECSS, Second Edition, many entries have been updated to capture these new developments, together with new chapters on such topics as data clustering, outliers in data mining, qualitative fuzzy sets, and information flow analysis for security applications. Granulations can be seen as a natural and ancient methodology deeply rooted in the human mind. Many daily "things" are routinely granulated into sub "things": The topography of earth is granulated into hills, plateaus, etc., space and time are granulated into infinitesimal granules, and a circle is granulated into polygons of infinitesimal sides. Such granules led to the invention of calculus, topology and non-standard analysis. Formalization of general granulation was difficult but, as shown in this volume, great progress has been made in combing discrete and continuous mathematics under one roof for a broad range of applications in data science.




Soft Computing Methods in Human Sciences


Book Description

An in-depth look at soft computing methods and their applications in the human sciences, such as the social and the behavioral sciences. Soft computing methods - including fuzzy systems, neural networks, evolutionary computing and probabilistic reasoning - are state-of-the-art methods in theory formation and model construction. The powerful application areas of these methods in the human sciences are demonstrated, including the replacement of statistical models by simpler numerical or linguistic soft computing models and the use of computer simulations with approximate and linguistic constituents. "Dr. Niskanen's work opens new vistas in application of soft computing, fuzzy logic and fuzzy set theory to the human sciences. This book is likely to be viewed in retrospect as a landmark in its field" (Lotfi A. Zadeh, Berkeley)




Statistical Methods for Fuzzy Data


Book Description

Statistical data are not always precise numbers, or vectors, or categories. Real data are frequently what is called fuzzy. Examples where this fuzziness is obvious are quality of life data, environmental, biological, medical, sociological and economics data. Also the results of measurements can be best described by using fuzzy numbers and fuzzy vectors respectively. Statistical analysis methods have to be adapted for the analysis of fuzzy data. In this book, the foundations of the description of fuzzy data are explained, including methods on how to obtain the characterizing function of fuzzy measurement results. Furthermore, statistical methods are then generalized to the analysis of fuzzy data and fuzzy a-priori information. Key Features: Provides basic methods for the mathematical description of fuzzy data, as well as statistical methods that can be used to analyze fuzzy data. Describes methods of increasing importance with applications in areas such as environmental statistics and social science. Complements the theory with exercises and solutions and is illustrated throughout with diagrams and examples. Explores areas such quantitative description of data uncertainty and mathematical description of fuzzy data. This work is aimed at statisticians working with fuzzy logic, engineering statisticians, finance researchers, and environmental statisticians. It is written for readers who are familiar with elementary stochastic models and basic statistical methods.




Coping with Uncertainty


Book Description

Support for addressing the on-going global changes needs solutions for new scientific problems which in turn require new concepts and tools. A key issue concerns a vast variety of irreducible uncertainties, including extreme events of high multidimensional consequences, e.g., the climate change. The dilemma is concerned with enormous costs versus massive uncertainties of extreme impacts. Traditional scientific approaches rely on real observations and experiments. Yet no sufficient observations exist for new problems, and "pure" experiments, and learning by doing may be expensive, dangerous, or impossible. In addition, the available historical observations are often contaminated by past actions, and policies. Thus, tools are presented for the explicit treatment of uncertainties using "synthetic" information composed of available "hard" data from historical observations, the results of possible experiments, and scientific facts, as well as "soft" data from experts' opinions, and scenarios.




Strengthening Links Between Data Analysis and Soft Computing


Book Description

This book gathers contributions presented at the 7th International Conference on Soft Methods in Probability and Statistics SMPS 2014, held in Warsaw (Poland) on September 22-24, 2014. Its aim is to present recent results illustrating new trends in intelligent data analysis. It gives a comprehensive overview of current research into the fusion of soft computing methods with probability and statistics. Synergies of both fields might improve intelligent data analysis methods in terms of robustness to noise and applicability to larger datasets, while being able to efficiently obtain understandable solutions of real-world problems.