The Epistemology of Statistical Science


Book Description

Whilst this is a book about higher education, there are important lessons for schooling. On the one hand, the book is a powerful demonstration of the potential of DST for enhancing learning in schools, particularly in schools serving the poor and marginalised. On the other hand, improving teaching and learning in higher education, through the creative use of technology, is essential to overcome the learning challenges of those entering tertiary level institutions.




On the Epistemology of Data Science


Book Description

This book addresses controversies concerning the epistemological foundations of data science: Is it a genuine science? Or is data science merely some inferior practice that can at best contribute to the scientific enterprise, but cannot stand on its own? The author proposes a coherent conceptual framework with which these questions can be rigorously addressed. Readers will discover a defense of inductivism and consideration of the arguments against it: an epistemology of data science more or less by definition has to be inductivist, given that data science starts with the data. As an alternative to enumerative approaches, the author endorses Federica Russo’s recent call for a variational rationale in inductive methodology. Chapters then address some of the key concepts of an inductivist methodology including causation, probability and analogy, before outlining an inductivist framework. The inductivist framework is shown to be adequate and useful for an analysis of the epistemological foundations of data science. The author points out that many aspects of the variational rationale are present in algorithms commonly used in data science. Introductions to algorithms and brief case studies of successful data science such as machine translation are included. Data science is located with reference to several crucial distinctions regarding different kinds of scientific practices, including between exploratory and theory-driven experimentation, and between phenomenological and theoretical science. Computer scientists, philosophers and data scientists of various disciplines will find this philosophical perspective and conceptual framework of great interest, especially as a starting point for further in-depth analysis of algorithms used in data science.




The Epistemology of Statistical Science


Book Description

"In the usage of present-day statistics 'statistical inference' is a profoundly ambiguous expression. In some literature a statistical inference is a "decision made under risk', in other literature it is 'a conclusion drawn from given data', and most of the literature displays no awareness that the two meanings might be different. This book concerns the problem of drawing conclusions from given data, in which respect we have to ask: Does there exist a need for the term 'statistical inference'? If so, does there also exist a corresponding need for every other science? If so, how does, for example, agronomy then manage to reason in terms of botanical inference, soil scientific inference, meteorological inference, biochemical inference, molecular biological inference, entomological inference, plant pathological inference, etc. without incoherence or self-contradiction? Consider the possibility that agronomy does not reason in terms of such a motley of special kinds of inference. Consider the possibility that, apart from subject matter, botany, soil science, entomology, etc. all employ the same kind of reasoning. If so, must we then believe that statistics, alone among all the sciences, is the only one that requires its own special kind of inference?"--P. i.




Data Science and Social Research


Book Description

This edited volume lays the groundwork for Social Data Science, addressing epistemological issues, methods, technologies, software and applications of data science in the social sciences. It presents data science techniques for the collection, analysis and use of both online and offline new (big) data in social research and related applications. Among others, the individual contributions cover topics like social media, learning analytics, clustering, statistical literacy, recurrence analysis and network analysis. Data science is a multidisciplinary approach based mainly on the methods of statistics and computer science, and its aim is to develop appropriate methodologies for forecasting and decision-making in response to an increasingly complex reality often characterized by large amounts of data (big data) of various types (numeric, ordinal and nominal variables, symbolic data, texts, images, data streams, multi-way data, social networks etc.) and from diverse sources. This book presents selected papers from the international conference on Data Science & Social Research, held in Naples, Italy in February 2016, and will appeal to researchers in the social sciences working in academia as well as in statistical institutes and offices.




Bayesian Philosophy of Science


Book Description

How should we reason in science? Jan Sprenger and Stephan Hartmann offer a refreshing take on classical topics in philosophy of science, using a single key concept to explain and to elucidate manifold aspects of scientific reasoning. They present good arguments and good inferences as being characterized by their effect on our rational degrees of belief. Refuting the view that there is no place for subjective attitudes in 'objective science', Sprenger and Hartmann explain the value of convincing evidence in terms of a cycle of variations on the theme of representing rational degrees of belief by means of subjective probabilities (and changing them by Bayesian conditionalization). In doing so, they integrate Bayesian inference—the leading theory of rationality in social science—with the practice of 21st century science. Bayesian Philosophy of Science thereby shows how modeling such attitudes improves our understanding of causes, explanations, confirming evidence, and scientific models in general. It combines a scientifically minded and mathematically sophisticated approach with conceptual analysis and attention to methodological problems of modern science, especially in statistical inference, and is therefore a valuable resource for philosophers and scientific practitioners.




Philosophy of Statistics


Book Description

Statisticians and philosophers of science have many common interests but restricted communication with each other. This volume aims to remedy these shortcomings. It provides state-of-the-art research in the area of philosophy of statistics by encouraging numerous experts to communicate with one another without feeling "restricted by their disciplines or thinking "piecemeal in their treatment of issues. A second goal of this book is to present work in the field without bias toward any particular statistical paradigm. Broadly speaking, the essays in this Handbook are concerned with problems of induction, statistics and probability. For centuries, foundational problems like induction have been among philosophers' favorite topics; recently, however, non-philosophers have increasingly taken a keen interest in these issues. This volume accordingly contains papers by both philosophers and non-philosophers, including scholars from nine academic disciplines. - Provides a bridge between philosophy and current scientific findings - Covers theory and applications - Encourages multi-disciplinary dialogue




Statistical Inference as Severe Testing


Book Description

Mounting failures of replication in social and biological sciences give a new urgency to critically appraising proposed reforms. This book pulls back the cover on disagreements between experts charged with restoring integrity to science. It denies two pervasive views of the role of probability in inference: to assign degrees of belief, and to control error rates in a long run. If statistical consumers are unaware of assumptions behind rival evidence reforms, they can't scrutinize the consequences that affect them (in personalized medicine, psychology, etc.). The book sets sail with a simple tool: if little has been done to rule out flaws in inferring a claim, then it has not passed a severe test. Many methods advocated by data experts do not stand up to severe scrutiny and are in tension with successful strategies for blocking or accounting for cherry picking and selective reporting. Through a series of excursions and exhibits, the philosophy and history of inductive inference come alive. Philosophical tools are put to work to solve problems about science and pseudoscience, induction and falsification.




Error and Inference


Book Description

Although both philosophers and scientists are interested in how to obtain reliable knowledge in the face of error, there is a gap between their perspectives that has been an obstacle to progress. By means of a series of exchanges between the editors and leaders from the philosophy of science, statistics and economics, this volume offers a cumulative introduction connecting problems of traditional philosophy of science to problems of inference in statistical and empirical modelling practice. Philosophers of science and scientific practitioners are challenged to reevaluate the assumptions of their own theories - philosophical or methodological. Practitioners may better appreciate the foundational issues around which their questions revolve and thereby become better 'applied philosophers'. Conversely, new avenues emerge for finally solving recalcitrant philosophical problems of induction, explanation and theory testing.




Algorithms and Complexity in Mathematics, Epistemology, and Science


Book Description

ACMES (Algorithms and Complexity in Mathematics, Epistemology, and Science) is a multidisciplinary conference series that focuses on epistemological and mathematical issues relating to computation in modern science. This volume includes a selection of papers presented at the 2015 and 2016 conferences held at Western University that provide an interdisciplinary outlook on modern applied mathematics that draws from theory and practice, and situates it in proper context. These papers come from leading mathematicians, computational scientists, and philosophers of science, and cover a broad collection of mathematical and philosophical topics, including numerical analysis and its underlying philosophy, computer algebra, reliability and uncertainty quantification, computation and complexity theory, combinatorics, error analysis, perturbation theory, experimental mathematics, scientific epistemology, and foundations of mathematics. By bringing together contributions from researchers who approach the mathematical sciences from different perspectives, the volume will further readers' understanding of the multifaceted role of mathematics in modern science, informed by the state of the art in mathematics, scientific computing, and current modeling techniques.




Resonance: From Probability To Epistemology And Back


Book Description

Resonance examines some building blocks of epistemology as a prelude to the careful analysis of the foundations of probability. The concept of resonance is introduced to shed light on the philosophical problems of induction, consciousness, intelligence and free will. The same concept is later applied to provide support for a new philosophical theory of probability.Although based on existing ideas and theories, the epistemological concept of resonance is investigated for the first time in this book. The best-known philosophical theories of probability, frequency and subjective, are shown to be unrealistic and dissociated from the two main branches of statistics: frequency statistics and Bayesian statistics.Written in an accessible style, this book can be enjoyed by philosophers, statisticians and mathematicians, and also by anyone looking to expand their understanding of the disciplines of epistemology and probability.




Recent Books