Inconsistency Robustness


Book Description

Inconsistency robustness is information system performance in the face of continually pervasive inconsistencies---a shift from the previously dominant paradigms of inconsistency denial and inconsistency elimination attempting to sweep them under the rug. Inconsistency robustness is a both an observed phenomenon and a desired feature: Inconsistency Robustness is an observed phenomenon because large information-systems are required to operate in an environment of pervasive inconsistency. Inconsistency Robustness is a desired feature because we need to improve the performance of large information system. This volume has revised versions of refereed articles and panel summaries from the first two International Symposia on Inconsistency Robustness conducted under the auspices of the International Society for Inconsistency Robustness (iRobust http: //irobust.org). The articles are broadly based on theory and practice, addressing fundamental issues in inconsistency robustness. The field of Inconsistency Robustness aims to provide practical rigorous foundations for computer information systems dealing with pervasively inconsistent information."




Truth in Fiction


Book Description

This monograph examines truth in fiction by applying the techniques of a naturalized logic of human cognitive practices. The author structures his project around two focal questions. What would it take to write a book about truth in literary discourse with reasonable promise of getting it right? What would it take to write a book about truth in fiction as true to the facts of lived literary experience as objectivity allows? It is argued that the most semantically distinctive feature of the sentences of fiction is that they areunambiguously true and false together. It is true that Sherlock Holmes lived at 221B Baker Street and also concurrently false that he did. A second distinctive feature of fiction is that the reader at large knows of this inconsistency and isn’t in the least cognitively molested by it. Why, it is asked, would this be so? What would explain it? Two answers are developed. According to the no-contradiction thesis, the semantically tangled sentences of fiction are indeed logically inconsistent but not logically contradictory. According to the no-bother thesis, if the inconsistencies of fiction were contradictory, a properly contrived logic for the rational management of inconsistency would explain why readers at large are not thrown off cognitive stride by their embrace of those contradictions. As developed here, the account of fiction suggests the presence of an underlying three - or four-valued dialethic logic. The author shows this to be a mistaken impression. There are only two truth-values in his logic of fiction. The naturalized logic of Truth in Fiction jettisons some of the standard assumptions and analytical tools of contemporary philosophy, chiefly because the neurotypical linguistic and cognitive behaviour of humanity at large is at variance with them. Using the resources of a causal response epistemology in tandem with the naturalized logic, the theory produced here is data-driven, empirically sensitive, and open to a circumspect collaboration with the empirical sciences of language and cognition.




Quality, Reliability, Security and Robustness in Heterogeneous Systems


Book Description

This book constitutes the refereed post-conference proceedings of the 14th EAI International Conference on Quality, Reliability, Security and Robustness in Heterogeneous Networks, QShine 2018, held in Ho Chi Minh City, Vietnam, in December 2018. The 13 revised full papers were carefully reviewed and selected from 28 submissions. The papers are organized thematically in tracks, starting with security and privacy, telecommunication systems and networks, networks and applications.




Stating the Obvious, and Other Database Writings


Book Description

Some things seem so obvious that they don’t need to be spelled out in detail. Or do they? In computing, at least (and probably in any discipline where accuracy and precision are important), it can be quite dangerous just to assume that some given concept is “obvious,” and indeed universally understood. Serious mistakes can happen that way! The first part of this book discusses features of the database field—equality, assignment, naming—where just such an assumption seems to have been made, and it describes some of the unfortunate mistakes that have occurred as a consequence. It also explains how and why the features in question aren’t quite as obvious as they might seem, and it offers some advice on how to work around the problems caused by assumptions to the contrary. Other parts of the book also deal with database issues where devoting some preliminary effort to spelling out exactly what the issues in question entailed could have led to much better interfaces and much more carefully designed languages. The issues discussed include redundancy and indeterminacy; persistence, encapsulation, and decapsulation; the ACID properties of transactions; and types vs. units of measure. Finally, the book also contains a detailed deconstruction of, and response to, various recent pronouncements from the database literature, all of them having to do with relational technology. Once again, the opinions expressed in those pronouncements might seem “obvious” to some people (to the writers at least, presumably), but the fact remains that they’re misleading at best, and in most cases just flat out wrong.




Characterizing the Robustness of Science


Book Description

Mature sciences have been long been characterized in terms of the “successfulness”, “reliability” or “trustworthiness” of their theoretical, experimental or technical accomplishments. Today many philosophers of science talk of “robustness”, often without specifying in a precise way the meaning of this term. This lack of clarity is the cause of frequent misunderstandings, since all these notions, and that of robustness in particular, are connected to fundamental issues, which concern nothing less than the very nature of science and its specificity with respect to other human practices, the nature of rationality and of scientific progress; and science’s claim to be a truth-conducive activity. This book offers for the first time a comprehensive analysis of the problem of robustness, and in general, that of the reliability of science, based on several detailed case studies and on philosophical essays inspired by the so-called practical turn in philosophy of science.




A Computable Universe


Book Description

This volume discusses the foundations of computation in relation to nature. It focuses on two main questions: What is computation? and How does nature compute?




Graham Priest on Dialetheism and Paraconsistency


Book Description

This book presents the state of the art in the fields of formal logic pioneered by Graham Priest. It includes advanced technical work on the model and proof theories of paraconsistent logic, in contributions from top scholars in the field. Graham Priest’s research has had a considerable influence on the field of philosophical logic, especially with respect to the themes of dialetheism—the thesis that there exist true but inconsistent sentences—and paraconsistency—an account of deduction in which contradictory premises do not entail the truth of arbitrary sentences. Priest’s work has regularly challenged researchers to reappraise many assumptions about rationality, ontology, and truth. This book collects original research by some of the most esteemed scholars working in philosophical logic, whose contributions explore and appraise Priest’s work on logical approaches to problems in philosophy, linguistics, computation, and mathematics. They provide fresh analyses, critiques, and applications of Priest’s work and attest to its continued relevance and topicality. The book also includes Priest’s responses to the contributors, providing a further layer to the development of these themes .




NASA Formal Methods


Book Description

This book constitutes the proceedings of the 15th International Symposium on NASA Formal Methods, NFM 2023, held in Houston, Texas, USA, during May 16-18, 2023. The 26 full and 3 short papers presented in this volume were carefully reviewed and selected from 75 submissions. The papers deal with advances in formal methods, formal methods techniques, and formal methods in practice.




Theory Of Knowledge: Structures And Processes


Book Description

This book aims to synthesize different directions in knowledge studies into a unified theory of knowledge and knowledge processes. It explicates important relations between knowledge and information. It provides the readers with understanding of the essence and structure of knowledge, explicating operations and process that are based on knowledge and vital for society.The book also highlights how the theory of knowledge paves the way for more advanced design and utilization of computers and networks.




Intelligent Distributed Computing XI


Book Description

This book presents a collection of contributions addressing recent advances and research in synergistic combinations of topics in the joint fields of intelligent computing and distributed computing. It focuses on the following specific topics: distributed data mining and machine learning, reasoning and decision-making in distributed environments, distributed evolutionary algorithms, trust and reputation models for distributed systems, scheduling and resource allocation in distributed systems, intelligent multi-agent systems, advanced agent-based and service-based architectures, and Smart Cloud and Internet of Things (IoT) environments. The book represents the combined peer-reviewed proceedings of the 11th International Symposium on Intelligent Distributed Computing (IDC 2017) and the 7th International Workshop on Applications of Software Agents (WASA 2017), both of which were held in Belgrade, Serbia from October 11 to 13, 2017.