Computing with Quantum Cats


Book Description

A mind-blowing glimpse into the near future, where quantum computing will have world-transforming effects. The quantum computer is no longer the stuff of science fiction. Pioneering physicists are on the brink of unlocking a new quantum universe which provides a better representation of reality than our everyday experiences and common sense ever could. The birth of quantum computers - which, like Schrödinger's famous "dead and alive" cat, rely on entities like electrons, photons, or atoms existing in two states at the same time - is set to turn the computing world on its head. In his fascinating study of this cutting-edge technology, John Gribbin updates his previous views on the nature of quantum reality, arguing for a universe of many parallel worlds where "everything is real." Looking back to Alan Turing's work on the Enigma machine and the first electronic computer, Gribbin explains how quantum theory developed to make quantum computers work in practice as well as in principle. He takes us beyond the arena of theoretical physics to explore their practical applications - from machines which learn through "intuition" and trial and error to unhackable laptops and smartphones. And he investigates the potential for this extraordinary science to create a world where communication occurs faster than light and teleportation is possible. This is an exciting insider's look at the new frontier of computer science and its revolutionary implications.




Computing with Quantum Cats


Book Description

Looking back to Alan Turing's work on the Enigma machine and the first electronic computer, and featuring a new introduction on the recent evolution of quantum computing, author John Gribbin explains how quantum theory developed to make quantum computers work in practice as well as in principle, taking us beyond the arena of theoretical physics to explore the potential for this extraordinary science.




Quantum Computing for Everyone


Book Description

An accessible introduction to an exciting new area in computation, explaining such topics as qubits, entanglement, and quantum teleportation for the general reader. Quantum computing is a beautiful fusion of quantum physics and computer science, incorporating some of the most stunning ideas from twentieth-century physics into an entirely new way of thinking about computation. In this book, Chris Bernhardt offers an introduction to quantum computing that is accessible to anyone who is comfortable with high school mathematics. He explains qubits, entanglement, quantum teleportation, quantum algorithms, and other quantum-related topics as clearly as possible for the general reader. Bernhardt, a mathematician himself, simplifies the mathematics as much as he can and provides elementary examples that illustrate both how the math works and what it means. Bernhardt introduces the basic unit of quantum computing, the qubit, and explains how the qubit can be measured; discusses entanglement—which, he says, is easier to describe mathematically than verbally—and what it means when two qubits are entangled (citing Einstein's characterization of what happens when the measurement of one entangled qubit affects the second as “spooky action at a distance”); and introduces quantum cryptography. He recaps standard topics in classical computing—bits, gates, and logic—and describes Edward Fredkin's ingenious billiard ball computer. He defines quantum gates, considers the speed of quantum algorithms, and describes the building of quantum computers. By the end of the book, readers understand that quantum computing and classical computing are not two distinct disciplines, and that quantum computing is the fundamental form of computing. The basic unit of computation is the qubit, not the bit.




Quantum Computing from Colossus to Qubits


Book Description

The revolution is here. In breakthrough after breakthrough, pioneering physicists are unlocking a new quantum universe which provides a better representation of reality than our everyday experiences and common sense ever could. The birth of quantum computers - which, like Schrödinger's famous dead-and-alive cat, rely on entities like electrons existing in a mixture of states - is starting to turn the computing world on its head. In his fascinating study of this cutting-edge technology (first published as Computing with Quantum Cats and now featuring a new foreword), John Gribbin updates his previous views on the nature of quantum reality, arguing for a universe of many parallel worlds where 'everything is real'. Looking back to Alan Turing's work on the Enigma machine and the first electronic computer, Gribbin explains how quantum theory developed to make quantum computers work in practice as well as in principle. He takes us beyond the arena of theoretical physics to explore their practical applications - from machines which learn through 'intuition' and trial and error to unhackable laptops and smartphones. And he investigates the potential for this extraordinary science to allow communication faster than light and even teleportation, as we step into a world of infinite possibility.




Quantum Computing


Book Description

The ultimate non-technical guide to the fast-developing world of quantum computing Computer technology has improved exponentially over the last 50 years. But the headroom for bigger and better electronic solutions is running out. Our best hope is to engage the power of quantum physics. 'Quantum algorithms' had already been written long before hardware was built. These would enable, for example, a quantum computer to exponentially speed up an information search, or to crack the mathematical trick behind internet security. However, making a quantum computer is incredibly difficult. Despite hundreds of laboratories around the world working on them, we are only just seeing them come close to 'supremacy' where they can outperform a traditional computer. In this approachable introduction, Brian Clegg explains algorithms and their quantum counterparts, explores the physical building blocks and quantum weirdness necessary to make a quantum computer, and uncovers the capabilities of the current generation of machines.




A Computable Universe


Book Description

This volume, with a foreword by Sir Roger Penrose, discusses the foundations of computation in relation to nature.It focuses on two main questions: What is computation? How does nature compute?The contributors are world-renowned experts who have helped shape a cutting-edge computational understanding of the universe. They discuss computation in the world from a variety of perspectives, ranging from foundational concepts to pragmatic models to ontological conceptions and philosophical implications.The volume provides a state-of-the-art collection of technical papers and non-technical essays, representing a field that assumes information and computation to be key in understanding and explaining the basic structure underpinning physical reality. It also includes a new edition of Konrad Zuse''s OC Calculating SpaceOCO (the MIT translation), and a panel discussion transcription on the topic, featuring worldwide experts in quantum mechanics, physics, cognition, computation and algorithmic complexity.The volume is dedicated to the memory of Alan M Turing OCo the inventor of universal computation, on the 100th anniversary of his birth, and is part of the Turing Centenary celebrations.




Six Impossible Things


Book Description

“An elegant and accessible” investigation of quantum mechanics for non-specialists—“highly recommended” for students of the sciences, sci-fi fans, and anyone interested in the strange world of quantum physics (Forbes) Rules of the quantum world seem to say that a cat can be both alive and dead at the same time and a particle can be in two places at once. And that particle is also a wave; everything in the quantum world can described in terms of waves—or entirely in terms of particles. These interpretations were all established by the end of the 1920s, by Erwin Schrödinger, Werner Heisenberg, Paul Dirac, and others. But no one has yet come up with a common sense explanation of what is going on. In this concise and engaging book, astrophysicist John Gribbin offers an overview of six of the leading interpretations of quantum mechanics. Gribbin calls his account “agnostic,” explaining that none of these interpretations is any better—or any worse—than any of the others. Gribbin presents the Copenhagen Interpretation, promoted by Niels Bohr and named by Heisenberg; the Pilot-Wave Interpretation, developed by Louis de Broglie; the Many Worlds Interpretation (termed “excess baggage” by Gribbin); the Decoherence Interpretation (“incoherent”); the Ensemble “Non-Interpretation”; and the Timeless Transactional Interpretation (which theorized waves going both forward and backward in time). All of these interpretations are crazy, Gribbin warns, and some are more crazy than others—but in the quantum world, being more crazy does not necessarily mean more wrong.




Computing


Book Description

Discover the history of computing through 4 major threads of development in this compact, accessible history covering punch cards, Silicon Valley, smartphones, and much more. In an accessible style, computer historian Paul Ceruzzi offers a broad though detailed history of computing, from the first use of the word “digital” in 1942 to the development of punch cards and the first general purpose computer, to the internet, Silicon Valley, and smartphones and social networking. Ceruzzi identifies 4 major threads that run throughout all of computing’s technological development: • Digitization: the coding of information, computation, and control in binary form • The convergence of multiple streams of techniques, devices, and machines • The steady advance of electronic technology, as characterized famously by “Moore's Law” • Human-machine interface The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of “smart” hand-held devices. In this concise and accessible account of the invention and development of digital technology, Ceruzzi offers a general and more useful perspective for students of computer science and history.




The Chip


Book Description

Barely fifty years ago a computer was a gargantuan, vastly expensive thing that only a handful of scientists had ever seen. The world’s brightest engineers were stymied in their quest to make these machines small and affordable until the solution finally came from two ingenious young Americans. Jack Kilby and Robert Noyce hit upon the stunning discovery that would make possible the silicon microchip, a work that would ultimately earn Kilby the Nobel Prize for physics in 2000. In this completely revised and updated edition of The Chip, T.R. Reid tells the gripping adventure story of their invention and of its growth into a global information industry. This is the story of how the digital age began.




The Universal Machine


Book Description

The computer unlike other inventions is universal; you can use a computer for many tasks: writing, composing music, designing buildings, creating movies, inhabiting virtual worlds, communicating... This popular science history isn't just about technology but introduces the pioneers: Babbage, Turing, Apple's Wozniak and Jobs, Bill Gates, Tim Berners-Lee, Mark Zuckerberg. This story is about people and the changes computers have caused. In the future ubiquitous computing, AI, quantum and molecular computing could even make us immortal. The computer has been a radical invention. In less than a single human life computers are transforming economies and societies like no human invention before.