Reconstructing physics: The universe is information
WHEN we consider some of the most striking phenomena permitted by the laws of physics – from human reasoning to computer technologies and the replication of genes – we find that information plays a central role. But, on the face of it, information is profoundly different from the basic entities that physical sciences use to describe reality. Neither quantum mechanics nor general relativity, the most fundamental theories in physics, provide a meaning for information or even a way of measuring it. And it has a "counterfactual" character: a message cannot carry information unless a different message is also possible.
Statements about information were therefore long regarded in physics as second-class, non-fundamental approximations. Information itself was considered an a priori abstraction, like Euclid's perfect triangles and circles, whose physical instantiations are inevitably approximate.
Yet there have long been clues that information is a fundamental physical quantity, obeying exact laws. Consider statistical mechanics, pioneered by Ludwig Boltzmann at the end of the 19th century, which reformulates the laws of thermodynamics in information-like terms. For example, these laws define heat and entropy – loosely speaking, a measure of disorder in a system – in terms of the number of ways in which atoms of a given total energy could possibly be distributed, which is also the information content of the system. The laws of thermodynamics therefore link information with fundamental forms of energy such as work.
Even more strikingly, in the 1970s Jacob Bekenstein and Stephen Hawking discovered that a black hole's surface area is also its entropy (in suitable units). Hence information, too, must be an exact quantity, like area.
In the theory of computation, information is referred to in the same way as the laws of thermodynamics refer to energy: without ever mentioning the details of the physical systems that instantiate it. Yet we now know that different laws of physics can give rise to fundamentally different modes of computing: quantum computers can solve problems qualitatively different from anything classical computers are capable of. Thus laws about computation must be laws of physics, and so must laws about information.
But what are those laws? How can abstractions be physical, and counterfactual properties factual?
We think we have solved this riddle. Our solution begins like this: The laws of physics have certain regularities which have never been expressed precisely, but only through a vague concept of "information". For instance, information is informally characterised as something that can be copied from one physical system to another – a property we call interoperability. A physical theory of information must express those regularities explicitly, in the form of fundamental laws. In this respect information is like energy, and its laws are like the principle of conservation of energy.
But unlike energy, the idea of information clashes with the prevailing conception of fundamental physics. Ever since Galileo and Newton, this has been that the physical world is explained in terms of its state (describing everything that is there) and deterministic laws of motion (describing how the state changes with time). Only one outcome can result from a given initial state, so there is no room for anything else to be possible. Information cannot be expressed that way, because of its counterfactual character. It requires a new mode of explanation, one provided by our constructor theory. Its basic claim is that all laws of physics can be expressed entirely in terms of statements of which tasks – ie physical transformations – are possible and which impossible, and why.
A task is possible if the laws of physics permit the existence of a constructor for it: something that can both cause the transformation and retain the ability to cause it again. A heat engine, for instance, is a thermodynamic constructor: it causes energy to change from one form to another, while operating in a cycle. A catalyst is a chemical constructor: it causes chemical reactions but is not itself chemically changed.
Just as is often done with catalysts, in constructor theory one abstracts away the constructor and expresses everything in the form of statements about tasks. Our constructor theory of information takes the informally known properties of information and expresses them entirely in terms of the distinction between possible and impossible tasks. This makes all the difference. In constructor theory, counterfactuals are first-class, fundamental statements, and transformations such as copying are naturally expressed as tasks. So in constructor theory the properties associated with information appear as elegant, exact laws of physics.
How is this achieved? First, one must express what it takes for a physical system to perform computations. All computations on a set of attributes of a system can be expressed as tasks – the permutations of that set. A computation medium is a system with a set of attributes whose permutations are all possible tasks. We call that a computation set. If copying the attributes in the computation set is also a possible task, we call the computation medium an information medium.
All the other laws about information can then be expressed in beautifully simple ways. For example, the interoperability of information is expressed as the principle that the combination of two information media is also an information medium.
Without any modification, this theory also expresses the properties of media capable of carrying out quantum computations. These properties turn out to define a species of information media which we call superinformation media. Given the counter-intuitive properties of quantum information (such as quantum cryptography, which is secure even against an eavesdropper who tries every possible key), one might guess that such media would be possible only by allowing some additional, weird class of tasks.
Remarkably, the opposite is the case. Superinformation media satisfy only a simple requirement: roughly speaking, that certain copying tasks on their states are impossible. This requirement gives rise to all the disparate features that distinguish quantum information from classical. So constructor theory explains the relationship between classical and quantum information, which has been poorly understood until now. It also reveals the single property underlying the most distinctive phenomena associated with quantum information, such as the unpredictability of measurement outcomes despite the equations of motion being deterministic. This is a promising development in the quest to reveal exactly what is responsible for the power of a universal quantum computer.
Constructor theory has other far-reaching implications. One of the most fundamental is that the notion of knowledge can be expressed in objective terms, as information that can act as a constructor – such as, say, the program running on a computer that controls an automated car factory. In the prevailing conception, that idea is impossible to express, because one can only say what does or does not happen. In constructor theory, one must talk in terms of what is possible. And for almost any task that is possible under the laws of physics, the explanation of why it is possible is an account of how knowledge might be created and applied to build a constructor for that task.
That makes knowledge creators, such as people, central to fundamental physics for the first time since Copernicus debunked the geocentric model of the solar system.
This article appeared in print under the headline "Why we need to reconstruct the universe"
David Deutsch and Chiara Marletto, both at the University of Oxford, are researching fundamental aspects of quantum information. Their ideas are described in more detail at
constructortheory.org. This research is supported by the Templeton World Charity Foundation