In physics and cosmology, digital physics is a collection of theoretical perspectives that start by assuming that the universe is, at heart, describable by information, and is therefore computable. Given such assumptions, the universe can be conceived as either the output of some computer program or as being some sort of vast digital computation device (or, at least, mathematically isomorphic to such a device).
Digital physics is grounded in one or more of the following hypotheses, listed in order of increasing boldness. The universe, or reality, is:
Contents 
Every computer must obviously be compatible with the principles of information theory, statistical thermodynamics, and quantum mechanics. A fundamental link among these fields was proposed by Edwin Jaynes in two seminal 1957 papers.^{[1]} Moreover, Jaynes elaborated an interpretation of probability theory as generalized Aristotelian logic, a view very convenient for linking fundamental physics with digital computers, because these are designed to implement the operations of classical logic and, equivalently, of Boolean algebra.^{[2]}
The hypothesis that the universe is a digital computer was pioneered by Konrad Zuse in his book Rechnender Raum (translated into English as Calculating Space). The term digital physics was first employed by Edward Fredkin, who later came to prefer the term digital philosophy.^{[3]} Others who have modeled the universe as a giant computer include Stephen Wolfram,^{[4]} Juergen Schmidhuber,^{[5]} and Nobel laureate Gerard 't Hooft.^{[6]} These authors hold that the apparently probabilistic nature of quantum physics is not necessarily incompatible with the notion of computability. Quantum versions of digital physics have recently been proposed by Seth Lloyd,^{[7]} David Deutsch, and Paola Zizzi.^{[8]}
Related ideas include Carl Friedrich von Weizsäcker's binary theory of uralternatives, pancomputationalism, computational universe theory, John Archibald Wheeler's "It from bit", and Max Tegmark's ultimate ensemble.
Digital physics suggests that there exists, at least in principle, a program for a universal computer which computes the evolution of the universe in real time. The computer could be, for example, a huge cellular automaton (Zuse 1967), or a universal Turing machine, as suggested by Schmidhuber (1997), who pointed out that there exists a very short program that can compute all possible computable universes in an asymptotically optimal way.
Some try to identify single physical particles with simple bits. For example, if one particle, such as an electron, is switching from one quantum state to another, it may be the same as if a bit is changed from one value (0, say) to the other (1). A single bit suffices to describe a single quantum switch of a given particle. As the universe appears to be composed of elementary particles whose behavior can be completely described by the quantum switches they undergo, that implies that the universe as a whole can be described by bits. Every state is information, and every change of state is a change in information (requiring the manipulation of one or more bits). Setting aside dark matter and dark energy, which are poorly understood at present, the known universe consists of about 10^{80} protons and the same number of electrons. Hence, the universe could be simulated by a computer capable of storing and manipulating about 10^{90} bits. If such a simulation is indeed the case, then hypercomputation would be impossible.
Loop quantum gravity could lend support to digital physics, in that it assumes spacetime is quantized. Paola Zizzi has formulated a realization of this concept in what has come to be called "computational loop quantum gravity", or CLQG^{[9]}^{[10]}. Other theories that combine aspects of digital physics with loop quantum gravity are those of Marzuoli and Rasetti^{[11]}^{[12]} and Girelli and Livine^{[13]}.
Physicist Carl Friedrich von Weizsäcker's theory of uralternatives was first proposed in his book Einheit der Natur (1971) (translated into English in 1980 as The Unity of Nature) and further developed in his "Zeit und Wissen" (1992). This theory is a kind of digital physics, as it axiomatically constructs quantum physics from the distinction between empirically observable, binary alternatives. Weizsäcker used his theory to derive the 3dimensionality of space and to estimate the entropy of a proton falling into a black hole.
Pancomputationalism (also known as Pancomputationalism, Naturalist computationalism) is a view that the universe is a huge computational machine or rather a network of computational processes which following fundamental physical laws compute (dynamically develop) its own next state from the current one.^{[14]}
For instance, in his book, Programming the Universe, Seth Lloyd contends that the universe itself is one big quantum computer producing what we see around us, and ourselves, as it runs a cosmic program. According to Lloyd, once we understand the laws of physics completely, we will be able to use smallscale quantum computing to understand the universe completely as well.
A computational universe is also proposed by Jürgen Schmidhuber in a paper based on Konrad Zuse's assumption (1967) that the history of the universe is computable. He pointed out that the simplest explanation of the universe would be a very simple Turing machine programmed to systematically execute all possible programs computing all possible histories for all types of computable physical laws. He also pointed out that there is an optimally efficient way of computing all computable universes based on Leonid Levin's universal search algorithm (1973). In 2000 he expanded this work by combining Ray Solomonoff's theory of inductive inference with the assumption that quickly computable universes are more likely than others. This work on digital physics also led to limitcomputable generalizations of algorithmic information or Kolmogorov Complexity and the concept of Super Omegas, which are limitcomputable numbers that are even more random (in a certain sense) than Gregory Chaitin's number of wisdom Omega.
Following Jaynes and Weizsäcker, the physicist John Archibald Wheeler wrote the following:
It is not unreasonable to imagine that information sits at the core of physics, just as it sits at the core of a computer.
It from bit. Otherwise put, every 'it'—every particle, every field of force, even the spacetime continuum itself—derives its function, its meaning, its very existence entirely—even if in some contexts indirectly—from the apparatuselicited answers to yesorno questions, binary choices, bits. 'It from bit' symbolizes the idea that every item of the physical world has at bottom—a very deep bottom, in most instances—an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes–no questions and the registering of equipmentevoked responses; in short, that all things physical are informationtheoretic) in origin and that this is a participatory universe. (John Archibald Wheeler 1990: 5)
David Chalmers of the Australian National University summarised Wheeler's views as follows:
Wheeler (1990) has suggested that information is fundamental to the physics of the universe. According to this 'it from bit' doctrine, the laws of physics can be cast in terms of information, postulating different states that give rise to different effects without actually saying what those states are. It is only their position in an information space that counts. If so, then information is a natural candidate to also play a role in a fundamental theory of consciousness. We are led to a conception of the world on which information is truly fundamental, and on which it has two basic aspects, corresponding to the physical and the phenomenal features of the world.^{[15]}
Chris Langan also builds upon Wheeler's views in his epistemological metatheory:
The Future of Reality Theory According to John Wheeler: In 1979, the celebrated physicist John Wheeler, having coined the phrase “black hole”, put it to good philosophical use in the title of an exploratory paper, Beyond the Black Hole, in which he describes the universe as a selfexcited circuit. The paper includes an illustration in which one side of an uppercase U, ostensibly standing for Universe, is endowed with a large and rather intelligentlooking eye intently regarding the other side, which it ostensibly acquires through observation as sensory information. By dint of placement, the eye stands for the sensory or cognitive aspect of reality, perhaps even a human spectator within the universe, while the eye’s perceptual target represents the informational aspect of reality. By virtue of these complementary aspects, it seems that the universe can in some sense, but not necessarily that of common usage, be described as “conscious” and “introspective”…perhaps even “infocognitive”.^{[16]}
The first formal presentation of the idea that information might be the fundamental quantity at the core of physics seems to be due to Frederick W. Kantor (a physicist from Columbia University). Kantor's book Information Mechanics (WileyInterscience, 1977) developed this idea in detail, but without mathematical rigor.
Not every informational approach to physics (or ontology) is necessarily digital. According to Luciano Floridi,^{[17]} "informational structural realism" is a variant of structural realism that supports an ontological commitment to a world consisting of the totality of informational objects dynamically interacting with each other. Such informational objects are to be understood as constraining affordances.
Digital ontology and pancomputationalism are also independent positions. In particular, John Wheeler advocated the former but was silent about the latter; see the quote in the preceding section.
On the other hand, pancomputationalists like Lloyd (2006), who models the universe as a quantum computer, can still maintain an analogue or hybrid ontology; and informational ontologists like Sayre and Floridi embrace neither a digital ontology nor a pancomputationalist position.^{[18]}
Theoretical computer science is founded on the Turing machine, an imaginary computing machine first described by Alan Turing in 1936. While mechanically simple, the ChurchTuring thesis implies that a Turing machine can solve any "reasonable" problem. (In theoretical computer science, a problem is considered "solvable" if it can be solved in principle, namely in finite time, which is not necessarily a finite time that is of any value to humans.) A Turing machine therefore sets the practical "upper bound" on computational power, apart from the possibilities afforded by hypothetical hypercomputers.
Wolfram's principle of computational equivalence powerfully motivates the digital approach. This principle, if correct, means that everything can be computed by one essentially simple machine, the realization of a cellular automaton. This is one way of fulfilling a traditional goal of physics: finding simple laws and mechanisms for all of nature.
Digital physics is falsifiable in that a less powerful class of computers cannot simulate a more powerful class. Therefore, if our universe is a gigantic simulation, that simulation is being run on a computer at least as powerful as a Turing machine. If humans succeed in building a hypercomputer, then a Turing machine cannot have the power required to simulate the universe. Hence, progress in quantum computation may have implications for all of physical theory, including cosmology.
The classic ChurchTuring thesis claims that any computer as powerful as a Turing machine can, in principle, calculate anything that a human can calculate, given enough time. A stronger version, not attributable to Church or Turing,^{[19]} claims that a universal Turing machine can compute anything whatsoever, so that it is not possible to build a "superTuring computer" called a hypercomputer. But the limits of practical computation are set by physics, not by theoretical computer science:
"Turing did not show that his machines can solve any problem that can be solved 'by instructions, explicitly stated rules, or procedures', nor did he prove that the universal Turing machine 'can compute any function that any computer, with any architecture, can compute'. He proved that his universal machine can compute any function that any Turing machine can compute; and he put forward, and advanced philosophical arguments in support of, the thesis here called Turing's thesis. But a thesis concerning the extent of effective methods—which is to say, concerning the extent of procedures of a certain sort that a human being unaided by machinery is capable of carrying out—carries no implication concerning the extent of the procedures that machines are capable of carrying out, even machines acting in accordance with 'explicitly stated rules.' For among a machine's repertoire of atomic operations there may be those that no human being unaided by machinery can perform." ^{[20]}
On the other hand, if two further conjectures are made, along the lines that:
the resulting compound principle does bring practical computation within Turing's limits.
As David Deutsch puts it:
"I can now state the physical version of the ChurchTuring principle: 'Every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means.' This formulation is both better defined and more physical than Turing's own way of expressing it."^{[21]} (Emphasis added)
This compound conjecture is sometimes called the "strong ChurchTuring thesis" or the Church–Turing–Deutsch principle.
The critics of digital physics—including physicists^{[citation needed]} who work in quantum mechanics—object to it on a several grounds.
One objection is that extant models of digital physics are incompatible^{[citation needed]} with the existence of several continuous characters of physical symmetries, e.g., rotational symmetry, translational symmetry, Tsymmetry, Lorentz symmetry, and electroweak symmetry, all central to current physical theory.
Proponents of digital physics claim that such continuous symmetries are only convenient (and very good) approximations of a discrete reality. For example, the reasoning leading to systems of natural units and the conclusion that the Planck length is a minimum meaningful unit of distance suggests that at some level space itself is quantized^{[22]}.
Some argue^{[citation needed]} that extant models of digital physics violate various postulates of quantum physics. For example, if these models are not grounded in Hilbert spaces and probabilities, they belong to the class of theories with local hidden variables that some deem ruled out experimentally using Bell's theorem. This criticism has two possible answers. First, any notion of locality in the digital model does not necessarily have to correspond to locality formulated in the usual way in the emergent spacetime. A concrete example of this case was recently given by Lee Smolin.^{[23]} Another possibility is a wellknown loophole in Bell's theorem known as superdeterminism (sometimes referred to as predeterminism).^{[24]} In a completely deterministic model, the experimenter's decision to measure certain components of the spins is predetermined. Thus, the assumption that the experimenter could have decided to measure different components of the spins than he actually did is, strictly speaking, not true.
It has been argued^{[citation needed]} that digital physics, grounded in the theory of finite state machines and hence discrete mathematics, cannot do justice to a physical theory whose mathematics requires the real numbers, which is the case for all physical theories having any credibility.
But computers can manipulate and solve formulas describing real numbers using Symbolic computation, thus avoiding the need to approximate real numbers by using an infinite number of digits. Before symbolic computation, a number—in particular a real number, one with an infinite number of digits— was said to be computable if a Turing machine will continue to spit out digits endlessly. In other words, there is no "last digit". But this sits uncomfortably with any proposal that the universe is the output of a virtualreality exercise carried out in real time (or any plausible kind of time). Known physical laws (including quantum mechanics and its continuous spectra) are very much infused with real numbers and the mathematics of the continuum.
"So ordinary computational descriptions do not have a cardinality of states and state space trajectories that is sufficient for them to map onto ordinary mathematical descriptions of natural systems. Thus, from the point of view of strict mathematical description, the thesis that everything is a computing system in this second sense cannot be supported".^{[25]}
Moreover, the universe seems to be able decide on their values in real time, moment by moment. As Richard Feynman put it:
"It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of space/time is going to do?"^{[26]}
He then answered his own question as follows:
"So I have often made the hypothesis that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the checker board with all its apparent complexities. But this speculation is of the same nature as those other people make—'I like it,' 'I don't like it'—and it is not good to be prejudiced about these things".^{[26]}
