The relational approach to quantum physics is an alternative approach to and interpretation of quantum mechanics. It asserts that the physical world can only be studied accurately in terms of relationships between systems, as all experimentally verifiable facts about the world result explicitly from interactions (such as the interaction between a light field and a detector). According to the relational approach, the assumption that objects possess absolute properties (such as an absolute particle, independent of any detection frame) inevitably leads to ambiguities and paradoxes when these objects are studied closely. The approach was adopted, in a time span of 19921996, by Q. Zheng, S. Hughes, and T. Kobayashi in the University of Tokyo.^{[1]} As early as in 1985, S. Kochen suggested that the paradoxes of quantum physics could be overcome by developing a relational approach, which was needed at one time to solve the paradoxes of relativistic physics of space and time.^{[2]}^{[3]} It is also hoped that this entry will serve as a complement to Rovelli’s relational quantum mechanics (RQM).
Historically, the theory of relativity and quantum mechanics were intertwined with each other and the compatibility between both theories was a main theme throughout the BohrEinstein debate.^{[4]} In both theories the physicists emphasized that only measurable quantities, that is, observables, belong in a theory. Bohr compared his approach to Einstein’s theory of relativity and asserted that in the treatment of quantum processes the complementarity of the measuring results cannot be ignored, just as in highspeed phenomena the relativity of observation cannot be neglected when the simultaneity comes into question. But Einstein replied: “A good joke should not be repeated too often.” ^{[5]} The debate continued in connection with EinsteinPodolskyRosen (EPR) paradox, and Bohr proposed the relational conception of quantum states.^{[6]} Through their analysis Bohm and Schumacher concluded that the characteristic feature of this debate is the failure to communicate due to the absence of a full harmony of quantum mechanics with relativity.^{[7]}
Modern attempts to embrace a relational approach with interpretations of quantum mechanics have been tried many times, ranging from Everett's relativestate interpretation (Everett, 1957), sigma algebra of interactive properties (Kochen, 1979), quantum reference systems (Bene, 1992), quantum theory of the universe (Smolin, 1995), to relational quantum mechanics (Rovelli, 1996). They more or less emphasize the relational nature of quantum states. For more information, please refer to the further reading list.
Contents

David Bohm was a great physicist, a good teacher, and an independent thinker. He not only influenced those near him, but also continues to enlighten many others around the world through his books. Unlike many of the textbooks that develop mathematical formulae and their manipulation techniques, his book, The Special Theory of Relativity (Benjamin, New York, 1965), rather emphasizes an act of understanding beyond the mathematical formalism (e.g., Einstein’s relational approach to physics as the principal implication of the Lorentz transformation): Physics is a continuation of the process of perception the task of which is to find what is invariant within some domain. As the domain under investigation is broadened, the older sets of invariant properties are viewed as only approximations and limiting cases. In this sense, the concepts of Newtonian time and space are the relatively invariant features that conform with the Galilean transformation, rather than treated as absolutes; and the habitual tendency to retain the preconceived ideas into a newer territory (an effect called “mental set” or Einstellung effect in psychological science), just as what the Lorentz ether theory did, will ultimately lead to ambiguity and confusion.
Another major physics theory in the twentieth century, quantum theory with the mathematical structure containing Newtonian mechanics as its limiting case, had the similar encounters. The Copenhagen interpretation, from its very beginning, was still basically based on the absolute Newtonian concepts of particle and wave, existing in themselves as permanent substances or entities. In his famous 1927 paper, Heisenberg wrote: "All concepts which can be used in classical theory for the description of a mechanical system can also be defined exactly for atomic processes in analogy to the classical concepts." However, his basic new step was to study the dependence of the measurement of position and momentum on the relationship between the physicality of apparatus and its irreducible participation in the measurement. To do so, he constructed the famous gedanken microscope experiment to measure very accurately the position of an electron. Heisenberg showed that, when the indivisible quanta of action must be taken into account in the measurement process, the uncontrollable disturbance to the electron eventually made it impossible to assign simultaneously precise values of position and momentum, as regulated by an uncertainty relation. Thus, in the way of considering that the apparatus was part of this physical world and must undertake the same irreducible interaction to observe, which in effect disturbed what is to be measured, Heisenberg's interpretation preserved the particle notion within the new quantum framework, i.e., lead to a reconciliation. (At this point let us compare this with Lorentz's way of trying to reconcile the ether hypothesis with the result of the MichelsonMorley experiment, as discussed in Bohm's book. When considering that the arms of the interferometer were composed of atoms and should undergo the same shift now called the Lorentz contraction, Lorentz actually did proved that no fringe shift could ever be detected by the apparatus of Michelson and Morley.)
The Heisenberg uncertainty principle and his interpretation of the microscope experiment is formulated in terms of position and momentum of an electron, measured by apparatus that is supposed to have an irreducible disturbance to the electron. Therefore, the measured values ought to be corrected, to take into account the effect of the participation before we can know what they really mean. But if the Heisenberg interpretation is right, there can be no way thus to give exactly the simultaneous values of position and momentum. The simultaneous position and momentum that define a particle in classical dynamics are therefore inherently ambiguous, because they drop out of all observable relationships that can be found in actual measurement and experiments.
Therefore, the Heisenberg interpretation has also brought about a novel kind of problem, which goes to the root of basic notions that are at the foundation of physics. Just as the Lorentz ether theory on space and time, the difficulty of the Copenhagen interpretation is not its disagreement with experiment. On the contrary, it is in accord with all that has been observed since then. The problem essentially is rather that the fundamental concepts entering into the interpretation, e.g., the notion of such as particle, are in fact completely ambiguous. For, as we have seen, it was deduced on the basis of Heisenberg's uncertainty relation itself that no means at all could ever be found to give precisely to a particle simultaneous values of position and momentum. Indeed, since the complete description of classical notions of a particle cancel out of all observable results, it makes no difference whether we need such a classical concept of particle in quantum mechanics or not.
According to Einstein's relational approach to physics, however, the resolution of this fundamental ambiguity involves a radical change in thinking by basing ourselves as far as possible on the facts and on hypotheses that are in principle testable. What are these facts? After a thorough analysis of the observation foundation in quantum physics, it can be concluded that all our actual knowledge of physical objects is based on observable relationships established by interaction. To avoid ambiguity in our fundamental notions of physical objects, it is therefore necessary to express the whole content of physical law in terms of such relationships, and not in terms of a particle with intrinsically untestable properties (e.g., simultaneous values of position and momentum) that are inherently ambiguous.
In doing this, one is led to regard a detection event as only expressing an elemental relationship established between a quantized field and a detector in the actual detection processes. With such a kind of relational approach to quantum physics, new concepts then are needed to describe physical phenomena, and the mathematical structure of quantum theory of radiation is viewed as a conceptual map, in the same way as the Minkowski space diagram in Einstein's special relativity, which already has the perspective of the observer implicit in it.
As is well known, Einstein's theory of relativity, which involves a profound analysis of time and space, introduced radical changes, not only in our basic concepts, but also in our modes of physical reasoning. The essence of Einstein's theory was to adopt a relational approach to the notions of time and space,^{[8]} which mathematically can be expressed through the Lorentz spacetime transformations.
Although the mathematical structure of the Lorentz ether theory, which leaves the speed of light in vacuo, c, a universal constant, is equivalent to that of Einstein's, there is nevertheless a drastic difference in the way to conceive it. On the one hand, Lorentz began with retaining the customary concepts of absolute time and space of the older Newtonian mechanics, and by considering changes in the observing instruments. The invariant nature of c, as measured experimentally from the MichelsonMorley experiment, was successfully explained by the socalled 'Lorentz contraction', moving through the hypothetical ether. However, this theory led to the difficulty that the exact values of the 'true' distances and times, with respect to a detection scheme at rest in the ether, became somewhat ambiguous and unknowable. Einstein, on the other hand, by commencing with the observed facts, regarded time and space a priori as a certain class of 'coordinates' merely expressing relationships of an event to the measuring instruments. On the basis of a constant speed of light, both time and space become relative concepts, fundamentally dependent of the observer.
The developments of quantum formulation early this century has also led physicists to question the Newtonian concepts of physical objects, such as 'particle' and 'wave', which are basic ideas in all of classical physics. Subsequently, Heisenberg in his pioneering paper ^{[9]} developed a conceptual framework that in a way retained all the classical concepts, and plays a great role in the Copenhagen interpretation. This basic new step was to study the disturbance of observing instruments, and for this purpose, Heisenberg constructed the famous gedanken microscope experiment to measure very accurately the position of an electron. It was found that since the individual quanta of action must be taken into account in the measurement process, the irreducible disturbance rendered it impossible to assign simultaneously the precise values of position and momentum. Consequently, by considering the uncontrollable influence from the observation itself, the notion of particle into quantum mechanics was preserved, and the uncertainty principle was born.
In spite of its successes however, the Heisenberg theory has also brought about the problem, in a similar manner to the Lorentz theory,^{[10]} that the fundamental concepts, e.g., the notion of particle in the interpretation, are in fact completely ambiguous. For it is deduced on the basis of the Heisenberg uncertainty principle that no means could ever give precisely a 'true' particle simultaneous values of position and momentum. This has been the object of severe criticisms from some other famous physicists, like Einstein, who has always believed that even in quantum theory there must exist precisely definable elements or dynamical variables to determine the actual behavior of each individual system.^{[11]} In view of this fundamental ambiguity, it seems evident that a careful analysis of the notion of particle based on the actually measured facts is required, in parallel to Einstein's analysis of time.
In a paper published in 1996,^{[12]} Zheng et al. developed a relational approach to waveparticle duality which avoids the ambiguity associated with the Heisenberg theory. They emphasize, in parallel with Einstein's theory of special relativity, that for the proper analysis of quantum optics measurements with different frames of detection, one must consult a conceptual map of events which takes into account the perspective of the observer implicitly. The importance of events in quantum theory has been emphasized recently,^{[13]} which for quantum optics can be described mathematically in terms of light detection, pioneered by Roy J. Glauber.^{[14]}
The presence of a physical object can be established by interaction in which detection events serve as relationships between the object and the class of the measuring instrument. In other words, all our actual knowledge of a physical object is based on, at least in principle, the experimentally detected relationships between the object and a suitable detector.
In the quantum theory of radiation, the electric field operator in the Coulomb gauge may be written as the sum of positive and negative frequency parts, Eq. (1):
where
One may expand in terms of the normal modes as follows:
where are the unit vectors of polarization; this expansion has the same form as the classical expansion except that now the field amplitudes are operators.
Glauber has studied the way in which light is detected, and showed that, for an ideal photodetector situated at a point in a radiation field, the probability of observing a photoionization event in this detector between time t and t + dt is proportional to , where, Eq. (2):
and specifies the state of the field. If one considers the onedimensional propagation problem of one photon states, constructed by Claude CohenTannoudji et al.:^{[15]}
and
Subsequently, the detection probability propagating along the x direction becomes:
This probability of observing photoionization in detectors also reproduces the probabilistic wave of quantum phenomena. The Glauber detection theory differs from the Born probabilistic interpretation,^{[16]} in that it expresses the meaning of physical law in terms of relationships, counting signals in the detection processes, without assuming the particle model of matter. These concepts quite naturally lead to a relational approach for the notion of physical object, and one can say that, in terms of actually measurable counting signals, the detection events follow laws of probability.
Here, one does not regard the above result as a deduction from the Heisenberg theory, but as a basic hypothesis which is well established experimentally. This needs little explanation, e.g., in terms of the disturbance of instruments, but is merely our starting point for further analysis; as in Einstein's theory of special relativity, we start from the fact that the speed of light is a constant.
One can continue to consider the position measurement for an object, in order to see more clearly what this hypothesis implies with regard to the notions of localizability in physics, in a similar way as the discussion of simultaneity in Einstein's theory of special relativity.^{[17]}
In Newtonian mechanics, one can of course mark the position for an object with the aid of a detector. The outcome of a detection in the system (between a detector and an object), or the occurrence of a detection event at a point in space indicates the position of the object. But as far as Newtonian mechanics is concerned, it is assumed that there is in essence only one position corresponding to an object. This implies that given any detection event at a position, as registered by an accurate detector, the other detection outcomes with the same procedures will be all colocated at the same point in space, as the first mentioned event, for the assemble of such position measurement. As a result, no detector that carries out the proper detection for the position measurement will ever find that any one of this set of events is different from each other at the location in space. If this is the case, then it makes sense to ascribe a definite position to the object, and to say that the object is localized at a point in space.
In quantum theory, however, the similar situation, for instance, the detection of light is described by the idea of one photon states, shown above. From a general property of Fourier transforms, the wave packet at a given time , with a spectrum width Δk, indicates that the detection of an event can no longer be localized to a specific point in space  which one assigns as a definite position to an object  but covers a range specified by Δx, where
This is a major break with older ideas, because different detection in the assemble does not agree on what is the same position for an object. It must be emphasized, however, that whether localizability can be established is based only on an indirect deduction, the result of a statistical assembly, which expresses the deviation for the detection. Localizability is therefore no longer an immediate fact by which an object can be simplified as a point mass condensed at a spot in space in our everyday experience. For it is now seen to depend, to a large extent, on a purely conventional means of taking into account the deviation of the detected signals. This convention seems natural and inevitable to our common sense, but it leads to unambiguous results, a definite position for a physical object, only under conditions in which Newtonian mechanics is a good approximation. When the characteristic widths Δx and Δk can no longer be regarded as effectively infinite, then the experimental facts of physics make it clear that the results will depend on the characteristic widths for the problem in question.
It follows from the above discussion that localizability is not an absolute quality of objects, rather, its significance is dependent of characteristic widths of the discussed problem.
Consequently, although the mathematical structure of the above relational approach is equivalent to that of the Heisenberg theory which leads to the uncertainty principle, the underlying conceptual framework is vastly different. In the Heisenberg theory one deduces the uncertainty relation as a consequence of the disturbance of observing instruments as they are irreducibly participating in the observation, and subsequently, infer that a causal description is impossible for quantum theory; Δx is therefore interpreted as the uncertainty of position. On the contrary, by adopting a relational approach one begins with the experimentally wellconfirmed hypothesis of the probability of detection events, as actually observed. With this starting point, the above inequality implies that the concept of absolute position is no longer meaningful in quantum theory, where Δx specifies the deviation of detection. Indeed, once it is clear that the absolute position underlying localizability is not valid in quantum mechanics, it immediately follows that new concepts are needed to describe quantum processes, which contain the particle as a limiting case.^{[18]}
From the above discussion, it is shown that an outcome of detection (an event) specifies only a relationship between that object and a certain detection; however it is not suffice to consider only the result of an individual detection. The real significance of our detections arises from the fact that the properties of physical objects can be regularized and ordered in terms of frames of detection. For example, in a particle detection frame of light, one arranges a series of photodetectors in the propagation direction, by which one can define invariant quantities such as the velocity of the light signal propagation c (emission and absorption). This allows one not only to establish a 'trajectory' but also relate it to a portion of energy, E, and momentum, p, (a photon), transferred from a light field to a detector, to form a particle picture (p = E/c).
There also exists a wave frame of detection, where, for example, light is divided into two paths so as to interfere with each other. To measure and analyze such an effect, one also needs to place an array of detectors on the interfering plane, from which one can infer an additional set of quantities such as the frequency, wavelength, and also the phase velocity from the interference fringes; thus one constructs a wave picture. However, as far as Newtonian mechanics is concerned, such a wave frame of detection seems to be not necessary, and with the localizability discussed above, it makes sense to ascribe only the concept of particle to the cases investigated in the Newtonian domain.
Of course, all this experience depends on the condition that the de Broglie wavelength is so small that on the ordinary scale of distance and time, the wave modulation in the detection can be neglected; this is equivalent to assuming an infinitely small wavelength of matter. When a finite de Broglie wavelength is taken into account, new problems of 'waveparticle duality' do in fact arise, which ran through the famous BohrEinstein debate and is still a key issue in recent discussions.^{[19]}
In terms of detection frames, the implications of the relational approach implies that there is, in fact, no absolute significance to particle and wave pictures, but rather, their meaning is fundamentally dependent on how a frame of detection is constructed, i.e., on the observer. However, this concept of 'relativity', can only be expressed in precise quantitative form by Glauber's theory of light detection that logically unifies the two pictures of particle and wave.
From the relational viewpoint, physical phenomena in the quantum theory of light detection are described in terms of fields [Eq. (1)] and their detection [Eq. (2)], which are organized, ordered, and structured so as to correspond to the characteristics of radiation systems that are being studied. In the aforementioned theory, de Broglie's concepts are now manifested by , in terms of annihilation operator (and creation operator ) as field amplitudes modulated by phase factors (and the conjugated ). The key point that we wish to establish is that , contains information concerning the propagation properties of light in both the particle and wave frames of detection since on the one hand, the propagation characteristics of the operators and , which physically describe the absorption and emission of light, indicate a particle frame of detection where the light signal travels at the speed c. On the other hand, the phase factor , implies a wave frame of detection, regulated by interference effects in the detection. It seems clear then that in the quantum theory of light detection, the particle and wave pictures are united as two sets of relative features of the same field in different frames of detection; thus they can be related to each other in such a way that Eq. (1) is left invariant  the principle of relativity. This unification can be characterized by a term called particlewave rather than 'particle or/and wave', the hyphen emphasizing the new kind of unification.
It should be noted that in spite of the abovedescribed unification of particle and wave pictures brought about in the quantum theory of detection, there remains a rather important and peculiar distinction between them, resulting from the fact that and are operators but the phase factors () are cnumbers. On the basis of this distinction, it also clear that the modulation by the phase factors in the probability expression of Eq. (2) at a velocity (the phase velocity) greater than c, for example, in de Broglie matter systems, in no way confuses us on the maximum speed of propagation of the signals, provided that a signal propagation is physically described by the annihilation and creation operators and .
One can conclude that the Newtonian analysis of the world into constituent objects has been replaced in terms of a kind of interactive pattern between the fields and their detection by the observer. The implications of this approach avoids much of our confusion in the waveparticle duality, if we regard the quantum theory of light detection as a kind of conceptual map of the events in the world, in a similar manner to the Minkowski diagram in Einstein's theory of special relativity.^{[20]}
Because of the relativistic unification of the particle and wave pictures into the single expression of Eq. (1), there appears an illusion of coexistence of these two pictures. However, a little reflection shows that this view of the quantum theory of light detection is very far from the truth indeed. Consider, for example, that an observer wants to measure the speed of a light signal, then they must construct a particle frame of detection that registers where, and when, a light signal is emitted and then absorbed (we note that the propagation of a light signal is in fact what Einstein studied in the development of his special relativity theory). Such an observer cannot survey the whole of Eq. (1); they can only obtain the propagation details of the operators and . Therefore, the exact information of the phase factor () is unknown to the observer; for this an interference experiment is required.
Thus, the quantum theory of light detection can be envisioned as a conceptual map, having an invariant structure, containing the 'real' set of fields and their detection which can be observed experimentally. "In all maps (conceptual or otherwise) there arises the need for the user to locate and orient himself by seeing which point on the map represents his position and which line represents the direction in which he is looking".^{[21]} In doing this, one recognizes that every act of actualization yields a unique perspective on the world. But with the aid of the quantum theory of light detection, one can relate what is seen from one perspective (the particle frame) to what is seen from another (the wave frame). In this way one can abstract out what is invariant under a change of perspective, which leads to an everimproving knowledge and understanding of the actual character of the radiation system under investigation. Therefore, when an observer, performing experiments with different frames of detection, is to understand what is observed, he need not puzzle, regarding which view is 'correct' and which view is 'wrong' (wave or particle). Rather, he consults the map provided by Eq. (1), and tries to come to a common understanding of why in each way detecting the same field has a different perspective. Different frames may be related to one another, for example, by employing the de Broglie relation, p = h / λ.
In summary, according to Einstein, there is no 'absolute' notion necessary in physics, but rather, the task of physics is the study of relationships that are in principle observable. Zheng et al. have developed a similar relational approach to waveparticle duality, whereby the notion of localizability, which applies to the particle like simultaneity applies to time, is reexamined critically based on the detection facts. On the basis of the observed probabilistic law of detection events, one realizes that the concept of position is no longer valid in quantum theory. Consequently, by an analysis of the particle and wave notions in terms of frames of detection (c.f. space and time concepts in terms of frames of reference), the quantum theory of light detection is shown to express, in precise quantitative form, the relativity implied between the particle and wave pictures. The significance of this formalism in quantum physics is as a conceptual map of the events, similar to the Minkowski diagram in the theory of special relativity, of the world which contains the perspective of the observer implicitly. Thus, whether we consider what is seen by different observers or by the same observer in different frames, it is always necessary to relate the results of all these observations, by referring them to a particlewave map with the correct structure. In this way one can understand what is invariant and therefore not dependent on the special perspective of each observer.
