MANY WORLDS FAQ
Q0 Why this FAQ? Q1 Who believes in many-worlds? Q2 What is many-worlds? Q3 What are the alternatives to many-worlds? Q4 What is a "world"? Q5 What is a measurement? Q6 Why do worlds split? What is decoherence? Q7 When do worlds split? Q8 When does Schrodinger's cat split? Q9 What is sum-over-histories? Q10 What is many-histories? What is the environment basis? Q11 How many worlds are there? Q12 Is many-worlds a local theory? Q13 Is many-worlds a deterministic theory? Q14 Is many-worlds a relativistic theory? What about quantum field theory? What about quantum gravity? Q15 Where are the other worlds? Q16 Is many-worlds (just) an interpretation? Q17 Why don't worlds fuse, as well as split? Do splitting worlds imply irreversible physics? Q18 What retrodictions does many-worlds make? Q19 Do worlds differentiate or split? Q20 What is many-minds? Q21 ockham Q22 Does many-worlds violate conservation of energy? Q23 How do probabilities emerge within many-worlds? Q24 Does many-worlds allow free-will? Q25 Why am I in this world and not another? Why does the universe appear random? Q26 Can wavefunctions collapse? Q27 Is physics linear? Could we ever communicate with the other worlds? Why do I only ever experience one world? Why am I not aware of the world (and myself) splitting? Q28 Can we determine what other worlds there are? Is the form of the Universal Wavefunction knowable? Q29 Who was Everett? Q30 What are the problems with quantum theory? Q31 What is the Copenhagen interpretation? Q32 Does the EPR experiment prohibit locality? What about Bell's Inequality? Q33 Is Everett's relative state formulation the same as many-worlds? Q34 What is a relative state? Q35 Was Everett a "splitter"? Q36 What unique predictions does many-worlds make? Q37 Could we detect other Everett-worlds? Q38 Why quantum gravity? Q39 Is linearity exact? Q41 Why can't the boundary conditions be updated to reflect myobservations in this one world?
A1References and further reading A2 Quantum mechanics and Dirac notation
This FAQ shows how quantum paradoxes are resolved by the "many-worlds" interpretation or metatheory of quantum mechanics. This FAQ does not seek to prove that the many-worlds interpretation is the "correct" quantum metatheory, merely to correct some of the common errors and misinformation on the subject floating around.
As a physics undergraduate I was struck by the misconceptions of my tutors about many-worlds, despite that it seemed to resolve all the paradoxes of quantum theory [A]. The objections raised to many-worlds were either patently misguided [B] or beyond my ability to assess at the time [C], which made me suspect (confirmed during my graduate QFT studies) that the more sophisticated rebuttals were also invalid. I hope this FAQ will save other investigators from being lead astray by authoritative statements from mentors.
I have attempted, in the answers, to translate the precise mathematics of quantum theory into woolly and ambiguous English - I would appreciate any corrections. In one or two instances I couldn't avoid using some mathematical (Dirac) notation, in particular in describing the Einstein- Podolsky-Rosen (EPR) experiment and Bell's Inequality and in showing how probabilities are derived, so I've included an appendix on the Dirac notation.
[A] See"Does the EPR experiment prohibit locality?", "What about Bell's Inequality?" and "When does Schrodinger's cat split?" for how many- worlds handles the most quoted paradoxes.
[B] Sample objection: "Creation of parallel universes violates energy conservation/Ockham's razor". (See"Does many-worlds violate conservation of energy?" and "What about quantum field theory?")
"Political scientist" L David Raub reports a poll of 72 of the "leading cosmologists and other quantum field theorists" about the "Many-Worlds Interpretation" and gives the following response breakdown [T].
1) "Yes, I think MWI is true" 58% 2) "No, I don't accept MWI" 18% 3) "Maybe it's true but I'm not yet convinced" 13% 4) "I have no opinion one way or the other" 11%
Amongst the "Yes, I think MWI is true" crowd listed are Stephen Hawking and Nobel Laureates Murray Gell-Mann and Richard Feynman. Gell-Mann and Hawking recorded reservations with the name "many-worlds", but not with the theory's content. Nobel Laureate Steven Weinberg is also mentioned as a many-worlder, although the suggestion is not when the poll was conducted, presumably before 1988 (when Feynman died). The only "No, I don't accept MWI" named is Penrose.
The findings of this poll are in accord with other polls, that many- worlds is most popular amongst scientists who may rather loosely be described as string theorists or quantum gravitists/cosmologists. It is less popular amongst the wider scientific community who mostly remain in ignorance of it.
More detail on Weinberg's views can be found in _Dreams of a Final Theory_ or _Life in the Universe_ Scientific American (October 1994), the latter where Weinberg says about quantum theory:
"The final approach is to take the Schrodinger equation seriously [..description of the measurement process..] In this way, a measurement causes the history of the universe for practical purposes to diverge into different non-interfering tracks, one for each possible value of the measured quantity. [...] I prefer this last approach"
In the The Quark and the Jaguar and Quantum Mechanics in the Light of Quantum Cosmology [10] Gell-Mann describes himself as an adherent to the (post-)Everett interpretation, although his exact meaning is sometimes left ambiguous.
Steven Hawking is well known as a many-worlds fan and says, in an article on quantum gravity [H], that measurement of the gravitational metric tells you which branch of the wavefunction you're in and references Everett.
Feynman, apart from the evidence of the Raub poll, directly favouring the Everett interpretation, always emphasized to his lecture students [F] that the "collapse" process could only be modelled by the Schrodinger wave equation (Everett's approach).
[F] Jagdish Mehra The Beat of a Different Drum: The Life and Science Richard Feynman
[H] Stephen W Hawking Black Holes and Thermodynamics Physical Review D Vol 13 #2 191-197 (1976)
[T] Frank J Tipler The Physics of Immortality 170-171
AKA as the Everett, relative-state, many-histories or many-universes interpretation or metatheory of quantum theory. Dr Hugh Everett, III, its originator, called it the "relative-state metatheory" or the "theory of the universal wavefunction" [1], but it is generally called "many- worlds" nowadays, after DeWitt [4a],[5].
Many-worlds comprises of two assumptions and some consequences. The assumptions are quite modest:
1) The metaphysical assumption: That the wavefunction does not merely encode the all the information about an object, but has an observer-independent objective existence and actually is the object. For a non-relativistic N-particle system the wavefunction is a complex-valued field in a 3-N dimensional space.
2) The physical assumption: The wavefunction obeys the empirically derived standard linear deterministic wave equations at all times. The observer plays no special role in the theory and, consequently, there is no collapse of the wavefunction. For non-relativistic systems the Schrodinger wave equation is a good approximation to reality. (See"Is many-worlds a relativistic theory?" for how the more general case is handled with quantum field theory or third quantisation.)
The rest of the theory is just working out consequences of the above assumptions. Measurements and observations by a subject on an object are modelled by applying the wave equation to the joint subject-object system. Some consequences are:"What is decoherence?") The histories form a branching tree which encompasses all the possible outcomes of each interaction. (See "Why do worlds split?" and "When do worlds split?") Every historical what-if compatible with the initial conditions and physical law is realised.
1) That each measurement causes a decomposition or decoherence of the universal wavefunction into non-interacting and mostly non- interfering branches, histories or worlds. (See
2) That the conventional statistical Born interpretation of the amplitudes in quantum theory is derived from within the theory rather than having to be assumed as an additional axiom. (See"How do probabilities emerge within many-worlds?")
Many-worlds is a re-formulation of quantum theory [1], published in 1957 by Dr Hugh Everett III [2], which treats the process of observation or measurement entirely within the wave-mechanics of quantum theory, rather than an input as additional assumption, as in the Copenhagen interpretation. Everett considered the wavefunction a real object. Many-worlds is a return to the classical, pre-quantum view of the universe in which all the mathematical entities of a physical theory are real. For example the electromagnetic fields of James Clark Maxwell or the atoms of Dalton were considered as real objects in classical physics. Everett treats the wavefunction in a similar fashion. Everett also assumed that the wavefunction obeyed the same wave equation during observation or measurement as at all other times. This is the central assumption of many-worlds: that the wave equation is obeyed universally and at all times.
Everett discovered that the new, simpler theory - which he named the "relative state" formulation - predicts that interactions between two (or more) macrosystems typically split the joint system into a superposition of products of relative states. The states of the macrosystems are, after the subsystems have jointly interacted, henceforth correlated with, or dependent upon, each other. Each element of the superposition - each a product of subsystem states - evolves independently of the other elements in the superposition. The states of the macrosystems are, by becoming correlated or entangled with each other, impossible to understand in isolation from each other and must be viewed as one composite system. It is no longer possible to speak the state of one (sub)system in isolation from the other (sub)systems. Instead we are forced to deal with the states of subsystems relative to each other. Specifying the state of one subsystem leads to a unique specification of the state (the "relative state") of the other subsystems. (See"What is a relative state?")
If one of the systems is an observer and the interaction an observation then the effect of the observation is to split the observer into a number of copies, each copy observing just one of the possible results of a measurement and unaware of the other results and all its observer- copies. Interactions between systems and their environments, including communication between different observers in the same world, transmits the correlations that induce local splitting or decoherence into non- interfering branches of the universal wavefunction. Thus the entire world is split, quite rapidly, into a host of mutually unobservable but equally real worlds.
According to many-worlds all the possible outcomes of a quantum interaction are realised. The wavefunction, instead of collapsing at the moment of observation, carries on evolving in a deterministic fashion, embracing all possibilities embedded within it. All outcomes exist simultaneously but do not interfere further with each other, each single prior world having split into mutually unobservable but equally real worlds.
There is no other quantum theory, besides many-worlds, that is scientific, in the sense of providing a reductionist model of reality, and free of internal inconsistencies, that I am aware of. Briefly here are the defects of the most popular alternatives:
1) Copenhagen Interpretation. Postulates that the observer obeys different physical laws than the non-observer, which is a return to vitalism. The definition of an observer varies from one adherent to another, if present at all. The status of the wavefunction is also ambiguous. If the wavefunction is real the theory is non-local (not fatal, but unpleasant). If the wavefunction is not real then the theory supplies no model of reality. (See"What are the problems with quantum theory?")
2) Hidden Variables [B]. Explicitly non-local. Bohm accepts that all the branches of the universal wavefunction exist. Like Everett Bohm held that the wavefunction is real complex-valued field which never collapses. In addition Bohm postulated that there were particles that move under the influence of a non-local "quantum- potential" derived from the wavefunction (in addition to the classical potentials which are already incorporated into the structure of the wavefunction). The action of the quantum- potential is such that the particles are affected by only one of the branches of the wavefunction. (Bohm derives what is essentially a decoherence argument to show this, see section 7,#I [B]).
The implicit, unstated assumption made by Bohm is that only the single branch of wavefunction associated with particles can contain self-aware observers, whereas Everett makes no such assumption. Most of Bohm's adherents do not seem to understand (or even be aware of) Everett's criticism, section VI [1], that the hidden- variable particles are not observable since the wavefunction alone is sufficient to account for all observations and hence a model of reality. The hidden variable particles can be discarded, along with the guiding quantum-potential, yielding a theory isomorphic to many-worlds, without affecting any experimental results.
[B] David J Bohm A suggested interpretation of the quantum theory in terms of "hidden variables" I and II Physical Review Vol 85 #2 166-193 (1952)
3) Quantum Logic. Undoubtedly the most extreme of all attempts to solve the QM measurement problem. Apart from abandoning one or other of the classical tenets of logic these theories are all unfinished (presumably because of internal inconsistencies). Also it is unclear how and why different types of logic apply on different scales.
4) Extended Probability [M]. A bold theory in which the concept of probability is "extended" to include complex values [Y]. Whilst quite daring, I am not sure if this is logically permissible, being in conflict with the relative frequency notion of probability, in which case it suffers from the same criticism as quantum logic. Also it is unclear, to me anyway, how the resultant notion of "complex probability" differs from the quantum "probability amplitude" and thus why we are justified in collapsing the complex- valued probability as if it were a classical, real-valued probability.
[M] W Muckenheim A review of extended probabilities Physics Reports Vol 133 339- (1986) [Y] Saul Youssef Quantum Mechanics as Complex Probability Theory hep-th 9307019
5) Transactional model [C]. Explicitly non-local. An imaginative theory, based on the Feynman-Wheeler absorber-emitter model of EM, in which advanced and retarded probability amplitudes combine into an atemporal "transaction" to form the Born probability density. It requires that the input and output states, as defined by an observer, act as emitters and absorbers respectively, but not any internal states (inside the "black box"), and, consequently, suffers from the familiar measurement problem of the Copenhagen interpretation.
If the internal states did act as emitters/absorbers then the wavefunction would collapse, for example, around one of the double slits (an internal state) in the double slit experiment, destroying the observed interference fringes. In transaction terminology a transaction would form between the first single slit and one of the double slits and another transaction would form between the same double slit and the point on the screen where the photon lands. This never observed.
[C] John G CramerThe transactional interpretation of quantum mechanics Reviews of Modern Physics Vol 58 #3 647-687 (1986)
6) Many-minds. Despite its superficial similarities with many-worlds this is actually a very unphysical, non-operational theory. (See"What is many-minds?")
7) Non-linear theories in general. So far no non-linear theory has any accepted experimental support, whereas many have failed experiment. (See"Is physics linear?") Many-worlds predicts that non-linear theories will always fail experiment. (See "Is linearity exact?")
Loosely speaking a "world" is a complex, causally connected, partially or completely closed set of interacting sub-systems which don't significantly interfere with other, more remote, elements in the superposition. Any complex system and its coupled environment, with a large number of internal degrees of freedom, qualifies as a world. An observer, with internal irreversible processes, counts as a complex system. In terms of the wavefunction, a world is a decohered branch of the universal wavefunction, which represents a single macrostate. (See"What is decoherence?") The worlds all exist simultaneously in a non- interacting linear superposition.
Sometimes "worlds" are called "universes", but more usually the latter is reserved the totality of worlds implied by the universal wavefunction. Sometimes the term "history" is used instead of "world". (Gell-Mann/Hartle's phrase, see"What is many-histories?").
A measurement is an interaction, usually irreversible, between subsystems that correlates the value of a quantity in one subsystem with the value of a quantity in the other subsystem. The interaction may trigger an amplification process within one object or subsystem with many internal degrees of freedom, leading to an irreversible high-level change in the same object. If the course of the amplification is sensitive to the initial interaction then we can designate the system containing the amplified process as the "measuring apparatus", since the trigger is sensitive to some (often microphysical) quantity or parameter of the one of the other subsystems, which we designate the "object" system. Eg the detection of a charged particle (the object) by a Geiger counter (the measuring apparatus) leads to the generation of a "click" (high-level change). The absence of a charged particle does not generate a click. The interaction is with those elements of the charged particle's wavefunction that passes between the charged detector plates, triggering the amplification process (an irreversible electron cascade or avalanche), which is ultimately converted to a click.
A measurement, by this definition, does not require the presence of an conscious observer, only of irreversible processes.
Worlds, or branches of the universal wavefunction, split when different components of a quantum superposition "decohere" from each other [7a], [7b], [10]. Decoherence refers to the loss of coherency or absence of interference effects between the elements of the superposition. For two branches or worlds to interfere with each other all the atoms, subatomic particles, photons and other degrees of freedom in each world have to be in the same state, which usually means they all must be in the same place or significantly overlap in both worlds, simultaneously.
For small microscopic systems it is quite possible for all their atomic components to overlap at some future point. In the double slit experiment, for instance, it only requires that the divergent paths of the diffracted particle overlap again at some space-time point for an interference pattern to form, because only the single particle has been split.
Such future coincidence of positions in all the components is virtually impossible in more complex, macroscopic systems because all the constituent particles have to overlap with their counterparts simultaneously. Any system complex enough to be described by thermodynamics and exhibit irreversible behaviour is a system complex enough to exclude, for all practical purposes, any possibility of future interference between its decoherent branches. An irreversible process is one in, or linked to, a system with a large number of internal, unconstrained degrees of freedom. Once the irreversible process has started then alterations of the values of the many degrees of freedom leaves an imprint which can't be removed. If we try to intervene to restore the original status quo the intervention causes more disruption elsewhere.
In QM jargon we say that the components (or vectors in the underlying Hilbert state space) have become permanently orthogonal due to the complexity of the systems increasing the dimensionality of the vector space, where each unconstrained degree of freedom contributes a dimension to the state vector space. In a high dimension space almost all vectors are orthogonal, without any significant degree of overlap. Thus vectors for complex systems, with a large number of degrees of freedom, naturally decompose into mutually orthogonal components which, because they can never significantly interfere again, are unaware of each other. The complex system, or world, has split into different, mutually unobservable worlds.
According to thermodynamics each activated degree of freedom acquires kT energy. This works the other way around as well: the release of approximately kT of energy increases the state-space dimensionality. Even the quite small amounts of energy released by an irreversible frictive process are quite large on this scale, increasing the size of the associated Hilbert space.
Contact between a system and a heat sink is equivalent to increasing the dimensionality of the state space, because the description of the system has to be extended to include all parts of the environment in causal contact with it. Contact with the external environment is a very effective destroyer of coherency. (See"What is the environment basis?")
Worlds irrevocably "split" at the sites of measurement-like interactions associated with thermodynamically irreversible processes. (See"What is a measurement?") An irreversible process will always produce decoherence which splits worlds. (See "Why do worlds split?", "What is decoherence?" and "When does Schrodinger's cat split?" for a concrete example.)
In the example of a Geiger counter and a charged particle after the particle has passed the counter one world contains the clicked counter and that portion of the particle's wavefunction which passed though the detector. The other world contains the unclicked counter with the particle's wavefunction with a "shadow" cast by the counter taken out of the particle's wavefunction.
The Geiger counter splits when the amplification process became irreversible, before the click is emitted. (See"What is a measurement?") The splitting is local (originally in the region of the Geiger counter in our example) and is transmitted causally to more distant systems. (See "Is many-worlds a local theory?" and "Does the EPR experiment prohibit locality?") The precise moment/location of the split is not sharply defined due to the subjective nature of irreversibility, but can be considered complete when much more than kT of energy has been released in an uncontrolled fashion into the environment. At this stage the event has become irreversible.
In the language of thermodynamics the amplification of the charged particle's presence by the Geiger counter is an irreversible event. These events have caused the decoherence of the different branches of the wavefunction. (See"What is decoherence?" and "Why do worlds split?") Decoherence occurs when irreversible macro-level events take place and the macrostate description of an object admits no single description. (A macrostate, in brief, is the description of an object in terms of accessible external characteristics.)
The advantage of linking the definition of worlds and the splitting process with thermodynamics is the splitting process becomes irreversible and only permits forward-time-branching, following the increase with entropy. (See"Why don't worlds fuse, as well as split?") Like all irreversible processes, though, there are exceptions even at the coarse-grained level and worlds will occasionally fuse. A necessary, although not sufficient, precondition for fusing is for all records, memories etc. that discriminate between the pre-fused worlds or histories be lost. This is not a common occurrence.
Consider Schrodinger's cat. A cat is placed in a sealed box with a device that releases a lethal does of cyanide if a certain radioactive decay is detected. For simplicity we'll imagine that the box, whilst closed, completely isolates the cat from its environment. After a while an investigator opens the box to see if the cat is alive or dead. According to the Copenhagen Interpretation the cat was neither alive nor dead until the box was opened, whereupon the wavefunction of the cat collapsed into one of the two alternatives (alive or dead cat). The paradox, according to Schrodinger, is that the cat presumably knew if it was alive *before* the box was opened. According to many-worlds the device was split into two states (cyanide released or not) by the radioactive decay, which is a thermodynamically irreversible process (See"When do worlds split?" and "Why do worlds split?"). As the cyanide/no-cyanide interacts with the cat the cat is split into two states (dead or alive). From the surviving cat's point of view it occupies a different world from its deceased copy. The onlooker is split into two copies only when the box is opened and they are altered by the states of the cat.
The cat splits when the device is triggered, irreversibly. The investigator splits when they open the box. The alive cat has no idea that investigator has split, any more than it is aware that there is a dead cat in the neighbouring split-off world. The investigator can deduce, after the event, by examining the cyanide mechanism, or the cat's memory, that the cat split prior to opening the box.
The sum-over-histories or path-integral formalism of quantum mechanics was developed by Richard Feynman in the 1940s [F] as a third interpretation of quantum mechanics, alongside Schrodinger's wave picture and Heisenberg's matrix mechanics, for calculating transition amplitudes. All three approaches are mathematically equivalent, but the path-integral formalism offers some interesting additional insights into many-worlds.
In the path-integral picture the wavefunction of a single particle at (x',t') is built up of contributions of all possible paths from (x,t), where each path's contribution is weighted by a (phase) factor of exp(i*Action[path]/hbar) * wavefunction at (x,t), summed, in turn, over all values of x. The Action[path] is the time-integral of the lagrangian (roughly: the lagrangian equals kinetic minus the potential energy) along the path from (x,t) to (x',t'). The final expression is thus the sum or integral over all paths, irrespective of any classical dynamical constraints. For N-particle systems the principle is the same, except that the paths run through a 3-N space.
In the path-integral approach every possible path through configuration space makes a contribution to the transition amplitude. From this point of view the particle explores every possible intermediate configuration between the specified start and end states. For this reason the path- integral technique is often referred to as "sum-over-histories". Since we do not occupy a privileged moment in history it is natural to wonder if alternative histories are contributing equally to transition amplitudes in the future, and that each possible history has an equal reality. Perhaps we shouldn't be surprised that Feynman is on record as believing in many-worlds. (See"Who believes in many-worlds?") What is surprising is that Everett developed his many-worlds theory entirely from the Schrodinger viewpoint without any detectable influence from Feynman's work, despite Feynman and Everett sharing the same Princeton thesis supervisor, John A Wheeler.
Feynman developed his path-integral formalism further during his work on quantum electrodynamics, QED, in parallel with Schwinger and Tomonoga who had developed a less visualisable form of QED. Dyson showed that these approaches were all equivalent. Feynman, Schwinger and Tomonoga were awarded the 1965 Physics Nobel Prize for this work. Feynman's approach was to show how any process, with defined in (initial) and out (final) states, can be represented by a series of (Feynman) diagrams, which allow for the creation, exchange and annihilation of particles. Each Feynman diagram represents a different contribution to the complete transition amplitude, provided that the external lines map onto the required boundary initial and final conditions (the defined in and out states). QED became the prototype for all the other, later, field theories like electro-weak and quantum chromodynamics.
[F] Richard P Feynman Space-time approach to non-relativistic quantum mechanics Reviews of Modern Physics, Vol 20: 267-287 (1948)
What is the environment basis? There is considerable linkage between thermodynamics and many-worlds, explored in the "decoherence" views of Zurek [7a], [7b] and Gell-Mann and Hartle [10], Everett [1], [2] and others [4b]. (See"What is decoherence?")
Gell-Mann and Hartle, in particular, have extended the role of decoherence in defining the Everett worlds, or "histories" in their nomenclature. They call their approach the "many-histories" approach, where each "coarse-grained or classical history" is associated with a unique time-ordered sequence of sets of irreversible events, including measurements, records, observations and the like. (See"What is a measurement?") Fine-grained histories effectively relax the irreversible criterion. Mathematically the many-histories approach is isomorphic to Everett's many-worlds.
The worlds split or "decohere" from each other when irreversible events occur. (See"Why do worlds split?" and "When do worlds split?") Correspondingly many-histories defines a multiply-connected hierarchy of classical histories where each classical history is a "child" of any parent history which has only a subset of the child defining irreversible events and a parent of any history which has a superset of such events. Climbing up the tree from child to parent moves to progressively coarser grained consistent histories until eventually the top is reached where the history has no defining events (and thus consistent with everything!). This is Everett's universal wavefunction. The bottom of the coarse-grained tree terminates with the maximally refined set of decohering histories. The classical histories each have a probability assigned to them and probabilities are additive in the sense that the sum of the probabilities associated a set classical histories is equal to the probability associated with the unique parent history defined by the set. (Below the maximally refined classical histories are the fine grained or quantum histories, where probabilities are no longer additive and different histories significantly interfere with each other. The bottom level consists of complete microstates, which fully specified states.)
The decoherence approach is useful in considering the effect of the environment on a system. In many ways the environment, acting as a heat sink, can be regarded as performing a succession of measurement-like interactions upon any system, inducing associated system splits. All the environment basis is a basis chosen so as to minimise the cross- basis interference terms. It makes any real-worlds calculation easy, since the cross terms are so small, but it does not uniquely select a basis, just eliminates a large number.
The thermodynamic Planck-Boltzmann relationship, S = k*log(W), counts the branches of the wavefunction at each splitting, at the lowest, maximally refined level of Gell-Mann's many-histories tree. (See"What is many-histories?") The bottom or maximally divided level consists of microstates which can be counted by the formula W = exp (S/k), where S = entropy, k = Boltzmann's constant (approx 10^-23 Joules/Kelvin) and W = number of worlds or macrostates. The number of coarser grained worlds is lower, but still increasing with entropy by the same ratio, i.e. the number of worlds a single world splits into at the site of an irreversible event, entropy dS, is exp(dS/k). Because k is very small a great many worlds split off at each macroscopic event.
The simplest way to see that the many-worlds metatheory is a local theory is to note that it requires that the wavefunction obey some relativistic wave equation, the exact form of which is currently unknown, but which is presumed to be locally Lorentz invariant at all times and everywhere. This is equivalent to imposing the requirement that locality is enforced at all times and everywhere. Ergo many-worlds is a local theory.
Another way of seeing this is examine how macrostates evolve. Macrostates descriptions of objects evolve in a local fashion. Worlds split as the macrostate description divides inside the light cone of the triggering event. Thus the splitting is a local process, transmitted causally at light or sub-light speeds. (See"Does the EPR experiment prohibit locality?" and "When do worlds split?")
Yes, many-worlds is a deterministic theory, since the wavefunction obeys a deterministic wave equation at all times. All possible outcomes of a measurement or interaction (See"What is a measurement?") are embedded within the universal wavefunction although each observer, split by each observation, is only aware of single outcomes due to the linearity of the wave equation. The world appears indeterministic, with the usual probabilistic collapse of the wavefunction, but at the objective level, which includes all outcomes, determinism is restored.
Some people are under the impression that the only motivation for many- worlds is a desire to return to a deterministic theory of physics. This is not true. As Everett pointed out, the objection with the standard Copenhagen interpretation is not the indeterminism per se, but that indeterminism occurs only with the intervention of an observer, when the wavefunction collapses. (See"What is the Copenhagen interpretation?")
It is trivial to relativise many-worlds, at least to the level of special relativity. All relativistic theories of physics are quantum theories with linear wave equations. There are three or more stages to developing a fully relativised quantum field theory:
First quantisation: the wavefunction of an N particle system is a complex field which evolves in 3N dimensions as the solution to either the many-particle Schrodinger, Dirac or Klein-Gordon or some other wave equation. External forces applied to the particles are represented or modelled via a potential, which appears in the wave equation as a classical, background field.
Second quantisation: AKA (relativistic) quantum field theory (QFT) handles the creation and destruction of particles by quantising the classical fields and potentials as well as the particles. Each particle corresponds to a field, in QFT, and becomes an operator. E.g. the electromagnetic field's particle is the photon. The wavefunction of a collection of particles/fields exists in a Fock space, where the number of dimensions varies from component to component, corresponding to the indeterminacy in the particle number. Many-worlds has no problems incorporating QFT, since a theory (QFT) is not altered by a metatheory (many-worlds), which makes statements about the theory.
Third quantisation: AKA quantum gravity. The gravitational metric is quantised, along with (perhaps) the topology of the space-time manifold. The role of time plays a less central role, as might be expected, but the first and second quantisation models are as applicable as ever for modelling low-energy events. The physics of this is incomplete, including some thorny, unresolved conceptual issues, with a number of proposals (strings, supersymmetry, supergravity...) for ways forward, but the extension required by many-worlds is quite trivial since the mathematics would be unchanged.
One of the original motivations of Everett's scheme was to provide a system for quantising the gravitational field to yield a quantum cosmology, permitting a complete, self-contained description of the universe. Indeed many-words actually requires that gravity be quantised, in contrast to other interpretations which are silent about the role of gravity. (See "Whyquantum gravity?")
Non-relativistic quantum mechanics and quantum field theory are quite unambiguous: the other Everett-worlds occupy the same space and time as we do.
The implicit question is really, why aren't we aware of these other worlds, unless they exist "somewhere" else? To see why we aren't aware of the other worlds, despite occupying the same space-time, see "Why do I only ever experience one world?" Some popular accounts describe the other worlds as splitting off into other, orthogonal, dimensions. These dimensions are the dimensions of Hilbert space, not the more familiar space-time dimensions.
The situation is more complicated, as we might expect, in theories of quantum gravity (See"What about quantum gravity?"), because gravity can be viewed as perturbations in the space-time metric. If we take a geometric interpretation of gravity then we can regard differently curved space-times, each with their own distinct thermodynamic history, as non-coeval. In that sense we only share the same space-time manifold with other worlds with a (macroscopically) similar mass distribution. Whenever the amplification of a quantum-scale interaction effects the mass distribution and hence space-time curvature the resultant decoherence can be regarded as splitting the local space-time manifold into discrete sheets.
No, for four reasons:
First, many-worlds makes predictions that differ from the other so- called interpretations of quantum theory. Interpretations do not make predictions that differ. (See"What unique predictions does many-worlds make?") In addition many-worlds retrodicts a lot of data that has no other easy interpretation. (See "What retrodictions does many-worlds make?")
Second, the mathematical structure of many-worlds is not isomorphic to other formulations of quantum mechanics like the Copenhagen interpretation or Bohm's hidden variables. The Copenhagen interpretation does not contain those elements of the wavefunction that correspond to the other worlds. Bohm's hidden variables contain particles, in addition to the wavefunction. Neither theory is isomorphic to each other or many-worlds and are not, therefore, merely rival "interpretations".
Third, there is no scientific, reductionistic alternative to many- worlds. All the other theories fail for logical reasons. (See"Is there any alternative theory?")
Fourth, the interpretative side of many-worlds, like the subjective probabilistic elements, are derived from within the theory, rather than added to it by assumption, as in the conventional approach. (See"How do probabilities emerge within many-worlds?")
Many-worlds should really be described as a theory or, more precisely, a metatheory, since it makes statements that are applicable about a range of theories. Many-worlds is the unavoidable implication of any quantum theory which obeys some type of linear wave equation. (See"Is physics linear?")
This is really a question about why thermodynamics works and what is the origin of the "arrow of time", rather than about many-worlds.
First, worlds almost never fuse, in the forward time direction, but often divide, because of the way we have defined them. (See"What is decoherence?", "Why do worlds split?" and "When do worlds split?") The Planck-Boltzmann formula for the number of worlds (See "How many worlds are there?") implies that where worlds to fuse together then entropy would decrease, violating the second law of thermodynamics.
Second, this does not imply that irreversible thermodynamics is incompatible with reversible (or nearly so) microphysics. The laws of physics are reversible (or CPT invariant, more precisely) and fully compatible with the irreversibility of thermodynamics, which is solely due to the boundary conditions (the state of universe at some chosen moment) imposed by the Big Bang or whatever we chose to regard as the initial conditions. (See"Why can't the boundary conditions be updated to reflect my observations in this one world?")
A retrodiction occurs when already gathered data is accounted for by a later theoretical advance in a more convincing fashion. The advantage of a retrodiction over a prediction is that the already gathered data is more likely to be free of experimenter bias. An example of a retrodiction is the perihelion shift of Mercury which Newtonian mechanics plus gravity was unable, totally, to account for whilst Einstein's general relativity made short work of it.
Many-worlds retrodicts all the peculiar properties of the (apparent) wavefunction collapse in terms of decoherence. (See"What is decoherence?", "Can wavefunctions collapse?", "When do worlds split?"and "Why do worlds split?") No other quantum theory has yet accounted for this behaviour scientifically. (See "What are the alternatives to many- worlds?")
Can we regard the separate worlds that result from a measurement-like interaction (See"What is a measurement?") as having previous existed distinctly and merely differentiated, rather than the interaction as having split one world into many? This is definitely not permissible in many-worlds or any theory of quantum theory consistent with experiment. Worlds do not exist in a quantum superposition independently of each other before they decohere or split. The splitting is a physical process, grounded in the dynamical evolution of the wave vector, not a matter of philosophical, linguistic or mental convenience (see "Why do worlds split?" and "When do worlds split?") If you try to treat the worlds as pre-existing and separate then the maths and probabilistic behaviour all comes out wrong. Also the differentiation theory isn't deterministic, in contradiction to the wave equations which are deterministic, since many-minds says that:
AAAAAAAAAAAAAAABBBBBBBBBBBBBBB -------------- time (Worlds differentiate) AAAAAAAAAAAAAAACCCCCCCCCCCCCCC occurs, rather than: BBBBBBBBBBBBBBB B AAAAAAAAAAAAAA (Worlds split) C CCCCCCCCCCCCCCC according to many-worlds.
This false differentiation model, at the mental level, seems favoured by adherents of many-minds. (See"What is many-minds?")
Many-minds proposes, as an extra fundamental axiom, that an infinity of separate minds or mental states be associated with each single brain state. When the single physical brain state is split into a quantum superposition by a measurement (See"What is a measurement?") the associated infinity of minds are thought of as differentiating rather than splitting. The motivation for this brain-mind dichotomy seems purely to avoid talk of minds splitting and talk instead about the differentiation of pre-existing separate mental states. There is no physical basis for this interpretation, which is incapable of an operational definition. Indeed the differentiation model for physical systems is specifically not permitted in many-worlds. Many-minds seems to be proposing that minds follow different rules than matter. (See "Do worlds differentiate or split?")
In many-minds the role of the conscious observer is accorded special status, with its fundamental axiom about infinities of pre-existing minds, and as such is philosophically opposed to many-worlds, which seeks to remove the observer from any privileged role in physics. (Many-minds was co-invented by David Albert, who has, apparently, since abandoned it. See Scientific American July 1992 page 80 and contrast with Albert's April '94 Scientific American article.)
The two theories must not be confused.
William of Ockham, 1285-1349(?) English philosopher and one of the founders of logic, proposed a maxim for judging theories which says that hypotheses should not be multiplied beyond necessity. This is known as Ockham's razor and is interpreted, today, as meaning that to account for any set of facts the simplest theories are to be preferred over more complex ones. Many-worlds is viewed as unnecessarily complex, by some, by requiring the existence of a multiplicity of worlds to explain what we see, at any time, in just one world.
This is to mistake what is meant by "complex". Here's an example. Analysis of starlight reveals that starlight is very similar to faint sunlight, both with spectroscopic absorption and emission lines. Assuming the universality of physical law we are led to conclude that other stars and worlds are scattered, in great numbers, across the cosmos. The theory that "the stars are distant suns" is the simplest theory and so to be preferred by Ockham's Razor to other geocentric theories.
Similarly many-worlds is the simplest and most economical quantum theory because it proposes that same laws of physics apply to animate observers as has been observed for inanimate objects. The multiplicity of worlds predicted by the theory is not a weakness of many-worlds, any more than the multiplicity of stars are for astronomers, since the non-interacting worlds emerge from a simpler theory.
(As an historical aside it is worth noting that Ockham's razor was also falsely used to argue in favour of the older heliocentric theories against Galileo's notion of the vastness of the cosmos. The notion of vast empty interstellar spaces was too uneconomical to be believable to the Medieval mind. Again they were confusing the notion of vastness with complexity [15].)
First, the law conservation of energy is based on observations within each world. All observations within each world are consistent with conservation of energy, therefore energy is conserved.
Second, and more precisely, conservation of energy, in QM, is formulated in terms of weighted averages or expectation values. Conservation of energy is expressed by saying that the time derivative of the expected energy of a closed system vanishes. This statement can be scaled up to include the whole universe. Each world has an approximate energy, but the energy of the total wavefunction, or any subset of, involves summing over each world, weighted with its probability measure. This weighted sum is a constant. So energy is conserved within each world and also across the totality of worlds.
One way of viewing this result - that observed conserved quantities are conserved across the totality of worlds - is to note that new worlds are not created by the action of the wave equation, rather existing worlds are split into successively "thinner" and "thinner" slices, if we view the probability densities as "thickness".
Everett demonstrated [1], [2] that observations in each world obey all the usual conventional statistical laws predicted by the probabilistic Born interpretation, by showing that the Hilbert space's inner product or norm has a special property which allows us to makes statements about the worlds where quantum statistics break down. The norm of the vector of the set of worlds where experiments contradict the Born interpretation ("non-random" or "maverick" worlds) vanishes in the limit as the number of probabilistic trials goes to infinity, as is required by the frequentist definition of probability. Hilbert space vectors with zero norm don't exist (see below), thus we, as observers, only observe the familiar, probabilistic predictions of quantum theory. Everett-worlds where probability breaks down are never realised.
Strictly speaking Everett did not prove that the usual statistical laws of the Born interpretation would hold true for all observers in all worlds. He merely showed that no other statistical laws could hold true and asserted the vanishing of the Hilbert space "volume" or norm of the set of "maverick" worlds. DeWitt later published a longer derivation of Everett's assertion [4a], [4b], closely based on an earlier, independent demonstration by Hartle [H]. What Everett asserted, and DeWitt/Hartle derived, is that the collective norm of all the maverick worlds, as the number of trials goes to infinity, vanishes. Since the only vector in a Hilbert space with vanishing norm is the null vector (a defining axiom of Hilbert spaces) this is equivalent to saying that non-randomness is never realised. All the worlds obey the usual Born predictions of quantum theory. That's why we never observe the consistent violation of the usual quantum statistics, with, say, heat flowing from a colder to a hotter macroscopic object. Zero-probability events never happen.
Of course we have to assume that the wavefunction is a Hilbert space vector in the first place but, since this assumption is also made in the standard formulation, this is not a weakness of many-worlds since we are not trying to justify all the axioms of the conventional formulation of QM, merely those that relate to probabilities and collapse of the wavefunction.
In more detail the steps are:
1) Construct the tensor product of N identical systems in state |psi, according to the usual rules for Hilbert space composition (repeated indices summed):
|PSI_N = |psi_1*|psi_2*...... |psi_N where |psi_j = jth system prepared in state |psi = |i_j<i_j|psi> (ie the amplitude of the ith eigenstate is independent of which system it is in) so that |PSI_N = |i_1|i_2...|i_N<i_1|psi<i_2|psi...<i_N|psi 2) Quantify the deviation from the "expected" Born-mean for each component of |PSI_N with respect to the above |i_1|i_2...|i_N basis by counting the number of occurrences of the ith eigenstate/N. Call this number RF(i). Define the Born-deviation as D = sum(i)( (RF(i) - |<i|psi|^2)^2 ). Thus D, loosely speaking, for each N length sequence, quantifies by how much the particular sequence differs from the Born-expectation. 3) Sort out terms in the expansion of |PSI_N according to whether D is less/equal to (.LE.) or greater than (.GT.) E, where E is a real, positive constant. Collecting terms together we get: |PSI_N = |N,"D.GT.E" + |N,"D.LE.E" worlds worlds for which for which D E D <= E 4) What DeWitt showed was that: <N,"D.GT.E"|N,"D.GT.E" < 1/(NE) (proof in appendix of [4b]) Thus as N goes to infinity the right-hand side vanishes for all positive values of E. (This mirrors the classical "frequentist" position on probability which states that if event i occurs with probability p(i) then the proportion of N trials with outcome i approaches p(i)/N as N goes to infinity [H]. This has the immediate benefit that sum(i) p(i) = 1.) The norm of |N,"D.LE.E", by contrast, approaches 1 as N goes to infinity. Note: this property of D is not shared by other definitions, which is why we haven't investigated them. If, say, we had defined, step 2), A = sum(i)( (RF(i) - |<i|psi|)^2 ), so that A measures the deviation from |psi|, rather than |psi|^2, then we find that does not have the desired property of vanishing as N goes to infinity. 5)The norm of the collection of non-random worlds vanishes and therefore must be identified with some complex multiple of the null vector. 6)Since (by assumption) the state vector faithfully models reality then the null vector cannot represent any element of reality, since it can be added to (or subtracted from) any other state vector without altering the other state vector. 7)Ergo the non-random worlds are not realised, without making any additional physical assumptions, such the imposition of a measure. Note: no finite sequence of outcomes is excluded from happening, since the concept of probability and randomness only becomes precise only as N goes to infinity [H]. Thus, heat could be observed to flow from a cold to hotter object, but we might have to wait a very long time before observing it. What is excluded is the possibility of this process going on forever.
The emergence of Born-style probabilities as a consequence of the mathematical formalism of the theory, without any extra interpretative assumptions, is another reason why the Everett metatheory should not be regarded as just an interpretation. (See"Is many-worlds (just) an interpretation?") The interpretative elements are forced by the mathematical structure of the axioms of Hilbert space.
[H] JB Hartle Quantum Mechanics of Individual Systems American Journal of Physics Vol 36 #8 704-712 (1968) Hartle has investigated the N goes to infinity limit in more detail and more generally. He shows that the relative frequency operator, RF, obeys RF(i) |psi_1|psi_2.... = |<i|psi|^2 |psi_1|psi_2...., for a normed state. Hartle regarded his derivation as essentially the same as Everett's, despite being derived independently.
Many-Worlds, whilst deterministic on the objective universal level, is indeterministic on the subjective level so the situation is certainly no better or worse for free-will than in the Copenhagen view. Traditional Copenhagen indeterministic quantum mechanics only slightly weakens the case for free-will. In quantum terms each neuron is an essentially classical object. Consequently quantum noise in the brain is at such a low level that it probably doesn't often alter, except very rarely, the critical mechanistic behaviour of sufficient neurons to cause a decision to be different than we might otherwise expect. The consensus view amongst experts is that free-will is the consequence of the mechanistic operation of our brains, the firing of neurons, discharging across synapses etc. and fully compatible with the determinism of classical physics. Free-will is the inability of an intelligent, self-aware mechanism to predict its own future actions due to the logical impossibility of any mechanism containing a complete internal model of itself rather than any inherent indeterminism in the mechanism's operation.
Nevertheless, some people find that with all possible decisions being realised in different worlds that the prima face situation for free- will looks quite difficult. Does this multiplicity of outcomes destroy free-will? If both sides of a choice are selected in different worlds why bother to spend time weighing the evidence before selecting? The answer is that whilst all decisions are realised, some are realised more often than others - or to put to more precisely each branch of a decision has its own weighting or measure which enforces the usual laws of quantum statistics.
This measure is supplied by the mathematical structure of the Hilbert spaces. Every Hilbert space has a norm, constructed from the inner product, - which we can think of as analogous to a volume - which weights each world or collection of worlds. A world of zero volume is never realised. Worlds in which the conventional statistical predictions consistently break down have zero volume and so are never realised. (See"How do probabilities emerge within many-worlds?")
Thus our actions, as expressions of our will, correlate with the weights associated with worlds. This, of course, matches our subjective experience of being able to exercise our will, form moral judgements and be held responsible for our actions.
These are really the same questions. Consider, for a moment, this analogy:
Suppose Fred has his brain divided in two and transplanted into two different cloned bodies (this is a gedanken operation! [*]). Let's further suppose that each half-brain regenerates to full functionality and call the resultant individuals Fred-Left and Fred-Right. Fred-Left can ask, why did I end up as Fred-Left? Similarly Fred-Right can ask, why did I end up as Fred-Right? The only answer possible is that there was no reason. From Fred's point of view it is a subjectively random choice which individual "Fred" ends up as. To the surgeon the whole process is deterministic. To both the Freds it seems random.
Same with many-worlds. There was no reason "why" you ended up in this world, rather than another - you end up in all the quantum worlds. It is a subjectively random choice, an artefact of your brain and consciousness being split, along with the rest of the world, that makes our experiences seem random. The universe is, in effect, performing umpteen split-brain operations on us all the time. The randomness apparent in nature is a consequence of the continual splitting into mutually unobservable worlds.
(See"How do probabilities emerge within many-worlds?" for how the subjective randomness is moderated by the usual probabilistic laws of QM.)
[*] Split brain experiments were performed on epileptic patients (severing the corpus callosum, one of the pathways connecting the cerebral hemispheres, moderated epileptic attacks). Complete hemispherical separation was discontinued when testing of the patients revealed the presence of two distinct consciousnesses in the same skull. So this analogy is only partly imaginary.
Many-worlds predicts/retrodicts that wavefunctions appear to collapse (See"Does the EPR experiment prohibit locality?"), when measurement- like interactions (See "What is a measurement?") and processes occur via a process called decoherence (See "What is decoherence?"), but claims that the wavefunction does not actually collapse but continues to evolve according to the usual wave-equation. If a mechanism for collapse could be found then there would be no need for many-worlds. The reason why we doubt that collapse takes place is because no one has ever been able to devise a physical mechanism that could trigger it.
The Copenhagen interpretation posits that observers collapse wavefunctions, but is unable to define "observer". (See"What is the Copenhagen interpretation?" and "Is there any alternative theory?") Without a definition of observer there can be no mechanism triggered by their presence.
Another popular view is that irreversible processes trigger collapse. Certainly wavefunctions appear to collapse whenever irreversible processes are involved. And most macroscopic, day-to-day events are irreversible. The problem is, as with positing observers as a cause of collapse, that any irreversible process is composed of a large number of sub-processes that are each individually reversible. To invoke irreversibility as a mechanism for collapse we would have to show that new fundamental physics comes into play for complex systems, which is quite absent at the reversible atom/molecular level. Atoms and molecules are empirically observed to obey some type of wave equation. We have no evidence for an extra mechanism operating on more complex systems. As far as we can determine complex systems are described by the quantum-operation of their simpler components interacting together. (Note: chaos, complexity theory, etc., do not introduce new fundamental physics. They still operate within the reductionistic paradigm - despite what many popularisers say.)
Other people have attempted to construct non-linear theories so that microscopic systems are approximately linear and obey the wave equation, whilst macroscopic systems are grossly non-linear and generates collapse. Unfortunately all these efforts have made additional predictions which, when tested, have failed. (See"Is physics linear?")
(Another reason for doubting that any collapse actually takes place is that the collapse would have to propagate instantaneously, or in some space-like fashion, otherwise the same particle could be observed more than once at different locations. Not fatal, but unpleasant and difficult to reconcile with special relativity and some conservation laws.)
The simplest conclusion, which is to be preferred by Ockham's razor, is that wavefunctions just don't collapse and that all branches of the wavefunction exist.
According to our present knowledge of physics whilst it is possible to detect the presence of other nearby worlds, through the existence of interference effects, it is impossible travel to or communicate with them. Mathematically this corresponds to an empirically verified property of all quantum theories called linearity. Linearity implies that the worlds can interfere with each other with respect to a external, unsplit, observer or system but the interfering worlds can't influence each other in the sense that an experimenter in one of the worlds can arrange to communicate with their own, already split-off, quantum copies in other worlds.
Specifically, the wave equation is linear, with respect to the wavefunction or state vector, which means that given any two solutions of the wavefunction, with identical boundary conditions, then any linear combination of the solutions is another solution. Since each component of a linear solution evolves with complete indifference as to the presence or absence of the other terms/solutions then we can conclude that no experiment in one world can have any effect on another experiment in another world. Hence no communication is possible between quantum worlds. (This type of linearity mustn't be confused with the evident non-linearity of the equations with respect to the fields.)
Non communication between the splitting Everett-worlds also explains why we are not aware of any splitting process, since such awareness needs communication between worlds. To be aware of the world splitting you would have to be receiving sensory information from, and thereby effect by the reverse process, more than one world. This would enable communication between worlds, which is forbidden by linearity. Ergo, we are not aware of any splitting precisely because we are split into non-interfering copies along with the rest of the world.
See also"Is linearity exact?"
To calculate the form of the universal wavefunction requires not only a knowledge of its dynamics (which we have a good approximation to, at the moment) but also of the boundary conditions. To actually calculate the form of the universal wavefunction, and hence make inferences about all the embedded worlds, we would need to know the boundary conditions as well. We are presently restricted to making inferences about those worlds with which have shared a common history up to some point, which have left traces (records, fossils, etc.) still discernible today. This restricts us to a subset of the extant worlds which have shared the same boundary conditions with us. The further we probe back in time the less we know of the boundary conditions and the less we can know of the universal wavefunction.
This limits us to drawing conclusions about a restricted subset of the worlds - all the worlds which are consistent with our known history up to a some common moment, before we diverged. The flow of historical events is, according to chaos/complexity theory/thermodynamics, very sensitive to amplification of quantum-scale uncertainty and this sensitivity is a future-directed one-way process. We can make very reliable deductions about the past from the knowledge future/present but we can't predict the future from knowledge the past/present. Thermodynamics implies that the future is harder to predict than the past is to retrodict. Books get written about this "arrow of time" problem but, for the purposes of this discussion, we'll accept the thermodynamic origin of time's arrow is as given. The fossil and historical records say that dinosaurs and Adolf Hitler once existed but have less to say about the future.
Consider the effects of that most quantum of activities, Brownian motion, on the conception of individuals and the knock-on effects on the course of history. Mutation itself, one of the sources of evolutionary diversity, is a quantum event. For an example of the biological/evolutionary implications see Stephen Jay Gould's book Wonderful Life for an popular exploration of the thesis that the path of evolution is driven by chance. According to Gould evolutionary history forms an enormously diverse tree of possible histories - all very improbable - with our path being selected by chance. According to many-worlds all these other possibilities are realised. Thus there are worlds in which Hitler won WW-II and other worlds in which the dinosaurs never died out. We can be as certain of this as we are that Hitler and the dinosaurs once existed in our own past.
Whether or not we can ever determine the totality of the universal wavefunction is an open question. If Steven Hawking's work on the no- boundary-condition condition is ultimately successful, or it emerges from some theory of everything, and many think it will, then the actual form of the total wavefunction could, in principle, we determined from a complete knowledge of physical law itself.
Hugh Everett III (1930-1982) did his undergraduate study in chemical engineering at the Catholic University of America. Studying von Neumann's and Bohm's textbooks as part of his graduate studies, under Wheeler, in mathematical physics at Princeton University in the 1950s he became dissatisfied (like many others before and since) with the collapse of the wavefunction. He developed, during discussions with Charles Misner and Aage Peterson (Bohr' assistant, then visiting Princeton), his "relative state" formulation. Wheeler encouraged his work and preprints were circulated in January 1956 to a number of physicists. A condensed version of his thesis was published as a paper to The Role of Gravity in Physics conference held at the University of North Carolina, Chapel Hill, in January 1957.
Everett was discouraged by the lack of response from others, particularly Bohr, whom he flew to Copenhagen to meet but got the complete brush-off from. Leaving physics after completing his Ph.D., Everett worked as a defense analyst at the Weapons Systems Evaluation Group, Pentagon and later became a private contractor, apparently quite successfully for he became a multimillionaire. In 1968 Everett worked for the Lambda Corp. His published papers during this period cover things like optimising resource allocation and, in particular, maximising kill rates during nuclear-weapon campaigns.
From 1968 onwards Bryce S DeWitt, one of the 1957 Chapel Hill conference organisers, but better known as one of the founders of quantum gravity, successfully popularised Everett's relative state formulation as the "many-worlds interpretation" in a series of articles [4a],[4b],[5].
Sometime in 1976-9 Everett visited Austin, Texas, at Wheeler or DeWitt's invitation, to give some lectures on QM. The strict no-smoking rule in the auditorium was relaxed for Everett (a chain smoker); the only exception ever. Everett, apparently, had a very intense manner, speaking acutely and anticipating questions after a few words. Oh yes, a bit of trivia, he drove a Cadillac with horns.
With the steady growth of interest in many-worlds in the late 1970s Everett planned returning to physics to do more work on measurement in quantum theory, but died of a heart attack in 1982. Survived by his wife.
Quantum theory is the most successful description of microscopic systems like atoms and molecules ever, yet often it is not applied to larger, classical systems, like observers or the entire universe. Many scientists and philosophers are unhappy with the theory because it seems to require a fundamental quantum-classical divide. Einstein, for example, despite his early contributions to the subject, was never reconciled with assigning to the act of observation a physical significance, which most interpretations of QM require. This contradicts the reductionist ethos that, amongst other things, observations should emerge only as a consequence of an underlying physical theory and not be present at the axiomatic level, as they are in the Copenhagen interpretation. Yet the Copenhagen interpretation remains the most popular interpretation of quantum mechanics amongst the broad scientific community. (See"What is the Copenhagen interpretation?")
An unobserved system, according to the Copenhagen interpretation of quantum theory, evolves in a deterministic way determined by a wave equation. An observed system changes in a random fashion, at the moment of observation, instantaneously, with the probability of any particular outcome given by the Born formula. This is known as the "collapse" or "reduction" of the wavefunction. The problems with this approach are:
(1) The collapse is an instantaneous process across an extended region ("non-local") which is non-relativistic.
(2) The idea of an observer having an effect on microphysics is repugnant to reductionism and smacks of a return to pre-scientific notions of vitalism. Copenhagenism is a return to the old vitalist notions that life is somehow different from other matter, operating by different laws from inanimate matter. The collapse is triggered by an observer, yet no definition of what an "observer" is available, in terms of an atomic scale description, even in principle.
For these reasons the view has generally been adopted that the wavefunction associated with an object is not a real "thing", but merely represents our knowledge of the object. This approach was developed by Bohr and others, mainly at Copenhagen in the late 1920s. When we perform an measurement or observation of an object we acquire new information and so adjust the wavefunction as we would boundary conditions in classical physics to reflect this new information. This stance means that we can't answer questions about what's actually happening, all we can answer is what will be the probability of a particular result if we perform a measurement. This makes a lot of people very unhappy since it provides no model for the object.
It should be added that there are other, less popular, interpretations of quantum theory, but they all have their own drawbacks, which are widely reckoned more severe. Generally speaking they try to find a mechanism that describes the collapse process or add extra physical objects to the theory, in addition to the wavefunction. In this sense they are more complex. (See"Is there any alternative theory?")
The EPR experiment is widely regarded as the definitive gedanken experiment for demonstrating that quantum mechanics is non-local (requires faster-than-light communication) or incomplete. We shall see that it implies neither.
The EPR experiment was devised, in 1935, by Einstein, Podolsky and Rosen to demonstrate that quantum mechanics was incomplete [E]. Bell, in 1964, demonstrated that any hidden variables theory, to replicate the predictions of QM, must be non-local [B]. QM predicts strong correlations between separated systems, stronger than any local hidden variables theory can offer. Bell encoded this statistical prediction in the form of some famous inequalities that apply to any type of EPR experiment. Eberhard, in the late 1970s, extended Bell's inequalities to cover any local theory, with or without hidden variables. Thus the EPR experiment plays a central role in sorting and testing variants of QM. All the experiments attempting to test EPR/Bell's inequality to date (including Aspect's in the 1980s [As]) are in line with the predictions of standard QM - hidden variables are ruled out. Here is the paradox of the EPR experiment. It seems to imply that any physical theory must involve faster-than-light "things" going on to maintain these "spooky" action-at-a-distance correlations and yet still be compatible with relativity, which seems to forbid FTL.
Let's examine the EPR experiment in more detail.
So what did EPR propose? The original proposal was formulated in terms of correlations between the positions and momenta of two once-coupled particles. Here I shall describe it in terms of the spin (a type of angular momentum intrinsic to the particle) of two electrons. [In this treatment I shall ignore the fact that electrons always form antisymmetric combinations. This does not alter the results but does simplify the maths.] Two initially coupled electrons, with opposed spins that sum to zero, move apart from each other across a distance of perhaps many light years, before being separately detected, say, by me on Earth and you on Alpha Centauri with our respective measuring apparatuses. The EPR paradox results from noting that if we choose the same (parallel) spin axes to measure along then we will observe the two electrons' spins to be anti-parallel (i.e. when we communicate we find that the spin on our electrons are correlated and opposed). However if we choose measurement spin axes that are perpendicular to each other then there is no correlation between electron spins. Last minute alterations in a detector's alignment can create or destroy correlations across great distances. This implies, according to some theorists, that faster-than-light influences maintain correlations between separated systems in some circumstances and not others.
Now let's see how many-worlds escapes from this dilemma.
The initial state of the wavefunction of you, me and the electrons and the rest of the universe may be written:
|psi = |me |electrons |you |rest of universe on in on Earth deep Alpha space Centauri or more compactly, ignoring the rest of the universe, as: |psi = |me, electrons, you And |me represents me on Earth with my detection apparatus. |electrons = (|+,- - |-,+)/sqrt(2) represents a pair electrons, with the first electron travelling towards Earth and the second electron travelling towards Alpha Centauri. |+ represents an electron with spin in the +z direction |- represents an electron with spin in the -z direction
It is an empirically established fact, which we just have to accept, that we can relate spin states in one direction to spin states in other directions like so (where "i" is the sqrt(-1)):
|left = (|+ - |-)/sqrt(2) (electron with spin in -x direction) |right = (|+ + |-)/sqrt(2) (electron with spin in +x direction) |up = (|+ + |-i)/sqrt(2) (electron with spin in +y direction) |down = (|+ - |-i)/sqrt(2) (electron with spin in -y direction) and inverting: |+ = (|right + |left)/sqrt(2) = (|up + |down)/sqrt(2) |- = (|right - |left)/sqrt(2) = (|down - |up)i/sqrt(2)
(In fancy jargon we say that the spin operators in different directions form non-commuting observables. I shall eschew such obfuscations.)
Working through the algebra we find that for pairs of electrons:
|+,- - |-,+ = |left,right - |right,left = |up,downi - |down,up
I shall assume that we are capable of either measuring spin in the x or y direction, which are both perpendicular the line of flight of the electrons. After having measured the state of the electron my state is described as one of either:
|me[l] represents me + apparatus + records having measured and recorded the x-axis spin as "left" |me[r] ditto with the x-axis spin as "right" |me[u] ditto with the y-axis spin as "up" |me[d] ditto with the y-axis spin as "down"
Similarly for |you on Alpha Centauri. Notice that it is irrelevant how we have measured the electron's spin. The details of the measurement process are irrelevant. (See"What is a measurement?" if you're not convinced.) To model the process it is sufficient to assume that there is a way, which we have further assumed does not disturb the electron. (The latter assumption may be relaxed without altering the results.)
To establish familiarity with the notation let's take the state of the initial wavefunction as:
|psi_1 = |me,left,up,you / \ / \ first electron in left second electron in up state state heading towards heading towards you on me on Earth Alpha Centauri
After the electrons arrive at their detectors, I measure the spin along the x-axis and you along the y-axis. The wavefunction evolves into |psi_2:
local |psi_1 ============ |psi_2 = |me[l],left,up,you[u] observation
which represents me having recorded my electron on Earth with spin left and you having recorded your electron on Alpha Centauri with spin up. The index in []s indicates the value of the record. This may be held in the observer's memory, notebooks or elsewhere in the local environment (not necessarily in a readable form). If we communicate our readings to each other the wavefunctions evolves into |psi_3:
remote |psi_2 ============ |psi_3 = |me[l,u],left,up,you[u,l] communication
where the second index in []s represents the remote reading communicated to the other observer and being recorded locally. Notice that the results both agree with each other, in the sense that my record of your result agrees with your record of your result. And vice versa. Our records are consistent.
That's the notation established. Now let's see what happens in the more general case where, again,:
|electrons = (|+,- - |-,+)/sqrt(2).
First we'll consider the case where you and I have previously arranged to measure the our respective electron spins along the same x-axis.
Initially the wavefunction of the system of electrons and two experimenters is:
|psi_1 = |me,electrons,you = |me(|left,right - |right,left)|you /sqrt(2) = |me,left,right,you /sqrt(2) - |me,right,left,you /sqrt(2)
Neither you or I are yet unambiguously split.
Suppose I perform my measurement first (in some time frame). We get
|psi_2 = (|me[l],left,right - |me[r],right,left)|you /sqrt(2) = |me[l],left,right,you /sqrt(2) - |me[r],right,left,you /sqrt(2)
My measurement has split me, although you, having made no measurement, remain unsplit. In the full expansion the terms that correspond to you are identical.
After the we each have performed our measurements we get:
|psi_3 = |me[l],left,right,you[r] /sqrt(2) - |me[r],right,left,you[l] /sqrt(2)
The observers (you and me) have been split (on Earth and Alpha Centauri) into relative states (or local worlds) which correlate with the state of the electron. If we now communicate over interstellar modem (this will take a few years since you and I are separated by light years, but no matter). We get:
|psi_4 = |me[l,r],left,right,you[r,l] /sqrt(2) - |me[r,l],right,left,you[l,r] /sqrt(2)
The world corresponding to the 2nd term in the above expansion, for example, contains me having seen my electron with spin right and knowing that you have seen your electron with spin left. So we jointly agree, in both worlds, that spin has been conserved.
Now suppose that we had prearranged to measure the spins along different axes. Suppose I measure the x-direction spin and you the y-direction spin. Things get a bit more complex. To analyse what happens we need to decompose the two electrons along their respective spin axes.
|psi_1 = |me,electrons,you = |me(|+,- - |-,+)|you/sqrt(2) = |me ( (|right+|left)i(|down-|up) - (|right-|left)(|down+|up) ) |you /2*sqrt(2) = |me ( |right(|down-|up)i + |left (|down-|up)i - |right(|down+|up) + |left (|down+|up) ) |you /2*sqrt(2) = |me ( |right,down (i-1) - |right,up (1+i) + |left,up (1-i) + |left,down (1+i) ) |you /2*sqrt(2) = ( + |me,right,down,you (i-1) - |me,right,up,you (i+1) + |me,left,up,you (1-i) + |me,left,down,you (1+i) ) /2*sqrt(2)
So after you and I make our local observations we get:
|psi_2 = ( + |me[r],right,down,you[d] (i-1) - |me[r],right,up,you[u] (i+1) + |me[l],left,up,you[u] (1-i) + |me[l],left,down,you[d] (1+i) ) /2*sqrt(2)
Each term realises a possible outcome of the joint measurements. The interesting thing is that whilst we can decompose it into four terms there are only two states for each observer. Looking at myself, for instance, we can rewrite this in terms of states relative to *my* records/memories.
|psi_2 = ( |me[r],right ( |down,you[d] (i-1) - |up,you[u] (i+1) ) + |me[l],left ( |up,you[u] (1-i) + |down,you[d] (1+i) ) ) /2*sqrt(2)
And we see that there are only two copies of me. Equally we can rewrite the expression in terms of states relative to your records/memory.
|psi_2 = ( ( |me[l],left (1-i) - |me[r],right (i+1) ) |up,you[u] + ( |me[r],right (i-1) + |me[l],left (1+i) ) |down,you[d] ) /2*sqrt(2)
And see that there are only two copies of you. We have each been split into two copies, each perceiving a different outcome for our electron's spin, but we have not been split by the measurement of the remote electron's spin.
Afteryou and I communicate our readings to each other, more than four years later, we get:
|psi_3 = ( + |me[r,d],right,down,you[d,r] (i-1) - |me[r,u],right,up,you[u,r] (i+1) + |me[l,u],left,up,you[u,l] (1-i) + |me[l,d],left,down,you[d,l] (1+i) ) /2*sqrt(2)
The decomposition into four worlds is forced and unambiguous after communication with the remote system. Until the two observers communicated their results to each other they were each unsplit by each others' measurements, although their own local measurements had split themselves. The splitting is a local process that is causally transmitted from system to system at light or sub-light speeds. (This is a point that Everett stressed about Einstein's remark about the observations of a mouse, in the Copenhagen interpretation, collapsing the wavefunction of the universe. Everett observed that it is the mouse that's split by its observation of the rest of the universe. The rest of the universe is unaffected and unsplit.)
When all communication is complete the worlds have finally decomposed or decohered from each other. Each world contains a consistent set of observers, records and electrons, in perfect agreement with the predictions of standard QM. Further observations of the electrons will agree with the earlier ones and so each observer, in each world, can henceforth regard the electron's wavefunction as having collapsed to match the historically recorded, locally observed values. This justifies our operational adoption of the collapse of the wavefunction upon measurement, without having to strain our credibility by believing that it actually happens.
To recap. Many-worlds is local and deterministic. Local measurements split local systems (including observers) in a subjectively random fashion; distant systems are only split when the causally transmitted effects of the local interactions reach them. We have not assumed any non-local FTL effects, yet we have reproduced the standard predictions of QM.
So where did Bell and Eberhard go wrong? They thought that all theories that reproduced the standard predictions must be non-local. It has been pointed out by both Albert [A] and Cramer [C] (who both support different interpretations of QM) that Bell and Eberhard had implicitly assumed that every possible measurement - even if not performed - would have yielded a single definite result. This assumption is called contra-factual definiteness or CFD [S]. What Bell and Eberhard really proved was that every quantum theory must either violate locality or CFD. Many-worlds with its multiplicity of results in different worlds violates CFD, of course, and thus can be local.
Thus many-worlds is the only local quantum theory in accord with the standard predictions of QM and, so far, with experiment.
[A] David Z Albert, Bohm's Alternative to Quantum Mechanics Scientific American (May 1994)
[As] Alain Aspect, J Dalibard, G Roger Experimental test of Bell's inequalities using time-varying analyzers Physical Review Letters Vol 49 #25 1804 (1982).
[C] John G Cramer The transactional interpretation of quantum mechanics Reviews of Modern Physics Vol 58 #3 647-687 (1986)
[B] John S Bell: On the Einstein Podolsky Rosen paradox Physics 1 #3 195-200 (1964).
[E] Albert Einstein, Boris Podolsky, Nathan Rosen: Can quantum-mechanical description of physical reality be considered complete? Physical Review Vol 41 777-780 (15 May 1935).
[S] Henry P Stapp S-matrix interpretation of quantum-theory Physical Review D Vol 3 #6 1303 (1971)
Yes, Everett's formulation of the relative state metatheory is the same as many-worlds, but the language has evolved a lot from Everett's original article [2] and some of his work has been extended, especially in the area of decoherence. (See"What is decoherence?") This has confused some people into thinking that Everett's "relative state metatheory" and DeWitt's "many-worlds interpretation" are different theories.
Everett [2] talked about the observer's memory sequences splitting to form a "branching tree" structure or the state of the observer being split by a measurement. (See"What is a measurement?") DeWitt introduced the term "world" for describing the split states of an observer, so that we now speak of the observer's world splitting during the measuring process. The maths is the same, but the terminology is different. (See "What is a world?")
Everett tended to speak in terms of the measuring apparatus being split by the measurement, into non-interfering states, without presenting a detailed analysis of *why* a measuring apparatus was so effective at destroying interference effects after a measurement, although the topics of orthogonality, amplification and irreversibility were covered. (See"What is a measurement?", "Why do worlds split?" and "When do worlds split?") DeWitt [4b], Gell-Mann and Hartle [10], Zurek [7a] and others have introduced the terminology of "decoherence" (See "What is decoherence?") to describe the role of amplification and irreversibility within the framework of thermodynamics.
The relative state of something is the state that something is in, conditional upon, or relative to, the state of something else. What the heck does that mean? It means, amongst other things, that states in the same Everett-world are all states relative to each other. (See"Quantum mechanics and Dirac notation" for more precise details.)
Let's take the example of Schrodinger's cat and ask what is the relative state of the observer, after looking inside the box? The relative state of the observer (either "saw cat dead" or "saw cat alive") is conditional upon the state of the cat (either "dead" or "alive").
Another example: the relative state of the last name of the President of the Unites States, in 1995, is "Clinton". Relative to what? Relative to you and me, in this world. In some other worlds it will be "Bush", "Smith", etc. ....... Each possibility is realised in some world and it is the relative state of the President's name, relative to the occupants of that world.
According to Everett almost all states are relative states. Only the state of the universal wavefunction is not relative but absolute.
Some people believe that Everett eschewed all talk all splitting or branching observers in his original relative state formulation [2]. This is contradicted by the following quote from [2]:
[...] Thus with each succeeding observation (or interaction), the observer state "branches" into a number of different states. Each branch represents a different outcome of the measurement and the corresponding eigenstate for the object- system state. All branches exist simultaneously in the superposition after any given sequence of observations.[#] The "trajectory" of the memory configuration of an observer performing a sequence of measurements is thus not a linear sequence of memory configurations, but a branching tree, with all possible outcomes existing simultaneously in a final superposition with various coefficients in the mathematical model. [...]
[#] Note added in proof-- In reply to a preprint of this article some correspondents have raised the question of the "transition from possible to actual," arguing that in "reality" there is-as our experience testifies-no such splitting of observers states, so that only one branch can ever actually exist. Since this point may occur to other readers the following is offered in explanation.
The whole issue of the transition from "possible" to "actual" is taken care of in the theory in a very simple way- there is no such transition, nor is such a transition necessary for the theory to be in accord with our experience. From the viewpoint of the theory all elements of a superposition (all "branches") are "actual," none are any more "real" than the rest. It is unnecessary to suppose that all but one are somehow destroyed, since all separate elements of a superposition individually obey the wave equation with complete indifference to the presence or absence ("actuality" or not) of any other elements. This total lack of effect of one branch on another also implies that no observer will ever be aware of any "splitting" process.
Arguments that the world picture presented by this theory is contradicted by experience, because we are unaware of any branching process, are like the criticism of the Copernican theory that the mobility of the earth as a real physical fact is incompatible with the common sense interpretation of nature because we feel no such motion. In both case the arguments fails when it is shown that the theory itself predicts that our experience will be what it in fact is. (In the Copernican case the addition of Newtonian physics was required to be able to show that the earth's inhabitants would be unaware of any motion of the earth.)
A prediction occurs when a theory suggests new phenomena. Many-worlds makes at least three predictions, two of them unique: about linearity, (See"Is linearity exact?"), quantum gravity (See "Why quantum gravity?") and reversible quantum computers (See "Could we detect other Everett-worlds?").
Many-Worlds predicts that the Everett-worlds do not interact with each other because of the presumed linearity of the wave equation. However worlds do interfere with each other, and this enables the theory to be tested. (Interfere and interact mean different things in quantum mechanics. Pictorially: Interactions occur at the vertices within Feynman diagrams. Interference occurs when you add together different Feynman diagrams with the same external lines.)
According to many-worlds model worlds split with the operation of every thermodynamically irreversible process. The operation of our minds are irreversible, carried along for the ride, so to speak, and divide with the division of worlds. Normally this splitting is undetectable to us. To detect the splitting we need to set an up experiment where a mind is split but the world isn't. We need a reversible mind.
The general consensus in the literature [11], [16] is that the experiment to detect other worlds, with reversible minds, will be doable by, perhaps, about mid-21st century. That date is predicted from two trendlines, both of which are widely accepted in their own respective fields. To detect the other worlds you need a reversible machine intelligence. This requires two things: reversible nanotechnology and AI.
1) Reversible nanoelectronics. This is an straight-line extrapolation based upon the log(energy) / logic operation figures, which are projected to drop below kT in about 2020. This trend has held good for 50 years. An operation that thermally dissipates much less than kT of energy is reversible. (This implies that frictive or dissipative forces are insignificant by comparison with other processes.) If more than kT of energy is released then, ultimately, new degrees of freedom are activated in the environment and the change becomes irreversible.
2) AI. Complexity of human brain = approx 10^17 bits/sec, based on the number of neurons (approx 10^10) per human brain, average number of synapses per neuron (approx 10^4) and the average firing rate (approx 10^3 Hz). Straight line projection of log(cost) / logic operation says that human level, self-aware machine intelligences will be commercially available by about 2030-2040. Uncertainty due to present human-level complexity, but the trend has held good for 40 years.
Assuming that we have a reversible machine intelligence to hand then the experiment consists of the machine making three reversible measurements of the spin of an electron (or polarisation of a photon). (1) First it measures the spin along the z-axis. It records either spin "up" or spin "down" and notes this in its memory. This measurement acts just to prepare the electron in a definite state. (2) Second it measures the spin along the x-axis and records either spin "left" or spin "right" and notes this in its memory. The machine now reverses the entire x-axis measurement - which must be possible, since physics is effectively reversible, if we can describe the measuring process physically - including reversibly erasing its memory of the second measurement. (3) Third the machine takes a spin measurement along the z-axis. Again the machine makes a note of the result.
According to the Copenhagen interpretation the original (1) and final (3) z-axis spin measurements have only a 50% chance of agreeing because the intervention of the x-axis measurement by the conscious observer (the machine) caused the collapse of the electron's wavefunction. According to many-worlds the first and third measurements will always agree, because there was no intermediate wavefunction collapse. The machine was split into two states or different worlds, by the second measurement; one where it observed the electron with spin "left"; one where it observed the electron with spin "right". Hence when the machine reversed the second measurement these two worlds merged back together, restoring the original state of the electron 100% of the time.
Only by accepting the existence of the other Everett-worlds is this 100% restoration explicable.
Many-worlds makes a very definite prediction - gravity must be quantised, rather than exist as the purely classical background field of general relativity. Indeed, no one has conclusively directly detected (classical) gravity waves (as of 1994), although their existence has been indirectly observed in the slowing of the rotation of pulsars and binary systems. Some claims have been made for the detection of gravity waves from supernova explosions in our galaxy, but these are not generally accepted. Neither has anyone has directly observed gravitons, which are predicted by quantum gravity, presumably because of the weakness of the gravitational interaction. Their existence has been, and is, the subject of much speculation. Should, in the absence of any empirical evidence, gravity be quantised at all? Why not treat gravity as a classical force, so that quantum physics in the vicinity of a mass becomes quantum physics on a curved Riemannian background? According to many-worlds there is empirical evidence for quantum gravity.
To see why many-worlds predicts that gravity must be quantised, let's suppose that gravity is not quantised, but remains a classical force. If all the other worlds that many-worlds predicts exist then their gravitational presence should be detectable -- we would all share the same background gravitational metric with our co-existing quantum worlds. Some of these effects might be undetectable. For instance if all the parallel Earths shared the same gravitational field small perturbations in one Earth's orbit from the averaged background orbit across all the Everett-worlds would damp down, eventually, and remain undetectable.
However theories of galactic evolution would need considerable revisiting if many-worlds was true and gravity was not quantised, since, according to the latest cosmological models, the original density fluctuations derive from quantum fluctuations in the early universe, during the inflationary era. These quantum fluctuations lead to the formation of clusters and super-clusters of galaxies, along with variations in the cosmic microwave background (detected by Smoots et al) which vary in location from Everett-cosmos to cosmos. Such fluctuations could not grow to match the observed pattern if all the density perturbations across all the parallel Everett-cosmoses were gravitationally interacting. Stars would bind not only to the observed galaxies, but also to the host of unobserved galaxies.
A theory of classical gravity also breaks down at the scale of objects that are not bound together gravitationally. Henry Cavendish, in 1798, measured the torque produced by the gravitational force on two separated lead spheres suspended from a torsion fibre in his laboratory to determine the value of Newton's gravitational constant. Cavendish varied the positions of other, more massive lead spheres and noted how the torsion in the suspending fibre varied. Had the suspended lead spheres been gravitationally influenced by their neighbours, placed in different positions by parallel Henry Cavendishs in the parallel Everett-worlds, then the torsion would have been the averaged sum of all these contributions, which was not observed. In retrospect Cavendish established that the Everett-worlds are not detectable gravitationally. More recent experiments where the location of attracting masses were varied by a quantum random (radioactive) source have confirmed these findings. [W]
A shared gravitational field would also screw up geo-gravimetric surveys, which have successfully detected the presence of mountains, ores and other density fluctuations at the Earth's surface. Such surveys are not sensitive to the presence of the parallel Everett-Earths with different geological structures. Ergo the other worlds are not detectable gravitationally. That gravity must be quantised emerges as a unique prediction of many-worlds.
[W] Louis Witten Gravitation: an introduction to current research New York, Wiley (1962).
Essays in honor of Louis Witten on his retirement. Topics on quantum gravity and beyond: University of Cincinnati, USA, 3-4 April 1992 / editors, Freydoon Mansouri & Joseph J. Scanio. Singapore ; River Edge, NJ : World Scientific, c1993 ISBN 981021290
Linearity (of the wavefunction) has been verified to hold true to better than 1 part in 10^27 [W]. If slight non-linear effects were ever discovered then the possibility of communication with, or travel to, the other worlds would be opened up. The existence of parallel Everett- worlds can be used to argue that physics must be exactly linear, that non-linear effects will never be detected. (See"Is physics linear" for more about linearity.)
The argument for exactness uses a version of the weak anthropic principle and proceeds thus: the exploitation of slight non-linear quantum effects could permit communication with and travel to the other Everett-worlds. A sufficiently advanced "early" civilisation [F] might colonise uninhabited other worlds, presumably in an exponentially spreading fashion. Since the course of evolution is dictated by random quantum events (mutations, genetic recombination) and environmental effects (asteroidal induced mass extinctions, etc.) it seems inevitable that in a minority, although still a great many, of these parallel worlds life on Earth has already evolved sapient-level intelligence and developed an advanced technology millions or even billions of years ago. Such early arrivals, under the usual Darwinian pressure to expand, would spread across the parallel time tracks, if they had the ability, displacing their less-evolved quantum neighbours.
The fossil record indicates that evolution, in our ancestral lineage, has proceeded at varying rates at different times. Periods of rapid development in complexity (e.g. the Cambrian explosion of 530 millions years ago or the quadrupling of brain size during the recent Ice Ages) are interspersed with long periods of much slower development. This indicates that we are not in the fast lane of evolution, where all the lucky breaks turned out just right for the early development of intelligence and technology. Ergo none of the more advanced civilisations that exist in other worlds have ever been able to cross from one quantum world to another and interrupt our long, slow biological evolution.
The simplest explanation is that physics is sufficiently linear to prevent travel between Everett worlds. If technology is only bounded by physical law (the Feinberg principle [F]) then linearity would have to be exact.
[F] Gerald Feinberg. Physics and Life Prolongation Physics Today Vol 19 #11 45 (1966). "A good approximation for such [technological] predictions is to assume that everything will be accomplished that does not violate known fundamental laws of science as well as many things that do violate these laws."
[W] Steven Weinberg Testing Quantum Mechanics Annals of Physics Vol 194 #2 336-386 (1989) and Dreams of a Final Theory (1992)
What is lost by this approach is a unique past assigned to each future. If you time-evolve the world-we-now-see backwards in time you get a superposition of earlier starting worlds. Similarly if you time evolve a single (initial) world forward you get a superposition of later (final) worlds.
For example consider a photon that hits a half-silvered mirror and turns into a superposition of a transmitted and a reflected photon. If we time-evolve one of these later states backwards we get not the original photon, but the original photon plus a "mirror image" of the original photon. (Try the calculation and see.) Only if we retain both the reflected and transmitted photons, with the correct relative phase, do we recover the single incoming photon when we time-reverse everything. (The mirror image contributions from both the final states have opposite signs and cancel out, when they are evolved backwards in time to before the reflection event.)
All the starting states have to have their relative phases co-ordinated or correlated just right (i.e. coherently) or else it doesn't work out. Needless to say the chances that the initial states should be arranged coherently just so that they yield the one final observed state are infinitesimal and in violation of observed thermodynamics, which states, in one form, that correlations only increase with time.
[1] Hugh Everett III The Theory of the Universal Wavefunction, Princeton thesis (1956?) The original and most comprehensive paper on many-worlds. Investigates and recasts the foundations of quantum theory in information theoretic terms, before moving on to consider the nature of interactions, observation, entropy, irreversible processes,classical objects etc. 138 pages. Only published in [5]. [2] Hugh Everett III "Relative State" Formulation of Quantum Mechanics Reviews of Modern Physics Vol 29 #3 454-462, (July 1957) A condensation of [1] focusing on observation. [3] John A Wheeler Assessment of Everett's "Relative State" Formulation of Quantum Theory, Reviews of Modern Physics Vol 29 #3 463-465 (July 1957) Wheeler was Everett's PhD supervisor. [4a] Bryce S DeWitt Quantum Mechanics and Reality Physics Today, Vol 23 #9 30-40 (September 1970) An early and accurate popularisations of Everett's work. The April 1971 issue has reader feedback and DeWitt's responses. [4b] Bryce S DeWitt The Many-Universes Interpretation of Quantum Mechanics in Proceedings of the International School of Physics "Enrico Fermi" Course IL: Foundations of Quantum Mechanics Academic Press (1972) [5] Bryce S DeWitt, R Neill Graham eds The many-worlds Interpretation of Quantum Mechanics_. Contains [1],[2],[3],[4a],[4b] plus other material. Princeton Series in Physics, Princeton University Press (1973) ISBN 0-691- 08126-3 (hard cover), 0-691-88131-X (paper back) The definitive guide to many-worlds, if you can get hold of a copy, but now (1994) only available xeroxed from microfilm (ISBN 0-7837-1942-6) from Books On Demand, 300 N Zeeb Road, Ann Arbor, MI 48106-1346, USA. Tel: +01-313 761 4700 or 800 521 0600. [15] Frank J Tipler The many-worlds interpretation of quantum mechanics in quantum cosmology in Quantum Concepts of Space and Time eds Roger Penrose and Chris Isham, Oxford University Press (1986). Has a discussion of Ockham's razor.
On quantum theory, measurement and decoherence generally: [6] John A Wheeler, Wojciech H Zurek eds Quantum Theory and Measurement Princeton Series in Physics, Princeton University Press (1983) ISBN 0-691-08316-9. Contains 49 classic articles, including [2], covering the history and development of interpretations of quantum theory. [7a] Wojciech H Zurek Decoherence and the Transition from the Quantum to the Classical, Physics Today, 36-44 (October 1991). The role of thermodynamics and the properties of large ergodic systems (like the environment) are related to the decoherence or loss of interference effects between superposed macrostates. [7b] Wojciech H Zurek Preferred States, Predictability, Classicality, and the Environment-Induced Decoherence Progress of Theoretical Physics, Vol 89 #2 281-312 (1993) A fuller expansion of [7a] [8] Max Jammer The Philosophy of Quantum Mechanics Wiley, New York (1974) Almost every interpretation of quantum mechanics is covered and contrasted. Section 11.6 contains a lucid review of many-worlds theories. [9] Bethold-Georg Englert, Marlan O Scully, Herbert Walther Quantum optical tests of complementarity Nature, Vol 351, 111-116 (9 May 1991).Demonstrates that quantum interference effects are destroyed by irreversible object-apparatus correlations ("measurement"), not by Heisenberg's uncertainty principle itself. See also The Duality in Matter and Light Scientific American, (December 1994) [10] Murray Gell-Mann, James B Hartle Quantum Mechanics in the Light of Quantum Cosmology Proceedings of the 3rd International Symposium on the Foundations of Quantum Mechanics (1989) 321-343. They accept the Everett's decoherence analysis, and have extended it further.
Tests of the Everett metatheory: [11] David Deutsch Quantum theory as a universal physical theory International Journal of Theoretical Physics, Vol 24 #1 (1985). Describes an experiment which tests for the existence of superpositions of *consciousness (in an AI). [16] David Deutsch Three connections between Everett's interpretation and experiment Quantum Concepts of Space and Time, eds Roger Penrose and Chris Isham, Oxford University Press (1986). Discusses a testable split observer experiment and quantum computing.
On quantum computers: [12] David Deutsch Quantum theory, the Church-Turing principle and the universal quantum computer Proceedings of the Royal Society of London, Vol. A400, 96-117 (1985). [13] David Deutsch Quantum computational networks Proceedings of the Royal Society of London, Vol. A425, 73-90 (1989). [14] David Deutsch and R. Jozsa _Rapid solution of problems by quantum computation Proceedings of the Royal Society of London, Vol. A439, 553-558 (1992). [17] Julian Brown A Quantum Revolution for Computing New Scientist, pages 21-24, 24-September-1994
Note: this is a very inadequate guide. Read a more comprehensive text ASAP. For a more technical exposition of QM the reader is referred to the standard textbooks. Here are 3 I recommend:
Richard P Feynman QED: the strange story of light and matter ISBN 0- 14-012505-1. (Requires almost no maths and is universally regarded as outstanding, despite being about quantum electrodynamics.)
Richard P Feynman The Feynman Lectures in Physics Volume III Addison- Wesley (1965) ISBN 0-201-02118-8-P. The other volumes are worth reading too!
Daniel T Gillespie A Quantum Mechanics Primer: An Elementary Introduction to the Formal Theory of Non-relativistic Quantum Mechanics (Takes an axiomatic, geometric approach and teaches all the Hilbert space stuff entirely by analogy with Euclidean vector spaces. Not sure if it is still in print.)
Quantum theory is the most successful theory of physics and chemistry ever. It accounts for a wide range of phenomena from black body radiation, atomic structure and chemistry, which were very puzzling before quantum mechanics was first developed (c1926) in its modern form. All theories of physics are quantum physics, with whole new fields, like the semiconductor and microchip technology, based upon the quantum effects. This FAQ assumes familiarity with the basics of quantum theory and with the associated "paradoxes" of wave-particle duality. It will not explain the uncertainty principle or delve into the significance of non-commuting matrix operators. Only those elements of quantum theory necessary for an understanding of many-worlds are covered here.
Quantum theory contains, as a central object, an abstract mathematical entity called the "wavefunction" or "state vector". Determining the equations that describe its form and evolution with time is an unfinished part of fundamental theoretical physics. Presently we only have approximations to some "correct" set of equations, often referred to whimsically as the Theory of Everything.
The wavefunction, in bracket or Dirac notation, is written as |symbol, where "symbol" labels the object. A dog, for example, might be represented as |dog.
A general object, labelled "psi" by convention, is represented as |psi and called a "ket". Objects called "bra"s, written <psi|, may be formed from kets. An arbitrary bra <psi'| and ket |psi may be combined together to form the bracket, <psi'|psi, or inner product, which is just a fancy way of constructing a complex number. Amongst the properties of the inner product is:
<psi'|(|psi1*a_1 + |psi2*a_2) = <psi'|psi1*a_1 + <psi'|psi2*a_2
where the a_i are arbitrary complex numbers. This is what is meant by saying that the inner product is linear on the right or ket side. It is made linear on the left-hand or bra side by defining
<psi|psi' = complex conjugate of <psi'|psi
Any ket may be expanded as:
|psi = sum |i*<i|psi i = |1*<1|psi + |2*<2|psi + ...
where the states |i form an orthonormal basis, with <i|j = 1 for i = j and = 0 otherwise, and where i labels some parameter of the object (like position or momentum).
The probability amplitudes, <i|psi, are complex numbers. It is empirically observed, first noted by Max Born and afterwards called the Born interpretation, that their magnitudes squared represent the probability that, upon observation, that the value of the parameter, labelled by i, will be observed if the system is the state represented by |psi. It is also empirically observed that after observing the system in state |i that we can henceforth replace the old value of the wavefunction, |psi, with the observed value, |i. This replacement is known as the collapse of the wavefunction and is the source of much philosophical controversy. Somehow the act of measurement has selected out one of the components. This is known as the measurement problem and it was this phenomenon that Everett addressed.
When a bra, <psi|, is formed from a ket, |psi, and both are inner productted together the result, <psi|psi, is a non-negative real number, called the norm of the vector. The norm of a vector provides a basis-independent way of measuring the "volume" of the vector.
The wavefunction for a joint system is built out of products of the components from the individual subsystems.
For example if the two systems composing the joint system are a cat and a dog, each of which may be in two states, alive or dead, and the state of the cat and the dog were independent of each other then we could write the total wavefunction as a product of terms. If
|cat = |cat alive * c_a + |cat dead * c_d and |dog = |dog alive * d_a + |dog dead * d_d then |dog+cat = |catx|dog where x = tensor product = (|cat alive * c_a + |cat dead * c_d) x (|dog alive * d_a + |dog dead * d_d) = |cat alive x |dog alive * c_a * d_a + |cat alive x |dog dead * c_a * d_d + |cat dead x |dog alive * c_d * d_a + |cat dead x |dog dead * c_d * d_d = |cat alive, dog alive * c_a * d_a + |cat alive, dog dead * c_a * d_d + |cat dead, dog alive * c_d * d_a + |cat dead, dog dead * c_d * d_d
More generally, though, we states of subsystems are not independent of each other we have to use a more general formula:
|dog+cat = |cat alive, dog alive * a_1 + |cat alive, dog dead * a_2 + |cat dead, dog alive * a_3 + |cat dead, dog dead * a_4
This is sometimes described by saying that the states of the cat and dog have become entangled. It is fairly trivial to define the state of the cat and the dog with respect to each other. For instance we could re- express the above expansion with respect to the cat's two states as:
|dog+cat = |cat alivex(|dog alive * a_1 + |dog dead * a_2) + |cat deadx(|dog alive * a_3 + |dog dead * a_4)
We term the state of the dog the relative state (Everett invented this terminology) with respect to the cat, specifying which cat state (alive or dead) we are interested in. This thus the dog's relative state with respect to the cat alive state is:
(|dog alive * a_1 + |dog dead * a_2)/sqrt(|a_1|^2 + |a_2|^2)
where the sqrt term has been added to normalise the relative state.
PREPRINTS
GR-QC
Quantum Physics
'The Fabric of Reality'