Hubbry Logo
search
logo

Quantum foundations

logo
Community Hub0 Subscribers

Wikipedia

from Wikipedia

Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relativity, the defining axioms of quantum theory are quite ad hoc, with no obvious physical intuition. While they lead to the right experimental predictions, they do not come with a mental picture of the world where they fit.

There exist different approaches to resolve this conceptual gap:

  • First, one can put quantum physics in contraposition with classical physics: by identifying scenarios, such as Bell experiments, where quantum theory radically deviates from classical predictions, one hopes to gain physical insights on the structure of quantum physics.
  • Second, one can attempt to find a re-derivation of the quantum formalism in terms of operational axioms.
  • Third, one can search for a full correspondence between the mathematical elements of the quantum framework and physical phenomena: any such correspondence is called an interpretation.
  • Fourth, one can renounce quantum theory altogether and propose a different model of the world.

Research in quantum foundations is structured along these roads.

Non-classical features of quantum theory

[edit]

Quantum nonlocality

[edit]

Two or more separate parties conducting measurements over a quantum state can observe correlations which cannot be explained with any local hidden variable theory.[1][2] Whether this should be regarded as proving that the physical world itself is "nonlocal" is a topic of debate,[3][4] but the terminology of "quantum nonlocality" is commonplace. Nonlocality research efforts in quantum foundations focus on determining the exact limits that classical or quantum physics enforces on the correlations observed in a Bell experiment or more complex causal scenarios.[5] This research program has so far provided a generalization of Bell's theorem that allows falsifying all classical theories with a superluminal, yet finite, hidden influence.[6]

Quantum contextuality

[edit]

Nonlocality can be understood as an instance of quantum contextuality. A situation is contextual when the value of an observable depends on the context in which it is measured (namely, on which other observables are being measured as well). The original definition of measurement contextuality can be extended to state preparations and even general physical transformations.[7]

Epistemic models for the quantum wave-function

[edit]

A physical property is epistemic when it represents our knowledge or beliefs on the value of a second, more fundamental feature. The probability of an event to occur is an example of an epistemic property. In contrast, a non-epistemic or ontic variable captures the notion of a "real" property of the system under consideration.

There is an on-going debate on whether the wave-function represents the epistemic state of a yet to be discovered ontic variable or, on the contrary, it is a fundamental entity.[8] Under some physical assumptions, the Pusey–Barrett–Rudolph (PBR) theorem demonstrates the inconsistency of quantum states as epistemic states, in the sense above.[9] Note that, in QBism[10] and Copenhagen-type[11] views, quantum states are still regarded as epistemic, not with respect to some ontic variable, but to one's expectations about future experimental outcomes. The PBR theorem does not exclude such epistemic views on quantum states.

Axiomatic reconstructions

[edit]

Some of the counter-intuitive aspects of quantum theory, as well as the difficulty to extend it, follow from the fact that its defining axioms lack a physical motivation. An active area of research in quantum foundations is therefore to find alternative formulations of quantum theory which rely on physically compelling principles. Those efforts come in two flavors, depending on the desired level of description of the theory: the so-called Generalized Probabilistic Theories approach and the Black boxes approach.

The framework of generalized probabilistic theories

[edit]

Generalized Probabilistic Theories (GPTs) are a general framework to describe the operational features of arbitrary physical theories. Essentially, they provide a statistical description of any experiment combining state preparations, transformations and measurements. The framework of GPTs can accommodate classical and quantum physics, as well as hypothetical non-quantum physical theories which nonetheless possess quantum theory's most remarkable features, such as entanglement or teleportation.[12] Notably, a small set of physically motivated axioms is enough to single out the GPT representation of quantum theory.[13]

L. Hardy introduced the concept of GPT in 2001, in an attempt to re-derive quantum theory from basic physical principles.[13] Although Hardy's work was very influential (see the follow-ups below), one of his axioms was regarded as unsatisfactory: it stipulated that, of all the physical theories compatible with the rest of the axioms, one should choose the simplest one.[14] The work of Dakic and Brukner eliminated this "axiom of simplicity" and provided a reconstruction of quantum theory based on three physical principles.[14] This was followed by the more rigorous reconstruction of Masanes and Müller.[15]

Axioms common to these three reconstructions are:

  • The subspace axiom: systems which can store the same amount of information are physically equivalent.
  • Local tomography: to characterize the state of a composite system it is enough to conduct measurements at each part.
  • Reversibility: for any two extremal states [i.e., states which are not statistical mixtures of other states], there exists a reversible physical transformation that maps one into the other.

An alternative GPT reconstruction proposed by Chiribella, D'Ariano and Perinotti [16][17] around the same time is also based on the

  • Purification axiom: for any state of a physical system A there exists a bipartite physical system and an extremal state (or purification) such that is the restriction of to system . In addition, any two such purifications of can be mapped into one another via a reversible physical transformation on system .

The use of purification to characterize quantum theory has been criticized on the grounds that it also applies in the Spekkens toy model.[18]

To the success of the GPT approach, it can be countered that all such works just recover finite dimensional quantum theory. In addition, none of the previous axioms can be experimentally falsified unless the measurement apparatuses are assumed to be tomographically complete.

Categorical quantum mechanics or process theories

[edit]

Categorical Quantum Mechanics (CQM) or Process Theories are a general framework to describe physical theories, with an emphasis on processes and their compositions.[19] It was pioneered by Samson Abramsky and Bob Coecke. Besides its influence in quantum foundations, most notably the use of a diagrammatic formalism, CQM also plays an important role in quantum technologies, most notably in the form of ZX-calculus. It also has been used to model theories outside of physics, for example the DisCoCat compositional natural language meaning model.

The framework of black boxes

[edit]

In the black box or device-independent framework, an experiment is regarded as a black box where the experimentalist introduces an input (the type of experiment) and obtains an output (the outcome of the experiment). Experiments conducted by two or more parties in separate labs are hence described by their statistical correlations alone.

From Bell's theorem, we know that classical and quantum physics predict different sets of allowed correlations. It is expected, therefore, that far-from-quantum physical theories should predict correlations beyond the quantum set. In fact, there exist instances of theoretical non-quantum correlations which, a priori, do not seem physically implausible.[20][21][22] The aim of device-independent reconstructions is to show that all such supra-quantum examples are precluded by a reasonable physical principle.

The physical principles proposed so far include no-signalling,[22] Non-Trivial Communication Complexity,[23] No-Advantage for Nonlocal computation,[24] Information Causality,[25] Macroscopic Locality,[26] and Local Orthogonality.[27] All these principles limit the set of possible correlations in non-trivial ways. Moreover, they are all device-independent: this means that they can be falsified under the assumption that we can decide if two or more events are space-like separated. The drawback of the device-independent approach is that, even when taken together, all the afore-mentioned physical principles do not suffice to single out the set of quantum correlations.[28] In other words: all such reconstructions are partial.

Interpretations of quantum theory

[edit]

An interpretation of quantum theory is a correspondence between the elements of its mathematical formalism and physical phenomena. For instance, in the pilot wave theory, the quantum wave function is interpreted as a field that guides the particle trajectory and evolves with it via a system of coupled differential equations. Most interpretations of quantum theory stem from the desire to solve the quantum measurement problem.

Extensions of quantum theory

[edit]

In an attempt to reconcile quantum and classical physics, or to identify non-classical models with a dynamical causal structure, some modifications of quantum theory have been proposed.

Collapse models

[edit]

Collapse models posit the existence of natural processes which periodically localize the wave-function.[29] Such theories provide an explanation to the nonexistence of superpositions of macroscopic objects, at the cost of abandoning unitarity and exact energy conservation.

Quantum measure theory

[edit]

In Sorkin's quantum measure theory (QMT), physical systems are not modeled via unitary rays and Hermitian operators, but through a single matrix-like object, the decoherence functional.[30] The entries of the decoherence functional determine the feasibility to experimentally discriminate between two or more different sets of classical histories, as well as the probabilities of each experimental outcome. In some models of QMT the decoherence functional is further constrained to be positive semidefinite (strong positivity). Even under the assumption of strong positivity, there exist models of QMT which generate stronger-than-quantum Bell correlations.[31]

Acausal quantum processes

[edit]

The formalism of process matrices starts from the observation that, given the structure of quantum states, the set of feasible quantum operations follows from positivity considerations. Namely, for any linear map from states to probabilities one can find a physical system where this map corresponds to a physical measurement. Likewise, any linear transformation that maps composite states to states corresponds to a valid operation in some physical system. In view of this trend, it is reasonable to postulate that any high-order map from quantum instruments (namely, measurement processes) to probabilities should also be physically realizable.[32] Any such map is termed a process matrix. As shown by Oreshkov et al.,[32] some process matrices describe situations where the notion of global causality breaks.

The starting point of this claim is the following mental experiment: two parties, Alice and Bob, enter a building and end up in separate rooms. The rooms have ingoing and outgoing channels from which a quantum system periodically enters and leaves the room. While those systems are in the lab, Alice and Bob are able to interact with them in any way; in particular, they can measure some of their properties.

Since Alice and Bob's interactions can be modeled by quantum instruments, the statistics they observe when they apply one instrument or another are given by a process matrix. As it turns out, there exist process matrices which would guarantee that the measurement statistics collected by Alice and Bob is incompatible with Alice interacting with her system at the same time, before or after Bob, or any convex combination of these three situations.[32] Such processes are called acausal.

See also

[edit]

References

[edit]

Grokipedia

from Grokipedia
Quantum foundations is the branch of physics that investigates the conceptual, mathematical, and philosophical underpinnings of quantum mechanics, addressing core issues such as the interpretation of wave function collapse, the nature of superposition and entanglement, non-locality, and the reconciliation of quantum theory with classical physics and relativity.[1][2] Despite quantum mechanics' unparalleled predictive accuracy in describing microscopic phenomena like atomic spectra and particle interactions, its foundational aspects remain unresolved, prompting ongoing debates about the ontology of the theory and the reality it describes.[1][2] The field originated in the early 20th century amid the formulation of quantum mechanics, with pivotal contributions from pioneers such as Max Planck, who introduced the quantum hypothesis in 1900 to explain blackbody radiation, and Werner Heisenberg, whose 1925 matrix mechanics laid the groundwork for non-commutative algebra in quantum theory.[3][2] Key historical debates, including the Bohr-Einstein debates that began in 1927, centered on the completeness of quantum mechanics and the role of hidden variables; these were advanced by the Heisenberg uncertainty principle and Bohr's principle of complementarity, though Einstein remained skeptical, leading to further challenges.[3] John von Neumann's 1932 axiomatization formalized the measurement process and introduced quantum entropy, quantifying irreversibility as $ S = -k \sum p_i \log p_i $, where $ k $ is Boltzmann's constant and $ p_i $ are probabilities.[3] Central to quantum foundations are several enduring problems, including the measurement problem, where the Schrödinger equation predicts persistent superpositions but observations yield definite outcomes, necessitating interpretations to explain the apparent collapse.[2][4] Prominent interpretations include the Copenhagen interpretation, which posits wave function collapse upon measurement and distinguishes classical from quantum domains; the Many-Worlds interpretation, proposing branching universes without collapse; the de Broglie-Bohm pilot-wave theory, assigning definite particle trajectories guided by the wave function; and QBism, viewing the wave function as subjective Bayesian probabilities rather than objective reality.[2] Another cornerstone is Bell's theorem (1964), which demonstrated that local hidden variable theories cannot reproduce quantum predictions, with experimental violations of Bell inequalities—such as those confirmed over distances up to 1200 km via satellite—affirming quantum non-locality while challenging classical intuitions of locality and causality. This experimental confirmation of quantum nonlocality was recognized by the 2022 Nobel Prize in Physics, awarded to John Clauser, Alain Aspect, and Anton Zeilinger for their pioneering work on entangled photons.[1][4][5] Contemporary quantum foundations research intersects with quantum information science, exploring implications for entropy, irreversibility, and the quantum-to-classical transition through decoherence, where environmental interactions suppress quantum superpositions to yield classical behavior.[4] Open questions, as highlighted in the 2013 Oxford Questions, include the fundamental nature of time and information in quantum gravity, the possibility of macroscopic superpositions, and whether quantum theory requires extension or replacement to unify with general relativity at the Planck scale of $ 10^{-35} $ meters.[4] These inquiries not only deepen theoretical understanding but also underpin practical advancements in quantum technologies, such as computing, cryptography, and sensing, demonstrating the field's profound societal impact.[1]

Historical development

Origins in early quantum mechanics

The roots of quantum foundations lie in the breakdowns of classical physics during the late 19th and early 20th centuries, particularly in phenomena involving radiation and atomic structure that defied continuous energy assumptions. A pivotal moment occurred in 1900 when Max Planck addressed the blackbody radiation problem, where classical Rayleigh-Jeans theory predicted infinite energy at high frequencies (the ultraviolet catastrophe), contradicting experimental spectra. Planck derived an empirical formula by postulating that energy from atomic oscillators is exchanged in discrete quanta E=hνE = h\nu, with hh as a universal constant and ν\nu the frequency, successfully matching observations across all wavelengths.[6] This quantization idea gained traction through Albert Einstein's 1905 explanation of the photoelectric effect, where classical wave theory failed to account for the threshold frequency and linear energy dependence of ejected electrons. Einstein hypothesized light as consisting of localized energy quanta (photons) with E=hνE = h\nu, such that electron emission occurs only if hν>ϕh\nu > \phi (with ϕ\phi the material's work function), and the electron's kinetic energy is hνϕh\nu - \phi. This particle-like view of light extended Planck's discrete energy to electromagnetic waves, challenging classical continuity.[7] Niels Bohr advanced these concepts in his 1913 model of the hydrogen atom, reconciling Rutherford's nuclear structure with quantization to explain discrete spectral lines. Electrons occupy stationary orbits with quantized angular momentum L=nL = n\hbar (where nn is an integer and =h/2π\hbar = h/2\pi), preventing classical radiation losses, and transitions between levels emit or absorb photons of energy ΔE=hν\Delta E = h\nu. This semi-classical framework reproduced Balmer series frequencies accurately, introducing ad hoc stability rules that hinted at deeper quantum principles.[8] Wave-particle duality emerged explicitly in Louis de Broglie's 1924 thesis, proposing that all matter possesses wave properties analogous to light's duality, with wavelength λ=h/p\lambda = h/p ( pp momentum) derived from extending Einstein's photon momentum p=h/λp = h/\lambda. This hypothesis unified particle and wave behaviors, predicting electron diffraction later confirmed experimentally. Building on this, Werner Heisenberg formulated matrix mechanics in 1925 as a non-commutative algebra of observables, replacing trajectories with arrays whose products yield transition amplitudes, aligning with Bohr's correspondence principle for large quantum numbers.[9][10] Erwin Schrödinger independently developed wave mechanics in 1926, treating particles as waves governed by a differential equation. For time-independent cases, the state ψ\psi satisfies the eigenvalue equation
Hψ=Eψ, H \psi = E \psi,
where HH is the Hamiltonian operator incorporating kinetic and potential energies, yielding quantized energies EE for bound systems like the hydrogen atom. This approach reproduced Bohr's results exactly and extended to multi-electron systems.[11] Max Born provided the probabilistic interpretation that same year, stating that ψ2dV|\psi|^2 dV represents the probability of finding the particle in volume dVdV, shifting quantum theory from deterministic waves to statistical predictions and resolving the physical meaning of ψ\psi.[12] These foundational formalisms from 1900 to 1926 established quantum mechanics as a predictive theory, yet their non-classical elements—discreteness, duality, and probability—soon sparked philosophical debates on reality and measurement.

Emergence of foundational questions

The formulation of quantum mechanics in the mid-1920s, while providing a powerful predictive framework, soon gave rise to profound foundational questions about the nature of reality, measurement, and determinism. A pivotal moment came with Werner Heisenberg's introduction of the uncertainty principle in 1927, which posited that certain pairs of physical properties, such as position and momentum, cannot be simultaneously measured with arbitrary precision, implying inherent limits to knowledge in the quantum realm.[13] This principle challenged classical intuitions of a fully determinate world, sparking debates on whether quantum indeterminacy reflected fundamental unpredictability or incomplete theory.[14] These issues intensified at the Fifth Solvay Conference in October 1927, where Niels Bohr and Albert Einstein engaged in a landmark debate on the interpretation of quantum mechanics. Bohr defended his concept of complementarity, arguing that wave-particle duality required mutually exclusive experimental contexts, rendering quantum descriptions inherently observer-dependent and rejecting classical realism. Einstein, countering with thought experiments like the "clock in a box," insisted on the existence of objective reality independent of measurement, questioning whether quantum mechanics could be complete without hidden variables underlying apparent randomness.[15] The exchanges highlighted tensions between probabilistic quantum predictions and the desire for a deterministic, local description of nature. In 1932, John von Neumann formalized these concerns in his axiomatic treatment of quantum mechanics, introducing the projection postulate to describe measurement-induced state collapse and proving a no-hidden-variables theorem that seemed to rule out deterministic underpinnings consistent with quantum statistics.[16] This theorem assumed that hidden variables would need to reproduce quantum probabilities additively, leading to an impossibility result under standard assumptions, though later critiques revealed flaws in its scope.[17] The mid-1930s saw further challenges: the Einstein-Podolsky-Rosen (EPR) paradox of 1935 argued that quantum mechanics was incomplete because entangled particles implied instantaneous influences violating locality, as measuring one particle's property appeared to determine the distant other's without causal connection.[18] Concurrently, Erwin Schrödinger's cat thought experiment illustrated the absurdity of applying superposition to macroscopic objects, where a cat in a sealed box linked to a quantum event would be simultaneously alive and dead until observed, underscoring the measurement problem's scale.[19] These paradoxes persisted into the 1960s, with Eugene Wigner's friend thought experiment in 1961—rooted in earlier ideas from the 1930s—questioning the role of consciousness in wave function collapse. In this scenario, a friend measures a quantum system inside a lab, entangling it with their knowledge, while Wigner outside views the entire setup as superposed until he intervenes, raising irreconcilable descriptions between observers.[20] Such debates crystallized the foundational crises, prompting ongoing scrutiny of quantum mechanics' ontological status without resolving whether reality is observer-independent or fundamentally non-classical.

Core non-classical features

Superposition and the measurement problem

In quantum mechanics, the superposition principle asserts that a quantum system can exist in a linear combination of multiple states simultaneously, described mathematically as $ |\psi\rangle = \alpha |0\rangle + \beta |1\rangle $, where $ \alpha $ and $ \beta $ are complex coefficients satisfying $ |\alpha|^2 + |\beta|^2 = 1 $, and $ |0\rangle $, $ |1\rangle $ represent basis states. This principle, fundamental to the theory's linear structure, allows quantum states to interfere constructively or destructively, leading to observable effects that defy classical intuition. For instance, in the double-slit experiment, particles such as electrons or photons passing through two slits produce an interference pattern on a detection screen, as if each particle explores both paths simultaneously and interferes with itself, demonstrating the wave-like aspect of quantum matter.[21] The measurement problem arises from the apparent conflict between this unitary evolution of superpositions under the Schrödinger equation and the definite, classical outcomes observed upon measurement, questioning why and how a superposition "collapses" into a single definite state with probabilities given by the Born rule. John von Neumann formalized this issue in his analysis of the measurement process, describing a chain where the system's state entangles sequentially with the measuring apparatus, then the environment, and ultimately the observer, yet the collapse occurs only at the subjective boundary of the observer's consciousness, leaving unresolved where exactly the transition to a definite outcome happens. This von Neumann chain highlights the problematic role of the observer, as extending the chain indefinitely without collapse would entangle the observer in a superposition, contradicting everyday experience of classical reality. Decoherence provides a partial explanation for the appearance of definite outcomes by showing how interactions with the environment rapidly suppress interference between superposition components, effectively selecting robust "pointer states" that behave classically without invoking a true collapse.[22] Wojciech Zurek's work in the 1980s and 1990s developed this through the concept of environment-induced superselection (einselection), where environmental entanglement leads to the loss of off-diagonal terms in the density matrix, making superpositions unobservable on macroscopic scales, though it does not fully resolve the origin of the probabilistic Born rule or the preferred basis problem. Complementing these challenges, Gleason's theorem demonstrates that non-contextual hidden variable theories—positing definite pre-measurement values independent of measurement context—cannot reproduce quantum probabilities for systems with Hilbert space dimension greater than two, ruling out such deterministic underpinnings for superposition without contextual influences. While entanglement extends superposition to multi-particle correlations, the measurement problem primarily concerns the collapse in single-system superpositions.[22]

Entanglement and quantum nonlocality

Quantum entanglement refers to a phenomenon in quantum mechanics where the quantum state of two or more particles cannot be described independently, even when separated by large distances, such that the state of the entire system is inseparable. This concept was first articulated by Erwin Schrödinger in 1935, who introduced the term "entanglement" to describe the peculiar correlations arising from such composite systems. A canonical example is the Bell state $ |\Phi^+\rangle = \frac{1}{\sqrt{2}} (|00\rangle + |11\rangle) $, where two qubits exhibit perfect correlation in their outcomes upon measurement, regardless of the distance between them. The implications of entanglement for locality were dramatically highlighted in the 1935 Einstein-Podolsky-Rosen (EPR) argument, which questioned the completeness of quantum mechanics by considering an entangled pair of particles with perfectly correlated positions and momenta. Einstein, Podolsky, and Rosen argued that measuring one particle's properties instantaneously determines the other's, suggesting either quantum mechanics is incomplete or involves "spooky action at a distance" that violates the principle of locality in special relativity. This paradox underscored the tension between quantum correlations and classical intuitions of separability, prompting decades of debate on whether hidden variables could restore locality. John Stewart Bell resolved this debate in 1964 by deriving an inequality that any local hidden-variable theory must satisfy for measurements on entangled particles. Specifically, for two parties Alice and Bob performing measurements A, A' and B, B' respectively on their shares of an entangled pair, the correlations obey the CHSH inequality:
AB+AB+ABAB2. |\langle AB \rangle + \langle AB' \rangle + \langle A'B \rangle - \langle A'B' \rangle| \leq 2.
Quantum mechanics predicts violations of this bound, reaching a maximum of $ 2\sqrt{2} $ for certain entangled states, demonstrating that no local hidden-variable model can reproduce all quantum predictions.[23] This inequality, formalized by Clauser, Horne, Shimony, and Holt in 1969, provided a testable criterion for nonlocality. Experimental validations began with Alain Aspect's 1982 tests using entangled photons, which violated the CHSH inequality by approximately 5 standard deviations, closing the locality loophole through rapid switching of measurement settings. Decades later, fully loophole-free demonstrations were achieved in 2015: Hensen et al. used electron spins in diamond separated by 1.3 km, observing a CHSH violation of $ S = 2.42 \pm 0.20 $, simultaneously addressing detection, locality, and freedom-of-choice loopholes. Independently, Giustina et al. employed entangled photons over 58.5 m, reporting $ S = 2.27 \pm 0.23 $, confirming quantum nonlocality without experimental assumptions favoring quantum predictions.[24] Despite these nonlocal correlations, quantum mechanics preserves relativistic causality through the no-signaling theorem, which ensures that the measurement statistics for one party remain unchanged regardless of the distant party's choice of measurement basis, preventing superluminal information transfer. This theorem, inherent to the quantum formalism, reconciles entanglement's nonlocality with special relativity by prohibiting controllable signaling.[25]

Contextuality and incompatibility

Quantum contextuality refers to the phenomenon in quantum mechanics where the outcome probabilities of measurements depend on the specific context of compatible observables measured together, violating the assumptions of non-contextual hidden variable models that assign predetermined values to all observables independently of measurement context.[26] In such non-contextual models, each quantum state would correspond to a hidden variable that fixes the value of every observable, with measurement merely revealing that pre-assigned value, but quantum predictions contradict this possibility.[27] The Kochen-Specker theorem, proved in 1967, demonstrates that no non-contextual hidden variable theory can reproduce quantum mechanics for a single quantum system in three-dimensional Hilbert space. Specifically, the theorem shows that it is impossible to assign definite values (0 or 1 for projectors) to all one-dimensional subspaces (rays) such that the assignments are consistent with the functional relations imposed by quantum orthogonality and completeness, without depending on the measurement context. The original proof constructs a set of 117 vectors in R3\mathbb{R}^3, grouped into 40 orthogonal triads, where any non-contextual assignment leads to a contradiction because the number of vectors assigned value 1 in each complete basis must sum to 1, but the interlocking structure makes this impossible. John S. Bell independently arrived at a related result in 1966, emphasizing the contextuality inherent in quantum measurements for single systems, later termed the Bell-Kochen-Specker theorem, which highlights that quantum mechanics requires contextual value assignments even without spatial separation. This single-system contextuality underscores a core non-classical feature distinct from multipartite correlations, though nonlocality can be viewed as a form of multipartite contextuality. Incompatibility complements contextuality by arising from non-commuting observables, which prevent simultaneous precise measurements and lead to uncertainty relations. The canonical example is the position-momentum commutator [x^,p^]=i[\hat{x}, \hat{p}] = i\hbar, derived from the foundational structure of quantum kinematics, implying that the product of uncertainties satisfies ΔxΔp/2\Delta x \Delta p \geq \hbar/2. This incompatibility ensures that measurement contexts cannot be ignored, as joint probability distributions for incompatible observables do not exist in quantum theory. Experimental verification of Kochen-Specker contextuality has been achieved through inequality tests that non-contextual models must satisfy but quantum predictions violate. In 2008, Klyachko et al. proposed the KCBS inequality for a spin-1 system using five pairwise compatible projectors PiP_i (i=1 to 5) forming a pentagon compatibility graph, where non-contextual hidden variable theories satisfy i=15Pi2\sum_{i=1}^5 \langle P_i \rangle \leq 2, but quantum mechanics achieves up to 52.236\sqrt{5} \approx 2.236, violating the bound.[28] This was confirmed experimentally with photons in subsequent works. Subsequent loophole-free tests have further solidified these results, closing detection and compatibility issues.

Interpretations of quantum mechanics

Copenhagen and instrumentalist views

The Copenhagen interpretation, developed in the late 1920s and early 1930s primarily by Niels Bohr and Werner Heisenberg, represents a foundational operational approach to quantum mechanics that emphasizes the theory's predictive power for experimental outcomes rather than a description of an underlying objective reality.[29] This view arose as a response to the measurement problem, where quantum superpositions appear to resolve into definite states upon observation, motivating a focus on the conditions under which measurements are performed.[29] Central to the Copenhagen framework is Bohr's principle of complementarity, introduced in 1928, which posits that wave-particle duality in quantum phenomena—such as light behaving as both waves and particles—cannot be observed simultaneously but requires mutually exclusive experimental arrangements.[30] For instance, the double-slit experiment reveals wave-like interference when no which-path information is sought, but particle-like behavior emerges when such information is obtained, illustrating how complementary descriptions are context-dependent and exhaustive for understanding quantum systems.[30] Bohr argued that this principle resolves apparent paradoxes by acknowledging the limitations of classical concepts in the quantum domain, without invoking hidden variables or deeper mechanisms.[30] Heisenberg and Max Born further shaped the probabilistic core of this interpretation by interpreting the quantum formalism as yielding probabilities for measurement outcomes, rather than deterministic trajectories. In 1927, Heisenberg emphasized the uncertainty principle, linking it to the unavoidable disturbance of quantum systems during observation, while Born's 1926 rule specified that the square of the wave function's amplitude gives the probability density for finding a particle at a given location. This statistical approach, formalized in their joint report at the 1927 Solvay Conference, treats quantum mechanics as a tool for calculating likelihoods of observable events, eschewing claims about unobservable intermediate states. John von Neumann provided a rigorous mathematical underpinning in his 1932 treatise, introducing the projection postulate—often called the collapse postulate—whereby the quantum state vector undergoes an instantaneous, non-unitary reduction to an eigenstate upon measurement of an observable. This postulate delineates the boundary between quantum evolution (governed by the Schrödinger equation) and classical measurement outcomes, reinforcing the Copenhagen emphasis on the act of measurement as the point where probabilities actualize. Von Neumann's framework thus operationalizes the theory, assuming a separation between the quantum system and a classical measuring apparatus without specifying the physical process of collapse. Instrumentalist extensions of these ideas, such as Quantum Bayesianism (QBism) developed by Christopher Fuchs and collaborators in the 2000s, treat the quantum state not as an objective feature of physical systems but as a subjective catalog of an agent's beliefs about future measurement results.[31] In QBism, probabilities derived from the Born rule reflect personal degrees of credence, updated Bayesian-style upon new data, thereby resolving interpretive puzzles like the measurement problem by relocating them to epistemic rather than ontological domains.[31] This approach aligns with Copenhagen's operationalism, prioritizing how quantum predictions guide empirical inquiry over any commitment to a hidden reality beneath the formalism.[31] A defining feature of Copenhagen and instrumentalist views is their deliberate avoidance of assumptions about hidden variables or an independent quantum reality, instead concentrating exclusively on verifiable predictions for macroscopic measurements.[29] This pragmatic stance has enabled quantum mechanics' extraordinary success in applications, from atomic spectra to quantum technologies, by focusing on what can be tested rather than unobservable entities.[29] Criticisms of these interpretations, particularly from the 1920s through the 1950s, center on the lack of a detailed mechanism for wave function collapse, which appears as an ad hoc addition to the unitary evolution of the Schrödinger equation without physical justification.[29] Einstein, for instance, famously objected that such a view renders quantum theory incomplete, as it fails to explain the transition from probabilistic superpositions to singular outcomes in a causally coherent manner.[29] Despite these challenges, the instrumentalist emphasis on empirical adequacy has sustained Copenhagen's influence as the de facto framework for much of quantum physics practice.[29]

Ontological interpretations

Ontological interpretations of quantum mechanics aim to provide a realist ontology by attributing objective existence to quantum entities, such as particles or wave functions, thereby addressing foundational issues like the measurement problem through underlying physical dynamics rather than observer-dependent collapse. These approaches posit that the quantum state describes an objective reality, often incorporating hidden variables or universal evolution to recover classical-like definiteness while reproducing empirical predictions. Unlike instrumentalist views, they seek a complete description of physical systems independent of measurement contexts.[32] A prominent example is Bohmian mechanics, also known as the de Broglie-Bohm pilot-wave theory, introduced by David Bohm in 1952. In this framework, particles possess definite trajectories and positions at all times, with their motion determined by a guiding equation derived from the wave function ψ\psi. The velocity of a particle is given by
dxdt=m(ψψ), \frac{d\mathbf{x}}{dt} = \frac{\hbar}{m} \Im \left( \frac{\nabla \psi}{\psi} \right),
where x\mathbf{x} is the particle position, mm its mass, \hbar the reduced Planck's constant, and \Im denotes the imaginary part. The wave function evolves according to the standard Schrödinger equation, acting as a nonlocal pilot wave that influences particle motion without altering its form. This deterministic ontology resolves the measurement problem by eliminating collapse, as apparent randomness arises from initial position ignorance, though it implies instantaneous influences across distances, challenging locality.[33] Another key ontological interpretation is the many-worlds interpretation, formulated by Hugh Everett III in 1957. Here, the entire universe is described by a single universal wave function that evolves linearly via the Schrödinger equation, with no wave function collapse occurring during measurements. Instead, interactions between quantum systems and observers lead to entanglement, effectively branching the universe into multiple, non-interacting worlds, each realizing a different outcome of a superposition. For instance, in a spin measurement, the observer-system state splits into correlated branches where the particle is "up" in one world and "down" in another, with all branches coexisting objectively. This approach maintains unitarity and avoids special roles for measurement, providing a fully objective dynamics, but it requires accepting the proliferation of worlds and raises questions about the preferred basis for branching.[34] The consistent histories interpretation offers an alternative ontological framework, pioneered by Robert B. Griffiths in 1984 and extended by Roland Omnès. It generalizes the Born rule to assign probabilities not just to single events but to entire sequences or "histories" of quantum events, defined as chains of projectors on the system's Hilbert space. A set of histories is deemed consistent or decoherent if the interference between different paths vanishes, satisfying the condition kTr(ChkChkρ)=Tr(ρkChkChk)\sum_k \text{Tr}(C_{h_k}^\dagger C_{h_k} \rho) = \text{Tr}(\rho \sum_k C_{h_k}^\dagger C_{h_k}) for density operator ρ\rho and class operators ChkC_{h_k}, ensuring classical probability rules apply without contradiction. This allows objective descriptions of closed quantum systems over time, resolving the measurement problem by selecting decoherent families of histories as physically realizable narratives, akin to classical paths. However, the approach depends on choosing appropriate decoherent sets, which can introduce ambiguity in the preferred basis.[35] Collectively, these ontological interpretations resolve the measurement problem by invoking objective, collapse-free dynamics—deterministic guidance in Bohmian mechanics, universal branching in many-worlds, or decoherent narratives in consistent histories—but they encounter challenges such as nonlocality, ontological extravagance, or basis selection issues.[32]

Informational and relational approaches

Informational approaches to quantum foundations posit that the fundamental entities of the universe are not physical objects or fields per se, but rather information and the processes by which it is acquired and processed. This perspective, famously encapsulated in John Archibald Wheeler's "it from bit" hypothesis, suggests that every physical "it"—every particle, force, or spacetime feature—derives its existence and properties from binary yes/no questions answered through quantum measurements, rendering reality as an emergent construct from informational exchanges.[36] Wheeler argued that this framework bridges quantum mechanics with general relativity by treating information as the bedrock, influencing subsequent efforts to derive quantum rules from informational axioms, such as the no-cloning theorem and the Born rule, without presupposing the full Hilbert space structure.[36] Relational quantum mechanics (RQM), proposed by Carlo Rovelli in 1996, extends this informational ethos by viewing quantum states as inherently relative to specific observers or systems, eschewing any absolute, observer-independent facts about the world. In RQM, the state of a system is not a complete description of reality but a tool encoding correlations between that system and another interacting system, such as an observer; outcomes of measurements are thus relativized, with different observers potentially describing the same event differently without contradiction.[37] This relational ontology treats the wave function as epistemic—reflecting an observer's knowledge of correlations rather than an ontic entity representing the system's intrinsic state—thereby avoiding paradoxes like the measurement problem by denying a unique, universal reality and emphasizing observer-dependent perspectives.[37] For instance, in the Schrödinger's cat thought experiment, the cat's state is alive relative to one observer and dead relative to another, resolving apparent superpositions without collapse.[37] Quantum Darwinism, developed by Wojciech H. Zurek starting in the early 2000s, complements these views by explaining how classical objectivity emerges from quantum superpositions through environmental interactions that redundantly broadcast information about preferred states. The theory posits that certain robust "pointer states" of a quantum system—those stable under decoherence—proliferate copies of themselves in the environment via entanglement, allowing multiple observers to access the same classical information without directly interacting with the system. This redundancy acts as a Darwinian selection process, where only the fittest states (those maximizing information survival) become "observable" reality, with the environment serving as a witness that amplifies classical correlations while suppressing quantum alternatives. Decoherence plays a supportive role here by enabling this proliferation without resolving the interpretive issues outright. A key recent development within relational and informational paradigms is the revival of the Page-Wootters mechanism, originally proposed in 1983, which derives the appearance of time and dynamical evolution from static quantum entanglement in a timeless universe. In this framework, time emerges relationally as correlations between a clock subsystem and the rest of the system, with the global wave function remaining stationary while conditional states evolve as if under the Schrödinger equation. Revived in the 2010s through experimental tests and theoretical extensions, the mechanism aligns with informational approaches by treating time as an emergent relational property, further supporting the idea that quantum theory describes correlations rather than absolute states. This avoids foundational puzzles like the origin of time in quantum gravity by relativizing temporal facts to entangled subsystems. Despite these diverse approaches, a 2025 Nature survey of over 1,100 physicists revealed sharp divisions, with no single interpretation achieving consensus; the Copenhagen interpretation remains the most favored among respondents, underscoring the field's ongoing debates.[38]

Axiomatic reconstructions

Traditional Hilbert space formalism

The traditional Hilbert space formalism establishes the mathematical foundation of quantum mechanics through a set of axioms primarily formulated by Paul Dirac and John von Neumann in the late 1920s and early 1930s.[39][16] This framework posits that quantum systems are described in an infinite-dimensional, separable Hilbert space H\mathcal{H} over the complex numbers, where physical states and observables are represented abstractly without reference to specific coordinates.[16] The axioms emphasize the duality between states and observables, enabling probabilistic predictions for measurement outcomes while incorporating non-classical features like superposition.[39] Central to the formalism are the representations of quantum states and observables. A pure state of the system is given by a ray in H\mathcal{H}, corresponding to a normalized vector ψ|\psi\rangle unique up to a global phase factor eiθe^{i\theta}, such that ψψ=1\langle \psi | \psi \rangle = 1.[16] Mixed states are described by density operators ρ\rho, which are positive semi-definite trace-class operators with Tr(ρ)=1\operatorname{Tr}(\rho) = 1.[16] Observables, representing measurable quantities like position or momentum, are modeled as self-adjoint operators A^\hat{A} on H\mathcal{H}, ensuring real-valued expectation values A^=ψA^ψ\langle \hat{A} \rangle = \langle \psi | \hat{A} | \psi \rangle.[39][16] The connection between observables and measurement outcomes relies on the spectral theorem for self-adjoint operators, which guarantees a spectral decomposition A^=adE(a)\hat{A} = \int_{-\infty}^{\infty} a \, dE(a), where E(a)E(a) is a projection-valued measure onto the eigenspaces and the spectrum consists of possible outcomes aa.[16] For a non-degenerate eigenvector satisfying A^ψ=aψ\hat{A} |\psi\rangle = a |\psi\rangle, a measurement yields the definite outcome aa with probability 1.[16] In general, the probability of obtaining outcome aa for state ψ|\psi\rangle is given by p(a)=ψE(da)ψp(a) = \langle \psi | E(da) | \psi \rangle, aligning with the Born rule derived within this structure.[39][16] Time evolution between measurements proceeds deterministically via unitary dynamics governed by the Schrödinger equation:
itψ(t)=H^ψ(t), i \hbar \frac{\partial}{\partial t} |\psi(t)\rangle = \hat{H} |\psi(t)\rangle,
where H^\hat{H} is the self-adjoint Hamiltonian operator encoding the system's energy.[39][16] This generates a one-parameter unitary group U(t)=eiH^t/U(t) = e^{-i \hat{H} t / \hbar}, preserving the norm U(t)ψ=ψ\| U(t) |\psi\rangle \| = \| |\psi\rangle \| and thus the total probability.[16] The equation applies to closed systems without external interactions, maintaining coherence until a measurement intervenes.[39] Measurement introduces the projection postulate, which describes the non-unitary collapse of the state upon obtaining an outcome.[16] If the measurement of A^\hat{A} yields aa, the post-measurement state is the normalized projection ψ=E(a)ψψE(a)ψ|\psi'\rangle = \frac{E(a) |\psi\rangle}{\sqrt{\langle \psi | E(a) | \psi \rangle}}, with the probability ψE(a)ψ\langle \psi | E(a) | \psi \rangle.[16] This postulate, distinct from unitary evolution, accounts for the irreversible nature of observation but raises foundational questions about its physical mechanism.[16] A key result supporting the uniqueness of this probabilistic structure is Gleason's theorem, which proves that any probability measure on the closed subspaces of a Hilbert space H\mathcal{H} with dimH3\dim \mathcal{H} \geq 3 must be of the form μ(P)=Tr(ρP)\mu(P) = \operatorname{Tr}(\rho P) for some density operator ρ\rho, assuming non-contextuality (i.e., frame functions independent of orthonormal basis choice).[40] The theorem applies to complex Hilbert spaces and excludes dimension 2 due to counterexamples like the Kochen-Specker theorem's precursors, reinforcing the formalism's reliance on the standard Born rule without hidden variables.[40] This axiomatic setup assumes a complex, separable Hilbert space of dimension at least 3 and focuses on non-relativistic phenomena, without incorporating special relativity or field-theoretic extensions.[16] Limitations include the absence of a direct relativistic formulation and challenges in treating infinite-dimensional cases rigorously for unbounded operators.[16]

Generalized probabilistic theories

Generalized probabilistic theories (GPTs) provide a broad operational framework for axiomatic reconstructions of physical theories, encompassing classical probability theory, quantum theory, and hypothetical extensions beyond both. In this approach, physical systems are described abstractly through their statistical predictions, without presupposing specific mathematical structures like Hilbert spaces. A GPT consists of a state space, which is a convex set representing all possible preparations of the system, with pure states as the extreme points and mixed states as convex combinations thereof. Effects are affine functionals on the state space mapping to probabilities between 0 and 1, corresponding to measurement outcomes, while tests are collections of effects that form complete measurements summing to the unit effect.[41] Atomicity in GPTs posits that every state can be decomposed into a finite mixture of perfectly distinguishable pure states, ensuring that information is carried by indivisible "atoms" of probability. Purification is a key principle stating that every state admits a purification—a pure state on an extended system whose marginal recovers the original state—and that such purifications are unique up to reversible channels on the purifying system. These principles, along with causality (ensuring no superluminal signaling) and atomic parallelism (parallel composition of atomic tests yields atomic tests), structure the theory's composite systems via a convex combination of parallel and sequential compositions.[41] In the 2000s and 2010s, Giulio Chiribella and collaborators developed an information-theoretic approach within GPTs to derive quantum theory from elementary axioms. Their framework begins with causality, which implies a unique normalization for states; perfect distinguishability, allowing non-orthogonal states to be told apart with certainty using multiple copies; and ideal compression, enabling lossless encoding into minimal-dimensional systems. Additional axioms include local distinguishability for composite states and purification, which together imply that the state space is a simplex of rank equal to the dimension squared, leading to the quantum formalism of density operators on complex Hilbert spaces. These axioms rule out classical theory by permitting superposition and entanglement, while excluding more exotic theories through constraints on information processing.[42] Key principles in GPTs, such as the no-cloning and no-deleting theorems, arise as limits on discriminability in non-classical theories. The no-cloning theorem states that no channel can perfectly copy an unknown state onto an ancillary system for all input states, a consequence of the non-orthogonality of states in theories beyond classical probability. Similarly, the no-deleting theorem prohibits the perfect erasure of an unknown state while preserving others, emerging from the same operational constraints on reversible transformations and purification in GPTs. These no-go results highlight quantum theory's uniqueness in balancing information preservation with irreversibility.[43] Quantum theory's uniqueness within GPTs is established by sets of 5 to 8 axioms that exclude both classical probability and supra-quantum correlations like Popescu-Rohrlich (PR) boxes. Lucien Hardy's 2001 formulation uses five axioms—probabilistic consistency, simplicity of degrees of freedom, subspace inheritance, tensor product composition for composites, and continuous reversibility—deriving quantum mechanics while ruling out classical theory (which lacks continuous transformations between pure states) and PR boxes (which violate the tensor structure and bounded correlations). Complementing this, Lluís Masanes and Markus P. Müller in 2011 employed seven axioms, including finite dimensionality for binary systems, local tomography (states determined by local measurements), subspace equivalence, continuous symmetry of pure states, and allowance of all effects for qubits, proving that only classical and quantum theories satisfy them; adding continuity selects quantum theory alone. Subsequent reconstructions, including those by Masanes et al. (2013), Goyal (2014), and Höhn (2017), have further developed these information-theoretic axioms, reinforcing quantum theory's uniqueness within GPTs.[44][45] These reconstructions demonstrate that quantum theory is the unique GPT compatible with intuitive physical requirements like efficient information encoding and local observability.[44] A pivotal result in this context is Solér's theorem from 1995, which characterizes infinite-dimensional orthomodular lattices (abstracting quantum logic) with an orthonormal basis as Hilbert spaces over the real numbers, complex numbers, or quaternions, thereby implying the complex Hilbert space structure underlying quantum theory when combined with GPT axioms like purification and atomicity. Quantum theory emerges as a special case of GPTs, where the state space is the set of density matrices on a complex Hilbert space, effects are bounded positive operators, and composites follow the tensor product rule.

Categorical and operational frameworks

Categorical quantum mechanics provides a diagrammatic and compositional framework for quantum theory, treating physical processes as morphisms in a symmetric monoidal category enriched over Hilbert spaces, where objects represent quantum systems and parallel composition corresponds to tensor products.[46] Pioneered by Samson Abramsky and Bob Coecke in 2004, this approach emphasizes dagger-compact structure to capture unitarity and complementarity, with dagger symmetry ensuring that processes are reversible and self-adjoint in a graphical sense. Measurements are modeled using Frobenius algebras, which axiomatize classical structures within the quantum category, allowing complementary observables to be represented as mutually unbiased algebras.[46] A key development in this framework is the ZX-calculus, introduced by Bob Coecke and Ross Duncan in 2008, which extends categorical quantum mechanics with a graphical language using Z- and X-spiders to simplify proofs of quantum protocols and circuit equivalences.[47] In process theories, the diagrammatic representation uses wires for systems and boxes for channels, enabling intuitive composition of quantum operations while highlighting complementarity through the dagger structure that enforces probabilistic interpretations. This categorical perspective reconstructs quantum theory from abstract principles of compositionality, avoiding direct reliance on Hilbert space coordinates.[46] Operational frameworks treat quantum systems as black boxes, focusing on statistics derived from preparations, transformations, and measurements without presupposing an underlying ontology.[44] Lucien Hardy's 2001 axiomatization derives the full quantum formalism, including the Born rule, from five principles: systems as abstract entities, compound systems via direct sum and tensor product, continuous reversible transformations, and a simple composability condition for overlapping measurements.[44] Building on this, Howard Barnum and Christopher Fuchs emphasized operational probabilities as the core of quantum information, where states are compendia of outcome probabilities for all possible measurements, leading to derivations of quantum features like no-cloning from informational constraints. Robert Spekkens' 2007 toy model serves as a classical analogue within operational frameworks, illustrating epistemic interpretations where quantum-like behavior emerges from limited knowledge about ontic states, with "self-dual" information flow mimicking quantum complementarity without true superposition.[48] In this model, systems have hidden degrees of freedom, and operations reveal partial information, reproducing key quantum phenomena such as interference in a restricted epistemic setting. More recently, quantum combs, developed by Giulio Chiribella and collaborators in the late 2000s, extend operational frameworks to higher-order processes, representing networks of channels as combs that input and output quantum instruments, enabling the reconstruction of quantum theory from principles of causality and purification.[49] This formalism captures indefinite causal orders and provides a process-theoretic basis for quantum advantage in information tasks.

Extensions and alternative theories

Objective collapse models

Objective collapse models propose modifications to the Schrödinger equation that introduce spontaneous, objective wave function collapses, aiming to resolve the measurement problem without invoking a special role for observers. These theories maintain the predictive success of standard quantum mechanics for microscopic systems while ensuring that macroscopic superpositions decay rapidly, leading to definite outcomes. Unlike interpretations that preserve unitary evolution, objective collapse models alter the dynamics to include stochastic or nonlinear terms that localize the wave function in position space over time.[50] The Ghirardi–Rimini–Weber (GRW) model, introduced in 1986, posits spontaneous localization events occurring at a low rate for individual particles but amplified for macroscopic objects. In this framework, the wave function undergoes rare, random collapses modeled as Gaussian multiplications, with a localization rate $ \lambda \approx 10^{-16} $ s1^{-1} per particle and a spatial smearing width of about 100 nm, ensuring that superpositions involving many particles collapse almost immediately. This mechanism provides an objective criterion for collapse, independent of measurement, while reproducing quantum predictions for isolated systems. A continuous variant, the continuous spontaneous localization (CSL) model developed in 1990, replaces discrete jumps with a stochastic differential equation incorporating white noise. The evolution is governed by
dψ=iHdtψλ2(qq)2dtψ+λ(qq)dWψ, d\psi = -\frac{i}{\hbar} H \, dt \, \psi - \frac{\lambda}{2} (q - \langle q \rangle)^2 \, dt \, \psi + \sqrt{\lambda} (q - \langle q \rangle) \, dW \, \psi,
where $ H $ is the Hamiltonian, $ q $ is the position operator, $ \langle q \rangle $ is its expectation value, $ \lambda $ is the localization rate, and $ dW $ is Wiener noise. CSL avoids the discontinuity of GRW while preserving particle symmetries and leading to similar amplification for macroscopic systems. The Diósi–Penrose model, proposed in the late 1980s (Diósi 1987–1989; Penrose 1989), links collapse to gravitational effects, suggesting that superpositions of differing spacetime geometries become unstable. Diósi's formulation incorporates gravitational self-energy differences driving diffusion, while Penrose argues for objective reduction when the gravitational uncertainty exceeds the Planck scale, with a characteristic length $ l = \sqrt{\hbar G / c^5} \approx 10^{-35} $ m. This gravity-induced collapse occurs on timescales inversely proportional to the superposition size and mass, naturally suppressing macroscopic quantum coherence. Recent analyses as of 2024 indicate that the model predicts gravitationally induced entanglement, offering potential new experimental tests in tabletop setups.[51] These models address the measurement problem by providing an objective, intrinsic collapse mechanism, distinct from environment-induced decoherence, though they share the goal of explaining classical emergence. They predict testable deviations, such as excess radiation from spontaneous localizations or modified interferometry patterns, but current experiments impose tight constraints; for instance, recent molecular interferometry and optomechanical experiments in the 2020s have set upper limits on the CSL rate $ \lambda $ below approximately $ 10^{-13} $ s1^{-1} for correlation lengths around 100 nm, depending on the specific parameters, while underground photon emission tests have ruled out parameter-free versions of the Diósi–Penrose model. Recent developments in Diósi's stochastic gravity approach, updated in the 2020s, refine the model by incorporating quantum fluctuations in the gravitational field to ensure energy conservation and avoid divergences, as explored in analyses of superposition decay rates and experimental tests.

Stochastic and nonlinear modifications

Stochastic modifications of quantum mechanics seek to reinterpret the theory through continuous random processes, providing an objective description of particle trajectories without invoking wave function collapse. A seminal example is Edward Nelson's stochastic mechanics, introduced in 1966, which models the motion of particles as a Markov diffusion process akin to Brownian motion. In this framework, the particle's position evolves according to the stochastic differential equation $ d\mathbf{x} = \mathbf{b}(\mathbf{x}, t) , dt + d\mathbf{w} $, where b\mathbf{b} represents the mean drift velocity and dwd\mathbf{w} is the increment of a Wiener process capturing thermal-like fluctuations. Nelson derived the non-relativistic Schrödinger equation from Newtonian mechanics by incorporating an "osmotic velocity" u(x,t)=mlnρ(x,t)\mathbf{u}(\mathbf{x}, t) = \frac{\hbar}{m} \nabla \ln \sqrt{\rho(\mathbf{x}, t)}, where ρ\rho is the probability density, thus linking classical stochastic dynamics to quantum behavior while preserving the equivalence to standard quantum predictions. Nonlinear extensions, by contrast, directly alter the unitary evolution of the wave function to introduce deterministic but non-standard dynamics, potentially resolving issues like the measurement problem through objective evolution. Steven Weinberg proposed such a framework in 1989, generalizing the Schrödinger equation to $ i\hbar \frac{d\psi}{dt} = h(\psi, \psi^*) \psi $, where $ h $ is a homogeneous function that can include nonlinear dependencies on the wave function and its complex conjugate, allowing for small corrections to linear quantum mechanics while maintaining hermiticity for observables. However, these models predict deviations observable in precision experiments, such as shifts in energy levels or EPR correlations; for instance, recent precision experiments, including those with superconducting qubits as of 2025, have set bounds on the nonlinearity parameter γ<1.15×1012\gamma < 1.15 \times 10^{-12} at 90% confidence level.[52] Both stochastic and nonlinear approaches aim to endow quantum dynamics with greater objectivity, avoiding observer-dependent collapse, and stochastic variants have been extended to relativistic settings in the 1990s via formulations that incorporate Lorentz-invariant diffusions and proper time stochastic processes, ensuring compatibility with special relativity without violating causality. These relativistic extensions, such as those mapping Klein-Gordon solutions to Markov diffusions, maintain the derivation of relativistic wave equations from stochastic principles.[53][54] In recent developments during the 2020s, continuous-variable stochastic models have emerged for testing these ideas in optomechanical systems, where quantum fluctuations in light-mechanical oscillator couplings induce effective stochastic trajectories for macroscopic probes, enabling experimental probes of quantum-stochastic interfaces at larger scales. For example, linear optomechanical interactions generate quantum-induced diffusion in a semiclassical mechanical mode, with variance scaling as the cooperativity parameter, offering pathways to detect deviations from linear quantum predictions.[55]

Superdeterminism and retrocausality

Superdeterminism proposes a deterministic framework for quantum mechanics that resolves apparent nonlocality in Bell inequality violations by positing correlations between measurement settings and hidden variables through a common cause in the initial conditions of the universe. In this view, championed by Gerard 't Hooft in the 2010s, the universe's evolution is fully deterministic, and quantum probabilities emerge from underlying classical variables, eliminating the need for superluminal influences while closing the "freedom-of-choice" loophole in Bell's theorem.[56] Specifically, 't Hooft's cellular automaton models treat quantum systems as emergent from deterministic rules at a fundamental level, where particle positions and momenta are precisely defined, and statistical independence between experimenters' choices and system states is not assumed.[56] As of 2025, ongoing quantum experiments aim to test the free-will assumption by generating measurement choices with high-entropy quantum sources, potentially constraining superdeterministic models if no correlations are found.[57] A key feature of superdeterminism is the requirement for highly specific initial conditions that correlate all relevant variables, including observer choices, from the Big Bang onward, often described as a "conspiracy" in the universe's setup to produce observed quantum correlations. This setup avoids nonlocality but has drawn criticism for implying a lack of experimenter free will, as measurement settings are predetermined by the same hidden variables influencing outcomes, rendering experiments non-random in a way that challenges statistical assumptions in physics.[58] Debates in the 2020s, including analyses by proponents like Sabine Hossenfelder, highlight that while superdeterminism is logically consistent, its reliance on such fine-tuned correlations makes it empirically indistinguishable from standard quantum mechanics without additional testable predictions. Retrocausality offers an alternative foundational approach by incorporating time-symmetric influences, where future boundary conditions affect past events through advanced waves propagating backward in time. In John G. Cramer's transactional interpretation, introduced in 1986, quantum events arise from "transactions" between retarded (forward-propagating) and advanced (backward-propagating) waves that satisfy boundary conditions at emission and absorption points, ensuring a fully causal, relativistically invariant description without collapse or hidden variables. This model interprets the Schrödinger equation's solutions as real physical waves, with absorbers in the future "handshaking" with emitters via these waves to form definite outcomes, providing a retrocausal mechanism for entanglement correlations. The two-state vector formalism (TSVF), originating from the work of Yakir Aharonov, Peter Bergmann, and Joel L. Lebowitz in the 1960s, formalizes this time symmetry by describing quantum states with both a pre-selected forward-evolving state vector and a post-selected backward-evolving one, enabling the computation of weak values through weak measurements that minimally disturb the system. Recent experiments in the 2020s using weak measurements have tested TSVF predictions, such as anomalous weak values in entangled systems, confirming consistency with standard quantum mechanics while exploring time-symmetric features like the "past of a quantum particle."[59] For instance, studies on multipartite entangled qubits have characterized state updates via nondestructive weak measurements, aligning observed weak values with theoretical expectations from the formalism. Experimental investigations of retrocausality, particularly in delayed-choice entanglement scenarios, have found no evidence supporting backward causation beyond standard quantum predictions. In 2023 analyses of delayed-choice experiments, including entanglement swapping, forward-time interpretations suffice without invoking retrocausal influences, as correlations emerge from entanglement alone rather than future choices altering past states. These bounds reinforce that while retrocausal models like TSVF provide interpretive tools, they do not necessitate modifications to the quantum formalism for observed phenomena.

References

User Avatar
No comments yet.