Hubbry Logo
Consistent historiesConsistent historiesMain
Open search
Consistent histories
Community hub
Consistent histories
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Consistent histories
Consistent histories
from Wikipedia

In quantum mechanics, the consistent histories or simply "consistent quantum theory"[1] interpretation generalizes the complementarity aspect of the conventional Copenhagen interpretation. The approach is sometimes called decoherent histories[2] and in other work decoherent histories are more specialized.[1]

First proposed by Robert Griffiths in 1984,[3][4] this interpretation of quantum mechanics is based on a consistency criterion that then allows probabilities to be assigned to various alternative histories of a system such that the probabilities for each history obey the rules of classical probability while being consistent with the Schrödinger equation. In contrast to some interpretations of quantum mechanics, the framework does not include "wavefunction collapse" as a relevant description of any physical process, and emphasizes that measurement theory is not a fundamental ingredient of quantum mechanics. Consistent histories allows predictions related to the state of the universe needed for quantum cosmology.[5]

Key assumptions

[edit]

The interpretation rests on three assumptions:

  1. states in Hilbert space describe physical objects,
  2. quantum predictions are not deterministic, and
  3. physical systems have no single unique description.

The third assumption generalizes complementarity and this assumption separates consistent histories from other quantum theory interpretations.[1]

Formalism

[edit]

Histories

[edit]

A homogeneous history (here labels different histories) is a sequence of Propositions specified at different moments of time (here labels the times). We write this as:

and read it as "the proposition is true at time and then the proposition is true at time and then ". The times are strictly ordered and called the temporal support of the history.

Inhomogeneous histories are multiple-time propositions which cannot be represented by a homogeneous history. An example is the logical OR of two homogeneous histories: .

These propositions can correspond to any set of questions that include all possibilities. Examples might be the three propositions meaning "the electron went through the left slit", "the electron went through the right slit" and "the electron didn't go through either slit". One of the aims of the approach is to show that classical questions such as, "where are my keys?" are consistent. In this case one might use a large number of propositions each one specifying the location of the keys in some small region of space.

Each single-time proposition can be represented by a projection operator acting on the system's Hilbert space (we use "hats" to denote operators). It is then useful to represent homogeneous histories by the time-ordered product of their single-time projection operators. This is the history projection operator (HPO) formalism developed by Christopher Isham and naturally encodes the logical structure of the history propositions.

Consistency

[edit]

An important construction in the consistent histories approach is the class operator for a homogeneous history:

The symbol indicates that the factors in the product are ordered chronologically according to their values of : the "past" operators with smaller values of appear on the right side, and the "future" operators with greater values of appear on the left side. This definition can be extended to inhomogeneous histories as well.

Central to the consistent histories is the notion of consistency. A set of histories is consistent (or strongly consistent) if

for all . Here represents the initial density matrix, and the operators are expressed in the Heisenberg picture.

The set of histories is weakly consistent if

for all .

Probabilities

[edit]

If a set of histories is consistent then probabilities can be assigned to them in a consistent way. We postulate that the probability of history is simply

which obeys the axioms of probability if the histories come from the same (strongly) consistent set.

As an example, this means the probability of " OR " equals the probability of "" plus the probability of "" minus the probability of " AND ", and so forth.

Interpretation

[edit]

The interpretation based on consistent histories is used in combination with the insights about quantum decoherence. Quantum decoherence implies that irreversible macroscopic phenomena (hence, all classical measurements) render histories automatically consistent, which allows one to recover classical reasoning and "common sense" when applied to the outcomes of these measurements. More precise analysis of decoherence allows (in principle) a quantitative calculation of the boundary between the classical domain and the quantum domain. According to Roland Omnès,[6]

[the] history approach, although it was initially independent of the Copenhagen approach, is in some sense a more elaborate version of it. It has, of course, the advantage of being more precise, of including classical physics, and of providing an explicit logical framework for indisputable proofs. But, when the Copenhagen interpretation is completed by the modern results about correspondence and decoherence, it essentially amounts to the same physics.

[... There are] three main differences:

1. The logical equivalence between an empirical datum, which is a macroscopic phenomenon, and the result of a measurement, which is a quantum property, becomes clearer in the new approach, whereas it remained mostly tacit and questionable in the Copenhagen formulation.

2. There are two apparently distinct notions of probability in the new approach. One is abstract and directed toward logic, whereas the other is empirical and expresses the randomness of measurements. We need to understand their relation and why they coincide with the empirical notion entering into the Copenhagen rules.

3. The main difference lies in the meaning of the reduction rule for 'wave packet collapse'. In the new approach, the rule is valid but no specific effect on the measured object can be held responsible for it. Decoherence in the measuring device is enough.

In order to obtain a complete theory, the formal rules above must be supplemented with a particular Hilbert space and rules that govern dynamics, for example a Hamiltonian.

In the opinion of others[7] this still does not make a complete theory as no predictions are possible about which set of consistent histories will actually occur. In other words, the rules of consistent histories, the Hilbert space, and the Hamiltonian must be supplemented by a set selection rule. However, Robert B. Griffiths holds the opinion that asking the question of which set of histories will "actually occur" is a misinterpretation of the theory;[8] histories are a tool for description of reality, not separate alternate realities.

Proponents of this consistent histories interpretation—such as Murray Gell-Mann, James Hartle, Roland Omnès and Robert B. Griffiths—argue that their interpretation clarifies the fundamental disadvantages of the old Copenhagen interpretation, and can be used as a complete interpretational framework for quantum mechanics.

In Quantum Philosophy,[9] Roland Omnès provides a less mathematical way of understanding this same formalism.

The consistent histories approach can be interpreted as a way of understanding which sets of classical questions can be consistently asked of a single quantum system, and which sets of questions are fundamentally inconsistent, and thus meaningless when asked together. It thus becomes possible to demonstrate formally why it is that the questions which Einstein, Podolsky and Rosen assumed could be asked together, of a single quantum system, simply cannot be asked together. On the other hand, it also becomes possible to demonstrate that classical, logical reasoning often does apply, even to quantum experiments – but we can now be mathematically exact about the limits of classical logic.

See also

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
The consistent histories interpretation of is a framework that assigns probabilities to sequences of quantum events, known as histories, across time, using families of such histories that satisfy a mathematical consistency condition to ensure additive probabilities without quantum interference. This approach treats as a theory of probabilistic predictions for closed systems, avoiding the need for or special measurement postulates. Pioneered by Robert B. Griffiths in 1984, it generalizes the to conditional probabilities for chains of events represented by projectors on . Subsequent developments expanded the formalism's scope and rigor. In the late 1980s, Roland Omnès refined the approach by emphasizing logical structure and decoherence, demonstrating how it resolves quantum paradoxes like the Einstein-Podolsky-Rosen through consistent logical propositions. Independently, and James B. Hartle adapted it for in the early 1990s, introducing the concept of decoherent histories where environmental interactions suppress interference, enabling quasi-classical descriptions of the universe's . These extensions highlighted the role of the decoherence functional, a complex-valued measure whose diagonal elements yield probabilities when off-diagonal terms vanish. Central to the interpretation is the single-framework rule, which stipulates that all propositions about a system must be evaluated within one consistent family of histories to avoid contradictions from incompatible descriptions. This allows multiple valid frameworks for different questions—such as position versus —each providing coherent answers without privileging a unique reality or implying nonlocality. For macroscopic s, decoherence often selects families approximating , explaining the emergence of definite outcomes in everyday observations. Unlike the , it applies uniformly to isolated systems, including the entire , and integrates seamlessly with standard quantum formalism without additional axioms.

Historical Context

Origins and Development

The consistent histories approach to originated with Griffiths' proposal in , which introduced a framework for assigning probabilities to sequences of events in closed without relying on the collapse of the wave function. This method allowed quantum theory to describe the dynamics of isolated systems, such as the entire , by treating histories as chains of alternative quantum events that could be analyzed probabilistically when they satisfied certain consistency conditions. Griffiths' work was motivated by longstanding foundational challenges in , including the desire for a realist interpretation that could apply Born's probability rule to closed systems without invoking external observers or measurements, thereby addressing concerns about the theory's completeness akin to those expressed by Einstein. Independently, Roland Omnès advanced a parallel formulation in the late 1980s, emphasizing the role of decoherence in selecting consistent sets of histories and providing a logical structure for quantum predictions. Omnès' development built on Griffiths' ideas but incorporated environmental interactions to explain why certain histories become effectively classical, publishing key papers starting with "Logical reformulation of quantum mechanics I. Foundations" in 1988. His contributions culminated in influential books, including "The Interpretation of Quantum Mechanics" (1994), which systematized the approach and demonstrated its application to a wide range of quantum phenomena. The framework gained further prominence through its extension to quantum cosmology by Murray Gell-Mann and James Hartle in 1990, who coined the term "decoherent histories" to highlight the importance of coarse-graining and decoherence in cosmological contexts. Their paper, "Quantum Mechanics in the Light of Quantum Cosmology," integrated the approach with the path-integral formulation, enabling probabilistic descriptions of the universe's evolution from the onward without preferential initial conditions or observers. This work addressed the unique challenges of applying to the , where traditional measurement-based interpretations fail due to the absence of an external environment. The timeline of seminal publications underscores the rapid evolution of the approach: Griffiths' foundational 1984 article in the Journal of Statistical Physics (volume 36, pages 219–272); Omnès' 1988 paper in the Journal of Statistical Physics (volume 53, pages 893–932), followed by additional papers through 1992 and his 1994 book; and Gell-Mann and Hartle's 1990 contribution in the proceedings of the Third International Symposium on the Foundations of (pages 425–458). Overall, these developments were driven by the need for a probability framework that restores a sense of narrative coherence to quantum theory while preserving its , particularly for systems lacking classical boundaries.

Key Contributors

Robert B. Griffiths introduced the consistent histories approach in , proposing a framework where sets of quantum histories—sequences of events described by projectors on —could be assigned classical-like probabilities without invoking , provided they satisfy a consistency condition that ensures non-interfering probabilities. This innovation emphasized the unitary evolution of closed quantum systems, allowing meaningful probabilistic statements about alternative histories without measurement-induced changes. Griffiths further elaborated this in his 2002 book Consistent Quantum Theory, solidifying the formalism's foundations for interpreting in terms of decohering narratives. Roland Omnès built upon Griffiths' work starting in 1988, developing what he termed the "logical" interpretation of , which integrated consistent histories with classical emergence through decoherence mechanisms. Omnès emphasized the role of approximate consistency in real-world open systems, where environmental interactions suppress interference, and formalized the approach as a deductive structure akin to . His 1994 book The Interpretation of Quantum Mechanics provided a comprehensive synthesis, applying the formalism to resolve paradoxes like the Einstein-Podolsky-Rosen and deriving classical behavior from quantum principles. Murray Gell-Mann and James B. Hartle extended the framework to in the late 1980s, coining the term "decoherent histories" to describe sets of histories that achieve approximate consistency via interactions with the environment, particularly relevant for closed universes without external observers. Their seminal 1990 paper outlined how decoherence functional replaces the exact consistency condition, enabling probabilistic predictions in cosmological contexts like the early universe. This adaptation highlighted the formalism's utility beyond laboratory settings, influencing applications in . The consistent histories approach drew inspiration from earlier ideas, notably Hugh Everett's 1957 , which introduced branching quantum states without collapse, providing a conceptual basis for multiple coexisting histories. Wojciech H. Zurek's work on decoherence in the 1980s, particularly his 1981 analysis of environment-induced superselection, supplied the mechanism for why certain histories decohere preferentially, bridging quantum superpositions to classical probabilities. Griffiths and Omnès engaged in key exchanges through their publications, debating the merits of exact consistency for idealized closed systems versus approximate consistency for practical, decoherence-dominated scenarios, with Omnès advocating broader applicability through logical structures while Griffiths stressed rigorous mathematical constraints. These discussions, echoed in Gell-Mann and Hartle's emphasis on cosmological , refined the formalism's scope and interoperability.

Core Concepts

Fundamental Assumptions

The consistent histories interpretation of rests on the foundational assumption that the entire evolves unitarily according to the , without any collapse of the upon measurement. This unitary applies to closed , treating measurements as interactions within the system rather than external interventions that alter the dynamics. By rejecting the collapse postulate, the approach maintains a deterministic, linear for the universal wave function, allowing probabilities to emerge from the structure of histories rather than state reductions. In this framework, histories represent alternative, mutually exclusive descriptions of events unfolding for a single quantum system over time, rather than branching into parallel worlds or ontological realities. These histories are constructed as sequences of quantum events, providing a probabilistic that approximates classical trajectories without invoking multiple universes. The emphasis is on descriptive completeness within a given , ensuring that the interpretation aligns with the unitary dynamics of the system as a whole. A key requirement is the coarse-graining of histories, where events are defined at a sufficiently broad level such that quantum interference between alternative paths becomes negligible, enabling classical-like behavior to emerge. This coarse-graining involves selecting sets of orthogonal projection operators that partition the into alternatives, avoiding fine details where superposition effects would violate probabilistic additivity. Such approximations are essential for the histories to form a consistent framework amenable to probability assignment. The approach rejects the notion of a single preferred basis for describing quantum events, allowing multiple sets of consistent histories depending on the physical question or context under investigation. Different sets may be incompatible, but within each set, probabilities can be meaningfully assigned, reflecting the context-dependent nature of quantum descriptions. Finally, the is applied exclusively to consistent families of histories, yielding additive probabilities that mimic classical probabilities for the selected alternatives. For inconsistent families, where interference terms are non-zero, the rule does not hold, underscoring the need for consistency conditions to validate the probabilistic interpretation. This selective application connects to decoherence mechanisms that suppress off-diagonal terms in the , facilitating the emergence of consistent sets.

Role of Decoherence

Decoherence refers to the process by which quantum superpositions of a become effectively classical through entanglement with a large environment, leading to the rapid suppression of quantum interference effects. In this mechanism, the system's reduced loses its off-diagonal elements, which represent coherences between different states, due to the irreversible spread of correlations into the environmental . This environmental interaction effectively selects preferred states, known as pointer states, that are robust against decoherence and align with classical observables. Within the consistent histories framework, decoherence plays a crucial role by enabling the of sets of histories that exhibit negligible interference, thereby satisfying the consistency conditions approximately. Specifically, the off-diagonal terms in the decoherence functional, which quantify interference between distinct histories, decay exponentially due to environmental , allowing the diagonal elements—corresponding to individual incoherent histories—to dominate and behave like classical probabilities. This suppression makes it possible to assign meaningful probabilities to alternative sequences of events without paradoxical quantum effects. Exact consistency among fine-grained histories is rare in open quantum systems, but decoherence provides practical approximate consistency for coarse-grained histories that are sufficiently separated in time and aligned with pointer states. These coarse-grained descriptions capture macroscopic events where environmental interactions have had time to erase interferences, rendering the histories effectively classical. Zurek's concepts of pointer states and einselection further underpin this process, as einselection dynamically favors those history branches that redundantly record information in the environment, ensuring their stability and objectivity across multiple observers. Unlike the wave function collapse postulate, decoherence does not involve a non-unitary reduction of the but instead hides interference terms while preserving the overall unitary of the plus environment. This distinction maintains the fundamental intact, with apparent classicality arising solely from the observer's limited access to the full entangled state.

Mathematical Framework

Definition of Histories

In the consistent histories interpretation of , a history is formally defined as an alternating sequence of projectors and unitary evolution operators representing a chain of quantum events over time. Specifically, a history α\alpha is a sequence of projectors Pαk(tk)P_{\alpha_k}(t_k) for alternatives αk\alpha_k at ordered times t0t1<<tnt_0 \leq t_1 < \cdots < t_n. This sequence captures the temporal evolution of a quantum system without presupposing measurement at each step, treating the entire process within the closed system's . Associated with each history α\alpha is a class operator CαC_\alpha, which encodes the amplitude for that history and is constructed as Cα=Pαn(tn)U(tn,tn1)Pαn1(tn1)Pα1(t1)U(t1,t0)C_\alpha = P_{\alpha_n}(t_n) U(t_n, t_{n-1}) P_{\alpha_{n-1}}(t_{n-1}) \cdots P_{\alpha_1}(t_1) U(t_1, t_0), where the U(tk,tk1)U(t_k, t_{k-1}) are successive unitary evolution operators from tk1t_{k-1} to tkt_k. These projectors Ptk(k)P_{t_k}^{(k)} act on the system's Hilbert space H\mathcal{H} and must satisfy orthogonality, PiPj=δijPiP_i P_j = \delta_{ij} P_i for distinct alternatives iji \neq j at the same time, ensuring mutual exclusivity of the events they represent, as well as completeness, iPi=I\sum_i P_i = I, where II is the identity operator, guaranteeing that the projectors exhaust all possibilities at that time. Histories can be fine-grained or coarse-grained depending on the level of detail in the projectors. Fine-grained histories employ projectors that resolve the system into maximally refined alternatives, providing the most detailed description possible within a given framework, while coarse-grained histories aggregate multiple fine-grained projectors into broader ones, such as combining sub-events into a single outcome to simplify analysis while preserving the overall structure. A representative example is the two-slit experiment, where histories describe a particle's path without interference terms by using s for passage through slit 1 or slit 2 at an intermediate time, followed by evolution to a final at the detection screen; this formulation assigns amplitudes solely to individual slit paths, avoiding cross terms that would arise in the full .

Consistency Conditions

In the consistent histories interpretation of quantum mechanics, consistency conditions provide the mathematical criteria that a set or family of histories must satisfy to assign well-defined, additive probabilities to them without encountering quantum interference paradoxes. These conditions ensure that the probabilities behave classically, summing appropriately over alternative histories, which is essential for describing the evolution of closed quantum systems where no external measurements occur. The conditions are formulated in terms of the decoherence functional, a central object that quantifies the interference between pairs of histories. The decoherence functional for a pair of histories α\alpha and β\beta is defined as D(α,β)=Tr(CαρCβ)D(\alpha, \beta) = \operatorname{Tr}(C_\alpha \rho C_\beta^\dagger), where ρ\rho is the initial density operator of the system, and CαC_\alpha and CβC_\beta are the class operators representing the histories (constructed as time-ordered products of projection operators along each history). A family of histories is consistent if the off-diagonal elements of this functional vanish for distinct histories, i.e., D(α,β)=0D(\alpha, \beta) = 0 whenever αβ\alpha \neq \beta. This diagonalization condition, introduced by Robert Griffiths, guarantees that there is no quantum interference between the histories, allowing the probability of a history α\alpha to be given by the Born rule as p(α)=D(α,α)=Tr(CαρCα)p(\alpha) = D(\alpha, \alpha) = \operatorname{Tr}(C_\alpha \rho C_\alpha^\dagger), with the probabilities being additive over the family. Griffiths' original formulation specifies strong consistency precisely through this full vanishing of the decoherence functional: for all αβ\alpha \neq \beta, Tr(CαρCβ)=0\operatorname{Tr}(C_\alpha \rho C_\beta^\dagger) = 0. This exact condition implies that the family of histories can be treated as a classical sample space, where the quantum mechanical evolution does not mix probabilities across branches. In the special case of a pure initial state ψ|\psi\rangle, the condition simplifies to the orthogonality of the chain kets: ΨαΨβ=0\langle \Psi^\alpha | \Psi^\beta \rangle = 0 for αβ\alpha \neq \beta, where Ψα=Cαψ|\Psi^\alpha\rangle = C_\alpha |\psi\rangle. Strong consistency is satisfied automatically for single-time or two-time measurements but becomes nontrivial for multi-time histories with more than two alternatives. For practical applications where exact consistency is difficult to achieve due to residual weak interactions, weaker approximations have been developed. Weak consistency, proposed by and , requires only that the real part of the off-diagonal elements vanishes: ReD(α,β)=0\operatorname{Re} D(\alpha, \beta) = 0 for αβ\alpha \neq \beta, with probabilities still approximated by p(α)D(α,α)p(\alpha) \approx D(\alpha, \alpha). This is sufficient for additive probabilities in many physical scenarios, such as those involving approximate decoherence. A related approximate condition is αβTr(CαρCβ)αTr(CαρCα)|\sum_{\alpha \neq \beta} \operatorname{Tr}(C_\alpha^\dagger \rho C_\beta)| \ll \sum_\alpha \operatorname{Tr}(C_\alpha^\dagger \rho C_\alpha), which quantifies the smallness of interference terms relative to the total probability. Roland Omnès introduced graded levels of consistency, including medium and strong conditions, to address different degrees of approximation in real systems. Medium consistency requires that the imaginary parts of the off-diagonal elements are negligible and the real parts are small compared to the geometric mean of the diagonal probabilities: ReD(α,β)D(α,α)D(β,β)|\operatorname{Re} D(\alpha, \beta)| \ll \sqrt{D(\alpha, \alpha) D(\beta, \beta)}
Add your contribution
Related Hubs
User Avatar
No comments yet.