Hubbry Logo
Quantum indeterminacyQuantum indeterminacyMain
Open search
Quantum indeterminacy
Community hub
Quantum indeterminacy
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Quantum indeterminacy
Quantum indeterminacy
from Wikipedia

Quantum indeterminacy is the apparent necessary incompleteness in the description of a physical system, that has become one of the characteristics of the standard description of quantum physics. Prior to quantum physics, it was thought that

  1. a physical system had a determinate state that uniquely determined all the values of its measurable properties, and
  2. conversely, the values of its measurable properties uniquely determined the state.

Quantum indeterminacy can be quantitatively characterized by a probability distribution on the set of outcomes of measurements of an observable. The distribution is uniquely determined by the system state, and moreover quantum mechanics provides a recipe for calculating this probability distribution.

Indeterminacy in measurement was not an innovation of quantum mechanics, since it had been established early on by experimentalists that errors in measurement may lead to indeterminate outcomes. By the later half of the 18th century, measurement errors were well understood, and it was known that they could either be reduced by better equipment or accounted for by statistical error models. In quantum mechanics, however, indeterminacy is of a much more fundamental nature, having nothing to do with errors or disturbance.

Measurement

[edit]

An adequate account of quantum indeterminacy requires a theory of measurement. Many theories have been proposed since the beginning of quantum mechanics and quantum measurement continues to be an active research area in both theoretical and experimental physics.[1] Possibly the first systematic attempt at a mathematical theory was developed by John von Neumann. The kinds of measurements he investigated are now called projective measurements. That theory was based in turn on the theory of projection-valued measures for self-adjoint operators that had been recently developed (by von Neumann and independently by Marshall Stone) and the Hilbert space formulation of quantum mechanics (attributed by von Neumann to Paul Dirac).

In this formulation, the state of a physical system corresponds to a vector of length 1 in a Hilbert space H over the complex numbers. An observable is represented by a self-adjoint (i.e. Hermitian) operator A on H. If H is finite dimensional, by the spectral theorem, A has an orthonormal basis of eigenvectors. If the system is in state ψ, then immediately after measurement the system will occupy a state that is an eigenvector e of A and the observed value λ will be the corresponding eigenvalue of the equation Ae = λe. It is immediate from this that measurement in general will be non-deterministic. Quantum mechanics, moreover, gives a recipe for computing a probability distribution Pr on the possible outcomes given the initial system state is ψ. The probability is where E(λ) is the projection onto the space of eigenvectors of A with eigenvalue λ.

Example

[edit]
Bloch sphere showing eigenvectors for Pauli Spin matrices. The Bloch sphere is a two-dimensional surface the points of which correspond to the state space of a spin 1/2 particle. At the state ψ the values of σ1 are +1 whereas the values of σ2 and σ3 take the values +1, −1 with probability 1/2.

In this example, we consider a single spin 1/2 particle (such as an electron) in which we only consider the spin degree of freedom. The corresponding Hilbert space is the two-dimensional complex Hilbert space C2, with each quantum state corresponding to a unit vector in C2 (unique up to phase). In this case, the state space can be geometrically represented as the surface of a sphere, as shown in the figure on the right.

The Pauli spin matrices are self-adjoint and correspond to spin-measurements along the 3 coordinate axes.

The Pauli matrices all have the eigenvalues +1, −1.

  • For σ1, these eigenvalues correspond to the eigenvectors
  • For σ3, they correspond to the eigenvectors

Thus in the state σ1 has the determinate value +1, while measurement of σ3 can produce either +1, −1 each with probability 1/2. In fact, there is no state in which measurement of both σ1 and σ3 have determinate values.

There are various questions that can be asked about the above indeterminacy assertion.

  1. Can the apparent indeterminacy be construed as in fact deterministic, but dependent upon quantities not modeled in the current theory, which would therefore be incomplete? More precisely, are there hidden variables that could account for the statistical indeterminacy in a completely classical way?
  2. Can the indeterminacy be understood as a disturbance of the system being measured?

Von Neumann formulated the question 1) and provided an argument why the answer had to be no, if one accepted the formalism he was proposing. However, according to Bell, von Neumann's formal proof did not justify his informal conclusion.[2] A definitive but partial negative answer to 1) has been established by experiment: because Bell's inequalities are violated, any such hidden variable(s) cannot be local (see Bell test experiments).

The answer to 2) depends on how disturbance is understood, particularly since measurement entails disturbance (however note that this is the observer effect, which is distinct from the uncertainty principle). Still, in the most natural interpretation the answer is also no. To see this, consider two sequences of measurements: (A) that measures exclusively σ1 and (B) that measures only σ3 of a spin system in the state ψ. The measurement outcomes of (A) are all +1, while the statistical distribution of the measurements (B) is still divided between +1, −1 with equal probability.

Other examples of indeterminacy

[edit]

Quantum indeterminacy can also be illustrated in terms of a particle with a definitely measured momentum for which there must be a fundamental limit to how precisely its location can be specified. This quantum uncertainty principle can be expressed in terms of other variables, for example, a particle with a definitely measured energy has a fundamental limit to how precisely one can specify how long it will have that energy. The magnitude involved in quantum uncertainty is on the order of the Planck constant (6.62607015×10−34 J⋅Hz−1[3]).

Indeterminacy and incompleteness

[edit]

Quantum indeterminacy is the assertion that the state of a system does not determine a unique collection of values for all its measurable properties. Indeed, according to the Kochen–Specker theorem, in the quantum mechanical formalism it is impossible that, for a given quantum state, each one of these measurable properties (observables) has a determinate (sharp) value. The values of an observable will be obtained non-deterministically in accordance with a probability distribution that is uniquely determined by the system state. Note that the state is destroyed by measurement, so when we refer to a collection of values, each measured value in this collection must be obtained using a freshly prepared state.

This indeterminacy might be regarded as a kind of essential incompleteness in our description of a physical system. Notice however, that the indeterminacy as stated above only applies to values of measurements not to the quantum state. For example, in the spin 1/2 example discussed above, the system can be prepared in the state ψ by using measurement of σ1 as a filter that retains only those particles such that σ1 yields +1. By the von Neumann (so-called) postulates, immediately after the measurement the system is assuredly in the state ψ.

However, Albert Einstein believed that quantum state cannot be a complete description of a physical system and, it is commonly thought, never came to terms with quantum mechanics. In fact, Einstein, Boris Podolsky and Nathan Rosen showed that if quantum mechanics is correct, then the classical view of how the real world works (at least after special relativity) is no longer tenable. This view included the following two ideas:

  1. A measurable property of a physical system whose value can be predicted with certainty is actually an element of (local) reality (this was the terminology used by EPR).
  2. Effects of local actions have a finite propagation speed.

This failure of the classical view was one of the conclusions of the EPR thought experiment in which two remotely located observers, now commonly referred to as Alice and Bob, perform independent measurements of spin on a pair of electrons, prepared at a source in a special state called a spin singlet state. It was a conclusion of EPR, using the formal apparatus of quantum theory, that once Alice measured spin in the x direction, Bob's measurement in the x direction was determined with certainty, whereas immediately before Alice's measurement Bob's outcome was only statistically determined. From this it follows that either value of spin in the x direction is not an element of reality or that the effect of Alice's measurement has infinite speed of propagation.

Indeterminacy for mixed states

[edit]

We have described indeterminacy for a quantum system that is in a pure state. Mixed states are a more general kind of state obtained by a statistical mixture of pure states. For mixed states the "quantum recipe" for determining the probability distribution of a measurement is determined as follows:

Let A be an observable of a quantum mechanical system. A is given by a densely defined self-adjoint operator on H. The spectral measure of A is a projection-valued measure defined by the condition

for every Borel subset U of R. Given a mixed state S, we introduce the distribution of A under S as follows:

This is a probability measure defined on the Borel subsets of R that is the probability distribution obtained by measuring A in S.

Logical independence and quantum randomness

[edit]

Quantum indeterminacy is often understood as information (or lack of it) whose existence we infer, occurring in individual quantum systems, prior to measurement. Quantum randomness is the statistical manifestation of that indeterminacy, witnessable in results of experiments repeated many times. However, the relationship between quantum indeterminacy and randomness is subtle and can be considered differently.[4]

In classical physics, experiments of chance, such as coin-tossing and dice-throwing, are deterministic, in the sense that, perfect knowledge of the initial conditions would render outcomes perfectly predictable. The ‘randomness’ stems from ignorance of physical information in the initial toss or throw. In diametrical contrast, in the case of quantum physics, the theorems of Kochen and Specker,[5] the inequalities of John Bell,[6] and experimental evidence of Alain Aspect,[7][8] all indicate that quantum randomness does not stem from any such physical information.

In 2008, Tomasz Paterek et al. provided an explanation in mathematical information. They proved that quantum randomness is, exclusively, the output of measurement experiments whose input settings introduce logical independence into quantum systems.[9][10]

Logical independence is a well-known phenomenon in Mathematical Logic. It refers to the null logical connectivity that exists between mathematical propositions (in the same language) that neither prove nor disprove one another.[11]

In the work of Paterek et al., the researchers demonstrate a link connecting quantum randomness and logical independence in a formal system of Boolean propositions. In experiments measuring photon polarisation, Paterek et al. demonstrate statistics correlating predictable outcomes with logically dependent mathematical propositions, and random outcomes with propositions that are logically independent.[12][13]

See also

[edit]

Notes

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Quantum indeterminacy refers to the fundamental principle in that certain pairs of physical properties of subatomic particles, such as position and , cannot be simultaneously known with arbitrary precision, introducing an inherent probabilistic element to quantum predictions. This concept, first articulated by in 1927, arises from the wave-like behavior of particles and is mathematically expressed through the uncertainty relation ΔxΔp2\Delta x \Delta p \geq \frac{\hbar}{2}, where Δx\Delta x and Δp\Delta p are the standard deviations of position and , and \hbar is the reduced Planck's constant. The principle of quantum indeterminacy emerged during the development of quantum theory in the mid-1920s, as physicists grappled with the limitations of in describing atomic phenomena. Heisenberg's formulation highlighted that the act of itself disturbs the system, preventing simultaneous exact knowledge of like and time or angular position and . Complementing this, Max Born's 1926 interpretation of the provided the probabilistic framework, stating that the probability density of finding a particle in a given state is proportional to the square of the of its wave function amplitude, P=ψ2P = |\psi|^2. Together, these ideas shifted from deterministic trajectories to statistical outcomes, fundamentally challenging classical notions of causality. Beyond foundational implications, quantum indeterminacy manifests in observable effects like the broadening of lines in atomic spectra and the random decay times of radioactive particles, confirming its role in real-world quantum processes. It also underpins advanced applications, including where superposition and measurement-induced collapse exploit this indeterminacy for information processing, and , which leverages inherent uncertainties for secure key distribution. While vary—ranging from the view emphasizing observer-dependent reality to more deterministic hidden-variable theories—indeterminacy remains a cornerstone, verified through countless experiments since the .

Fundamentals

Definition and principles

Quantum indeterminacy refers to the fundamental unpredictability inherent in the outcomes of measurements on , even when the is fully described by a pure state. In , a 's state is represented by a vector in a , which encodes all available information about the , yet this state only determines probability distributions for results rather than definite values. This probabilistic nature arises from the , which states that the probability of obtaining a particular outcome corresponding to an eigenstate ϕ|\phi\rangle when measuring a in state ψ|\psi\rangle is given by ϕψ2|\langle \phi | \psi \rangle|^2. Central to this framework are the principles governing observables in . Physical observables, such as position, momentum, or spin, are represented by self-adjoint operators on the . The possible outcomes of a of an observable are the eigenvalues of its corresponding operator, and upon measurement, the system collapses to the associated eigenstate. The probabilities of these outcomes follow the , ensuring that the measurement process introduces intrinsic indeterminacy, as the pre-measurement state does not specify a unique result but only a set of weighted possibilities. For incompatible observables represented by non-commuting operators, uncertainty relations impose fundamental limits on the simultaneous precision of measurements. The position-momentum , ΔxΔp2\Delta x \Delta p \geq \frac{\hbar}{2}, exemplifies this, where Δx\Delta x and Δp\Delta p are the standard deviations, and =h/2π\hbar = h/2\pi with hh Planck's constant. This operator formalism underpins the theory's predictive power while highlighting the non-deterministic character of quantum events. Unlike classical physics, where a complete specification of a system's state allows deterministic prediction of all future properties, quantum indeterminacy stems from an intrinsic randomness that cannot be eliminated by acquiring more information. In classical mechanics, uncertainties are epistemic, arising from incomplete knowledge, but in quantum mechanics, they are ontic, reflecting a fundamental limitation where not all properties can be simultaneously well-defined for a given state. This leads to the absence of a unique trajectory or value for all observables, distinguishing quantum systems from their classical counterparts. The mathematical structure of quantum mechanics relies on , a complete over the complex numbers, which provides the arena for state vectors. These vectors, normalized to unit length, represent pure quantum states and evolve unitarily according to the until a measurement occurs. This setup is essential for formalizing superposition and entanglement, though the focus here remains on how it enables the probabilistic interpretation central to indeterminacy.

Historical development

The concept of quantum indeterminacy traces its origins to the early , when efforts to resolve inconsistencies in led to probabilistic interpretations of atomic and subatomic phenomena. In 1900, introduced the quantum hypothesis to explain , proposing that energy is emitted and absorbed in discrete quanta rather than continuously, which implied an inherent discreteness in physical processes that challenged deterministic . Five years later, extended this idea in his explanation of the , arguing that light behaves as discrete packets of energy (quanta, later called photons) whose interactions with matter produce probabilistic outcomes, such as electron ejection probabilities dependent on frequency rather than intensity alone. These developments marked the first shifts toward viewing quantum events as fundamentally unpredictable, laying groundwork for indeterminacy without fully embracing it. The 1920s saw rapid advancements in quantum theory that explicitly incorporated non-deterministic elements. Werner Heisenberg's 1925 formulation of provided a mathematical framework for quantum phenomena, emphasizing observable quantities through non-commuting matrices representing dynamical variables. In 1927, Heisenberg introduced the , articulating that conjugate variables like position and cannot be simultaneously measured with arbitrary precision, establishing intrinsic uncertainty as a core feature. Complementing this, Erwin Schrödinger's 1926 wave mechanics described particles via wave functions that evolve deterministically but yield probabilistic measurement outcomes, further underscoring the non-deterministic nature of quantum predictions. Experimental support came from Arthur Compton's 1923 discovery of the Compton effect, where X-rays scattered off electrons exhibited wavelength shifts consistent with particle-like collisions, providing evidence of wave-particle duality and the probabilistic scattering of quanta. The , interpreting the wave function's magnitude squared as a , formalized this indeterminacy in measurements. Debates over indeterminacy intensified at the 1927 Solvay Conference, where defended the —positing that inherently limits predictability due to measurement disturbances—against Albert Einstein's objections that such randomness indicated an incomplete theory. In the 1930s, John von Neumann's 1932 book axiomatized mathematically, rigorously defining the measurement process as a non-deterministic projection onto eigenstates and solidifying indeterminacy as a core postulate. This era culminated in the 1935 Einstein-Podolsky-Rosen (EPR) paper, which argued through a on entangled particles that ' probabilistic predictions implied "spooky " and failed to capture all elements of physical reality, challenging the theory's completeness. Post-1960s developments provided empirical vindication of quantum indeterminacy over deterministic alternatives. John Bell's 1964 theorem derived inequalities that any must satisfy, but predicts violations, offering a testable distinction from Einstein's realism. Alain Aspect's 1982 experiments with entangled photons confirmed these violations by over five standard deviations, ruling out local hidden variables and affirming the non-local, indeterministic character of .

Measurement process

Projective measurements

In , projective measurements are formalized through the model developed by , where a measurement of an corresponds to the collapse of the onto one of the eigenstates of the 's associated operator, with the possible outcomes determined by the eigenvalues of that operator. This collapse, often termed the projection postulate, ensures that the post-measurement state is an eigenstate, yielding a definite value for the measured , while the probability of each outcome arises from the applied to the representing the . The guarantees that such operators can be diagonalized in an appropriate basis, providing a discrete spectrum of possible measurement results for systems with finite-dimensional Hilbert spaces. The mathematical framework for projective measurements extends to both pure and mixed states using the density operator formalism. For a quantum system described by a density operator ρ\rho, a projective measurement associated with an AA is defined by a set of orthogonal projectors {Pi}\{P_i\} satisfying iPi=I\sum_i P_i = I and PiPj=δijPiP_i P_j = \delta_{ij} P_i, where each PiP_i projects onto the eigenspace corresponding to eigenvalue λi\lambda_i. The probability pip_i of obtaining outcome λi\lambda_i is given by pi=Tr(ρPi),p_i = \operatorname{Tr}(\rho P_i), which follows from the Born rule generalized to density operators. Upon measurement yielding λi\lambda_i, the post-measurement state becomes the normalized projection ρ=PiρPipi,\rho' = \frac{P_i \rho P_i}{p_i}, ensuring that subsequent measurements of the same yield λi\lambda_i with certainty. This update rule captures the irreversible nature of the measurement process in the projective model. In the formalism, are represented by operators on the system's , which for projective measurements are assumed to have a discrete to allow complete resolution into eigenprojections. ensures real eigenvalues, aligning with the empirical of measurement outcomes, and the completeness of the projectors guarantees that the measurement exhausts all possibilities without ambiguity. Determinate outcomes occur only for states that are eigenstates of the operator; otherwise, the measurement introduces indeterminacy through probabilistic . Compatible observables, whose operators commute, share a common set of eigenstates, permitting simultaneous projective measurements with definite joint outcomes. Indeterminacy in projective measurements arises fundamentally from the non-commutativity of operators representing incompatible observables, where [A,B]=ABBA0[A, B] = AB - BA \neq 0. Non-commuting operators lack a complete set of simultaneous eigenstates, precluding the existence of states with definite values for both observables. This incompatibility manifests in sequential measurements: measuring AA first projects the state onto an eigenstate of AA, disrupting any prior preparation for BB, leading to unpredictable outcomes for BB unless [A,B]=0[A, B] = 0. Thus, projective measurements enforce indeterminacy as an intrinsic feature of the formalism for non-commuting observables.

Observer effect and disturbance

In , the observer effect refers to the unavoidable interaction between a quantum system and the measuring apparatus, which entangles the two, resulting in decoherence that mimics and yields a definite outcome without invoking conscious . This process arises from the physical coupling required for measurement, where the system's becomes correlated with the apparatus's , suppressing superpositions and producing classical-like behavior. A key distinction exists between this observer effect and mere classical disturbance: quantum indeterminacy is inherent to the system's pre-measurement state, which is already probabilistic, rather than solely resulting from measurement-induced back-action that perturbs the system. Delayed-choice experiments demonstrate this intrinsic nature, showing that the decision to measure which-path information can be made after the particle has passed through , yet still eliminates interference, confirming that the stems from the quantum description itself, not post-facto disturbance. It is important to distinguish the observer effect from related quantum concepts such as Heisenberg's uncertainty principle and Schrödinger's cat thought experiment. Heisenberg's uncertainty principle concerns fundamental limits on the precision of simultaneous measurements of incompatible observables, such as position and momentum. In contrast, Schrödinger's cat thought experiment illustrates the superposition of states and the role of observation in resolving them into definite outcomes. While both involve aspects of quantum indeterminacy, they address distinct phenomena: the uncertainty principle focuses on measurement precision for conjugate variables, whereas the cat experiment highlights the measurement problem and superposition in macroscopic contexts. The role of the environment in this process is central to decoherence theory, which posits that interactions with surrounding particles or fields rapidly entangle the system with a vast environmental bath, selecting "preferred" bases—such as position over momentum—where outcomes appear determinate and classical, even without a formal . This environmental explains why macroscopic systems exhibit definite states, as the entanglement spreads irreversibly, making interference unobservable on practical timescales. Experimentally, the double-slit interference setup with which-path detectors illustrates this mechanism: when detectors are introduced to identify the slit traversed by photons or atoms, the interference pattern vanishes not primarily due to physical momentum transfer disturbing the , but because the entangles with the detector, creating which-way that decoheres the superposition. This loss of coherence persists even in setups minimizing direct disturbance, underscoring entanglement as the core cause.

Examples of indeterminacy

Spin-1/2 particle measurement

A quintessential example of quantum indeterminacy arises in the measurement of the spin component of a particle, such as an , along different spatial directions. Consider such a particle prepared in the eigenstate along the z-axis, given by ψ=,|\psi\rangle = |\uparrow\rangle, where |\uparrow\rangle denotes the eigenstate of the spin operator SzS_z with eigenvalue +/2+\hbar/2. This preparation yields a definite value for the z-component of spin prior to . When the spin is measured along the orthogonal x-axis, the observable is represented by the operator Sx=(/2)σxS_x = (\hbar/2) \sigma_x, where σx\sigma_x is the Pauli matrix σx=(0110).\sigma_x = \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}. The possible outcomes are +/2+\hbar/2 and /2-\hbar/2 (or equivalently ±1\pm 1 in units where =1\hbar = 1), each occurring with equal probability of 1/2, due to the Born rule applied to the initial state. Upon obtaining the +1 outcome, the state collapses to the eigenstate =12(+)|\rightarrow\rangle = \frac{1}{\sqrt{2}} \left( |\uparrow\rangle + |\downarrow\rangle \right)
Add your contribution
Related Hubs
User Avatar
No comments yet.