Hubbry Logo
Variable speed of lightVariable speed of lightMain
Open search
Variable speed of light
Community hub
Variable speed of light
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Variable speed of light
Variable speed of light
from Wikipedia

A variable speed of light (VSL) is a feature of a family of hypotheses stating that the speed of light may in some way not be constant, for example, that it varies with frequency, in space, or over time. Accepted classical theories of physics, and in particular general relativity, predict a constant speed of light in any local frame of reference and in some situations these predict apparent variations of the speed of light depending on frame of reference, but this article does not refer to this as a variable speed of light. Various alternative theories of gravitation and cosmology, many of them non-mainstream, incorporate variations in the local speed of light.

Attempts to incorporate a variable speed of light into physics were made by Robert Dicke in 1957, and by several researchers starting from the late 1980s.

VSL should not be confused with faster than light theories, which depends on a medium's refractive index or its measurement in a remote observer's frame of reference in a gravitational potential. In this context, the "speed of light" refers to the limiting speed c of the theory rather than to the velocity of propagation of photons.

Historical proposals

[edit]

Background

[edit]

While the speed of light is generally considered to be a constant, the idea that physical "constants" might be variable has a long history. One of early proposals was Dirac large numbers hypothesis. Looking for variation in these constants is an important way of testing the physical laws.[1][2]

Einstein's equivalence principle, on which general relativity is founded, requires that in any local, freely falling reference frame, the speed of light is always the same.[3][4] This leaves open the possibility, however, that an inertial observer inferring the apparent speed of light in a distant region might calculate a different value. Spatial variation of the speed of light in a gravitational potential as measured against a distant observer's time reference is implicitly present in general relativity.[5] The apparent speed of light will change in a gravity field and, in particular, go to zero at an event horizon as viewed by a distant observer.[6] In deriving the gravitational redshift due to a spherically symmetric massive body, a radial speed of light dr/dt can be defined in Schwarzschild coordinates, with t being the time recorded on a stationary clock at infinity. The result is

where m is MG/c2 and where natural units are used such that c0 is equal to one.[7][8]

Dicke's proposal (1957)

[edit]

Robert Dicke, in 1957, developed a VSL theory of gravity, a theory in which (unlike general relativity) the speed of light measured locally by a free-falling observer could vary.[9] Dicke assumed that both frequencies and wavelengths could vary, which since resulted in a relative change of c. Dicke assumed a refractive index (eqn. 5) and proved it to be consistent with the observed value for light deflection. In a comment related to Mach's principle, Dicke suggested that, while the right part of the term in eq. 5 is small, the left part, 1, could have "its origin in the remainder of the matter in the universe".

Given that in a universe with an increasing horizon more and more masses contribute to the above refractive index, Dicke considered a cosmology where c decreased in time, providing an alternative explanation to the cosmological redshift.[9]: 374 

Subsequent proposals

[edit]

Several hypotheses for varying speed of light, seemingly in contradiction to general relativity theory, have been published, including those of Giere and Tan (1986)[10] and Sanejouand (2009).[11] In 2003, Magueijo gave a review of such hypotheses.[12]

Cosmological models with varying speeds of light[13] have been proposed independently by Jean-Pierre Petit in 1988,[14] John Moffat in 1992,[15] and the team of Andreas Albrecht and João Magueijo in 1998[16] to explain the horizon problem of cosmology and propose an alternative to cosmic inflation.

Relation to other constants and their variation

[edit]

Dimensionless and dimensionful quantities

[edit]

Units are vital for experimental measurements and comparing the results of experiments to theory necessarily entangles units and the physical constants. Physical constants with units are not fundamental. Any equation of physical law can be expressed in a form in which all dimensional quantities are normalized against like-dimensioned quantities (called nondimensionalization), resulting in only dimensionless quantities remaining. Only variation of these dimensionless quantities change the nature of physics.[17] A physical theory that postulates a varying fine-structure constant can be express as either a variable speed of light or a variable electric charge.[18]

Physicists often adopt natural units in which the physical constants c, G, ħ = h/(2π), ε0, and kB take the value one, resulting in every physical quantity being normalized against its corresponding Planck unit.[18] When Planck units are used and such equations of physical law are expressed in this nondimensionalized form, no dimensional physical constants such as c, G, ħ, ε0, nor kB remain, only dimensionless quantities,[19] as predicted by the Buckingham π theorem.

Gravitational constant G

[edit]

In 1937, Paul Dirac and others began investigating the consequences of natural constants changing with time.[20] For example, Dirac proposed a change of only 5 parts in 1011 per year of the Newtonian constant of gravitation G to explain the relative weakness of the gravitational force compared to other fundamental forces. This has become known as the Dirac large numbers hypothesis.

However, Richard Feynman showed[21] that the gravitational constant most likely could not have changed this much in the past 4 billion years based on geological and solar system observations, although this may depend on assumptions about G varying in isolation. (See also strong equivalence principle.)

Fine-structure constant α

[edit]

One group, studying distant quasars, has claimed to detect a variation of the fine-structure constant[22] at the level in one part in 105. Other authors dispute these results. Other groups studying quasars claim no detectable variation at much higher sensitivities.[23][24][25]

The natural nuclear reactor of Oklo has been used to check whether the atomic fine-structure constant α might have changed over the past 2 billion years. That is because α influences the rate of various nuclear reactions. For example, 149
Sm
captures a neutron to become 150
Sm
, and since the rate of neutron capture depends on the value of α, the ratio of the two samarium isotopes in samples from Oklo can be used to calculate the value of α from 2 billion years ago. Several studies have analysed the relative concentrations of radioactive isotopes left behind at Oklo, and most have concluded that nuclear reactions then were much the same as they are today, which implies α was the same too.[26][27]

Paul Davies and collaborators have suggested that it is in principle possible to disentangle which of the dimensionful constants (the elementary charge, the Planck constant, and the speed of light) of which the fine-structure constant is composed is responsible for the variation.[28] However, this has been disputed by others and is not generally accepted.[29][30]

Criticisms of various VSL concepts

[edit]

General critique of varying c cosmologies

[edit]

From a very general point of view, G. F. R. Ellis and Jean-Philippe Uzan expressed concerns that a varying c would require a rewrite of much of modern physics to replace the current system which depends on a constant c.[31][32] Ellis claimed that any varying c theory (1) must redefine distance measurements; (2) must provide an alternative expression for the metric tensor in general relativity; (3) might contradict Lorentz invariance; (4) must modify Maxwell's equations; and (5) must be done consistently with respect to all other physical theories. VSL cosmologies remain out of mainstream physics.

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Variable speed of light (VSL) theories propose that the speed of light in vacuum, traditionally considered a universal constant in special relativity, may vary across space or over cosmic time, potentially resolving longstanding issues in cosmology and quantum gravity. These hypotheses emerged as alternatives to cosmic inflation, suggesting that an initially much higher speed of light in the early universe allowed for greater causal connectivity, thereby explaining the observed uniformity of the cosmic microwave background without invoking an exponential expansion phase. Unlike standard models where c is fixed at approximately 299,792 km/s, VSL frameworks modify fundamental equations, such as Maxwell's, to accommodate dynamic variations while preserving key experimental tests of relativity in the present epoch. The conceptual roots of VSL trace back to mid-20th-century physics, but systematic modern development began in the late 1990s through works by researchers including John Moffat, João Magueijo, and Andreas Albrecht. Motivated by the —why distant regions of the universe appear thermally equilibrated despite never having been in causal contact under constant c—VSL models posit a where c decreases dramatically shortly after the , expanding light cones retroactively to enable information exchange across the . This approach also addresses the by naturally driving the density parameter Ω toward unity through the varying c, reducing the need for extreme initial fine-tuning required in Friedmann-Robertson-Walker cosmologies. Additionally, VSL has been linked to phenomena, such as "doubly special relativity," where at Planck scales exhibits effective Lorentz invariance breaking, potentially explaining anomalies in high-energy particle propagation. Key VSL models include bimetric theories, which assign different propagation speeds to and ; color-dependent variants altering c for photons of varying energy; and brane-world scenarios from where extra dimensions induce variability. Observational support remains tentative but includes early hints (now controversial and largely disputed) of a redshift-dependent α, with 2001–2003 measurements suggesting Δα/α ≈ -0.72 × 10⁻⁵ at z ≈ 0.5–3.5, interpretable as tied to c variations since α ∝ 1/c; later analyses find no significant change. Ultra-high-energy cosmic rays exceeding the Greisen-Zatsepin-Kuzmin cutoff may also signal energy-dependent c, though constraints from and atomic clock comparisons limit past changes to |Δc/c| ≲ 10⁻⁴ over the last billion years. Overall, VSL offers a parsimonious framework for , predicting a scale-invariant, Gaussian spectrum consistent with data, while avoiding inflation's reliance on unverified scalar fields.

Fundamentals of VSL

Definition and physical meaning

Variable speed of light (VSL) theories propose that the in , denoted by cc, is not a fixed universal constant but varies as a function of time, position , scale, or cosmic , either as a fundamental dynamical parameter or an effective quantity emerging from underlying physics. This variation contrasts with the foundational postulate of , where cc is invariant for all observers, serving as the universal . In VSL frameworks, such changes to cc can arise from modifications to the properties, scalar fields influencing electromagnetic propagation, or altered metrics, potentially simplifying theoretical descriptions when units are chosen such that cc is non-constant. Physically, cc carries multiple interpretations in VSL contexts: it represents the propagation speed of electromagnetic waves, the maximum causal limit for information transfer between events, or a coefficient in the spacetime metric that governs geodesic paths. As the propagation speed, a varying cc implies that photons follow dispersion relations modified from the standard E=pcE = pc (for massless particles), such as E=pc(t)E = p \, c(t) in temporally varying cases, where the energy EE and momentum pp of photons scale with the local or epoch-dependent c(t)c(t). More generally, VSL can introduce energy-dependent modifications like E2f12(E;λ)p2f22(E;λ)=m2E^2 f_1^2(E; \lambda) - p^2 f_2^2(E; \lambda) = m^2, where f1f_1 and f2f_2 are functions incorporating the variation, altering how light travels compared to massive particles. In terms of causality, cc defines the boundaries of light cones in spacetime; a varying cc distorts these null cones, expanding or contracting the region of causally connected events and potentially allowing superluminal effective speeds for light relative to a fixed gravitational metric, while preserving local Lorentz invariance in some formulations. For instance, in bimetric approaches, photons may propagate along null geodesics of an effective electromagnetic metric distinct from the gravitational one, leading to tilted or widened light cones that reflect differing speeds for light and gravity. This variability distinguishes VSL from special and general relativity, where cc is invariant and sets the scale for intervals via the , ensuring the same laws of physics in all inertial frames without preferred directions or times. In VSL, such invariance may hold only locally or be broken globally, requiring modified that account for the changing scale, such as incorporating a position-dependent factor in the transformation between coordinates. Consequently, units of time, , and rescale with cc, interpreting variations as shifts in measurement standards rather than absolute changes, though some models treat cc as a dynamical field reflecting evolving fundamental interactions. These interpretations maintain the core geometric role of null geodesics for light paths but adapt them to a non-uniform , emphasizing cc's role as both a physical limit and a metric descriptor.

Implications for relativity and causality

In variable speed of light (VSL) theories, the variation of cc fundamentally challenges the foundational postulate of that the speed of light is invariant in all inertial frames, leading to a breakdown of global Lorentz invariance. Instead, these models often introduce a preferred cosmological frame in which c=c(t)c = c(t) evolves with , necessitating modified Lorentz transformations that depend on the local value of cc. This breaking of Lorentz invariance can manifest as the emergence of a preferred direction or frame, altering the between space and time and potentially allowing for anisotropic effects in and cosmology. However, some formulations aim to preserve local Lorentz invariance at each epoch by treating c(t)c(t) as constant on hypersurfaces of constant time, ensuring that physics remains relativistic within small scales while permitting global variations. The variability of cc raises significant concerns for , as the light cones defining causal boundaries would evolve over time, potentially permitting superluminal signaling relative to later epochs. In the early , a dramatically higher cc (e.g., exceeding the current value by factors of 103010^{30} or more) expands past light cones, enabling causal connections across regions that would otherwise be disconnected in standard relativity, thus addressing issues like the without . This temporal variation risks tachyonic instabilities or closed timelike curves if not carefully constrained, as signals propagating at the local c(t)c(t) might appear to violate when viewed from frames with different cc, though most VSL models mitigate this by enforcing subluminal propagation within each local frame and avoiding acausal loops through to matter. Such dynamics highlight the tension between VSL and the strict of Minkowski , where fixed light cones prevent information from traveling backward in time. VSL can be interpreted through unit rescaling as equivalent to variations in other fundamental constants or effective , rather than a literal change in light's propagation speed. For instance, an effective c(t)=c0f(t)c(t) = c_0 f(t), where f(t)f(t) is a scaling function and c0c_0 is the current value, is mathematically indistinguishable from rescaling Planck's constant \hbar or the unit of time, preserving the form of physical laws while altering their numerical values across cosmic . This equivalence arises because cc enters dimensionally in the definitions of units; varying cc while keeping \hbar fixed is akin to a global clock adjustment, which simplifies comparisons but complicates direct measurements of constancy. Broader implications extend to the foundational equations of physics, where c(t)c(t) modifies the structure of and Einstein's field equations. In , the covariant form becomes μ(cFμν)=4πjν\partial_\mu (c F^{\mu\nu}) = 4\pi j^\nu, leading to a time-dependent 2A/t2=c2(t)2A\partial^2 A / \partial t^2 = c^2(t) \nabla^2 A for the , which alters dispersion and propagation without violating local . For , minimal substitution in Einstein's equations yields Gμν=(8πG/c4(t))TμνG_{\mu\nu} = (8\pi G / c^4(t)) T_{\mu\nu}, introducing non-conservation of energy-momentum due to terms involving c˙/c\dot{c}/c, which can mimic effects or resolve flatness problems through dynamical adjustments. These modifications underscore how VSL reframes relativity not as an absolute framework but as an emergent property tied to cosmic scales.

Historical proposals

Early background and motivations

In the , discussions surrounding the provided early precursors to ideas about a variable , though these were primarily local and non-cosmological in nature. George Gabriel Stokes proposed in 1845 that the could be fully dragged along by moving matter, such as the , which would imply that the speed of light relative to an observer varies depending on the motion of the medium through which it propagates. This ether drag hypothesis, building on earlier partial drag ideas by , suggested that light's velocity could differ in moving media compared to a stationary , challenging the notion of an absolute constant speed but remaining tied to classical ether models rather than universal variation. Similarly, William Thomson () and Peter Guthrie Tait speculated in 1874 on the possibility of light's speed varying as a , at a time when c held no privileged role in physics, reflecting a broader willingness to treat propagation speeds as mutable. By the early , introduced a more formal consideration of variable light speed in the context of gravity, predating the full development of . In his 1911 paper, Einstein proposed that the decreases in a to account for the predicted of spectral lines, deriving a formula where c varies as c(1 + φ/c²), with φ as the . This idea, though later superseded by the and spacetime curvature in 1915, marked an influential shift toward viewing c as potentially non-constant under gravitational influence, motivating further exploration of its variability. During , amid debates over the nature of cosmological , some physicists invoked varying c as an alternative explanation to galactic recession or "tired light" hypotheses, suggesting that a decreasing over cosmic distances could mimic the observed stretching without requiring an expanding . Philosophically, these early ideas stemmed from the view that the speed of light was not inherently fundamental or immutable, especially before special relativity elevated c to a cornerstone of spacetime structure. Proponents argued that physical constants might evolve with the universe's development, avoiding ad hoc assumptions about their permanence and allowing for a more dynamic cosmology. A key catalyst in the 1950s was Paul Dirac's 1937 large numbers hypothesis, which posited that the gravitational constant G decreases over cosmic time to explain coincidences between atomic and astronomical scales, thereby inspiring considerations of variation in other fundamental constants, including the speed of light. This pre-Dicke era lacked rigorous formal theories but laid groundwork through such practical motivations as reconciling redshift observations and philosophical openness to evolving laws.

Dicke's 1957 proposal

In 1957, proposed a novel framework for gravitation and cosmology that eschewed the principle of equivalence, instead describing gravitational effects through a flat with a variable (VSL). In this model, the speed of light cc varies spatially near masses, acting like a ϵ=c0/c=1+2GM/(rc2)\epsilon = c_0 / c = 1 + 2GM / (r c^2), where c0c_0 is the speed far from masses, GG is the , MM is the mass, and rr is the distance; this formulation successfully accounted for the observed deflection of light by the Sun at 1.75 arcseconds. Cosmologically, Dicke extended this to a time-varying c(t)c(t) that decreases over cosmic history, proportional to 1/t1/t in an expanding universe, mimicking the slowing of atomic clocks and linking local measurements to global cosmic evolution. The model's formulation tied the variation in cc to atomic timescales, such that changes in speed directly affect : the relative shift is given by δν/ν=δc/c\delta \nu / \nu = -\delta c / c, reflecting how a decreasing cc leads to in emitted as photons propagate through evolving cosmic conditions. In this setup, the cosmic scale factor's rate of change is governed by R˙(t)=c(t)\dot{R}(t) = c(t), allowing for a dynamic where propagation adjusts to resolve causal disconnects without invoking expansion-driven mechanisms alone. Dicke aimed to address key cosmological puzzles, including the large s of quasars, which he interpreted through "rod shortening" effects from varying cc rather than pure Doppler or expansion shifts, yielding a redshift formula z+1=(t2/t1)1/4z + 1 = (t_2 / t_1)^{1/4} for radiation-dominated epochs. This approach also tackled the by permitting faster light travel in the early universe, enabling causal contact across vast distances without reliance on steady-state cosmology, while connecting to the flatness issue via adjusted dynamics. The proposal integrated with scalar-tensor gravity ideas, foreshadowing the Brans-Dicke theory, where a modulates cc and embodies by tying gravitational strength to the universe's total mass distribution. While Dicke's ideas garnered initial interest for their elegance in linking , , and cosmology—particularly in relation to Dirac's large number hypothesis—the model saw limited adoption due to insufficient and challenges in reconciling with observations like particle number conservation.

Late 20th-century developments

In the , physicist John W. Moffat developed a bimetric of that introduced two distinct metrics: one governing the propagation of with a constant speed cmc_m and another for with a variable speed clc_l. This framework allowed for a varying while maintaining consistency with in local frames, aiming to address cosmological issues like the through non-constant propagation. During the 1980s, , Geoffrey Burbidge, and Jayant V. Narlikar proposed the quasi-steady-state cosmology (QSSC), an alternative to the standard model that incorporated a varying alongside other evolving physical constants to better fit observational data on distributions and cosmic microwave background features. In this model, the universe undergoes periodic expansions and contractions with matter creation, where adjustments to GG helped reconcile the theory with observations and large-scale structure without relying on . Jean-Pierre Petit contributed to VSL ideas in 1988 with a gauge cosmological model featuring a variable light velocity tied to evolving fundamental constants like the hh and GG. In Petit's approach, characteristic lengths (such as Compton and Schwarzschild radii) scale with the cosmic scale factor R(t)R(t), leading to c1/Rc \propto 1/R and enabling interpretations of redshifts as arising from secular variations in these constants rather than solely from expansion. A significant breakthrough came in 1998 when Andreas Albrecht and João Magueijo introduced a VSL model where the speed of light varies inversely with the cosmic scale factor, c1/a(t)c \propto 1/a(t), to resolve the horizon problem without invoking inflation. This variation allows distant regions of the early universe to achieve thermal equilibrium via faster light travel, while the modified Friedmann equation becomes H2=8πG3ρ(cc0)4,H^2 = \frac{8\pi G}{3} \rho \left( \frac{c}{c_0} \right)^4, where HH is the Hubble parameter, ρ\rho is the energy density, GG is the gravitational constant, c0c_0 is the current speed of light, and the (c/c0)4(c/c_0)^4 term accounts for the enhanced effective density in a radiation-dominated era with varying cc. Their work demonstrated that such a model could also alleviate the flatness problem and suppress magnetic monopoles, providing a Lorentz-violating alternative to inflationary cosmology.

Key VSL models and frameworks

Bimetric and varying-e theories

Bimetric gravity theories introduce a framework for variable speed of light (VSL) by employing two distinct metrics to describe different sectors of physics, thereby allowing the effective to vary without fundamentally breaking Lorentz invariance in local frames. In this approach, one metric, gμνg_{\mu\nu}, governs the dynamics of and massive , while a second metric, γμν\gamma_{\mu\nu} (often denoted as g^μν\hat{g}_{\mu\nu}), is associated with and massless particles. The speed of light emerges as the ratio of proper distances measured in these metrics, c=dsγdsgc = \frac{ds_{\gamma}}{ds_g}, enabling cc to vary spatiotemporally as a function of the relative scaling between the metrics. This formulation was pioneered by John W. Moffat in 2002, who incorporated an interaction term in the action to couple the two metrics, ensuring consistency with in the appropriate limits while permitting VSL effects on cosmological scales. The interaction between the metrics is typically mediated by a scalar field ϕ\phi, which dynamically adjusts the conformal relation γμν=Ω2(ϕ)gμν\gamma_{\mu\nu} = \Omega^2(\phi) g_{\mu\nu}, where Ω(ϕ)\Omega(\phi) determines the variation in cc. This scalar-tensor extension leads to modified field equations, including a wave equation for ϕ\phi: γμνμνϕ+KV[ϕ]=0\gamma^{\mu\nu} \nabla_\mu \nabla_\nu \phi + K V'[\phi] = 0, where V[ϕ]V[\phi] is the potential and KK is a coupling constant. Such models preserve local Lorentz invariance for each metric separately, meaning observers in the gravitational frame experience standard relativity, while the electromagnetic frame allows for varying light propagation speeds. This duality provides a mechanism for global VSL without local violations of causality or equivalence principles. Complementing bimetric approaches, varying-ϵ\epsilon models treat the electric permittivity of the , ϵ(t)\epsilon(t), as a time-dependent quantity driven by a , leading to an effective VSL through the relation c(t)=1/ϵ(t)μ0c(t) = 1 / \sqrt{\epsilon(t) \mu_0}
Add your contribution
Related Hubs
User Avatar
No comments yet.