Hubbry Logo
ThermalisationThermalisationMain
Open search
Thermalisation
Community hub
Thermalisation
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Thermalisation
Thermalisation
from Wikipedia

In physics, thermalisation (or thermalization) is the process of physical bodies reaching thermal equilibrium through mutual interaction. In general, the natural tendency of a system is towards a state of equipartition of energy and uniform temperature that maximizes the system's entropy. Thermalisation, thermal equilibrium, and temperature are therefore important fundamental concepts within statistical physics, statistical mechanics, and thermodynamics; all of which are a basis for many other specific fields of scientific understanding and engineering application.

Examples of thermalisation include:

The hypothesis, foundational to most introductory textbooks treating quantum statistical mechanics,[4] assumes that systems go to thermal equilibrium (thermalisation). The process of thermalisation erases local memory of the initial conditions. The eigenstate thermalisation hypothesis is a hypothesis about when quantum states will undergo thermalisation and why.

Not all quantum states undergo thermalisation. Some states have been discovered which do not (see below), and their reasons for not reaching thermal equilibrium are unclear as of March 2019.

Theoretical description

[edit]

The process of equilibration can be described using the H-theorem or the relaxation theorem,[5] see also entropy production.

Systems resisting thermalisation

[edit]

Classical systems

[edit]

Broadly-speaking, classical systems with non-chaotic behavior will not thermalise. Systems with many interacting constituents are generally expected to be chaotic, but this assumption sometimes fails. A notable counter example is the Fermi–Pasta–Ulam–Tsingou problem, which displays unexpected recurrence and will only thermalise over very long time scales.[6] Non-chaotic systems which are perturbed by weak non-linearities will not thermalise for a set of initial conditions, with non-zero volume in the phase space, as stated by the KAM theorem, although the size of this set decreases exponentially with the number of degrees of freedom.[7] Many-body integrable systems, which have an extensive number of conserved quantities, will not thermalise in the usual sense, but will equilibrate according to a generalized Gibbs ensemble.[8][9]

Quantum systems

[edit]

Some such phenomena resisting the tendency to thermalize include (see, e.g., a quantum scar):[10]

  • Conventional quantum scars,[11][12][13][14] which refer to eigenstates with enhanced probability density along unstable periodic orbits much higher than one would intuitively predict from classical mechanics.
  • Perturbation-induced quantum scarring:[15][16][17][18][19] despite the similarity in appearance to conventional scarring, these scars have a novel underlying mechanism stemming from the combined effect of nearly-degenerate states and spatially localized perturbations,[15][19] and they can be employed to propagate quantum wave packets in a disordered quantum dot with high fidelity.[16]
  • Many-body quantum scars.
  • Many-body localisation (MBL),[20] quantum many-body systems retaining memory of their initial condition in local observables for arbitrary amounts of time.[21][22]

Other systems that resist thermalisation and are better understood are quantum integrable systems[23] and systems with dynamical symmetries.[24]

References

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Thermalisation, also spelled thermalization, is the process by which an isolated physical system evolves from a non-equilibrium initial state to a state of through unitary or classical dynamics, resulting in local observables relaxing to values predicted by statistical ensembles such as the microcanonical or distribution. This phenomenon reconciles the reversible, deterministic laws governing microscopic constituents with the irreversible, probabilistic behavior observed macroscopically, forming a cornerstone of and . In classical , thermalisation occurs via ergodic dynamics in chaotic systems, where trajectories densely explore the available , ensuring that long-time averages of observables equal averages over energy shells, thus achieving uniform probability distributions consistent with conserved quantities like . This process is typical for non-integrable systems, such as gases or coupled oscillators, but fails in integrable cases where additional conserved quantities constrain the dynamics, preventing full phase-space delocalization. In quantum mechanics, particularly for isolated many-body systems, thermalisation is often explained by the , which posits that individual energy eigenstates of generic, non-integrable Hamiltonians exhibit thermal properties locally, with matrix elements of observables fluctuating around microcanonical expectations. Under ETH, initial states with subextensive energy fluctuations evolve such that expectation values of local operators stabilize at thermal predictions after a relaxation time, even without coupling to an external bath. Quantum thermalisation is linked to , evidenced by level-spacing statistics resembling random matrix theory, and has been experimentally verified in ultracold atomic gases and spin chains. Notable exceptions include integrable quantum systems and those exhibiting (MBL), where disorder induces a breakdown of thermalisation, leading to persistent non-equilibrium behavior or failure to reach equilibrium. Additionally, prethermalisation describes an intermediate regime where systems approach a quasi-steady state resembling a generalized Gibbs before full thermalisation, common in nearly integrable or driven systems with separated timescales. These concepts extend to open systems and nonequilibrium steady states, influencing fields from to .

Introduction

Definition and Core Concepts

Thermalisation refers to the by which a non-equilibrium evolves toward , characterized by the statistical distribution of energy among its according to a single . In this state, the system's macroscopic properties, such as pressure and volume for a gas, become uniform and time-independent, reflecting the underlying microscopic dynamics that lead to energy equipartition. This process is fundamental in , where it bridges the gap between deterministic microscopic laws and emergent thermodynamic behavior. At the core of thermalisation lies the concept of , where for classical systems, the velocities of particles follow the Maxwell-Boltzmann distribution, describing the probability of particles having a given speed in proportion to emv2/2kTe^{-mv^2 / 2kT}, with mm as mass, vv as speed, kk as Boltzmann's constant, and TT as . This equilibrium can be described using statistical ensembles: the applies to s with fixed energy, treating all microstates with that energy as equally probable, while the is used for systems in contact with a heat bath, where energy fluctuates but is fixed. The second law of thermodynamics drives thermalisation by dictating that the of an cannot decrease, ensuring the system progresses toward the most probable configuration. Temperature serves as a basic prerequisite for understanding thermalisation, defined as a measure of the average per degree of freedom in the , specifically 32kT\frac{3}{2} kT for translational motion in a monatomic . The endpoint of this process is maximization, where the reaches the macrostate with the highest number of accessible microstates, thereby achieving maximum disorder and stability. For instance, consider a simple gas confined in a container initially perturbed by heating one side; through molecular collisions, energy redistributes until the entire gas attains a uniform , exemplifying classical thermalisation. In quantum systems, extensions like the provide analogous mechanisms, though detailed are beyond this foundational scope.

Significance Across Physics Disciplines

Thermalisation plays a pivotal role in by enabling the attainment of equilibrium states essential for the operation of engines and refrigerators, where systems exchange to convert work into or vice versa while adhering to the second law. In , it underpins ensemble theory, allowing the description of macroscopic properties through probabilistic averaging over microstates, thus bridging microscopic dynamics to observable thermodynamic quantities. Within cosmology, thermalisation facilitated the cooling of the early universe following the , enabling processes like and recombination by distributing among particles until a was reached. In , thermalisation contributes to metabolic balance, where organisms maintain internal equilibrium against environmental fluctuations through dissipative processes that dissipate excess as . The phenomenon is central to understanding irreversibility, as thermalisation drives systems toward states of higher , providing the thermodynamic basis for the and explaining why processes like or flow appear unidirectional in macroscopic observations. This irreversibility supports practical technologies such as , which exploit controlled thermalisation to achieve cooling by reversing natural flows through external work. Thermalisation connects to chaos and by ensuring that, in sufficiently complex systems, trajectories densely explore , allowing time averages to equal averages and enabling predictive statistical descriptions of otherwise unpredictable dynamics. However, its failure in disordered systems leads to phenomena like amorphous , where structural prevents equilibrium, or , where quantum interactions fail to delocalize energy, preserving initial correlations indefinitely.

Historical Context

Early Foundations in Thermodynamics

The foundations of thermalisation in thermodynamics emerged in the early 19th century, driven by efforts to understand heat engines and the directional flow of heat, without reliance on microscopic explanations. Sadi Carnot's 1824 publication, Réflexions sur la puissance motrice du feu, analyzed the efficiency of idealized heat engines operating between hot and cold reservoirs, establishing that maximum work extraction occurs when the system approaches equilibrium through reversible processes, laying the groundwork for concepts of thermal balance in macroscopic systems. This work highlighted the inevitability of heat transfer from hotter to cooler bodies, a key aspect of thermalisation, as engines inevitably dissipate energy toward uniform temperature states. Building on Carnot's ideas, in the 1840s and 1850s advanced the view of as molecular motion, interpreting as the state where such motions equalize across interacting bodies, though still within a phenomenological framework. Thomson's contributions emphasized the transitivity of —now formalized as the zeroth law—whereby if two systems are each in equilibrium with a third, they are in equilibrium with one another, enabling the consistent definition of as a measure of thermalisation progress. , in the 1850s, further refined these principles by introducing the concept of in 1865 as a measure of dispersal, quantifying how isolated systems evolve toward through irreversible flows, with the second law asserting that tends to increase in such processes. These laws described thermalisation as a macroscopic, directional phenomenon governed by empirical observations, such as spontaneously flowing from hot to cold until equilibrium is reached, without probing underlying mechanisms. The development of these thermodynamic principles played a pivotal role in the , particularly by informing improvements to steam engines, where understanding heat conversion to work allowed engineers to optimize and reduce waste toward equilibrium states. For instance, Carnot's cycle analysis guided refinements in engine design, enabling higher performance in factories and transportation, thus fueling industrial expansion. However, 19th-century remained limited to phenomenological descriptions, unable to explain the microscopic reasons why systems inevitably thermalize or why reversibility is idealized rather than actual, prompting later shifts toward for deeper insights.

Developments in Statistical and Quantum Mechanics

The groundwork for statistical mechanics was laid by James Clerk Maxwell in the 1860s, who developed the through papers such as his 1860 Illustrations of the Dynamical Theory of Gases, deriving the Maxwell distribution of molecular velocities and emphasizing probabilistic descriptions of particle motions to explain macroscopic thermal properties like and . In the late 19th century, advanced the microscopic understanding of thermalisation through his development of , particularly with the H-theorem introduced in 1872, which mathematically demonstrates the approach to equilibrium in a dilute gas via the monotonic decrease of the H-function, akin to an increase in . This theorem relies on the , which governs the time evolution of the single-particle distribution function f(r,v,t)f(\mathbf{r}, \mathbf{v}, t): ft+vf+Fvf=(ft)coll,\frac{\partial f}{\partial t} + \mathbf{v} \cdot \nabla f + \mathbf{F} \cdot \nabla_v f = \left( \frac{\partial f}{\partial t} \right)_{\text{coll}}, where the left-hand side accounts for streaming and external forces, and the collision term on the right drives thermalisation by redistributing energy through binary collisions, assuming weak interactions and large mean free paths. Central to Boltzmann's framework is the ergodic hypothesis, positing that time averages over a single system's trajectory equal ensemble averages over many systems, justifying the replacement of dynamical calculations with statistical ones for equilibrium properties. Additionally, the molecular chaos assumption, or Stosszahlansatz, underpins the collision term by supposing that particle velocities are uncorrelated before collisions, enabling the derivation of irreversible behavior from reversible microscopic laws. Building on Boltzmann's kinetic approach, J. Willard Gibbs formalized theory in his 1902 treatise Elementary Principles in Statistical Mechanics, introducing the microcanonical, , and grand canonical ensembles as conceptual collections of systems to compute thermodynamic averages without explicit , thus providing a probabilistic foundation for across isolated, closed, and open systems. Gibbs' ensembles shifted focus from trajectory-specific dynamics to phase-space probabilities, complementing Boltzmann's collision-driven picture by emphasizing configurational aspects of thermalisation. In the early , extended these ideas toward quantum precursors in the 1910s, developing fluctuation theories and early that anticipated quantized distributions, such as in his 1905 work on and 1907 model for the specific heat of solids. Meanwhile, Max Planck's 1900 quantum hypothesis, which resolved the by positing energy emission in discrete quanta E=hνE = h\nu for , indirectly bolstered thermalisation concepts by implying quantized energy exchanges that align with equilibrium distributions at finite temperatures. These developments collectively transitioned thermalisation from classical kinetic descriptions to quantum-compatible frameworks, emphasizing probabilistic relaxation mechanisms over deterministic ones.

Theoretical Framework

Classical Thermalisation Processes

In classical systems, thermalisation occurs primarily through dynamical processes involving particle interactions, such as elastic collisions in dilute gases, which facilitate the redistribution of energy and momentum among . These collisions, modeled by the , describe how the one-particle distribution function f(r,v,t)f(\mathbf{r}, \mathbf{v}, t) evolves toward equilibrium under binary interactions assuming molecular chaos. The collision integral in the accounts for the gain and loss of particles in specific velocity states due to , driving the system toward a state where local thermodynamic variables like and become well-defined. A key mathematical demonstration of this irreversible approach to equilibrium is provided by Boltzmann's H-theorem, which quantifies the monotonic increase in for isolated systems. The H-function is defined as H(t)=f(v,t)lnf(v,t)d3v,H(t) = \int f(\mathbf{v}, t) \ln f(\mathbf{v}, t) \, d^3 v, and its time derivative satisfies dHdt0,\frac{dH}{dt} \leq 0, with equality only at equilibrium when ff is the Maxwell-Boltzmann distribution. This inequality arises from the positivity of the collision term, ensuring that deviations from equilibrium diminish over time. Through repeated elastic collisions, the velocity distribution relaxes to the Maxwell-Boltzmann form f(v)exp(mv2/2kT)f(\mathbf{v}) \propto \exp(-m v^2 / 2kT), where the average per particle is (3/2)kT(3/2) kT in three dimensions. The underpins this relaxation, stating that in , each quadratic term in the Hamiltonian contributes an average energy of kT/2kT/2. For an , this implies equal partitioning of energy among translational , leading to the observed Maxwell-Boltzmann statistics after sufficient collisions. The process relies on the , which posits that chaotic dynamics in classical systems ensure time averages equal ensemble averages, allowing exploration of consistent with equilibrium. The timescales for thermalisation are governed by the mean collision time τcoll1/(nσv)\tau_\text{coll} \approx 1 / (n \sigma \langle v \rangle), where nn is the particle density, σ\sigma the collision cross-section, and v\langle v \rangle the average relative speed; this sets the local relaxation rate, typically much shorter than macroscopic times in larger systems. For instance, in a dilute gas, local thermalisation occurs on scales of τcoll\tau_\text{coll}, after which and diffuse according to transport laws like Fourier's law, q=κT\mathbf{q} = -\kappa \nabla T, describing in the hydrodynamic regime. A representative example is the compression of an by a , where sudden volume reduction creates a non-equilibrium state with anisotropic velocity distribution. Elastic collisions then redistribute , relaxing the system to a new equilibrium determined by the work done, with each particle's three translational acquiring kT/2kT/2 on average, restoring and the Maxwell-Boltzmann form. Post-relaxation, if gradients persist, the system enters a hydrodynamic description via Euler or Navier-Stokes equations, where conserved quantities like and evolve on diffusive timescales proportional to system size.

Statistical Mechanics Underpinnings

In statistical mechanics, the microcanonical ensemble provides the foundational description for the thermal equilibrium of isolated systems with fixed energy EE, volume VV, and particle number NN, where the probability density is uniform across the hypersurface of constant energy in phase space, ensuring that macroscopic observables approach well-defined averages as the system size increases. This ensemble underpins thermalization by positing that, for large systems, time averages of observables equal ensemble averages due to the near-equivalence of phase space volumes corresponding to similar macroscopic states. J. Willard Gibbs introduced this framework in his seminal work on statistical ensembles, demonstrating its role in deriving thermodynamic properties from probabilistic considerations. For systems coupled to a heat bath at fixed temperature TT, the canonical ensemble applies, with the probability of a state proportional to eβEie^{-\beta E_i}, where β=1/(kBT)\beta = 1/(k_B T) and kBk_B is Boltzmann's constant; thermalization here manifests as relaxation to this Boltzmann distribution, allowing energy exchange while conserving the bath's temperature. Gibbs formalized the canonical ensemble as a collection of systems in thermal contact, resolving inconsistencies in earlier approaches by emphasizing ensemble averaging over single-system trajectories. The grand canonical ensemble extends this to open systems permitting particle exchange with a reservoir at fixed chemical potential μ\mu, temperature TT, and volume VV, where the probability distribution incorporates both energy and particle number fluctuations, given by eβ(EiμNi)e^{-\beta (E_i - \mu N_i)}. Thermalization in this context involves convergence to equilibrium distributions that account for diffusive particle flows, maintaining chemical equilibrium alongside thermal balance. Gibbs developed this ensemble to handle systems with variable particle content, such as gases in contact with a particle reservoir, showing its equivalence to the canonical and microcanonical limits under appropriate constraints. Central to these ensembles is the partition function for the canonical case, Z=ieβEi,Z = \sum_i e^{-\beta E_i}, which normalizes probabilities and yields thermodynamic quantities; the average energy is then E=lnZβ\langle E \rangle = -\frac{\partial \ln Z}{\partial \beta}, illustrating how thermalization equates ensemble predictions with observed relaxation dynamics. Gibbs derived this structure from the principle of equal a priori probabilities, ensuring consistency across ensembles. Key concepts reinforcing thermalization include detailed balance, which requires that in equilibrium, the transition rate from state jj to ii satisfies Wij/Wji=eβ(EiEj)W_{i \leftarrow j} / W_{j \leftarrow i} = e^{-\beta (E_i - E_j)}, guaranteeing that the Boltzmann distribution is stationary under stochastic dynamics. Ludwig Boltzmann established this condition in his kinetic theory, proving it ensures the positivity of entropy production and convergence to equilibrium without net fluxes. Complementing this, the fluctuation-dissipation theorem connects equilibrium fluctuations—manifest as noise—to the system's dissipative response to perturbations, quantifying how thermal noise drives relaxation toward equilibrium states. Ryogo Kubo provided a general proof in linear response theory, showing that the response function is proportional to the correlation of fluctuations, thus linking microscopic randomness to macroscopic irreversibility. A rigorous justification for thermalization in isolated Hamiltonian systems stems from , which states that the density ρ(q,p,t)\rho(\mathbf{q}, \mathbf{p}, t) evolves according to dρdt={ρ,H}=0\frac{d\rho}{dt} = \{\rho, H\} = 0 along trajectories, conserving the volume of any region under Hamiltonian flow q˙=H/p\dot{\mathbf{q}} = \partial H / \partial \mathbf{p}, p˙=H/q\dot{\mathbf{p}} = -\partial H / \partial \mathbf{q}. In ergodic systems, this conservation implies that trajectories densely fill the energy surface, leading to uniform sampling of the and thus thermalization of observables. Gibbs integrated this theorem into his theory, arguing that for sufficiently chaotic dynamics, initial non-equilibrium distributions evolve to fill uniformly, justifying the approach to . The serves as a complementary tool for describing the evolution of distributions in dilute gases, bridging microscopic collisions to macroscopic thermalization.

Quantum Aspects

Eigenstate Thermalization Hypothesis

The (ETH) posits that in isolated quantum many-body systems exhibiting behavior, individual energy eigenstates serve as thermal states, effectively reproducing the predictions of for local observables. This hypothesis provides a microscopic foundation for thermalization in closed quantum systems, analogous to classical where time averages equal ensemble averages in dynamics. The is formally expressed through an for the matrix elements of a local operator OO in the energy eigenbasis {n}\{|n\rangle\}, where the eigenvalues EnE_n are densely distributed. For diagonal elements, nOnf(O,En)\langle n | O | n \rangle \approx f(O, E_n), with ff a smooth function that matches the microcanonical average at energy EnE_n. Off-diagonal elements are suppressed as nOmeS(En)/2g(O,En)Rnm\langle n | O | m \rangle \approx e^{-S(E_n)/2} g(O, E_n) R_{nm} for nmn \neq m, where S(En)S(E_n) is the thermodynamic at energy EnE_n, gg is another smooth function, and RnmR_{nm} represents a with zero mean and unit variance, ensuring rapid decay with energy difference. This structure implies that eigenstates appear thermal when restricted to subsystems, as the diagonal dominance yields expectation values indistinguishable from those in a thermal ensemble. The diagonal ETH ensures that the infinite-time average of an observable's expectation value in any initial state aligns with the thermal average, while the off-diagonal ETH governs the approach to equilibrium by causing dephasing of coherent superpositions. For a non-equilibrium initial state ψ(0)=ncnn|\psi(0)\rangle = \sum_n c_n |n\rangle, the time-evolved expectation value is O(t)=ncn2nOn+nmcncmnOmei(EnEm)t/\langle O(t) \rangle = \sum_n |c_n|^2 \langle n | O | n \rangle + \sum_{n \neq m} c_n^* c_m \langle n | O | m \rangle e^{i(E_n - E_m)t/\hbar}. The first term directly gives the thermal value due to diagonal thermalization, and the second term oscillates and averages to zero over long times because of the random, exponentially small off-diagonal elements, provided the energy spacing is small compared to the inverse system size. Subsystem thermalization follows from this, as entanglement entropy grows linearly with subsystem size in eigenstates (volume-law scaling), making reduced density matrices for local regions equivalent to thermal ones at the corresponding energy. The was independently proposed by Deutsch in 1991 and Srednicki in 1994, with Srednicki coining the term and providing an explicit formulation for systems like a quantum hard-sphere gas. Numerical validations have confirmed the in one-dimensional spin chains, such as the quantum with transverse and longitudinal fields, where matrix elements exhibit the predicted random matrix-like statistics for parameter regimes, leading to thermal relaxation of local correlations. In these systems, deviations from ETH occur only in integrable cases, underscoring its applicability to non-integrable, quantum many-body dynamics.

Pre-Thermalization and Relaxation Dynamics

In isolated , pre-thermalization describes the transient regime following a quantum quench where the system evolves toward a long-lived quasi-equilibrium state characterized by quasi-conserved quantities, such as approximate integrals of motion, that delay the onset of full . These quasi-conserved operators, often emerging from weak perturbations to integrable models, suppress rapid heating and maintain a pre-thermal plateau on intermediate timescales, distinct from the eventual ergodic mixing predicted by the (ETH) as the endpoint mechanism. The thermalization time, typically scaling polynomially with system size in chaotic many-body systems, is markedly shorter than the Heisenberg time tH=2π/Δt_H = 2\pi \hbar / \Delta, where Δ\Delta is the mean energy level spacing; this disparity arises because local observables equilibrate via short-time chaotic dynamics before the system fully resolves its dense spectrum at longer scales. In this regime, interactions play a crucial role in dephasing, where many-body scattering events destroy quantum coherences, enabling subsystems to approach local thermal-like distributions without requiring global equilibrium. For weakly interacting systems, the relaxation rate Γ\Gamma governing transitions to nearby states can be approximated using Fermi's golden rule as Γ(ΔE)2/\Gamma \approx (\Delta E)^2 / \hbar, with ΔE\Delta E denoting the perturbation-induced energy scale, highlighting the perturbative origin of early decay processes. In quantum fluids, such as those realized in ultracold atomic gases, hydrodynamic modes emerge during pre-thermalization as long-wavelength, low-frequency excitations that propagate diffusively after initial ballistic dephasing, bridging microscopic interactions to macroscopic . These modes, including and fluctuations, relax on hydrodynamic timescales set by viscosities and conductivities, providing a framework for understanding intermediate dynamics in translationally invariant systems. Recent 2025 investigations have revealed quantum enhancement effects, where initial entanglement between system components accelerates relaxation toward thermal states, as observed in the quantum , potentially optimizing energy transfer in molecular aggregates.

Non-Thermalising Systems

Classical Instances of Resistance

In , thermalization refers to the process by which a reaches equilibrium through ergodic exploration of its , distributing energy according to the . However, certain classical systems resist this due to underlying integrability or near-integrability, where conserved quantities prevent full mixing of phase space trajectories. These instances highlight exceptions to the , which posits that time averages equal ensemble averages over sufficiently long times. A prototypical example of resistance to thermalization is provided by systems of uncoupled or weakly coupled harmonic oscillators. In a collection of independent harmonic oscillators, each mode conserves its separately, leading to quasi-periodic motion confined to invariant tori in without chaotic diffusion or energy redistribution across modes. This lack of chaos ensures that the system does not achieve the equipartition of predicted by standard thermalization, instead equilibrating to a generalized ensemble that respects the individual mode invariants. Even in coupled linear harmonic chains, the integrability preserves these features, resulting in recurrent dynamics rather than irreversible relaxation. Near-integrable systems, such as those perturbed slightly from exact integrability, exhibit similar resistance through the persistence of regular orbits. For instance, in near-integrable billiards—like a slightly deformed circular billiard—the Kolmogorov-Arnold-Moser (KAM) theorem guarantees the survival of most invariant tori for small perturbations, trapping trajectories on these structures and hindering ergodic filling of the energy surface. This leads to slow in , where chaotic layers form around resonances but large regions remain non-ergodic, delaying thermalization. The Fermi-Pasta-Ulam (FPU) chain, a near-integrable model of nonlinearly coupled oscillators, exemplifies this: initial excitations recur periodically instead of thermalizing rapidly, due to the persistence of nearly conserved actions. The mechanisms underlying this resistance often involve action-angle variables, which transform the Hamiltonian into a form where actions are adiabatic invariants conserved under slow perturbations. These invariants restrict motion to toroidal manifolds, preventing the exploration necessary for thermalization and leading to bounded, non-diffusive dynamics. In nearly integrable cases, the slow leakage from KAM tori occurs via weak resonances, but the overall equilibration remains suppressed. A key result formalizing this stability is the Nekhoroshev theorem, which proves that in nearly integrable Hamiltonian systems with sufficiently small perturbations, trajectories remain close to their initial tori for exponentially long times—specifically, on timescales scaling as exp(c/ϵa)\exp(c / \epsilon^a) where ϵ\epsilon is the perturbation strength and a>0a > 0 depends on the system's —effectively postponing thermalization to impractically long durations. Unlike quantum analogs involving localization, classical resistance arises purely from regular, non-chaotic motion on invariant structures, breaking ergodicity without disorder. This classical framework underscores how excess conserved quantities, beyond energy alone, impede the approach to thermal equilibrium in deterministic dynamics.

Quantum Many-Body Localization

Quantum many-body localization (MBL) represents the extension of Anderson localization from non-interacting single-particle systems to interacting many-body quantum systems, where sufficiently strong disorder inhibits the delocalization of excitations and prevents thermalization. In this regime, the system's eigenstates remain localized even at finite temperatures, exhibiting an area-law scaling for entanglement entropy rather than the volume-law typical of thermal states. This phenomenon was first theoretically proposed in the context of weakly interacting electrons, demonstrating that repulsive interactions do not destroy localization when disorder exceeds a critical strength, leading to an insulating phase despite the presence of interactions. The primary mechanism underlying MBL involves disorder that pins individual particle or spin configurations, suppressing the resonant hybridization of states that would otherwise drive thermalization. In disordered spin chains or lattice models, random on-site potentials or couplings localize the single-particle wavefunctions exponentially, with the localization length ξ remaining finite such that ψ(r) ~ e^{-r/ξ}, where r is the distance from the localization center. Interactions between particles introduce perturbative corrections but fail to delocalize the system fully when disorder is strong, resulting in a basis of quasi-local integrals of motion (often called l-bits) that preserve initial state information over long times. This localization breaks the eigenstate thermalization hypothesis (ETH), as eigenstates do not resemble thermal ensembles and exhibit negligible off-diagonal matrix elements in the energy basis. Additionally, after a quantum quench, entanglement in MBL systems grows logarithmically with time, contrasting the linear growth in thermalizing systems. A rigorous of MBL's existence was established by Imbrie in for certain one-dimensional quantum spin chains with random interactions, relying on assumptions about limited level repulsion in the eigenvalue to demonstrate the stability of localized eigenstates. Numerical and perturbative analyses further support the existence of a transition as a function of disorder strength, separating ergodic thermalizing phases from non-ergodic localized ones. Recent discussions, including at the 2025 Workshop on Recent Advances in Disordered Systems (RAD), have explored the nature of MBL transitions in disordered oscillator systems and their indicators, highlighting ongoing debates on the sharpness and universality of the transition.

Applications and Recent Advances

Observations in Condensed Matter Experiments

Experiments with ultracold atoms confined in optical lattices provide key evidence for thermalization in interacting quantum many-body systems. These setups simulate lattice models like the Bose-Hubbard Hamiltonian, where atoms experience tunable on-site interactions and hopping amplitudes controlled by intensities. Following a sudden quench, such as a rapid change in interaction strength, the system evolves unitarily and relaxes toward a state, as observed through the broadening of distributions and the approach to a steady-state density profile. In bosonic gases, quantum effects enhance the thermalization rate compared to classical predictions, with relaxation dynamics showing faster equilibration due to coherent many-body interference. A seminal 2018 experiment using arrays of 6 to 13 Rydberg atoms demonstrated thermal-like behavior in individual energy eigenstates, supporting the through measurements of entanglement entropy and local observables. The atoms were prepared in pure states via , and revealed diagonal matrix elements of the density operator aligning with thermal expectations, indicating subsystem thermalization without an external bath. Equilibration times in such quenched ultracold systems typically range from tens to hundreds of milliseconds, scaling with interaction strength and system size, as tracked by in-situ imaging of atomic densities. Nuclear magnetic resonance (NMR) experiments on solid-state spin chains, such as those using 19^{19}F nuclei in crystals, have verified thermalization by engineering effective Hamiltonians that mimic interacting models. Local observables, like correlations, evolve to thermal values after initialization in non-equilibrium states, with relaxation dynamics consistent with ergodic behavior in the absence of disorder. Recent advances in 2024-2025 quantum simulators, such as 69-qubit superconducting platforms, have extended these observations to larger scales, revealing thermalization near quantum critical points with in mixed analogue-digital evolutions. In contrast to these thermalizing regimes, setups with engineered disorder in similar spin chains show suppressed relaxation, highlighting as a non-thermalizing counterpart. Additionally, 2025 studies on arrays have explored non-thermalizing dynamics through quantum scars, advancing quantum simulation capabilities.

Thermalisation in High-Energy Physics

In high-energy physics, thermalisation plays a crucial role in the dynamics of the quark-gluon plasma (QGP) formed during ultrarelativistic heavy-ion collisions at facilities such as the (RHIC) and the (LHC). These collisions create an initial state of highly anisotropic, far-from-equilibrium gluon fields, which rapidly evolve toward local through processes like elastic and inelastic scatterings, leading to QGP formation. Isotropization, the transition from anisotropic momentum distributions to isotropic ones, occurs on timescales of approximately 0.5–1 fm/c after the collision, enabling the system to behave as a near-ideal fluid described by relativistic hydrodynamics. A parallel context for thermalisation arises in the early during the reheating phase following cosmic , where the stored in the oscillating field is transferred to particles, establishing a hot, thermal plasma. This process involves parametric and perturbative decays, with thermalisation potentially occurring instantaneously through efficient interactions among produced particles, achieving temperatures up to 10^15 GeV depending on the inflaton potential. Unlike the rapid, collision-driven thermalisation in heavy-ion experiments, reheating unfolds over an extended period, but both scenarios highlight the transition from non-equilibrium initial conditions to a thermal bath supporting subsequent cosmological evolution. Experimental observations in heavy-ion collisions provide evidence for thermalisation, notably through jet quenching, where high-energy partons lose energy via interactions with the medium, suppressing high-p_T yields and indicating a dense, QGP with temperatures exceeding 200 MeV. Similarly, the of azimuthal anisotropies in particle yields, quantified by Fourier coefficients v_n, signals collective hydrodynamic flow, which requires local to generate gradients driving the expansion. These flow patterns, observed across collision centralities at RHIC and LHC, align with viscous hydrodynamic simulations, confirming thermalisation within 1 fm/c. The shear viscosity to entropy density ratio η/s of the QGP, a key measure of thermalisation , approaches the universal lower bound of 1/(4π) derived from the AdS/CFT correspondence, with experimental extractions from flow data yielding values near 0.1–0.2 times s/T^3, saturating the bound for strongly coupled systems. Recent numerical simulations, including 2025 studies on pre-QGP non-equilibrium states, employ classical-statistical approaches to model initial gluon cascades and bottom-up thermalisation, revealing prolonged lifetimes and enhanced particle production in far-from-equilibrium regimes before QGP onset. These advances underscore the challenges in bridging microscopic QCD dynamics to macroscopic hydrodynamic descriptions.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.