Hubbry Logo
Lindeberg's conditionLindeberg's conditionMain
Open search
Lindeberg's condition
Community hub
Lindeberg's condition
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Lindeberg's condition
Lindeberg's condition
from Wikipedia

In probability theory, Lindeberg's condition is a sufficient condition (and under certain conditions also a necessary condition) for the central limit theorem (CLT) to hold for a sequence of independent random variables.[1][2][3] Unlike the classical CLT, which requires that the random variables in question have finite variance and be both independent and identically distributed, Lindeberg's CLT only requires that they have finite variance, satisfy Lindeberg's condition, and be independent. It is named after the Finnish mathematician Jarl Waldemar Lindeberg.[4]

Statement

[edit]

Let be a probability space, and , be independent random variables defined on that space. Assume the expected values and variances exist and are finite. Also let

If this sequence of independent random variables satisfies Lindeberg's condition:

for all , where 1{…} is the indicator function, then the central limit theorem holds, i.e. the random variables

converge in distribution to a standard normal random variable as

Lindeberg's condition is sufficient, but not in general necessary (i.e. the inverse implication does not hold in general). However, if the sequence of independent random variables in question satisfies

then Lindeberg's condition is both sufficient and necessary, i.e. it holds if and only if the result of central limit theorem holds.

Remarks

[edit]

Feller's theorem

[edit]

Feller's theorem can be used as an alternative method to prove that Lindeberg's condition holds.[5] Letting and for simplicity , the theorem states

if , and converges weakly to a standard normal distribution as then satisfies the Lindeberg's condition.


This theorem can be used to disprove the central limit theorem holds for by using proof by contradiction. This procedure involves proving that Lindeberg's condition fails for .

Interpretation

[edit]

Because the Lindeberg condition implies as , it guarantees that the contribution of any individual random variable () to the variance is arbitrarily small, for sufficiently large values of .

Example

[edit]

Consider the following informative example which satisfies the Lindeberg condition. Let be a sequence of zero mean, variance 1 iid random variables and a non-random sequence satisfying:

Now, define the normalized elements of the linear combination:

which satisfies the Lindeberg condition:

but is finite so by DCT and the condition on the we have that this goes to 0 for every .

See also

[edit]

References

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Lindeberg's condition is a fundamental criterion in that guarantees the applicability of the to sums of independent but not necessarily identically distributed s with finite variances. Formally, for a triangular array of row-wise independent, zero-mean random variables Xn,kX_{n,k}, k=1,,rnk = 1, \dots, r_n, with variances σn,k2=E[Xn,k2]\sigma_{n,k}^2 = \mathbb{E}[X_{n,k}^2] and row sum of variances sn2=k=1rnσn,k2s_n^2 = \sum_{k=1}^{r_n} \sigma_{n,k}^2 \to \infty, the condition requires that for every ε>0\varepsilon > 0, limn1sn2k=1rnE[Xn,k21{Xn,k>εsn}]=0.\lim_{n \to \infty} \frac{1}{s_n^2} \sum_{k=1}^{r_n} \mathbb{E}\left[ X_{n,k}^2 \mathbf{1}_{\{|X_{n,k}| > \varepsilon s_n\}} \right] = 0. This ensures that no single term dominates the sum, allowing the normalized sum Sn/snS_n / s_n—where Sn=k=1rnXn,kS_n = \sum_{k=1}^{r_n} X_{n,k}—to converge in distribution to a standard normal random variable. Introduced by Finnish mathematician Jarl Waldemar Lindeberg in his 1922 paper, the condition generalized earlier results by Lyapunov and marked a significant advancement in understanding asymptotic normality under heterogeneous distributions. The condition's importance was further solidified by in 1937, who demonstrated its necessity (along with the uniform asymptotic negligibility condition) for the in the context of independent random variables, forming what is now known as the Lindeberg–Feller theorem. This theorem underpins much of modern , particularly in scenarios involving heterogeneous data, such as in , , and , where variables exhibit varying scales and tails. Extensions of Lindeberg's condition have since been developed for dependent processes, infinite variances, and non-Euclidean spaces, broadening its utility in advanced probabilistic modeling.

Background Concepts

Central Limit Theorem for Identical Distributions

The classical (CLT) addresses the asymptotic distribution of sums of independent and identically distributed (i.i.d.) random variables, providing a foundational result in . Specifically, consider i.i.d. random variables X1,X2,,XnX_1, X_2, \dots, X_n each with finite μ\mu and positive finite variance σ2>0\sigma^2 > 0. The standardized sum 1nσi=1n(Xiμ)\frac{1}{\sqrt{n} \sigma} \sum_{i=1}^n (X_i - \mu)
Add your contribution
Related Hubs
User Avatar
No comments yet.