Hubbry Logo
Tridiagonal matrixTridiagonal matrixMain
Open search
Tridiagonal matrix
Community hub
Tridiagonal matrix
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Contribute something
Tridiagonal matrix
Tridiagonal matrix
from Wikipedia

In linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements only on the main diagonal, the subdiagonal/lower diagonal (the first diagonal below this), and the supradiagonal/upper diagonal (the first diagonal above the main diagonal). For example, the following matrix is tridiagonal:

The determinant of a tridiagonal matrix is given by the continuant of its elements.[1]

An orthogonal transformation of a symmetric (or Hermitian) matrix to tridiagonal form can be done with the Lanczos algorithm.

Properties

[edit]

A tridiagonal matrix is a matrix that is both upper and lower Hessenberg matrix.[2] In particular, a tridiagonal matrix is a direct sum of p 1-by-1 and q 2-by-2 matrices such that p + q/2 = n — the dimension of the tridiagonal. Although a general tridiagonal matrix is not necessarily symmetric or Hermitian, many of those that arise when solving linear algebra problems have one of these properties. Furthermore, if a real tridiagonal matrix A satisfies ak,k+1 ak+1,k > 0 for all k, so that the signs of its entries are symmetric, then it is similar to a Hermitian matrix, by a diagonal change of basis matrix. Hence, its eigenvalues are real. If we replace the strict inequality by ak,k+1 ak+1,k ≥ 0, then by continuity, the eigenvalues are still guaranteed to be real, but the matrix need no longer be similar to a Hermitian matrix.[3]

The set of all n × n tridiagonal matrices forms a 3n-2 dimensional vector space.

Many linear algebra algorithms require significantly less computational effort when applied to diagonal matrices, and this improvement often carries over to tridiagonal matrices as well.

Determinant

[edit]

The determinant of a tridiagonal matrix A of order n can be computed from a three-term recurrence relation.[4] Write f1 = |a1| = a1 (i.e., f1 is the determinant of the 1 by 1 matrix consisting only of a1), and let

The sequence (fi) is called the continuant and satisfies the recurrence relation

with initial values f0 = 1 and f−1 = 0. The cost of computing the determinant of a tridiagonal matrix using this formula is linear in n, while the cost is cubic for a general matrix.

Inversion

[edit]

The inverse of a non-singular tridiagonal matrix T

is given by

where the θi satisfy the recurrence relation

with initial conditions θ0 = 1, θ1 = a1 and the ϕi satisfy

with initial conditions ϕn+1 = 1 and ϕn = an.[5][6]

Closed form solutions can be computed for special cases such as symmetric matrices with all diagonal and off-diagonal elements equal[7] or Toeplitz matrices[8] and for the general case as well.[9][10]

In general, the inverse of a tridiagonal matrix is a semiseparable matrix and vice versa.[11] The inverse of a symmetric tridiagonal matrix can be written as a single-pair matrix (a.k.a. generator-representable semiseparable matrix) of the form[12][13]

where with

Solution of linear system

[edit]

A system of equations Ax = b for  can be solved by an efficient form of Gaussian elimination when A is tridiagonal called tridiagonal matrix algorithm, requiring O(n) operations.[14]

Eigenvalues

[edit]

When a tridiagonal matrix is also Toeplitz, there is a simple closed-form solution for its eigenvalues, namely:[15][16]

A real symmetric tridiagonal matrix has real eigenvalues, and all the eigenvalues are distinct (simple) if all off-diagonal elements are nonzero.[17] Numerous methods exist for the numerical computation of the eigenvalues of a real symmetric tridiagonal matrix to arbitrary finite precision, typically requiring operations for a matrix of size , although fast algorithms exist which (without parallel computation) require only .[18]

As a side note, an unreduced symmetric tridiagonal matrix is a matrix containing non-zero off-diagonal elements of the tridiagonal, where the eigenvalues are distinct while the eigenvectors are unique up to a scale factor and are mutually orthogonal.[19]

Similarity to symmetric tridiagonal matrix

[edit]

For unsymmetric or nonsymmetric tridiagonal matrices one can compute the eigendecomposition using a similarity transformation. Given a real tridiagonal, nonsymmetric matrix

where . Assume that each product of off-diagonal entries is strictly positive and define a transformation matrix by[20]

The similarity transformation yields a symmetric tridiagonal matrix by:[21][20]

Note that and have the same eigenvalues.

Computer programming

[edit]

A transformation that reduces a general matrix to Hessenberg form will reduce a Hermitian matrix to tridiagonal form. So, many eigenvalue algorithms, when applied to a Hermitian matrix, reduce the input Hermitian matrix to (symmetric real) tridiagonal form as a first step.[22]

A tridiagonal matrix can also be stored more efficiently than a general matrix by using a special storage scheme. For instance, the LAPACK Fortran package stores an unsymmetric tridiagonal matrix of order n in three one-dimensional arrays, one of length n containing the diagonal elements, and two of length n − 1 containing the subdiagonal and superdiagonal elements.

Applications

[edit]

The discretization in space of the one-dimensional diffusion or heat equation

using second order central finite differences results in

with discretization constant . The matrix is tridiagonal with and . Note: no boundary conditions are used here.

See also

[edit]

Notes

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
A tridiagonal matrix is a square matrix in which all entries are zero except those on the , the subdiagonal (immediately below the ), and the superdiagonal (immediately above the ). This structure makes it a special case of a band matrix with bandwidth 1, allowing for compact storage using only O(n) space for an n×n matrix, compared to O(n²) for a dense matrix. Tridiagonal matrices possess several advantageous properties that facilitate efficient numerical computations. For instance, systems of linear equations involving a tridiagonal Ax = b can be solved in O(n) time using specialized algorithms like the Thomas algorithm, which is a variant of tailored to this sparse structure and avoids fill-in during factorization. Additionally, for symmetric tridiagonal matrices, can be computed using stable O(n²) methods such as the , which is particularly useful in spectral analysis. Determinants of tridiagonal matrices can also be evaluated recursively in O(n) time via a continued fraction-like formula, leveraging the matrix's banded form. These matrices arise naturally in numerous applications across and . In , they model finite difference approximations to second-order boundary value problems for ordinary and partial differential , such as the Poisson in one dimension. Tridiagonal forms also appear in the of orthogonal polynomials, Markov chains for modeling sequential processes, and Tikhonov regularization for ill-posed inverse problems, where their Toeplitz variants (constant diagonals) enable closed-form solutions and fast computations. Furthermore, in modeling, such as spread or particle deposition on lattices, tridiagonal matrices capture nearest-neighbor interactions efficiently.

Definition and Notation

Formal Definition

A tridiagonal matrix is a square matrix of order n×nn \times n in which all entries are zero except those on the , the subdiagonal (immediately below the ), and the superdiagonal (immediately above the ). This structure confines the nonzero elements to a narrow band around the , distinguishing it from denser matrices. In terms of bandwidth, a tridiagonal matrix has a lower bandwidth of 1 and an upper bandwidth of 1, meaning the nonzero entries extend at most one position below and above the . More formally, for an n×nn \times n matrix A=(ai,j)A = (a_{i,j}), it satisfies ai,j=0a_{i,j} = 0 whenever ij>1|i - j| > 1. This contrasts with a , which has bandwidth (0,0) and only nonzero main diagonal entries, and with bidiagonal matrices, which have bandwidth (1,0) or (0,1) by including only one off-diagonal. Broader banded matrices, such as pentadiagonal ones, allow nonzeros up to two positions away, resulting in bandwidth (2,2). Under certain conditions, such as the matrix being irreducible and satisfying diagonal dominance (where each diagonal entry's is at least the sum of the s of the off-diagonal entries in its row), tridiagonal matrices exhibit desirable properties like nonsingularity. A tridiagonal matrix is irreducible if all subdiagonal and superdiagonal entries are nonzero, ensuring the associated is strongly connected. These matrices are particularly valuable in contexts for efficient storage and computation in .

Matrix Representation and Examples

A tridiagonal matrix TT of order nn is typically denoted in block form using vectors for its three nonzero diagonals: the main diagonal consists of elements bib_i for i=1,,ni = 1, \dots, n, the subdiagonal (immediately below the ) consists of elements aia_i for i=2,,ni = 2, \dots, n, and the superdiagonal (immediately above the ) consists of elements cic_i for i=1,,n1i = 1, \dots, n-1. This notation compactly represents the sparse structure where all other entries are zero, as seen in the general form: T=(b1c100a2b2c200a3b3cn100anbn).T = \begin{pmatrix} b_1 & c_1 & 0 & \cdots & 0 \\ a_2 & b_2 & c_2 & \cdots & 0 \\ 0 & a_3 & b_3 & \ddots & \vdots \\ \vdots & \vdots & \ddots & \ddots & c_{n-1} \\ 0 & 0 & \cdots & a_n & b_n \end{pmatrix}.
Add your contribution
Related Hubs
Contribute something
User Avatar
No comments yet.