Exchange matrix
View on Wikipediafrom Wikipedia
In mathematics, especially linear algebra, the exchange matrices (also called the reversal matrix, backward identity, or standard involutory permutation) are special cases of permutation matrices, where the 1 elements reside on the antidiagonal and all other elements are zero. In other words, they are 'row-reversed' or 'column-reversed' versions of the identity matrix.[1]
Definition
[edit]If J is an n × n exchange matrix, then the elements of J are
Properties
[edit]- Premultiplying a matrix by an exchange matrix flips vertically the positions of the former's rows, i.e.,
- Postmultiplying a matrix by an exchange matrix flips horizontally the positions of the former's columns, i.e.,
- Exchange matrices are symmetric; that is:
- For any integer k: In particular, Jn is an involutory matrix; that is,
- The trace of Jn is 1 if n is odd and 0 if n is even. In other words:
- The determinant of Jn is: As a function of n, it has period 4, giving 1, 1, −1, −1 when n is congruent modulo 4 to 0, 1, 2, and 3 respectively.
- The characteristic polynomial of Jn is:
its eigenvalues are 1 (with multiplicity ) and -1 (with multiplicity ).
- The adjugate matrix of Jn is: (where sgn is the sign of the permutation πk of k elements).
Relationships
[edit]- An exchange matrix is the simplest anti-diagonal matrix.
- Any matrix A satisfying the condition AJ = JA is said to be centrosymmetric.
- Any matrix A satisfying the condition AJ = JAT is said to be persymmetric.
- Symmetric matrices A that satisfy the condition AJ = JA are called bisymmetric matrices. Bisymmetric matrices are both centrosymmetric and persymmetric.
See also
[edit]- Pauli matrices (the first Pauli matrix is a 2 × 2 exchange matrix)
References
[edit]- ^ Horn, Roger A.; Johnson, Charles R. (2012), "§0.9.5.1 n-by-n reversal matrix", Matrix Analysis (2nd ed.), Cambridge University Press, p. 33, ISBN 978-1-139-78888-5.
Exchange matrix
View on Grokipediafrom Grokipedia
This matrix reverses the order of the entries in any vector it multiplies (on the left or right, due to its symmetry), effectively permuting the components from first to last.[2]
The exchange matrix possesses several key properties that make it fundamental in matrix analysis. It is symmetric (), orthogonal (), and an involution (), meaning it is its own inverse and transpose.[2][1] These attributes ensure that multiplying by is a reversible operation without scaling, preserving norms and inner products in vector spaces. Additionally, a matrix is centrosymmetric if it commutes with under conjugation, i.e., , which implies symmetry with respect to the center of the matrix.[2]
Exchange matrices play a crucial role in various applications, including the study of symmetric and persymmetric matrices, where conjugation with transforms or symmetrizes structures like Hankel or Toeplitz matrices. They also appear in numerical algorithms for Gaussian elimination, eigenvalue computations, and the analysis of structured matrices in signal processing and control theory, facilitating efficient reversals and permutations without full recomputation.[1]
Definition and Examples
Formal Definition
The $ n \times n $ exchange matrix $ J_n $, also known as the reversal matrix, is a specific permutation matrix whose entries are given by $ J_{i,j} = \delta_{i, n+1-j} $, where $ \delta $ denotes the Kronecker delta; this configuration places 1s exclusively along the anti-diagonal and 0s elsewhere.[3][4] Equivalently, $ J_n $ is the identity matrix with its rows (or columns) reversed in order.[2] As a permutation matrix, $ J_n $ represents the reversal permutation $ \sigma: {1, \dots, n} \to {1, \dots, n} $ defined by $ \sigma(k) = n+1-k $, which acts on the standard basis by mapping the vector $ e_k $ to $ e_{n+1-k} $.[3] When applied to an arbitrary matrix $ A $, premultiplication by $ J_n $ reverses the order of the rows of $ A $, while postmultiplication $ A J_n $ reverses the order of the columns of $ A $.[5] Furthermore, the conjugation (similarity transformation) $ J_n A J_n $ reverses both the rows and columns of $ A $, producing a matrix whose entries are given by $ (J_n A J_n){i,j} = A{n-i+1, n-j+1} $. This corresponds to a 180-degree rotation of the entries of $ A $ (reflection across the center), which is distinct from the transpose $ A^T $, a reflection across the main diagonal. This has a direct interpretation in terms of change of basis for linear operators. If $ M $ is the matrix representation of a linear operator $ T $ with respect to the ordered basis $ \mathcal{B} = {v_1, v_2, \dots, v_n} $, then in the reversed basis $ \mathcal{B}' = {v_n, v_{n-1}, \dots, v_1} $, the matrix representation $ M' $ of $ T $ satisfies
or equivalently, $ M' = J_n M J_n $. Thus, reversing the basis order conjugates the operator's matrix by the exchange matrix, rotating its entries 180 degrees rather than transposing them.
Examples in Low Dimensions
In low dimensions, the exchange matrix provides concrete illustrations of its role as a reversal operator on basis vectors. For dimension $ n=2 $, the exchange matrix $ J_2 $ is given by
which swaps the standard basis vectors $ e_1 = (1, 0)^T $ and $ e_2 = (0, 1)^T $, as $ J_2 e_1 = e_2 $ and $ J_2 e_2 = e_1 $.[6] This structure places 1s on the anti-diagonal, reflecting the general form where entries are 1 only when the row index plus column index equals $ n+1 $.[3]
For dimension $ n=3 $, the exchange matrix $ J_3 $ takes the form
reversing the order of the standard basis vectors such that $ J_3 e_1 = e_3 $, $ J_3 e_2 = e_2 $, and $ J_3 e_3 = e_1 $.[3] This anti-diagonal pattern emerges consistently, positioning 1s to permute coordinates in reverse. To verify its reversal effect, consider the action on a sample vector $ v = (1, 2, 3)^T $: multiplying yields $ J_3 v = (3, 2, 1)^T $, directly exchanging the first and third components while leaving the middle unchanged.[3]
Algebraic Properties
Symmetry and Involutory Nature
The exchange matrix $ J_n $, also known as the reversal matrix, is symmetric, satisfying $ J_n^T = J_n $. This property arises from its structure, where the entries are 1 on the anti-diagonal and 0 elsewhere, such that $ (J_n){i,j} = \delta{i, n+1-j} $. Transposing the matrix reflects it across the main diagonal, but due to the anti-diagonal placement of the 1s, the result remains unchanged, preserving the original form.[2] Furthermore, $ J_n $ is involutory, meaning $ J_n^2 = I_n $, where $ I_n $ is the $ n \times n $ identity matrix. This follows from the reversal permutation it induces: multiplying a vector by $ J_n $ reverses its components, and applying the reversal again restores the original order. To see this algebraically, the $ (i,j) $-th entry of $ J_n^2 $ is $ \sum_{k=1}^n (J_n){i,k} (J_n){k,j} = \sum_{k=1}^n \delta_{i, n+1-k} \delta_{k, n+1-j} $, which simplifies to $ \delta_{i,j} $ because the sum picks out $ k = n+1-j $ only when $ i = j $. This confirms that double reversal yields the identity.[2] As a symmetric involutory matrix, $ J_n $ is orthogonal, satisfying $ J_n J_n^T = I_n $. Substituting the symmetry gives $ J_n J_n = I_n $, which aligns directly with the involutory property, implying that $ J_n $ preserves the Euclidean norm of vectors under multiplication. This orthogonality underscores its role as a permutation matrix corresponding to the reversal operation.[2]Powers and Inverse
The exchange matrix $ J_n $, being the permutation matrix associated with the reversal permutation $ \sigma(i) = n+1-i $, satisfies $ J_n^2 = I_n $, where $ I_n $ is the $ n \times n $ identity matrix.[7] This relation holds because composing the reversal permutation with itself yields the identity permutation, as reversing the order twice restores the original sequence; explicitly, the -entry of $ J_n^2 $ is 1 if and only if $ j = n+1-i $ and the corresponding entry in the second $ J_n $ maps back to $ i $, resulting in the identity.[7] Consequently, the powers of $ J_n $ exhibit a simple periodicity: $ J_n^k = I_n $ for even positive integers $ k $, and $ J_n^k = J_n $ for odd positive integers $ k $. This order-2 cycling distinguishes the exchange matrix from permutation matrices corresponding to higher-order permutations, such as cycles of length greater than 2, whose powers require more steps to return to the identity.[7] The self-inverse property follows directly from the squared identity: $ J_n^{-1} = J_n $, confirming that $ J_n $ is an involution in the group of invertible matrices. For verification in general dimension $ n $, one may compute the product $ J_n J_n $ entrywise, noting that the anti-diagonal structure ensures each standard basis vector $ e_i $ is mapped to $ e_{n+1-i} $ and then back to $ e_i $.[7]Spectral Properties
Eigenvalues and Eigenvectors
The exchange matrix $ J_n \in \mathbb{R}^{n \times n} $, defined as the permutation matrix corresponding to the reversal permutation $ \sigma(i) = n+1-i $, possesses eigenvalues solely of $ +1 $ and $ -1 $. The algebraic multiplicity of $ +1 $ is $ \lceil n/2 \rceil $, while that of $ -1 $ is $ \lfloor n/2 \rfloor $. These values arise from the cycle decomposition of the reversal permutation, which consists of $ \lfloor n/2 \rfloor $ disjoint 2-cycles (each pairing indices $ i $ and $ n+1-i $ for $ i = 1, \dots, \lfloor n/2 \rfloor $) and, when $ n $ is odd, one additional 1-cycle at the central index $ (n+1)/2 $. For a permutation matrix, the eigenvalues are the roots of unity matching the lengths of its cycles: a 2-cycle yields eigenvalues $ +1 $ and $ -1 $, while a 1-cycle yields $ +1 $. Thus, the $ \lfloor n/2 \rfloor $ 2-cycles contribute $ \lfloor n/2 \rfloor $ instances each of $ +1 $ and $ -1 $, augmented by an extra $ +1 $ for odd $ n $.[8] The corresponding eigenspaces reflect the symmetry properties induced by the reversal action. The eigenspace for eigenvalue $ +1 $ comprises all palindromic vectors $ \mathbf{v} $ satisfying $ v_i = v_{n+1-i} $ for $ i = 1, \dots, n $, forming a subspace of dimension $ \lceil n/2 \rceil $ (spanned by basis vectors where the first $ \lceil n/2 \rceil $ components are chosen freely and the remainder are mirrored). The eigenspace for eigenvalue $ -1 $ comprises all skew-palindromic (or anti-palindromic) vectors satisfying $ v_i = -v_{n+1-i} $ for $ i = 1, \dots, n $ (with the central component zero if $ n $ is odd), forming a subspace of dimension $ \lfloor n/2 \rfloor $. These characterizations follow from the fact that $ J_n $ is symmetric and centrosymmetric (commuting with itself), enabling a decomposition of the space into symmetric and skew-symmetric subspaces with respect to reversal. For illustration with $ n=3 $, the exchange matrix is
To find the eigenvectors, solve $ (J_3 - \lambda I_3) \mathbf{v} = \mathbf{0} $ for each eigenvalue. For $ \lambda = -1 $, the equation $ J_3 \mathbf{v} = -\mathbf{v} $ yields $ v_3 = -v_1 $, $ v_2 = -v_2 $ (implying $ v_2 = 0 $), and $ v_1 = -v_3 $, so a basis vector is $ \begin{pmatrix} 1 \ 0 \ -1 \end{pmatrix} $ (up to scaling); direct verification confirms $ J_3 \begin{pmatrix} 1 \ 0 \ -1 \end{pmatrix} = \begin{pmatrix} -1 \ 0 \ 1 \end{pmatrix} = -\begin{pmatrix} 1 \ 0 \ -1 \end{pmatrix} $. For $ \lambda = +1 $ (multiplicity 2), the equation $ J_3 \mathbf{v} = \mathbf{v} $ yields $ v_3 = v_1 $, $ v_2 = v_2 $, and $ v_1 = v_3 $, so the eigenspace is all vectors of the form $ \begin{pmatrix} a \ b \ a \end{pmatrix} $ for scalars $ a, b $; basis vectors are $ \begin{pmatrix} 1 \ 0 \ 1 \end{pmatrix} $ and $ \begin{pmatrix} 0 \ 1 \ 0 \end{pmatrix} $ (or, equivalently, $ \begin{pmatrix} 1 \ 2 \ 1 \end{pmatrix} $ as a non-basis example, satisfying $ J_3 \begin{pmatrix} 1 \ 2 \ 1 \end{pmatrix} = \begin{pmatrix} 1 \ 2 \ 1 \end{pmatrix} $).