Hubbry Logo
Orthogonal transformationOrthogonal transformationMain
Open search
Orthogonal transformation
Community hub
Orthogonal transformation
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Orthogonal transformation
Orthogonal transformation
from Wikipedia

In linear algebra, an orthogonal transformation is a linear transformation T : V → V on a real inner product space V, that preserves the inner product. That is, for each pair u, v of elements of V, we have[1]

Since the lengths of vectors and the angles between them are defined through the inner product, orthogonal transformations preserve lengths of vectors and angles between them. In particular, orthogonal transformations map orthonormal bases to orthonormal bases.

Orthogonal transformations are injective: if then , hence , so the kernel of is trivial.

Orthogonal transformations in two- or three-dimensional Euclidean space are stiff rotations, reflections, or combinations of a rotation and a reflection (also known as improper rotations). Reflections are transformations that reverse the direction front to back, orthogonal to the mirror plane, like (real-world) mirrors do. The matrices corresponding to proper rotations (without reflection) have a determinant of +1. Transformations with reflection are represented by matrices with a determinant of −1. This allows the concept of rotation and reflection to be generalized to higher dimensions.

In finite-dimensional spaces, the matrix representation (with respect to an orthonormal basis) of an orthogonal transformation is an orthogonal matrix. Its rows are mutually orthogonal vectors with unit norm, so that the rows constitute an orthonormal basis of V. The columns of the matrix form another orthonormal basis of V.

If an orthogonal transformation is invertible (which is always the case when V is finite-dimensional) then its inverse is another orthogonal transformation identical to the transpose or adjoint of : .

Examples

[edit]

Consider the inner-product space with the standard Euclidean inner product and standard basis. Then, the matrix transformation

is orthogonal. To see this, consider

Then,

The previous example can be extended to construct all orthogonal transformations. For example, the following matrices define orthogonal transformations on :

See also

[edit]

References

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
In linear algebra, an orthogonal transformation is a linear transformation T:VVT: V \to V on a real VV that preserves the inner product, meaning Tv,Tw=v,w\langle Tv, Tw \rangle = \langle v, w \rangle for all vectors v,wVv, w \in V. Equivalently, it preserves the Euclidean norm of vectors (Tv=v\|Tv\| = \|v\|) and the angles between them, ensuring that distances and orientations are maintained under the mapping. When represented in a standard , an orthogonal transformation corresponds to an QQ, a square matrix satisfying QTQ=InQ^T Q = I_n, where QTQ^T is the and InI_n is the n×nn \times n ; this implies Q1=QTQ^{-1} = Q^T. The columns (and rows) of QQ form an for Rn\mathbb{R}^n, and the of QQ is either +1+1 (for proper rotations) or 1-1 (for improper rotations involving a reflection). Orthogonal transformations form the O(n)O(n), a under matrix multiplication, consisting of all such matrices; the subgroup of proper rotations is the special orthogonal group SO(n)SO(n). Common examples include rotations in the plane, given by matrices like (cosθsinθsinθcosθ)\begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix}, and reflections across a hyperplane, both of which preserve vector lengths and orthogonality. These transformations are fundamental in various fields, including physics for describing motions and coordinate changes, for rotations and reflections, and numerical methods like where the QQ factor is orthogonal to stabilize computations. They also underpin techniques, such as the , by enabling efficient, norm-preserving representations of data.

Definition and Properties

Definition

An orthogonal transformation is a linear transformation T:VVT: V \to V on a finite-dimensional real VV that satisfies T(u),T(v)=u,v\langle T(\mathbf{u}), T(\mathbf{v}) \rangle = \langle \mathbf{u}, \mathbf{v} \rangle for all u,vV\mathbf{u}, \mathbf{v} \in V. This preservation of the inner product is equivalent to the transformation preserving the Euclidean norm, since T(v)2=T(v),T(v)=v,v=v2\|T(\mathbf{v})\|^2 = \langle T(\mathbf{v}), T(\mathbf{v}) \rangle = \langle \mathbf{v}, \mathbf{v} \rangle = \|\mathbf{v}\|^2 for all vV\mathbf{v} \in V. While the primary focus is on real inner product spaces, the concept extends to complex spaces through unitary transformations, which preserve the Hermitian inner product in an analogous manner. The term originated in the context of 19th-century , notably through the work of mathematicians such as , who employed such transformations in analyzing symmetric matrices. These transformations relate to isometries of Euclidean space, as preserving the inner product implies preserving distances between points.

Key Properties

Orthogonal transformations on a real inner product space preserve the inner product, which implies they preserve the Euclidean norm of vectors, ensuring that lengths remain unchanged under the transformation. This norm preservation directly leads to the invertibility of orthogonal transformations, as the preservation of lengths guarantees that the transformation is bijective and thus has an inverse. Specifically, the inverse of an orthogonal transformation TT is its adjoint TT^\dagger, satisfying T1=TT^{-1} = T^\dagger, where the adjoint is defined with respect to the inner product. A key algebraic consequence is the closure under composition: if SS and TT are orthogonal transformations, then their composition STS \circ T is also orthogonal, as it preserves the inner product by the successive application of the preservation property. This composition property, combined with invertibility and the existence of the identity transformation (which is orthogonal), establishes that the set of all orthogonal transformations on an nn-dimensional real forms a group under composition, known as the O(n)O(n). Furthermore, the preservation of the inner product implies that orthogonal transformations maintain angles between vectors. For any vectors u\mathbf{u} and v\mathbf{v}, the cosine of the angle θ\theta between T(u)T(\mathbf{u}) and T(v)T(\mathbf{v}) equals the cosine of the angle between u\mathbf{u} and v\mathbf{v}, given by cosθ=T(u),T(v)T(u)T(v)=u,vuv\cos \theta = \frac{\langle T(\mathbf{u}), T(\mathbf{v}) \rangle}{\|T(\mathbf{u})\| \|T(\mathbf{v})\|} = \frac{\langle \mathbf{u}, \mathbf{v} \rangle}{\|\mathbf{u}\| \|\mathbf{v}\|} due to the inner product preservation and norm invariance. This angle preservation underscores the isometry nature of orthogonal transformations in the algebraic framework.

Matrix Representation

Orthogonal Matrices

An orthogonal transformation on an nn-dimensional real vector space can be represented by an n×nn \times n real matrix QQ in a chosen , where QQ satisfies the condition QQ=InQ^\top Q = I_n, with InI_n denoting the n×nn \times n . This defining property ensures that the matrix QQ is invertible with inverse equal to its transpose, Q1=QQ^{-1} = Q^\top. The columns of such a matrix QQ form an orthonormal basis for Rn\mathbb{R}^n, meaning each column vector has unit length and the columns are pairwise orthogonal; the rows satisfy the same orthonormality condition. This orthonormality directly corresponds to the transformation preserving lengths and angles in the basis representation. The determinant of an orthogonal matrix QQ is either +1+1 or 1-1, reflecting whether the transformation is orientation-preserving (a proper rotation) or orientation-reversing (including a reflection). One explicit method to construct an is via the Gram-Schmidt process, which orthogonalizes and normalizes a given set of linearly independent vectors to produce an ; the resulting basis vectors can then serve as the columns (or rows) of QQ.

Eigenvalues and Singular Value Decomposition

The eigenvalues of a real orthogonal matrix lie on the unit circle in the , meaning their absolute values are all equal to 1. This property arises because if λ\lambda is an eigenvalue with eigenvector vv, then Qv=λvQv = \lambda v implies Qv=λv\|Qv\| = |\lambda| \|v\|, and since QQ preserves norms, λ=1|\lambda| = 1. For real orthogonal matrices, the eigenvalues are either real, in which case they must be ±1\pm 1, or they occur in complex conjugate pairs eiθe^{i\theta} and eiθe^{-i\theta} for some real θ\theta. The real eigenvalues ±1\pm 1 correspond to invariant directions under the transformation, while the complex pairs reflect rotational components in the spectral decomposition. The (SVD) provides a for any real matrix, and for a square QQ, it simplifies significantly. Specifically, Q=UΣVTQ = U \Sigma V^T, where UU and VV are orthogonal matrices, and Σ\Sigma is a with all entries equal to 1 (the singular values of QQ). This form underscores that orthogonal matrices have full rank and unit singular values, aligning with their norm-preserving nature; in fact, one possible SVD is Q=QIITQ = Q I I^T, highlighting the identity diagonal. A special case arises with proper orthogonal matrices, those with determinant 1, which form the special orthogonal group SO(n)SO(n) and represent pure rotations without reflection. Their eigenvalues consist of ±1\pm 1 (with even multiplicity for -1) and complex conjugate pairs on the unit circle, with 1 having multiplicity at least 1 in odd dimensions.

Geometric Aspects

Preservation of Inner Products

Orthogonal transformations, by definition, preserve the inner product in , meaning that for any T:RnRnT: \mathbb{R}^n \to \mathbb{R}^n satisfying Tu,Tv=u,v\langle T\mathbf{u}, T\mathbf{v} \rangle = \langle \mathbf{u}, \mathbf{v} \rangle for all vectors u,vRn\mathbf{u}, \mathbf{v} \in \mathbb{R}^n, the geometric structure induced by the inner product remains unchanged. This preservation directly implies that norms are maintained, as Tu=Tu,Tu=u,u=u\|T\mathbf{u}\| = \sqrt{\langle T\mathbf{u}, T\mathbf{u} \rangle} = \sqrt{\langle \mathbf{u}, \mathbf{u} \rangle} = \|\mathbf{u}\|
Add your contribution
Related Hubs
User Avatar
No comments yet.