Symmetric matrix




Matrix equal to its transpose




Symmetry of a (5×5)-Matrix


In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally,



A symmetric⟺A=AT{displaystyle A{text{ symmetric}}iff A=A^{textsf {T}}}{displaystyle A{text{ symmetric}}iff A=A^{textsf {T}}}



Because equal matrices have equal dimensions, only square matrices can be symmetric.


The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if aij{displaystyle a_{ij}}a_{ij} denotes the entry in the i{displaystyle i}i-th row and j{displaystyle j}j-th column then



A symmetric⟺aji=aij{displaystyle A{text{ symmetric}}iff a_{ji}=a_{ij}}{displaystyle A{text{ symmetric}}iff a_{ji}=a_{ij}}



for all indices i{displaystyle i}i and j{displaystyle j}j.


Every square diagonal matrix is symmetric, since all off-diagonal elements are zero. Similarly in characteristic different from 2, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative.


In linear algebra, a real symmetric matrix represents a self-adjoint operator[1] over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric matrix refers to one which has real-valued entries. Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.




Contents






  • 1 Example


  • 2 Properties


    • 2.1 Basic properties


    • 2.2 Decomposition into Hermitian and skew-Hermitian


    • 2.3 Matrix congruent to a symmetric matrix


    • 2.4 Symmetry implies normality


    • 2.5 Real symmetric matrices


    • 2.6 Complex symmetric matrices




  • 3 Decomposition


  • 4 Hessian


  • 5 Symmetrizable matrix


  • 6 See also


  • 7 Notes


  • 8 References


  • 9 External links





Example


The following 3{displaystyle 3times 3}3 times 3 matrix is symmetric:


A=[17374−53−56]{displaystyle A={begin{bmatrix}1&7&3\7&4&-5\3&-5&6end{bmatrix}}}{displaystyle A={begin{bmatrix}1&7&3\7&4&-5\3&-5&6end{bmatrix}}}


Properties



Basic properties


  • The sum and difference of two symmetric matrices is again symmetric

  • This is not always true for the product: given symmetric matrices A{displaystyle A}A and B{displaystyle B}B, then AB{displaystyle AB}AB is symmetric if and only if A{displaystyle A}A and B{displaystyle B}B commute, i.e., if AB=BA{displaystyle AB=BA}AB=BA.

  • For integer n{displaystyle n}n, An{displaystyle A^{n}}A^{n} is symmetric if A{displaystyle A}A is symmetric.

  • If A−1{displaystyle A^{-1}}A^{-1} exists, it is symmetric if and only if A{displaystyle A}A is symmetric.


Decomposition into Hermitian and skew-Hermitian


Any square matrix can uniquely be written as sum of a symmetric and a skew-symmetric matrix. This decomposition is known as the Toeplitz decomposition.
Let Matn{displaystyle {mbox{Mat}}_{n}}{displaystyle {mbox{Mat}}_{n}} denote the space of n{displaystyle ntimes n}ntimes n matrices. If Symn{displaystyle {mbox{Sym}}_{n}}{displaystyle {mbox{Sym}}_{n}} denotes the space of n{displaystyle ntimes n}ntimes n symmetric matrices and Skewn{displaystyle {mbox{Skew}}_{n}}{displaystyle {mbox{Skew}}_{n}} the space of n{displaystyle ntimes n}ntimes n skew-symmetric matrices then Matn=Symn+Skewn{displaystyle {mbox{Mat}}_{n}={mbox{Sym}}_{n}+{mbox{Skew}}_{n}}{displaystyle {mbox{Mat}}_{n}={mbox{Sym}}_{n}+{mbox{Skew}}_{n}} and Symn∩Skewn={0}{displaystyle {mbox{Sym}}_{n}cap {mbox{Skew}}_{n}={0}}{displaystyle {mbox{Sym}}_{n}cap {mbox{Skew}}_{n}={0}}, i.e.


Matn=Symn⊕Skewn,{displaystyle {mbox{Mat}}_{n}={mbox{Sym}}_{n}oplus {mbox{Skew}}_{n},}{displaystyle {mbox{Mat}}_{n}={mbox{Sym}}_{n}oplus {mbox{Skew}}_{n},}

where {displaystyle oplus }oplus denotes the direct sum. Let X∈Matn{displaystyle Xin {mbox{Mat}}_{n}}{displaystyle Xin {mbox{Mat}}_{n}} then



X=12(X+XT)+12(X−XT){displaystyle X={frac {1}{2}}left(X+X^{textsf {T}}right)+{frac {1}{2}}left(X-X^{textsf {T}}right)}{displaystyle X={frac {1}{2}}left(X+X^{textsf {T}}right)+{frac {1}{2}}left(X-X^{textsf {T}}right)}.

Notice that 12(X+XT)∈Symn{displaystyle {frac {1}{2}}left(X+X^{textsf {T}}right)in {mbox{Sym}}_{n}}{displaystyle {frac {1}{2}}left(X+X^{textsf {T}}right)in {mbox{Sym}}_{n}} and 12(X−XT)∈Skewn{displaystyle {frac {1}{2}}left(X-X^{textsf {T}}right)in {mbox{Skew}}_{n}}{displaystyle {frac {1}{2}}left(X-X^{textsf {T}}right)in {mbox{Skew}}_{n}}. This is true for every square matrix X{displaystyle X}X with entries from any field whose characteristic is different from 2.


A symmetric n{displaystyle ntimes n}ntimes n matrix is determined by 12n(n+1){displaystyle {tfrac {1}{2}}n(n+1)}{displaystyle {tfrac {1}{2}}n(n+1)} scalars (the number of entries on or above the main diagonal). Similarly, a skew-symmetric matrix is determined by 12n(n−1){displaystyle {tfrac {1}{2}}n(n-1)}{displaystyle {tfrac {1}{2}}n(n-1)} scalars (the number of entries above the main diagonal).



Matrix congruent to a symmetric matrix


Any matrix congruent to a symmetric matrix is again symmetric: if X{displaystyle X}X is a symmetric matrix then so is AXAT{displaystyle AXA^{mathrm {T} }}{displaystyle AXA^{mathrm {T} }} for any matrix A{displaystyle A}A.



Symmetry implies normality


A (real-valued) symmetric matrix is necessarily a normal matrix.



Real symmetric matrices


Denote by ,⋅{displaystyle langle cdot ,cdot rangle }langle cdot ,cdot rangle the standard inner product on Rn{displaystyle mathbb {R} ^{n}}mathbb {R} ^{n}. The real n{displaystyle ntimes n}ntimes n matrix A{displaystyle A}A is symmetric if and only if


Ax,y⟩=⟨x,Ay⟩x,y∈Rn.{displaystyle langle Ax,yrangle =langle x,Ayrangle quad forall x,yin {mathbb {R} }^{n}.}{displaystyle langle Ax,yrangle =langle x,Ayrangle quad forall x,yin {mathbb {R}}^{n}.}

Since this definition is independent of the choice of basis, symmetry is a property that depends only on the linear operator A and a choice of inner product. This characterization of symmetry is useful, for example, in differential geometry, for each tangent space to a manifold may be endowed with an inner product, giving rise to what is called a Riemannian manifold. Another area where this formulation is used is in Hilbert spaces.


The finite-dimensional spectral theorem says that any symmetric matrix whose entries are real can be diagonalized by an orthogonal matrix. More explicitly: For every symmetric real matrix A{displaystyle A}A there exists a real orthogonal matrix Q{displaystyle Q}Q such that D=QTAQ{displaystyle D=Q^{mathrm {T} }AQ}{displaystyle D=Q^{mathrm {T} }AQ} is a diagonal matrix. Every symmetric matrix is thus, up to choice of an orthonormal basis, a diagonal matrix.


If A{displaystyle A}A and B{displaystyle B}B are n{displaystyle ntimes n}ntimes n real symmetric matrices that commute, then they can be simultaneously diagonalized: there exists a basis of Rn{displaystyle mathbb {R} ^{n}}mathbb {R} ^{n} such that every element of the basis is an eigenvector for both A{displaystyle A}A and B{displaystyle B}B.


Every real symmetric matrix is Hermitian, and therefore all its eigenvalues are real. (In fact, the eigenvalues are the entries in the diagonal matrix D{displaystyle D}D (above), and therefore D{displaystyle D}D is uniquely determined by A{displaystyle A}A up to the order of its entries.) Essentially, the property of being symmetric for real matrices corresponds to the property of being Hermitian for complex matrices.



Complex symmetric matrices


A complex symmetric matrix can be 'diagonalized' using a unitary matrix: thus if A{displaystyle A}A is a complex symmetric matrix, there is a unitary matrix U{displaystyle U}U such that UAUT{displaystyle UAU^{mathrm {T} }}{displaystyle UAU^{mathrm {T} }} is a real diagonal matrix with non-negative entries. This result is referred to as the Autonne–Takagi factorization. It was originally proved by Léon Autonne (1915) and Teiji Takagi (1925) and rediscovered with different proofs by several other mathematicians.[2][3] In fact, the matrix B=A†A{displaystyle B=A^{dagger }A}{displaystyle B=A^{dagger }A} is Hermitian and positive semi-definite, so there is a unitary matrix V{displaystyle V}V such that V†BV{displaystyle V^{dagger }BV}{displaystyle V^{dagger }BV} is diagonal with non-negative real entries. Thus C=VTAV{displaystyle C=V^{mathrm {T} }AV}{displaystyle C=V^{mathrm {T} }AV} is complex symmetric with C†C{displaystyle C^{dagger }C}{displaystyle C^{dagger }C} real. Writing C=X+iY{displaystyle C=X+iY}{displaystyle C=X+iY} with X{displaystyle X}X and Y{displaystyle Y}Y real symmetric matrices, C†C=X2+Y2+i(XY−YX){displaystyle C^{dagger }C=X^{2}+Y^{2}+i(XY-YX)}{displaystyle C^{dagger }C=X^{2}+Y^{2}+i(XY-YX)}. Thus XY=YX{displaystyle XY=YX}{displaystyle XY=YX}. Since X{displaystyle X}X and Y{displaystyle Y}Y commute, there is a real orthogonal matrix W{displaystyle W}W such that both WXWT{displaystyle WXW^{mathrm {T} }}{displaystyle WXW^{mathrm {T} }} and WYWT{displaystyle WYW^{mathrm {T} }}{displaystyle WYW^{mathrm {T} }} are diagonal. Setting U=WVT{displaystyle U=WV^{mathrm {T} }}{displaystyle U=WV^{mathrm {T} }} (a unitary matrix), the matrix UAUT{displaystyle UAU^{mathrm {T} }}{displaystyle UAU^{mathrm {T} }} is complex diagonal. Pre-multiplying U{displaystyle U}U by a suitable diagonal unitary matrix (which preserves unitarity of U{displaystyle U}U), the diagonal entries of UAUT{displaystyle UAU^{mathrm {T} }}{displaystyle UAU^{mathrm {T} }} can be made to be real and non-negative as desired. Since their squares are the eigenvalues of A†A{displaystyle A^{dagger }A}{displaystyle A^{dagger }A}, they coincide with the singular values of A{displaystyle A}A. (Note, about the eigen-decomposition of a complex symmetric matrix A{displaystyle A}A, the Jordan normal form of A{displaystyle A}A may not be diagonal, therefore A{displaystyle A}A may not be diagonalized by any similarity transformation.)



Decomposition


Using the Jordan normal form, one can prove that every square real matrix can be written as a product of two real symmetric matrices, and every square complex matrix can be written as a product of two complex symmetric matrices.[4]


Every real non-singular matrix can be uniquely factored as the product of an orthogonal matrix and a symmetric positive definite matrix, which is called a polar decomposition. Singular matrices can also be factored, but not uniquely.


Cholesky decomposition states that every real positive-definite symmetric matrix A{displaystyle A}A is a product of a lower-triangular matrix L{displaystyle L}L and its transpose, A=LLT{displaystyle A=LL^{textsf {T}}}{displaystyle A=LL^{textsf {T}}}. If the matrix is symmetric indefinite, it may be still decomposed as PAPT=LDLT{displaystyle PAP^{textsf {T}}=LDL^{textsf {T}}}{displaystyle PAP^{textsf {T}}=LDL^{textsf {T}}} where P{displaystyle P}P is a permutation matrix (arising from the need to pivot), L{displaystyle L}L a lower unit triangular matrix, and D{displaystyle D}D[relevant? ] is a direct sum of symmetric 1{displaystyle 1times 1}1 times 1 and 2{displaystyle 2times 2}2times 2 blocks, which is called Bunch-Kaufman decomposition [5]


A complex symmetric matrix may not be diagonalizable by similarity; every real symmetric matrix is diagonalizable by a real orthogonal similarity.


Every complex symmetric matrix A{displaystyle A}A can be diagonalized by unitary congruence


A=QΛQT{displaystyle A=QLambda Q^{textsf {T}}}{displaystyle A=QLambda Q^{textsf {T}}}

where Q{displaystyle Q}Q is a unitary matrix. If A is real, the matrix Q{displaystyle Q}Q is a real orthogonal matrix, (the columns of which are eigenvectors of A{displaystyle A}A), and Λ{displaystyle Lambda }Lambda is real and diagonal (having the eigenvalues of A{displaystyle A}A on the diagonal). To see orthogonality, suppose x{displaystyle x}x and y{displaystyle y}y are eigenvectors corresponding to distinct eigenvalues λ1{displaystyle lambda _{1}}lambda _{1}, λ2{displaystyle lambda _{2}}lambda _{2}. Then


λ1⟨x,y⟩=⟨Ax,y⟩=⟨x,Ay⟩2⟨x,y⟩.{displaystyle lambda _{1}langle x,yrangle =langle Ax,yrangle =langle x,Ayrangle =lambda _{2}langle x,yrangle .}{displaystyle lambda _{1}langle x,yrangle =langle Ax,yrangle =langle x,Ayrangle =lambda _{2}langle x,yrangle .}

Since λ1{displaystyle lambda _{1}}lambda _{1} and λ2{displaystyle lambda _{2}}lambda _{2} are distinct, we have x,y⟩=0{displaystyle langle x,yrangle =0}langle x,yrangle =0.



Hessian


Symmetric n{displaystyle ntimes n}ntimes n matrices of real functions appear as the Hessians of twice continuously differentiable functions of n{displaystyle n}n real variables.


Every quadratic form q{displaystyle q}q on Rn{displaystyle mathbb {R} ^{n}}mathbb {R} ^{n} can be uniquely written in the form q(x)=xTAx{displaystyle q(mathbf {x} )=mathbf {x} ^{textsf {T}}Amathbf {x} }{displaystyle q(mathbf {x} )=mathbf {x} ^{textsf {T}}Amathbf {x} } with a symmetric n{displaystyle ntimes n}ntimes n matrix A{displaystyle A}A. Because of the above spectral theorem, one can then say that every quadratic form, up to the choice of an orthonormal basis of Rn{displaystyle mathbb {R} ^{n}}mathbb {R} ^{n}, "looks like"


q(x1,…,xn)=∑i=1nλixi2{displaystyle qleft(x_{1},ldots ,x_{n}right)=sum _{i=1}^{n}lambda _{i}x_{i}^{2}}{displaystyle qleft(x_{1},ldots ,x_{n}right)=sum _{i=1}^{n}lambda _{i}x_{i}^{2}}

with real numbers λi{displaystyle lambda _{i}}lambda _{i}. This considerably simplifies the study of quadratic forms, as well as the study of the level sets {x:q(x)=1}{displaystyle left{mathbf {x} :q(mathbf {x} )=1right}}{displaystyle left{mathbf {x} :q(mathbf {x} )=1right}} which are generalizations of conic sections.


This is important partly because the second-order behavior of every smooth multi-variable function is described by the quadratic form belonging to the function's Hessian; this is a consequence of Taylor's theorem.



Symmetrizable matrix


An n{displaystyle ntimes n}ntimes n matrix A{displaystyle A}A is said to be symmetrizable if there exists an invertible diagonal matrix D{displaystyle D}D and symmetric matrix S{displaystyle S}S such that A=DS{displaystyle A=DS}{displaystyle A=DS}.


The transpose of a symmetrizable matrix is symmetrizable, since AT=(DS)T=SD=D−1(DSD){displaystyle A^{mathrm {T} }=(DS)^{mathrm {T} }=SD=D^{-1}(DSD)}{displaystyle A^{mathrm {T} }=(DS)^{mathrm {T} }=SD=D^{-1}(DSD)} and DSD{displaystyle DSD}{displaystyle DSD} is symmetric. A matrix A=(aij){displaystyle A=(a_{ij})}A=(a_{ij}) is symmetrizable if and only if the following conditions are met:




  1. aij=0{displaystyle a_{ij}=0}a_{ij}=0 implies aji=0{displaystyle a_{ji}=0}a_{ji}=0 for all 1≤i≤j≤n.{displaystyle 1leq ileq jleq n.}1leq ileq jleq n.


  2. ai1i2ai2i3…aiki1=ai2i1ai3i2…ai1ik{displaystyle a_{i_{1}i_{2}}a_{i_{2}i_{3}}dots a_{i_{k}i_{1}}=a_{i_{2}i_{1}}a_{i_{3}i_{2}}dots a_{i_{1}i_{k}}}{displaystyle a_{i_{1}i_{2}}a_{i_{2}i_{3}}dots a_{i_{k}i_{1}}=a_{i_{2}i_{1}}a_{i_{3}i_{2}}dots a_{i_{1}i_{k}}} for any finite sequence (i1,i2,…,ik).{displaystyle left(i_{1},i_{2},dots ,i_{k}right).}{displaystyle left(i_{1},i_{2},dots ,i_{k}right).}



See also


Other types of symmetry or pattern in square matrices have special names; see for example:




  • Antimetric matrix

  • Centrosymmetric matrix

  • Circulant matrix

  • Covariance matrix

  • Coxeter matrix

  • Hankel matrix

  • Hilbert matrix

  • Persymmetric matrix

  • Skew-symmetric matrix

  • Sylvester's law of inertia

  • Toeplitz matrix



See also symmetry in mathematics.



Notes





  1. ^ Jesús Rojo García (1986). Álgebra lineal (in Spanish) (2nd. ed.). Editorial AC. ISBN 84 7288 120 2..mw-parser-output cite.citation{font-style:inherit}.mw-parser-output .citation q{quotes:"""""""'""'"}.mw-parser-output .citation .cs1-lock-free a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/6/65/Lock-green.svg/9px-Lock-green.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output .citation .cs1-lock-limited a,.mw-parser-output .citation .cs1-lock-registration a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/d/d6/Lock-gray-alt-2.svg/9px-Lock-gray-alt-2.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output .citation .cs1-lock-subscription a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/a/aa/Lock-red-alt-2.svg/9px-Lock-red-alt-2.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output .cs1-subscription,.mw-parser-output .cs1-registration{color:#555}.mw-parser-output .cs1-subscription span,.mw-parser-output .cs1-registration span{border-bottom:1px dotted;cursor:help}.mw-parser-output .cs1-ws-icon a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/4/4c/Wikisource-logo.svg/12px-Wikisource-logo.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output code.cs1-code{color:inherit;background:inherit;border:inherit;padding:inherit}.mw-parser-output .cs1-hidden-error{display:none;font-size:100%}.mw-parser-output .cs1-visible-error{font-size:100%}.mw-parser-output .cs1-maint{display:none;color:#33aa33;margin-left:0.3em}.mw-parser-output .cs1-subscription,.mw-parser-output .cs1-registration,.mw-parser-output .cs1-format{font-size:95%}.mw-parser-output .cs1-kern-left,.mw-parser-output .cs1-kern-wl-left{padding-left:0.2em}.mw-parser-output .cs1-kern-right,.mw-parser-output .cs1-kern-wl-right{padding-right:0.2em}


  2. ^ Horn, R.A.; Johnson, C.R. (2013). Matrix analysis (second ed.). Cambridge University Press. MR 2978290. pp. 263, 278


  3. ^ See:


    • Autonne, L. (1915), "Sur les matrices hypohermitiennes et sur les matrices unitaires", Ann. Univ. Lyon, 38: 1–77


    • Takagi, T. (1925), "On an algebraic problem related to an analytic theorem of Carathéodory and Fejér and on an allied theorem of Landau", Japan. J. Math., 1: 83–93


    • Siegel, Carl Ludwig (1943), "Symplectic Geometry", American Journal of Mathematics, 65: 1–86, doi:10.2307/2371774, JSTOR 2371774, Lemma 1, page 12


    • Hua, L.-K. (1944), "On the theory of automorphic functions of a matrix variable I–geometric basis", Amer. J. Math., 66: 470–488, doi:10.2307/2371910


    • Schur, I. (1945), "Ein Satz über quadratische formen mit komplexen koeffizienten", Amer. J. Math., 67: 472–480, doi:10.2307/2371974


    • Benedetti, R.; Cragnolini, P. (1984), "On simultaneous diagonalization of one Hermitian and one symmetric form", Linear Algebra Appl., 57: 215–226, doi:10.1016/0024-3795(84)90189-7




  4. ^ Bosch, A. J. (1986). "The factorization of a square matrix into two symmetric matrices". American Mathematical Monthly. 93 (6): 462–464. doi:10.2307/2323471. JSTOR 2323471.


  5. ^ G.H. Golub, C.F. van Loan. (1996). Matrix Computations. The Johns Hopkins University Press, Baltimore, London.




References



  • Horn, Roger A.; Johnson, Charles R. (2013), Matrix analysis (2nd ed.), Cambridge University Press, ISBN 978-0-521-54823-6


External links




  • Hazewinkel, Michiel, ed. (2001) [1994], "Symmetric matrix", Encyclopedia of Mathematics, Springer Science+Business Media B.V. / Kluwer Academic Publishers, ISBN 978-1-55608-010-4

  • A brief introduction and proof of eigenvalue properties of the real symmetric matrix

  • How to implement a Symmetric Matrix in C++




Popular posts from this blog

Shashamane

Carrot

Deprivation index