Skew-symmetric matrix

In, particularly in , a skew-symmetric (or antisymmetric or antimetric) matrix is a whose  equals its negative. That is, it satisfies the condition

In terms of the entries of the matrix, if $a_{ij}$ denotes the entry in the $i$ -th row and $j$ -th column, then the skew-symmetric condition is equivalent to

Example
The matrix
 * $$A =

\begin{bmatrix} 0 & 2 & -45 \\   -2 & 0 & -4 \\     45 & 4 &  0  \end{bmatrix} $$ is skew-symmetric because
 * $$-A =

\begin{bmatrix} 0 & -2 & 45 \\    2 &  0 & 4 \\    -45 & -4 & 0  \end{bmatrix} = A^\textsf{T} .$$

Properties
Throughout, we assume that all matrix entries belong to a $\mathbb{F}$  whose  is not equal to 2. That is, we assume that 1 + 1 ≠ 0, where 1 denotes the multiplicative identity and 0 the additive identity of the given field. If the characteristic of the field is 2, then a skew-symmetric matrix is the same thing as a.


 * The sum of two skew-symmetric matrices is skew-symmetric.
 * A scalar multiple of a skew-symmetric matrix is skew-symmetric.
 * The elements on the diagonal of a skew-symmetric matrix are zero, and therefore its equals zero.
 * If $A$ is a real skew-symmetric matrix and $\lambda$  is a real, then $\lambda = 0$ , i.e. the nonzero eigenvalues of a skew-symmetric matrix are purely imaginary.
 * If $A$ is a real skew-symmetric matrix, then $I + A$  is, where $I$  is the identity matrix.
 * If $A$ is a skew-symmetric matrix then $A^2$  is a symmetric.

Vector space structure
As a result of the first two properties above, the set of all skew-symmetric matrices of a fixed size forms a. The space of $n \times n$ skew-symmetric matrices has  $\frac{1}{2}n(n - 1).$

Let $$\mbox{Mat}_n$$ denote the space of $n \times n$ matrices. A skew-symmetric matrix is determined by $\frac{1}{2}n(n - 1)$ scalars (the number of entries above the ); a  is determined by  $\frac{1}{2}n(n + 1)$  scalars (the number of entries on or above the main diagonal). Let $\mbox{Skew}_n$ denote the space of $n \times n$  skew-symmetric matrices and $\mbox{Sym}_n$  denote the space of $n \times n$  symmetric matrices. If $A \in \mbox{Mat}_n$ then
 * $$ A = \frac{1}{2}\left(A - A^\mathsf{T}\right) + \frac{1}{2}\left(A + A^\mathsf{T}\right). $$

Notice that $\frac{1}{2}\left(A - A^\textsf{T}\right) \in \mbox{Skew}_n$ and $\frac{1}{2}\left(A + A^\textsf{T}\right) \in \mbox{Sym}_n.$  This is true for every  $A$  with entries from any  whose  is different from 2. Then, since $\mbox{Mat}_n = \mbox{Skew}_n + \mbox{Sym}_n$ and $\mbox{Skew}_n \cap \mbox{Sym}_n = 0,$
 * $$\mbox{Mat}_n = \mbox{Skew}_n \oplus \mbox{Sym}_n,$$

where $$\oplus$$ denotes the.

Denote by $\langle \cdot, \cdot \rangle$ the standard  on $$\mathbb{R}^n.$$ The real $$n \times n$$ matrix $A$  is skew-symmetric if and only if
 * $$\langle Ax,y \rangle = - \langle x, Ay\rangle \quad \forall x, y \in \mathbb{R}^n.$$

This is also equivalent to $\langle x, Ax \rangle = 0$ for all $$x \in \mathbb{R}^n$$ (one implication being obvious, the other a plain consequence of $\langle x + y, A(x + y)\rangle = 0$  for all $$x$$ and $$y$$).

Since this definition is independent of the choice of, skew-symmetry is a property that depends only on the $$A$$ and a choice of.

$$3 \times 3$$ skew symmetric matrices can be used to represent s as matrix multiplications.

Determinant
Let $$A$$ be a $$n \times n$$ skew-symmetric matrix. The of $$A$$ satisfies


 * $$\det\left(A^\textsf{T}\right) = \det(-A) = (-1)^n \det(A).$$

In particular, if $$n$$ is odd, and since the underlying field is not of characteristic 2, the determinant vanishes. Hence, all odd dimension skew symmetric matrices are singular as their determinants are always zero. This result is called Jacobi's theorem, after (Eves, 1980).

The even-dimensional case is more interesting. It turns out that the determinant of $$A$$ for $$n$$ even can be written as the square of a in the entries of $$A$$, which was first proved by Cayley:


 * $$\det{(A)} = \operatorname{Pf}(A)^2.$$

This polynomial is called the  of $$A$$ and is denoted $$\operatorname{Pf}(A)$$. Thus the determinant of a real skew-symmetric matrix is always non-negative. However this last fact can be proved in an elementary way as follows: the eigenvalues of a real skew-symmetric matrix are purely imaginary (see below) and to every eigenvalue there corresponds the conjugate eigenvalue with the same multiplicity; therefore, as the determinant is the product of the eigenvalues, each one repeated according to its multiplicity, it follows at once that the determinant, if it is not 0, is a positive real number.

The number of distinct terms $$s(n)$$ in the expansion of the determinant of a skew-symmetric matrix of order $$n$$ has been considered already by Cayley, Sylvester, and Pfaff. Due to cancellations, this number is quite small as compared the number of terms of a generic matrix of order $$n$$, which is $$n!$$. The sequence $$s(n)$$ is
 * 1, 0, 1, 0, 6, 0, 120, 0, 5250, 0, 395010, 0, …

and it is encoded in the
 * $$\sum_{n=0}^\infty \frac{s(n)}{n!}x^n = \left(1 - x^2\right)^{-\frac{1}{4}}\exp\left(\frac{x^2}{4}\right).$$

The latter yields to the asymptotics (for $$n$$ even)
 * $$s(n) = \pi^{-\frac{1}{2}} 2^\frac{3}{4} \Gamma\left(\frac{3}{4}\right)\left(\frac{n}{e}\right)^{n - \frac{1}{4}} \left(1 + O\left(\frac{1}{n}\right)\right).$$

The number of positive and negative terms are approximatively a half of the total, although their difference takes larger and larger positive and negative values as $$n$$ increases.

Cross product
Three-by-three skew-symmetric matrices can be used to represent cross products as matrix multiplications. Consider $\mathbf{a} = \left(a_1\ a_2\ a_3\right)^\textsf{T}$  and $\mathbf{b} = \left(b_1\ b_2\ b_3\right)^\textsf{T}.$  Then, defining the matrix


 * $$[\mathbf{a}]_{\times} = \begin{bmatrix}

\,\,0 & \!-a_3 & \,\,\,a_2 \\ \,\,\,a_3 &      0 &    \!-a_1 \\ \!-a_2 & \,\,a_1 &    \,\,0 \end{bmatrix},$$

the cross product can be written as
 * $$\mathbf{a}\times\mathbf{b} = [\mathbf{a}]_{\times}\mathbf{b}.$$

This can be immediately verified by computing both sides of the previous equation and comparing each corresponding element of the results.

One actually has
 * $$[\mathbf{a \times b}]_{\times} =

[\mathbf{a}]_{\times}[\mathbf{b}]_{\times} - [\mathbf{b}]_{\times}[\mathbf{a}]_{\times}; $$

i.e., the commutator of skew-symmetric three-by-three matrices can be identified with the cross-product of three-vectors. Since the skew-symmetric three-by-three matrices are the of the rotation group $SO(3)$  this elucidates the relation between three-space $\mathbb{R}^3$, the cross product and three-dimensional rotations. More on infinitesimal rotations can be found below.

Spectral theory
Since a matrix is to its own transpose, they must have the same eigenvalues. It follows that the s of a skew-symmetric matrix always come in pairs ±λ (except in the odd-dimensional case where there is an additional unpaired 0 eigenvalue). From the, for a real skew-symmetric matrix the nonzero eigenvalues are all pure and thus are of the form $$\lambda_1 i, -\lambda_1 i, \lambda_2 i, -\lambda_2 i, \ldots$$ where each of the $$\lambda_k$$ are real.

Real skew-symmetric matrices are (they commute with their ) and are thus subject to the, which states that any real skew-symmetric matrix can be diagonalized by a. Since the eigenvalues of a real skew-symmetric matrix are imaginary, it is not possible to diagonalize one by a real matrix. However, it is possible to bring every skew-symmetric matrix to a form by a. Specifically, every $$2n \times 2n$$ real skew-symmetric matrix can be written in the form $$A = Q\Sigma Q^\textsf{T}$$ where $$Q$$ is orthogonal and
 * $$\Sigma = \begin{bmatrix}

\begin{matrix}0 & \lambda_1 \\ -\lambda_1 & 0\end{matrix} & 0 & \cdots & 0 \\ 0 & \begin{matrix}0 & \lambda_2 \\ -\lambda_2 & 0\end{matrix} & & 0 \\ \vdots & & \ddots & \vdots \\ 0 & 0 & \cdots & \begin{matrix}0 & \lambda_r\\ -\lambda_r & 0\end{matrix} \\ & & & & \begin{matrix}0 \\ & \ddots \\ & & 0 \end{matrix} \end{bmatrix}$$

for real positive-definite $$\lambda_k$$. The nonzero eigenvalues of this matrix are ±λk i. In the odd-dimensional case Σ always has at least one row and column of zeros.

More generally, every complex skew-symmetric matrix can be written in the form $$A = U \Sigma U^{\mathrm T}$$ where $$U$$ is unitary and $$\Sigma$$ has the block-diagonal form given above with $$\lambda_k$$ still real positive-definite. This is an example of the Youla decomposition of a complex square matrix.

Skew-symmetric and alternating forms
A skew-symmetric form $$\varphi$$ on a $$V$$ over a  $$K$$ of arbitrary characteristic is defined to be a


 * $$\varphi: V \times V \mapsto K$$

such that for all $$v, w$$ in $$V,$$


 * $$\varphi(v, w) = -\varphi(w, v).$$

This defines a form with desirable properties for vector spaces over fields of characteristic not equal to 2, but in a vector space over a field of characteristic 2, the definition is equivalent to that of a symmetric form, as every element is its own additive inverse.

Where the $$V$$ is over a field of arbitrary  including characteristic 2, we may define an alternating form as a bilinear form $$\phi$$ such that for all vectors $$v$$ in $$V$$


 * $$\varphi(v, v) = 0.$$

This is equivalent to a skew-symmetric form when the field is not of characteristic 2, as seen from


 * $$0 = \varphi(v + w, v + w) = \varphi(v, v) + \varphi(v, w) + \varphi(w, v) + \varphi(w, w) = \varphi(v, w) + \varphi(w, v),$$

whence


 * $$\varphi(v, w) = -\varphi(w, v).$$

A bilinear form $$\varphi$$ will be represented by a matrix $$A$$ such that $$\varphi(v,w) = v^\textsf{T}Aw$$, once a of $$V$$ is chosen, and conversely an $$n \times n$$ matrix $$A$$ on $$K^n$$ gives rise to a form sending $$(v, w)$$ to $$v^\textsf{T}Aw.$$  For each of symmetric, skew-symmetric and alternating forms, the representing matrices are symmetric, skew-symmetric and alternating respectively.

Infinitesimal rotations
Skew-symmetric matrices over the field of real numbers form the to the real  $$O(n)$$ at the identity matrix; formally, the. In this sense, then, skew-symmetric matrices can be thought of as infinitesimal rotations.

Another way of saying this is that the space of skew-symmetric matrices forms the $$o(n)$$ of the  $$O(n).$$ The Lie bracket on this space is given by the :


 * $$[A, B] = AB - BA.\,$$

It is easy to check that the commutator of two skew-symmetric matrices is again skew-symmetric:


 * $$\begin{align}

{[}A, B{]}^\textsf{T} &= B^\textsf{T} A^\textsf{T} - A^\textsf{T} B^\textsf{T} \\ &= (-B)(-A) - (-A)(-B) = BA - AB = -[A, B] \,. \end{align}$$

The of a skew-symmetric matrix $$A$$ is then an  $$R$$:


 * $$R = \exp(A) = \sum_{n=0}^\infty \frac{A^n}{n!}.$$

The image of the of a Lie algebra always lies in the  of the Lie group that contains the identity element. In the case of the Lie group $$O(n),$$ this connected component is the $$SO(n),$$ consisting of all orthogonal matrices with determinant 1. So $$R = \exp(A)$$ will have determinant +1. Moreover, since the exponential map of a connected compact Lie group is always surjective, it turns out that every orthogonal matrix with unit determinant can be written as the exponential of some skew-symmetric matrix. In the particular important case of dimension $$n=2,$$ the exponential representation for an orthogonal matrix reduces to the well-known  of a complex number of unit modulus. Indeed, if $$n=2,$$ a special orthogonal matrix has the form
 * $$\begin{bmatrix}

a & -b \\ b & \,a \end{bmatrix},$$

with $$a^2 + b^2 = 1$$. Therefore, putting $$a = \cos\theta$$ and $$b = \sin\theta,$$ it can be written
 * $$\begin{bmatrix}

\cos\,\theta & -\sin\,\theta \\ \sin\,\theta & \,\cos\,\theta \end{bmatrix} = \exp\left(\theta\begin{bmatrix}   0 &  -1 \\    1 & \,0  \end{bmatrix}\right), $$

which corresponds exactly to the polar form $$\cos \theta + i \sin \theta = e^{i \theta}$$ of a complex number of unit modulus.

The exponential representation of an orthogonal matrix of order $$n$$ can also be obtained starting from the fact that in dimension $$n$$ any special orthogonal matrix $$R$$ can be written as $$R = QSQ^\textsf{T},$$ where $$Q$$ is orthogonal and S is a  with $\lfloor n/2\rfloor$  blocks of order 2, plus one of order 1 if $$n$$ is odd; since each single block of order 2 is also an orthogonal matrix, it admits an exponential form. Correspondingly, the matrix S writes as exponential of a skew-symmetric block matrix $$\Sigma$$ of the form above, $$S = \exp(\Sigma),$$ so that $$R = Q\exp(\Sigma)Q^\textsf{T} = \exp(Q\Sigma Q^\textsf{T}),$$ exponential of the skew-symmetric matrix $$Q\Sigma Q^\textsf{T}.$$ Conversely, the surjectivity of the exponential map, together with the above-mentioned block-diagonalization for skew-symmetric matrices, implies the block-diagonalization for orthogonal matrices.

Coordinate-free
More intrinsically (i.e., without using coordinates), skew-symmetric linear transformations on a vector space $$V$$ with an may be defined as the s on the space, which are sums of simple bivectors  $v \wedge w.$  The correspondence is given by the map $v \wedge w \mapsto v^* \otimes w - w^* \otimes v,$  where $v^*$  is the covector dual to the vector $v$ ; in orthonormal coordinates these are exactly the elementary skew-symmetric matrices. This characterization is used in interpreting the of a vector field (naturally a 2-vector) as an infinitesimal rotation or "curl", hence the name.

Skew-symmetrizable matrix
An $$n \times n$$ matrix $$A$$ is said to be skew-symmetrizable if there exists an invertible $$D$$ such that $$DA$$ is skew-symmetric. For real $$n \times n$$ matrices, sometimes the condition for $$D$$ to have positive entries is added.