Notes on the Trace of Skew-Symmetric Matrices

This article mainly studies some properties and inequalities of the traces of antisymmetric matrices. An antisymmetric matrix refers to a matrix that transposes a negative one, which has important significance in matrix theory and practical application. Trace refers to the sum of the main diagonal elements of a matrix. It is an important numerical feature of a matrix and has been widely used in many fields.

The article first gives the definition of antisymmetric matrix, Hermite matrix and trace, as well as some basic lemma. Then, the article gives seven theorems, which discuss the relationship between the traces of the antisymmetric matrix and itself, reversibility, transposition, product, eigenvalue, etc. Finally, the article uses imaginary units to link the real antisymmetric matrix with the Hermite matrix, and gives several inequalities between the real antisymmetric matrix and the Hermite matrix trace.

Below we introduce each theorem, its proof process and explanation respectively.


Theorem 1: Suppose $A$ is a square matrix of order $n$, then:

(1) If $A$ is an antisymmetric matrix, then $\mathrm{tr}A=0$;

(2) If $A$ is an antisymmetric matrix and is reversible, then $\mathrm{tr}A^{-1}=0$;

(3) $\mathrm{tr}(A-A^T)=0$.

Proof: From definition 1 and definition 2, it is easy to obtain (1); it is easy to verify that if $A$ is an antisymmetric matrix and is reversible, then the inverse matrix of $A$ is also an antisymmetric matrix, which is known from (1) (2) holds true; for any arbitrary $n$ order matrix $A$, $A-A^T$ is obviously an antisymmetric matrix, which can be obtained from (1):
$$
\mathrm{tr}A=\mathrm{tr}(A-A^T)=0.
$$

Explanation: This theorem shows that the trace of an antisymmetric matrix must be zero, whether reversible or not. This is because the main diagonal elements of the antisymmetric matrix are zero, and the trace is the sum of the main diagonal elements. In addition, this theorem also shows that the arbitrary square matrix is ​​obtained after subtracting its transposition, and its trace is also zero.


Theorem 2: Assume $A$ is an antisymmetric matrix of order $n$, then for any $n$-order square matrix $P$, there is $\mathrm{tr}(A^TAP)=\mathrm{tr}(PAP^T)=0$.

Proof: Because $A$ is an antisymmetric matrix of order $n$, that is, $A^T=-A$,
$$
(P^TAP)^T=P^TA^T(P^T)^T=P^T(-A)P=-P^TAP,
$$
That is, $P^TAP$ is an antisymmetric matrix, and similarly $PAP^T$ is also an antisymmetric matrix. From theorem 1,
$$
\mathrm{tr}(P^TAP)=\mathrm{tr}(PAP^T)=0.
$$

Explanation: This theorem shows that multiplying the antisymmetric matrix with any square matrix and then transposed multiplication is still an antisymmetric matrix with zero trace. This can be seen as an invariant or a preservative property.


Theorem 3: Suppose $A,B\in C^{n\times n}$, one of which is a symmetric matrix and the other is an antisymmetric matrix, then $\mathrm{tr}AB=0$.

Proof: You may wish to set up

$$A=\begin{pmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{12} & a_{22} & \cdots & a_{2n} \\\ \vdots & \vdots & \ddots &\vdots \\ a_{1n} & a_{2n} & \cdots & a_{nn} \end{pmatrix}, $$

$$B=\begin{pmatrix} 0&b_{12}&\cdots&b_{1n}\\\ -b_{12}&0&\cdots&b_{2n}\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\

And $A^T=A,B^T=-B$, then
$$ AB=\begin{pmatrix} a_{11}b_{12}-a_{12}b_{12}&a_{11}b_{13}-a_{13}b_{12}&\cdots&a_{11}b_{1n}-a_{1n}b_{12}\\ -a_{11}b_{12}+a_{12}b_{11}&-a_{11}b_{13}+a_{13}b_{11}&\cdots&-a_{11}b_{1n}b_{11}\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\-a_{11}b_{21}+a_{21}b_{11}&-a_{11}b_{31}+a_{31}b_{11}&\cdots&-a_{11}b_{n1}+a_{n1}b_{11} \end{pmatrix}.$$

Note $C=AB$, then
$$
\mathrm{tr}AB=\mathrm{tr}C=\sum^n_i=1c_ii=\sum^n_i=1(\sum^n_j=1 a_ij b_ji)=0.
$$

Explanation: This theorem shows that when a square matrix is ​​equal to or a negative sign different from its transpose, the trace of the square matrix they multiply is zero. This can be seen as an orthogonal or complementary property.


Theorem 4: Assume $A, B, C $ are all $n $ order antisymmetric matrices, then
$$\mathrm{tr}(ABC)= \mathrm{tr}(BCA)= \mathrm{tr}(CAB)=- \mathrm{tr}(CBA)=- \mathrm{tr}(BAC)=- \mathrm{tr}(ACB)$$

Proof: Because $A,B,C $ are all $n $ order antisymmetric matrices, that is, $A^T=- A,B^T=- B,C^T=-C $.
$$
\mathrm{tr}(ABC)= \mathrm{tr}(ABC)^T= \mathrm{tr}(C^TB^TA^T)= \mathrm{tr}[(-C)(-B)(-A)]= \mathrm{tr}{(-CBA)}= - \mathrm{tr}{(CBA)}.
$$
Obviously available
$$
\mathrm{tr}(ABC)= \mathrm{tr}(BCA)= \mathrm{tr}(CAB),
$$

$$
\mathrm{tr}(CBA)= \mathrm{tr}(BAC)= \mathrm{tr}(ACB).
$$

so,
$$
\mathrm{tr}(ABC)= \mathrm{tr}(BCA)= \mathrm{tr}(CAB)=- \mathrm{tr}(CBA)=- \mathrm{tr}(BAC)=- \mathrm{tr}(ACB).
$$

Explanation: This theorem shows that the traces of the square matrix obtained by multiplying three antisymmetric matrices are related to the order in which they multiply. When they move clockwise or counterclockwise, the traces of the square matrix they multiply remain unchanged; when they exchange two of them, the symbols of the square matrix they multiply are opposite. This can be seen as a circular or exchange nature.


Theorem 5: Assume $A and B$ are both $n$ order antisymmetric matrices, then $\mathrm{tr}(AB-BA)=0$.

Proof: From Theorem 3,
$$
\mathrm{tr}(AB-BA)=\mathrm{tr}AB-\mathrm{tr}BA=0-0=0.
$$

Explanation: This theorem shows that the trace of the square matrix obtained by multiplying two antisymmetric matrices and subtracting them is zero. This can be seen as a symmetry or equilibrium property.


Theorem 6: Assuming $A$ is an antisymmetric matrix of order $n$, then the eigenvalues ​​of $A$ are all pure imaginary numbers or zeros.

Proof: Suppose $\lambda$ is an eigenvalue of $A$ and $\alpha$ is the corresponding eigenvector, that is,

$
A\alpha=\lambda\alpha.
$

Take conjugate transpose on both sides, obtain
$$
(A\alpha)^H=\lambda^H\alpha^H,
$$

Right now
$$
\alpha^HA^H=\lambda^H\alpha^H.
$$
Because $A$ is an antisymmetric matrix, that is, $A^T=-A$,
$$
\alpha^H(-A)=\lambda^H\alpha^H,
$$
Right now
$$
-A\alpha^H=\lambda^H\alpha^H.
$$
Multiply the above formula with the original formula to obtain
$$
-A^2(\alpha\alpha^H)=\lambda\lambda^H(\alpha\alpha^H).
$$
Because $\alpha\neq 0$, so $\alpha\alpha^H\neq 0$ can be obtained
$$
-A^2=\lambda\lambda^HI_n,
$$
Right now
$$
A^2=-\lambda\lambda^HI_n.
$$
Take the trace, get
$$
\mathrm{tr}(A^2)=-n\lambda\lambda^H.
$$
From Lemma 4,
$$
\mathrm{tr}(A^2)=-2n(\mathrm{tr}A)^2=0,
$$
So $\lambda\lambda^H=0$, that is, $\lambda=0$ or $\lambda$ is a pure imaginary number.

Explanation: This theorem shows that the eigenvalues ​​of antisymmetric matrices are pure imaginary numbers or zeros. This is because the antisymmetric matrix is ​​a negative sign different from its transpose, so their eigenvalues ​​also satisfy the relationship, that is, $\lambda=-\lambda^H$, so that they can only be pure imaginary numbers or zeros.

(Note: The $A^H$ appears in the text refers to the conjugated transposition of $A$, that is, $A^H=\bar{A}^T$)


Theorem 7: Assume $A and B $ are both antisymmetric matrices of order $n $, then $\mathrm{tr}(AB) \leqslant 0 $.

Proof: From Theorem 6, we know that the eigenvalues ​​of $A and B $ are all pure imaginary numbers or zeros. Let $\{\lambda_i\}_{i=1}^{n}$
is the eigenvalue set of $A $, $\{\mu_j \}_{j=1}^{n} $ is the eigenvalue set of $B $, then

$$ \begin{aligned}
& \mathrm{tr}(AB) = \sum^n_{i=1}\sum^n_{j=1}\langle Ae_i,Be_j\rangle \\
& = \sum^n_{i=1}\sum^n_{j=1}\langle \lambda_i e_i,\mu_j e_j\rangle \\
& = \sum^n_{i=1}\sum^n_{j=1}\lambda_i \mu_j \langle e_i,e_j\rangle \\
& = \sum^n_{i=1}\sum^n_{j=1}\lambda_i \mu_j \delta_{ij} \\
& = \sum^n_{i=1}\lambda_i \mu_i \\
& = \sum^n_{i=1}(-a_ib_i) \\
& \leqslant 0,
\end{aligned}
$$

where $a_i, b_i$ are the imaginary parts of $\lambda_i,\mu_i$, $\delta_{ij}$ is the Cronek symbol, and $\langle e_i, e_j\rangle$ is the vector inner product.

Explanation: This theorem shows that the trace of the square matrix obtained by multiplying two antisymmetric matrices is not greater than zero. This is because the eigenvalues ​​of antisymmetric matrices are all pure imaginary numbers or zeros, so the eigenvalues ​​of the square matrix obtained by multiplication are all negative real numbers or zeros, and the traces are the sum of the eigenvalues.

Author

Evan Mi

Posted on

2023-05-31

Updated on

2023-05-31

Licensed under

Your browser is out-of-date!

Update your browser to view this website correctly.&npsb;Update my browser now

×