4. Eigenvalues and Eigenvectors#
For a square matrix \(A\), its eigenvectors are vectors \(x\) such that,
Where \(\lambda \in R\) is an eigenvalue of \(x\). In other words, the eigenvectors of \(A\) are vectors that when transformed by \(A\) are only scaled by some factor \(\lambda\) rather than, for example, changing direction.
4.1. Finding Eigenvalues and Eigenvectors#
In our equation above, there are two unknows: \(\lambda\), and \(x\). Rearanging, some of the terms, we get,
Where \(I\) is the identity matrix. If \(A - \lambda I\) is invertible then \((A - \lambda I) ^ {-1} (A - \lambda I) x = (A - \lambda I) ^ {-1} 0 \implies x = 0\). Therefore, for \(x\) to have a non-zero solution, \(A - \lambda I\) cannot be invertible, and thus,
This gives a formula to find the eigenvalues of \(A\) which can then be substituted into \((A - \lambda I) x = 0\) to find the eigenvectors \(x\).
4.2. Properties of Eigenvalues and Eigenvectors#
Several properties of eigenvalues and eigenvectors.
Squaring a square matrix \(A\) squares its eigenvalues but its eigenvectors remain the same.
The eigenvectors of a symmetric matrix are orthogonal.
A square matrix \(\mathbf{A}\) is invertible IFF its columns are linearly independent.
4.3. Properties of Eigenvalues and Eigenvectors (Proofs)#
Squaring a square matrix \(A\) squares its eigenvalues but its eigenvectors remain the same.
Suppose \(\mathbf{A}\) has the eigenvector \(\mathbf{x}_i\) and eigenvalue \(\lambda_i\).
Thus, \(\mathbf{x}_i\) is an eigenvector of \(\mathbf{A}\) and \(\mathbf{A}^2\). And \(\lambda_i\) is an eigenvalue of \(\mathbf{A}\) but \(\lambda_i^2\) is an eigenvalue of \(\mathbf{A}^2\).
The eigenvectors of a symmetric matrix are orthogonal.
Suppose \(\mathbf{A}\) is a two dimensional matrix with two eigenvectors and two distinct eigenvalues.
And,
Taking both equations from each other, we get,
Since \(\lambda_1\) and \(\lambda_2\) are distinct, \((\lambda_2 - \lambda_1)\) is non-zero, therefore, \(x_2^T x_1^T\) must be zero. Thus, \(x_2\) and \(x_1\) are orthogonal.
4.3.1. The determinant of \(A\) is the product of its eigenvalues#
The eigenvalues \(\lambda_1, \lambda_2, \dots, \lambda_n\), are the roots of its characteristic polynomial.
Setting \(\lambda\) to \(0\), we get,
4.3.2. The trace of \(A\) is equal to the sum of its eigenvalues#
For this, we only prove for the \(2 \times 2\) case. Let \(A\) be the following matrix,
Find the eigenvalues of \(A\) using its charateristic polynomial.
Using the quatratic formula, we get,
Therefore,
4.3.3. \(Ax = 0\) has a non-trivial solution IFF the columns of \(A\) are linearly independent#
Suppose the columns of \(A\) are linearly independent. Let the vectors \(v_1, v_2, \dots, v_n\) be the columns of \(A\).
This can be rewritten as,
Since the columns of \(A\) are linearly independent, some column \(v_i\) can be expressed as a combination of the other columns. Therefore, there must be solution to the equation above where some of the scalars \(x_1, x_2, \dots, x_n\) are non-zero.