Eigenvalues and Eigenvectors
June 8, 2019
Eigenvectors: The non-zero vector s.t. Parallel to . That is, is a multiple of , . These vectors won’t be many. The is the Eigenvalue. It can be negative and zero.
Zero Eigenvalue
One thing that can be sure is that singular matrix , even it’s not square, will always have eigenvalue 0. In the special case where the matrix is not full column rank, we will have non-zero eigenvector with zero eigenvalue. This will be explained later in singular value.
If the matrix is square and singular. Then
The number of zero eigenvalues are equal to (at least, because there’re can be equal eigenvalues, but the number of eigenvectors are) the dimension of the null space , i.e., # of free variables. Remember null space is all s.t.
If there’re two free variables, then and
This suggests that there’re corresponding to eigenvector s.t.
Identity Matrix
The Identity matrix has all vectors as eigenvectors, because is an eigenvalue, then:
will give a all zero matrix. Any vectors having same dimension as # of cols will make this equation true and thus are eigenvectors.
Examples
Now let’s look at the projection matrix which projects a vector onto a plane. The projected vector is (generally) not an eigenvector because it is not at the same direction as the original vector. One exception is that is already on the plane. Then we no longer need to project and in this case. Then in this case any vectors in the plane is one of the eigenvector of , with eigenvalue equal to 1. Another eigenvector is any vectors that is perpendicular to the plane, then its projection will be , with .
Let’s look at another example, a permutation matrix:
A permutation matrix is switching rows(or columns if you want) of a vector. What vector will stay the same if we switch its row 1 and row? It is when
And the corresponding eigenvalue is 1. Another possibility is
You may notice that any constant times or will also work. This is true. In fact, they are basis of the eigenvectors, we have a line of eigenvectors. But note that eigenvalue is not changed for and even they are multiplied by constants.
One neat fact is that sum of eigenvalues ’s is equal to the matrix’s sum of diagonals. So on the as permutation matrix above, the sum of eigenvalues are equal to 0.
How to solve Ax=$\lambda x$
Rewrite it into
And we are interested in non-zero eigenvector. To make non-zero, we need the matrix to be singular, or have free columns to be exact. In this case . Then we can find first, and then find . For example, let’s find the eigenvalues and eigenvectors for
Then,
Before we continue, it’s important to note the equation
And we know any such functions can be factored into:
The constant left on the very right is equal to the product of all eigenvalues. And you may guess the product of all eigenvectors equal to the determinant, that’s in fact true. Knowing there’re always eigenvalues even though it’s zero gives us the following equation:
Set gives us the desired result.
Back to finding eigenvectors. For each eigenvalue we find its corresponding eigenvector by substitution,
And,
One useful observation from this, comparing to the permutation matrix above, is that we add 3 to each diagonals there, we got our eigenvalues added 3 as well, and eigenvectors are not changed. So what’s happened is
Complex Eigenvalues
Let’s have another example
This matrix rotates all vectors counterclockwise 90 degrees. If you imagine, there shall not be any vectors s.t. will still be parallel to the original vector because it rotates the vector. But things can still work out.
These two eigenvalues are complex number, and they are complex conjugate of each other, that is, their real parts are the same (zero) and imaginary part has different signs. In general, when we have symmetric matrix or close to symmetric, we will have real eigenvalues. But in this case when the matrix is way far from symmetric, it rotates the vector, we will have complex eigenvalues.
Triangular Matrix
Let
Then for will lead to
Then
If we try to solve , we will only have one eigenvector basis: