Markov Matrices; Fourier Series

June 10, 2019

Markov Matrices

This is a Markov matrix because

The key point for this matrix is

Let’s focus on why will always be a eigenvalue. If that’s true, then should be true, take the above matrix as example:

We may not see directly why the determinant is zero at a first glance, but for a matrix to be singular, it must have one or more dependent columns (or rows, in square matrix the # of dept. rows should equal to # of dept. cols). Markov matrices shall have all columns sum to 1. And here, after subtracting one, all columns from up to down sum to zero. This means that

So their rows are dependent. And is in the left null space . Therefore $\det(A-I)=0$, and we can conclude that there exists an eigenvalue $\lambda_1=1$.

Eigenvalues of $A$ same as Eigenvalues of $A^\top$

This is true because of our property 10 of determinant (link).

And . So the eigenvalues for is also true for .

Application

Say we have populations of California and Massachusetts:

And each year there’s a portion of people moving from California to Massachusetts, and the other way versa. And this portion is .9 of Cal people will stay, .1 will move; .8 of Mass people will stay and .2 will move. So we can construct the following equation:

Since it’s , we can use tricks to find eigenvalues. It’s a Markov Matrix, so one of its eigenvalue must be 1; since the trace of the matrix is 1.7, the other eigenvalue will be .7. We can find the eigenvector for , that will be . The other eigenvector is not that important if we are looking for the steady state, because its eigenvalue < 1 will vanish as k goes to infinity. Then from this vector we can tell the steady state will be (I actually don’t know why).

Fourier Series

Fourier series for a function is:

Any functions with period can be approximated by Fourier series with their corresponding coefficients and .

We say a vector or a function is in infinite dimensional Hilbert Space, if and only if it has, of course, infinite items, and has finite lengths and :

You’ve seen how the length of a vector defined. The length of a function is defined that way because

One good thing about Fourier series is it has each element/trig function in the series are orthogonal to each other, that is, the inner products: , except the inner product of the trig function itself.

Two functions continuous on are said to be orthogonal if and only if

With this, we can calculate the coefficients for the Fourier series for specific function , because, integrating both sides of (1):

Therefore,

Similar things can be done, after we first multiply both sides of (1) by either $\cos nt$ or $\sin nt$, this gives us:

How this correlates with Linear Algebra? First is the orthogonal trig functions in the series. They are not an orthogonal basis because it requires the they have length 1. But as long as we divide the length it would be. And you see the squared length is also just summing together. Hilbert space is a vector space as well. It’s easy to prove with the property of integration.

Remember an orthogonal matrix is a square matrix where its columns are orthonormal, that is

If we have a vector , where is a vector with constants, Q is a orthogonal matrix:

, then since is orthogonal. And .

Markov Matrices; Fourier Series - June 10, 2019 - Ruizhen Mai