Elimination with Matrices

May 20, 2019

Elimination

A pivot in a matrix is the first non-zero element in each row. Later we have pivot row and pivot column and row echelon form. But let’s just assume this simple definition for now. We want the column position of a pivot is equal to its row position, i.e., we want the pivot in row 1 in column 1, in row 2 in column 2 etc. If we cannot find a pivot for each row, then it’s not a good sign.

So our task it to do operations between or among rows to achieve the ideal situation where the pivots are in their positions.

Exmaple:

Let’s care about matrix for now. I have labeled each row as . As the first number in is non-zero, the pivot for this row is found automatically. Then we find the pivot of second row. To do so, we do , and this results in

Then we found the second pivot by eliminating in . Note that the first element in is zero, so we don’t have to do extra steps to eliminate it. If it were non-zero, we still have to make it zero in order to find the third pivot . Right now to find the third pivot, we just need to do

where I coined the new as . Then finally we have found the final upper triangle matrix . This is the ideal form we want to achieve. But don’t forget that we also need to do same operations on , which leads to and this final constant vector we call it . The new equation is then . In general, we want to augment the coefficient matrix in case we forgot the steps to obtain , that is, to attach into it

Back Substitution

Right now our equations are

First we see from the last row. And substitute into the second row we get . And finally .

Matrix Operations

Before we start, note that multiplying a constant with any matrices is equal to multiply a diagonal matrix(one with only non-zero values in the diagonal line) with the constant lies on the diagonal:

where is the identity matrix. This is also true when we put on the right side of the matrix.

Column

This was mentioned on the preliminaries. When multiplying a vector on the right of a matrix, we are manipulating its columns:

Note each column is 3 x 1 so the final answer is a vector of 3 x 1. If it’s a matrix not a vector:

each column on the matrix to the right of multiplication is controlling the column in the product matrix.

Row

Row operations are similar:

Note the output vector is 1 x 3.

Thus if we want to reproduce the elimination on , we can do

the second row of the left matrix is producing the same effect of on row exchange on the original matrix. Let’s call this matrix because it anchors the r2c1 element(though I prefer to call it ). Learning from this, we can add another matrix to simulate the second elimination step in

Permutation

Inverse Preview

In summary, to express the whole operations, it is

By associative law,

But notice that we cannot mess around the order of the multiplication when multiplying matrices. So to find the elimination matrix which does the whole job of converting to , there’s two way, we can multiply with , or, the way the professor introduces, do Inverse. We can find the inverse. We will introduce the simple concept here.

To find the inverse of a matrix, we find the matrix that can offset its effect let’s say we want to find the inverse of

we offset the effect of substracting three by adding three of it back.

Elimination with Matrices - May 20, 2019 - Ruizhen Mai