Theory of General Second-order Linear Homogeneous ODE's
June 8, 2019
(Book 2.1)
Recall a second order homogeneous linear ODE is:
Superposition principle
Let and be the solutions to the homogeneous ODE, and are constants. Then
is also a solution to the ODE.
Proof: (Since this is simple) If , then
And:
because and are solutions, their substitution into (1) at line 2 will result in 0. Thus also results in zero and it is a solution.
What this implies is that there’re “twofold infinity”
Existence and Uniqueness for Linear Equations
Let the second order inhomogeneous linear ODE is:
This equation has one and only one solution satisfying the initial conditions:
Eq. (1) is also called the associated homogeneous eq. for eq. (2). The proof is hard so I will leave it out here.
Def. Linearly Independent
Two functions defined on an open interval are said to be linearly independent if neither is a constant multiple of the other.
Wronskians
The Wronskian of f and g is
We can either write or , maybe I will write because are functions of . But if and are linearly dependent, that is, , then
This leads to if and are solutions to (1), then
- If and are linearly dependent, then
- If and are linearly independent, then
This leads to the theorem for general solutions of homogeneous equations.
General Solutions of Homogeneous Equations
Let and are two linearly independent solutions to (1),
Then if is any solutions whatsoever of (2), there must exist and s.t.
This is saying that if we can find two linearly independent solutions to a second order homogeneous solutions (2), then we found all solutions.
Proof: Assume there does not exist a linear combinations of that can constitute . Despite that, we set up the following simultaneous equations:
where are the unknowns. By Wronskian because are independent, we know that there must exist that satisfy such system of equations. Let such be the coefficients of the linear combinations:
We have the following, when is evaluated at ,
By Existence and Uniqueness theorem, there must be one and only one solution to (1) s.t.
therefore, since satisfies the condition and , and are the same function.
All these theorem can be generalized for higher order differential equations.
Polynomial Differential Operator
Let denotes the operation of differentiation w.r.t. to , so that
and so on. In terms of , we can define another operator s.t.
It’s probably useful to denote because generally we won’t have things other than to put in ( stands for linear which I will explain a bit later). And we will restrict our discussion to second order linear ODE. The benefit of differential operators is we can write the factorization of characteristic equation into the original DE, say if we have
its characteristic equation will be
The corresponding is
Also note that is a linear operator, this is saying
This is obvious because is linear s.t. and etc. One useful property is commute:
Repeated Real Roots
So we would like to use the differential operator to find the repeated roots. Let’s consider an arbitrary order linear ODE with constant coefficients:
Then it has a characteristic equation:
Suppose the characteristic equation can be factored into
where . Solving this equation will only give us two distinct roots and , but we need total independent roots. Similar to the factored characteristic equation, can also be factored:
Therefore our problem reduced to find the solutions for the kth-order equation:
The fact that suggests (??) that we can try substitute into the equation (3). Before that, observe
Therefore,
The only solution that satisfy is . This only happens when ’s derivative is constant at differentiation. This can happen when:
Note we don’t need to have t’s exponential goes to , but since we need independent solutions, this is the best we can have. Therefore,
When we have two repeated roots for second order linear ODE, :
The solution is