Objectives
To understand that the existence and uniqueness of solutions of differential equations have important implications.
To understand that the existence and uniqueness of solutions of differential equations have important implications.
If \(x' = f(t, x)\) and \(x(t_0) = x_0\) is a linear differential equation, we have already shown that a solution exists and is unique. We will now take up the question of existence and uniqueness of solutions for all first-order differential equations. The existence and uniqueness of solutions will prove to be very importanteven when we consider applications of differential equations.
The following theorem tells us that solutions to first-order differential equations exist and are unique under certain reasonable conditions.
Let \(x' = f(t, x)\) have the initial condition \(x(t_0) = x_0\text{.}\) If \(f\) and \(\partial f/ \partial x\) are continuous functions on the rectangle
there exists a unique solution \(u = u(t)\) for \(x' = f(t, x)\) and \(x(t_0) = x_0\) on some interval \(|t - t_0| \lt h\) contained in the interval \(|t - t_0| \lt a\text{.}\)
Let us examine some consequences of the existence and uniqueness of solutions.
Consider the initial value problem
In this case \(f(t,x) = \sin(tx)/(x^2 + t^2)\) is continuous at \((0,1)\) as is
Therefore, a solution to the initial value problem must exist. However, finding such a solution in terms of elementary functions may be quite difficult if not impossible.
Consider the initial value problem \(y' = y^{1/3}\) with \(y(0) = 0\) and \(t \geq 0\text{.}\) Separating the variables,
Thus,
or
If \(C = 0\text{,}\) the initial condition is satisfied and
is a solution for \(t \geq 0\text{.}\) However, we can find two additional solutions for \(t \geq 0\text{:}\)
This is especially troubling if we are looking for equilibrium solutions. Although \(y' = y^{1/3}\) is an autonomous differential equation, there is no equilibrium solution at \(y = 0\text{.}\) The problem is that
is not defined at \(y = 0\text{.}\)
Suppose that \(y' = y^2\) with \(y(0) = 1\text{.}\) Since \(f(t,y) = y^2\) and \(\partial f/ \partial y = 2y\) are continuous everywhere, a unique solution exists near \(t = 0\text{.}\) Separating the variables,
we see that
or
Therefore, a solution also exists on \((-\infty, 1)\) if \(y(0) = -1\text{.}\) In the case that \(y(0) = -1\text{,}\) the solution is
and a solution exists on \((-1, \infty)\text{.}\) Solutions are only guaranteed to exist on an open interval containing the initial value and are very dependent on the initial condition.
The Existence and Uniqueness Theorem tells us that the integral curves of any differential equation satisfying the appropriate hypothesis, cannot cross. If the curves did cross, we could take the point of intersection as the initial value for the differential equation. In this case, we would no longer be guaranteed a unique solution to a differential equation.
Which of the following initial value problems are guaranteed to have a unique solution by the Existence and Uniqueness Theorem (Theorem1.6.1)? In each case, justify your conclusion.
\(y' = 4 + y^3\text{,}\) \(y(0) = 1\)
\(y' = \sqrt{y}\text{,}\) \(y(1) = 0\)
\(y' = \sqrt{y}\text{,}\) \(y(1) = 1\)
\(x' = \dfrac{t}{x-2}\text{,}\) \(x(0) = 2\)
\(x' = \dfrac{t}{x-2}\text{,}\) \(x(2) = 0\)
\(y' = x \tan y\text{,}\) \(y(0) = 0\)
\(y' = \dfrac{1}{t} y + 2t\text{,}\) \(y(0) = 1\)
It was Emile Picard (18561941) who developed the method of successive approximations to show the existence of solutions of ordinary differential equations. He proved that it is possible to construct a sequence of functions that converges to a solution of the differential equation. One of the first steps towards understanding Picard iteration is to realize that an initial value problem can be recast in terms of an integral equation.
The function \(u = u(t)\) is a solution to the initial value problem
if and only if \(u\) is a solution to the integral equation
Suppose that \(u = u(t)\) is a solution to
on some interval \(I\) containing \(t_0\text{.}\) Since \(u\) is continuous on \(I\) and \(f\) is continuous on \(R\text{,}\) the function \(F(t) = f(t, u(t))\) is also continuous on \(I\text{.}\) Integrating both sides of \(u'(t) = f(t, u(t))\) and applying the Fundamental Theorem of Calculus, we obtain
Since \(u(t_0) = x_0\text{,}\) the function \(u\) is a solution of the integral equation.
Conversely, assume that
If we differentiate both sides of this equation, we obtain \(u'(t) = f(t, u(t))\text{.}\) Since
the initial condition is fulfilled.
To show the existence of a solution to the initial value problem
we will construct a sequence of functions, \(\{ u_n(t) \}\text{,}\) that will converge to a function \(u(t)\) that is a solution to the integral equation
We define the first function of the sequence using the initial condition,
We derive the next function in our sequence using the right-hand side of the integral equation,
Subsequent terms in the sequence can be defined recursively,
Our goal is to show that \(u_n(t) \rightarrow u(t)\) as \(n \rightarrow \infty\text{.}\) Furthermore, we need to show that \(u\) is the continuous, unique solution to our initial value problem. We will leave the proof of Picard's Theorem to a series of exercises (Exercise Group1.6.4.412), but let us see how this works by developing an example.
Consider the exponential growth equation,
We already know that the solution is \(x(t) = e^{kt}\text{.}\) We define the first few terms of our sequence \(\{ u_n(t) \}\) as follows:
The next term in the sequence is
and the \(n\)th term is
However, this is just the \(n\)th partial sum for the power series for \(u(t) = e^{kt}\text{,}\) which is what we expected.
Existence and uniqueness of solutions of differential equations has important implications. Let \(x' = f(t, x)\) have the initial condition \(x(t_0) = x_0\text{.}\) If \(f\) and \(\partial f/ \partial x\) are continuous functions on the rectangle
there exists a unique solution \(u = u(t)\) for \(x' = f(t, x)\) and \(x(t_0) = x_0\) on some interval \(|t - t_0| \lt h\) contained in the interval \(|t - t_0| \lt a\text{.}\) In particular,
Solutions are only guaranteed to exist locally.
Uniqueness is especially important when it comes to finding equilibrium solutions.
Uniqueness of solutions tells us that the integral curves for a differential equation cannot cross.
Explain Theorem1.6.1 in your own words.
The differential equations \(y' = \sqrt[5]{y}\) and \(y(0) = 0\) has two solutions, \(y(t) \equiv 0\) and \(y(t) = 5y^{6/5}/6\text{.}\) Why does this not contradict Theorem1.6.1?
Find an explicit solution to the initial value problem
Use your solution to determine the interval of existence.
Consider the initial value problem
Show that the constant function, \(y(t) \equiv 0\text{,}\) is a solution to the initial value problem.
Show that
is a solution for the initial value problem, where \(t_0\) is any real number. Hence, there exists an infinite number of solutions to the initial value problem.
Explain why this example does not contradict the Existence and Uniqueness Theorem.
(b) Make sure that the derivative of \(y(t)\) exists at \(t = t_0\text{.}\)
Consider the initial value problem
Use the fact that \(y' = 2ty + t\) is a first-order linear differential equation to find a solution to the initial value problem.
Let \(\phi_0(t) = 1\) and use Picard iteration to find \(\phi_n(t)\text{.}\)
Show that the sequence \(\{ \phi_n(t) \}\) converges to the exact solution that you found in part (a) as \(n \to \infty\text{.}\)
In Exercise Group1.6.4.412, prove the Existence and Uniqueness Theorem for first-order differential equations.
Use the Fundamental Theorem of Calculus to show that the function \(u = u(t)\) is a solution to the initial value problem
if and only if \(u\) is a solution to the integral equation
If \(\partial f/ \partial x\) is continuous on the rectangle
prove that there exists a \(K \gt 0\) such that
for all \((t, x_1)\) and \((t, x_2)\) in \(R\text{.}\)
Define the sequence \(\{ u_n \}\) by
Use the result of the previous exercise to show that
Show that there exists an \(M \gt 0\) such that
Show that
Use mathematical induction to show that
Since
we can view \(u_n(t)\) as a partial sum for the series
If we can show that this series converges absolutely, then our sequence will converge to a function \(u(t)\text{.}\) Show that
where \(h\) is the maximum distance between \((t_0, x_0)\) and the boundary of the rectangle \(R\text{.}\) Since \(|u_n(t) - u_{n -1}(t)| \to 0\text{,}\) we know that \(u_n(t)\) converges to a continuous function \(u(t)\) that solves our equation.1We must a theorem from advanced calculus here to ensure uniform continuity (see Exercise1.6.4.13). Any sequence of functions that converges uniformly, must converge to a continuous function.
To show uniqueness, assume that \(u(t)\) and \(v(t)\) are both solutions to
Show that
Define2A similar argument will work for \(t \leq t_0\text{.}\)
then \(\phi(t_0) = 0\) and \(\phi(t) \geq 0\) for \(t \geq t_0\text{.}\) Show that
Since
we know that
Use this fact to show that
Conclude that
for \(t \geq t_0\) or for all \(t \geq t_0\) and \(u(t) = v(t)\text{.}\)
Let \(\phi_n(x) = x^n\) for \(0 \leq x \leq 1\) and show that
This is an example of a sequence of continuous functions that does not converge to a continuous function, which helps explain the need for uniform continuity in the proof of the Existence and Uniqueness Theorem.