๐ Course Materials
ยฉ๏ธ License CC BY 4.0
๐ | Code | ๐ | Worked Example |
๐ | Graph | ๐งฉ | Exercise |
๐ท๏ธ | Definition | ๐ป | Numerical Method |
๐ | Theorem | ๐งฎ | Analytical Method |
๐ | Remark | ๐ง | Theory |
โน๏ธ | Information | ๐๏ธ | Hint |
โ ๏ธ | Warning | ๐ | Solution |
Make sure that a system is โsaneโ (not โpathologicalโ):
Well-Posedness:
We will define and study each one in the sequel.
So far, we have mostly dealt with global solutions \(x(t)\) of IVPs, defined for any \(t \geq t_0\).
This concept is sometimes too stringent.
Consider the IVP
\[ \dot{x} = x^2, \; x(0)=1. \]
๐ค Ouch.
There is actually no global solution.
However there is a local solution \(x(t)\),
defined for \(t \in \left[t_0, \tau\right[\)
for some \(\tau > t_0\).
Indeed, the function \(\displaystyle x(t) := \frac{1}{1 - t}\) satisfies
\[ \dot{x}(t) = \frac{d}{dt} x(t) = -\frac{-1}{(1 - t)^2} = (x(t))^2 \] and \(x(0) = 1.\)
โ ๏ธ But itโs defined (continuously) only for \(t<1.\)
This local solution is also maximal:
You cannot extend this solution beyond \(\tau=1.0\).
A solution \(x: I \to \mathbb{R}^n\) of the IVP \[ \dot{x} = f(x), \; x(t_0) = x_0 \] is (forward and) local if \(I = \left[t_0, \tau\right[\) for some \(\tau\) such that \(t_0 < \tau \leq +\infty\).
A solution \(x: I \to \mathbb{R}^n\) of the IVP \[ \dot{x} = f(x), \; x(t_0) = x_0 \] is (forward and) global if \(I = \left[t_0, +\infty\right[\).
A (local) solution \(x :[0, \tau[\) to an IVP is maximal if there is no other solution
defined on \([0, \tau'[\) with \(\tau' > \tau\),
whose restriction to \([0, \tau[\) is \(x\).
Consider the IVP
\[ \dot{x} = x^2, \; x(0)=x_0 \neq 0. \]
Find a closed-formed local solution \(x(t)\) of the IVP.
๐๏ธ Hint: assume that \(x(t) \neq 0\) then compute
\[ \frac{d}{dt} \frac{1}{x(t)}. \]
Make sure that your solutions are maximal.
As long as \(x(t) \neq 0\),
\[ \frac{d}{dt} \frac{1}{x(t)} = - \frac{\dot{x}(t)}{x(t)^2} = 1. \]
By integration, this leads to
\[ \frac{1}{x(t)} - \frac{1}{x_0} = -t \]
and thus provides
\[ x(t) = \frac{1}{\frac{1}{x_0} - t} = \frac{x_0}{1 - x_0 t}. \]
which is indeed a solution as long as the denominator is not zero.
If \(x_0 < 0\), this solution is valid for all \(t\geq 0\) and thus maximal.
If \(x_0 > 0\), the solution is defined until \(t=1/x(0)\) where it blows up. Thus, this solution is also maximal.
Sometimes things get worse than simply having no global solution.
Consider the scalar IVP with initial value \(x(0) = (0,0)\) and right-hand side
\[ f(x_1,x_2) = \left| \begin{array}{rl} (+1,0) & \mbox{if } \; x_1< 0 \\ (-1,0) & \mbox{if } \; x_1 \geq 0. \end{array} \right. \]
This system has no solution, not even a local one, when \(x(0) = (0,0)\).
Assume that \(x: [0, \tau[ \to \mathbb{R}\) is a local solution.
Since \(\dot{x}(0) = -1 < 0\), for some small enough \(0 < \epsilon < \tau\) and any \(t \in \left]0, \epsilon\right]\), we have \(x(t) < 0\).
Consequently, \(\dot{x}(t) = +1\) and thus by integration
\[ x(\epsilon) = x(0) + \int_0^{\epsilon} \dot{x}(t) \, dt = \epsilon > 0, \]
which is a contradiction.
However, a local solution exists under very mild assumptions.
If \(f\) is continuous,
There is a (at least one) local solution to the IVP
\(\dot{x} = f(x)\) and \(x(t_0) = x_0\).
Any local solution on some \(\left[t_0, \tau \right[\) can be extended to a (at least one) maximal one on some \(\left[t_0, t_{\infty}\right[\).
๐ Note: a maximal solution is global iff \(t_{\infty} = +\infty\).
A solution on \(\left[t_0, \tau \right[\) is maximal if and only if either
\(\tau = +\infty\) : the solution is global, or
\(\tau < +\infty\) and \(\displaystyle \lim_{t \to \tau} \|x(t)\| = +\infty.\)
In plain words : a non-global solution cannot be extended further in time if and only if it โblows upโ.
Letโs assume that a local maximal solution exists.
You wonder if this solution is defined in \([t_0, t_f[\) or blows up before \(t_f\).
For example, you wonder if a solution is global (if \(t_f = +\infty\) or \(t_f < +\infty\).)
Task. Show that any solution which defined on some sub-interval \([t_0, \tau]\) with \(\tau < t_f\) would is bounded.
Then, no solution can be maximal on any such \([0, \tau[\) (since it doesnโt blow up !). Since a maximal solution does exist, its domain is \([0, t_{\infty}[\) with \(t_{\infty} \geq t_f\).
\(\Rightarrow\) a solution is defined on \([t_0, t_f[\).
Consider the dynamical system
\[ \dot{x} = \sigma(x) := \frac{1}{1 + e^{-x}}. \]
Show that there is a (at least one) maximal solution to each initial condition.
Show that any such solution is global.
The sigmoid function \(\sigma\) is continuous.
Consequently, ๐ Existence proves the existence of a (at least one) maximal solution.
Let \(x: \left[0, \tau \right[ \to \mathbb{R}\) be a maximal solution to the IVP. We have
\[ 0 \leq \dot{x}(t) = \sigma(x(t)) \leq 1, \; 0 \leq t < \tau \]
and by integration,
\[ |x(t)| \leq |x(0)| + t \]
Thus, it cannot blow-up in finite time; by ๐ Maximal Solutions, it is global.
Consider the pendulum, subject to a torque \(c\)
\[ ml^2 \ddot{\theta} + b \dot{\theta} + mg \ell \sin \theta = c(\theta, \dot{\theta}) \]
We assume that the torque provides a bounded power:
\[ P := c(\theta, \dot{\theta}) \dot{\theta} \leq P_M < +\infty. \]
Show that for any initial state, there is a global solution \((\theta, \dot{\theta})\).
๐๏ธ Hint. Compute the derivative with respect to \(t\) of
\[ E = \frac{1}{2} m\ell^2 \dot{\theta}^2 - m g \ell \cos \theta. \]
Since the system vector field
\[ (\theta, \dot{\theta}) \to \left( \dot{\theta}, (-b/m\ell^2) \dot{\theta} - (g / \ell) \sin \theta + c(\theta, \dot{\theta})/m\ell^2 \right) \]
is continuous, ๐ Existence yields the existence of a (at least one) maximal solution.
Additionally,
\[ \begin{split} \dot{E} &= \frac{d}{dt} \left( \frac{1}{2} m\ell^2 \dot{\theta}^2 - m g \ell \cos \theta \right) \\ &= -b \dot{\theta}^2 + c(\theta,\dot{\theta}) \dot{\theta} \\ &\leq P_M < +\infty. \end{split} \]
By integration
\[ E(t) = \frac{1}{2} m\ell^2 \dot{\theta}^2(t) - m g \ell \cos \theta(t) \leq E(0) + P_M t \]
Hence, since \(|\cos \theta(t)| \leq 1\),
\[ |\dot{\theta}(t)| \leq \sqrt{\frac{2E(0)}{m\ell^2} + \frac{2g}{\ell} +\frac{2P_M}{m\ell^2}t} \]
Thus, \(\dot{\theta}(t)\) cannot blow-up in finite time. Since
\[ |\theta(t)| \leq |\theta(0)| + \int_0^t |\dot{\theta}(s)| \, ds, \]
\(\theta(t)\) cannot blow-up in finite time either.
By ๐ Maximal Solutions, any maximal solution is global.
Let \(A \in \mathbb{R}^{n \times n}\).
Consider the dynamical system
\[ \dot{x} = A x , \; x \in \mathbb{R}^n. \]
Show that
\[ y(t) := \|x(t)\|^2 \]
is differentiable and satisfies
\[ \dot{y}(t) \leq 2\alpha y(t) \]
for some \(\alpha \geq 0\). ๐
Let
\[ z(t) := y(t) e^{-2\alpha t}. \]
Compute \(\dot{z}(t)\) and deduce that
\[ 0 \leq y(t) \leq y(0) e^{2\alpha t}. \]
Prove that for any initial state \(x(0) \in \mathbb{R}^n\) there is a corresponding global solution \(x(t)\). ๐
By definition of \(y(t)\) and since \(\dot{x}(t) = Ax(t)\),
\[ \begin{split} \dot{y}(t) &= \frac{d}{dt} \|x(t)\|^2 \\ &= \frac{d}{dt} x(t)^t x(t) \\ &= \dot{x}(t)^t x(t) + x(t)^t \dot{x}(t) \\ &= x(t)^t A^t x(t) + x(t)^t A x(t). \end{split} \]
Let \(\alpha\) denote the largest singular value of \(A\) (i.e.ย the operator norm \(\|A\|\)).
\[ \alpha := \sigma_{\rm max} (A) = \|A\|. \]
For any vector \(u \in \mathbb{R}^n\), we have \[ \|A u\| \leq \|A\| \|u\|. \]
By the triangle inequality and the Cauchy-Schwarz inequality, we obtain
\[ \begin{split} \dot{y}(t) &= \|x(t)^t A^t x(t) + x(t)^t A x(t)\| \\ &\leq \|(Ax(t))^t x(t)\| + \|x(t)^t (A x(t))\| \\ &\leq \|A x(t)\|\|x(t)\| + \|x(t)\|\|A x(t)\| \\ &\leq \|A\| \|x(t)\|\|x(t)\| + \|x(t)\|\|A\|\|x(t)\| \\ &= 2 \|A\| y(t) \\ \end{split} \]
and thus \(\dot{y}(t) \leq 2\alpha y(t)\) with \(\alpha := \|A\|.\)
Since \(y(t) = \|x(t)\|^2\), the inequality \(0 \leq y(t)\) is clear.
Since \(z(t) = y(t)e^{-2\alpha t}\),
\[ \begin{split} \dot{z}(t) & = \frac{d}{dt} y(t) e^{-2\alpha t} \\ & = \dot{y}(t) e^{-2\alpha t} + y(t) (-2\alpha e^{-\alpha t}) \\ & = (\dot{y}(t) - 2\alpha y(t)) e^{-2\alpha t} \\ & \leq 0. \end{split} \]
By integration
\[ \begin{split} y(t) e^{-2\alpha t} = z(t) & = z(0) + \int_0^t \dot{z}(s) \, ds \\ & \leq z(0) = y(0), \end{split} \]
hence
\[ y(t) \leq y(0) e^{2\alpha t}. \]
The vector field \[ x \in \mathbb{R}^n \to A x \] is continuous, thus by ๐ Existence there is a maximal solution \(x:\left[0, t_{\infty}\right[\) for any initial state \(x(0).\)
Moreover,
\[ \|x(t)\| = \sqrt{\|y(t)\|} \leq \sqrt{y(0) e^{2\alpha t}} = \|x(0)\| e^{\alpha t}. \]
Hence there is no finite-time blow-up and the maximal solution is global.
In the current context, uniqueness means uniqueness of the maximal solution to an IVP.
Uniqueness of solutions, even the maximal ones, is not granted either.
The IVP
\[\dot{x} = \sqrt{x}, \;x(0) = 0\]
has several maximal (global) solutions.
For any \(\tau \geq 0\), \(x_{\tau}\) is a solution:
\[ x_{\tau}(t) = \left| \begin{array}{ll} 0 & \mbox{if} \; t \leq \tau, \\ 1/4 \times (t-\tau)^2 & \mbox{if} \; t > \tau. \end{array} \right. \]
However, uniqueness of maximal solution holds under mild assumptions.
\[ x=(x_1, \dots, x_n), \;f(x) = (f_1(x), \dots, f_n(x)). \]
Jacobian matrix of \(f\):
\[ \frac{\partial f}{\partial x} := \left[ \begin{array}{ccc} \frac{\partial f_1}{\partial x_1} & \cdots & \frac{\partial f_1}{\partial x_n} \\ \vdots & \vdots & \vdots \\ \frac{\partial f_n}{\partial x_1} & \cdots & \frac{\partial f_n}{\partial x_n} \\ \end{array} \right] \]
If \(\partial f/\partial x\) exists and is continuous, the maximal solution is unique.
An infinitely small error in the initial value could result in a finite error in the solution, even in finite time.
That would severely undermine the utility of any approximation method.
Instead of denoting \(x(t)\) the solution, use \(x(t, x_0)\) to emphasize the dependency w.r.t. the initial state.
Continuity w.r.t. the initial state means that if \(x(t, x_0)\) is defined on \([t_0, \tau]\) and \(t\in [t_0, \tau]\):
\[ x(t, y) \to x(t, x_0) \; \mbox{when} \; y \to x_0 \]
and that this convergence is uniform w.r.t. \(t\).
However, continuity w.r.t. the initial value holds under mild assumptions.
Assume that \(\partial f / \partial x\) exists and is continuous.
Then the dynamical system is continous w.r.t. the initial state.
Let
\[ \begin{array}{rcl} \dot{x} &=& \alpha x - \beta xy \\ \dot{y} &=& \delta x y - \gamma y \\ \end{array} \]
with \(\alpha = 2/3\), \(\beta = 4/3\), \(\delta = \gamma = 1.0\).
Let \(h \geq 0\) and \(x^h(t)\) be the solution of the IVP
\[\dot{x} = x, \; x^h(0) = 1+ h.\]
Let \(\epsilon > 0\) and \(\tau \geq 0\).
Find the largest \(\delta > 0\) such that \(|h| < \delta\) ensures that \[\mbox{for any $t \in [t_0, \tau]$}, |x^{h}(t) - x^0(t)| \leq \epsilon\]
What is the behavior of \(\delta\) when \(\tau\) goes to infinity?
The solution \(x^h(t)\) to the IVP is \[ x^h(t) = (1+h) e^{t}. \] Hence, \[ |x^h(t) - x^0(t)| = |(1+h) e^{t} - e^{t}| = |h| e^{t} \] \[ \max_{t \in [0, \tau]} |x^h(t) - x^0(t)| = |h| e^{\tau}. \]
Thus, the smallest \(\delta\) such that \(|h| \leq \delta\) yields \[ \max_{t \in [0, \tau]} |x^h(t) - x^0(t)| \leq \epsilon. \] is \(\delta = \varepsilon e^{-\tau}.\)
For any \(\varepsilon > 0\), \[ \lim_{\tau \to +\infty} \delta = 0. \]
Consider the IVP \[\dot{x} = \sqrt{|x|}, \; x(0)=x_0 \in \mathbb{R}.\]
Solve numerically this IVP for \(t \in [0,1]\) and \(x_0 = 0\) and plot the result.
Then, solve it again for \(x_0 = 0.1\), \(x_0=0.01\), etc. and plot the results.
Does the solution seem to be continuous with respect to the initial value?
Explain this experimental result.
The solution does not seem to be continuous with respect to the initial value since the graph of the solution seems to have a limit when \(x_0 \to 0^+\), but this limit is different from \(x(t)= 0\) which is the numerical solution when \(x_0=0\).
The jacobian matrix of the vector field is not defined when \(x=0\), thus the continuity was not guaranted to begin with. Actually, uniqueness of the solution does not even hold here, see ๐ Non-Uniqueness. The function \(x(t)=0\) is valid when \(x_0=0\), but so is \[ x(t) = \frac{1}{4}t^2 \] and the numerical solution seems to converge to the second one when \(x_0 \to 0^+\).
Consider the system
\[ \begin{array}{rcl} \dot{x} &=& \alpha x - \beta xy \\ \dot{y} &=& \delta x y - \gamma y \\ \end{array} \]
where \(\alpha\), \(\beta\), \(\delta\) and \(\gamma\) are positive.
Prove that the system is well-posed.
Prove that all maximal solutions such that \(x(0) > 0\) and \(y(0) > 0\) are global and satify \(x(t)>0\) and \(y(t)>0\) for every \(t\geq 0\).
Hint ๐๏ธ. Compute the ODE satisfied by \(u=\ln x\) and \(v= \ln y\) and then the derivative w.r.t. time of \[ V := \delta e^u - \gamma u +\beta e^v - \alpha v. \]
The jacobian matrix of the system vector field \[ f(x, y)= (\alpha x - \beta xy, \delta x y - \gamma y) \] is defined and continuous: \[ \frac{\partial f}{\partial (x, y)} = \left[ \begin{array}{rr} \alpha -\beta y & - \beta x \\ \delta y & \delta x - \gamma \\ \end{array} \right] \] thus the sytem is well-posed.
The (continuously differentiable) change of variable \[ F: (x, y) \mapsto (u, v) := (\ln x, \ln y) \] is a bijection between \(\left]0, +\infty\right[^2\) and \(\mathbb{R}^2\).
Since \[ \frac{d}{dt} \ln x = \frac{\dot{x}}{x}, \; \frac{d}{dt} \ln y = \frac{\dot{y}}{y} \] the prey-predator ODE is equivalent to \[ \begin{array}{rcl} \dot{u} &=& \alpha - \beta e^v \\ \dot{v} &=& \delta e^u - \gamma \\ \end{array} \]
Accordingly, \[ \begin{split} \frac{d}{dt}{V} &= \delta e^u \dot{u} - \gamma \dot{u} +\beta e^v \dot{v} - \alpha \dot{v} \\ &= (\delta e^u - \gamma) \dot{u} + (\beta e^v - \alpha \dot{v}) \\ &= (\delta e^u - \gamma) (\alpha - \beta e^v) + (\beta e^v - \alpha) (\delta e^u - \gamma) \\ & =0 \end{split} \]
Therefore \(V(u(t), v(t))\) is constant.
Now, the function \[ \phi(u) := \delta e^u - \gamma u, \; \psi(v) := \beta e^v - \alpha v \] are continuous and \[ \lim_{|u| \to +\infty} \phi(u) = +\infty, \; \lim_{|v| \to +\infty} \phi(v) = +\infty. \] As \(V(u, v) = \phi(u) + \psi(v)\), \[ \lim_{\|(u, v)\| \to +\infty} V(u, v) = +\infty. \]
Consequently, since \(V(x(t), y(t))\) is constant, the solution \((u(t), v(t))\) cannot blow up (either in finite or infinite time).
Therefore the solution \((u(t), v(t))\) is global as is the solution in the original variables \((x(t), y(t))\).
Since \((x, y) = F^{-1}(u, v)\) and the domain of \(F\) is \(\left]0, +\infty\right[^2\), \(x(t)>0\) and \(y(t)>0\) for any \(t\geq 0\).