๐ Documents (GitHub)
ยฉ๏ธ License CC BY 4.0
๐ | Code | ๐ | Worked Example |
๐ | Graph | ๐งฉ | Exercise |
๐ท๏ธ | Definition | ๐ป | Numerical Method |
๐ | Theorem | ๐งฎ | Analytical Method |
๐ | Remark | ๐ง | Theory |
โน๏ธ | Information | ๐๏ธ | Hint |
โ ๏ธ | Warning | ๐ | Solution |
We are interested in the behavior of the solution to
\[ \dot{x} = A x, \; x(0) = x_0 \in \mathbb{R}^n \]
First, we study some elementary systems in this class.
\[ \dot{x} = a x \]
\(a \in \mathbb{R}, \; x(0) = x_0 \in \mathbb{R}.\)
๐ Solution: \[ x(t) = e^{a t} x_0 \]
๐ Proof: \[ \frac{d}{dt} e^{at} x_0 = a e^{at} x_0 = a x(t) \] and \[ x(0) = e^{a \times 0} x_0 = x_0. \]
The origin is globally asymptotically stable when
\[ a < 0.0 \]
i.e.ย \(a\) is in the open left-hand plane.
Let the time constant \(\tau\) be
\[ \tau := 1 / |a|. \]
When the system is asymptotically stable,
\[ x(t) = e^{-t/\tau} x_0. \]
\(\tau\) controls the speed of convergence to the origin:
time \(t\) | distance to the origin \(|x(t)|\) |
---|---|
\(0\) | \(|x(0)|\) |
\(\tau\) | \(\simeq (1/3) |x(0)|\) |
\(3\tau\) | \(\simeq (5/100) |x(0)|\) |
\(\vdots\) | \(\vdots\) |
\(+\infty\) | \(0\) |
\[ \dot{x}_1 = a_1 x_1, \; x_1(0) = x_{10} \]
\[ \dot{x}_2 = a_2 x_2, \; x_2(0) = x_{20} \]
i.e.
\[ A = \left[ \begin{array}{cc} a_1 & 0 \\ 0 & a_2 \end{array} \right] \]
๐ Solution: by linearity
\[ x(t) = e^{a_1 t} \left[\begin{array}{c} x_{10} \\ 0 \end{array}\right] + e^{a_2 t} \left[\begin{array}{c} 0 \\ x_{20} \end{array}\right] \]
a1 = -1.0; a2 = 2.0; x10 = x20 = 1.0
figure()
t = linspace(0.0, 3.0, 1000)
x1 = exp(a1*t)*x10; x2 = exp(a2*t)*x20
xn = sqrt(x1**2 + x2**2)
plot(t, xn , "k")
plot(t, x1, "k--")
plot(t, x2 , "k--")
xlabel("$t$"); ylabel("$\|x(t)\|$"); title(f"$a_1={a1}, \; a_2={a2}$")
grid(); axis([0.0, 2.0, 0.0, 10.0])
a1 = -1.0; a2 = -2.0; x10 = x20 = 1.0
figure()
t = linspace(0.0, 3.0, 1000)
x1 = exp(a1*t)*x10; x2 = exp(a2*t)*x20
xn = sqrt(x1**2 + x2**2)
plot(t, xn , "k")
plot(t, x1, "k--")
plot(t, x2 , "k--")
xlabel("$t$"); ylabel("$\|x(t)\|$"); title(f"$a_1={a1}, \; a_2={a2}$")
grid(); axis([0.0, 2.0, 0.0, 10.0])
๐ The rightmost \(a_i\) determines the asymptotic behavior,
๐ The origin is globally asymptotically stable if and only if
every \(a_i\) is in the open left-hand plane.
\[ \dot{x} = a x \]
\(a \in \mathbb{C}\), \(x(0) = x_0 \in \mathbb{C}\).
๐ Solution: formally, the same old solution
\[ x(t) = e^{at} x_0 \]
But now, \(x(t) \in \mathbb{C}\):
if \(a = \sigma + i \omega\) and \(x_0 = |x_0| e^{i \angle x_0}\)
\[ |x(t)| = |x_0| e^{\sigma t} \, \mbox{ and } \, \angle x(t) = \angle x_0 + \omega t. \]
๐ The origin is globally asymptotically stable iff
\(a\) is in the open left-hand plane: \(\Re (a) < 0\).
๐ If \(a =: \sigma + i \omega\),
๐ท๏ธ \(\tau = 1 /|\sigma|\) is the time constant.
๐ท๏ธ \(\omega\) the rotational frequency of the oscillations.
If \(M \in \mathbb{C}^{n \times n}\), its exponential is defined as:
\[ e^{M} = \sum_{k=0}^{+\infty} \frac{M^k}{k !} \in \mathbb{C}^{n \times n} \]
The exponential of a matrix \(M\) is not the matrix with elements \(e^{M_{ij}}\) (the elementwise exponential).
๐ elementwise exponential: exp
(numpy
module),
๐ exponential: expm
(scipy.linalg
module).
Let
\[ M = \left[ \begin{array}{cc} 0 & 1 \\ 1 & 0 \end{array} \right] \]
Compute the exponential of \(M\).
๐๏ธ Hint:
\[ \cosh x := \frac{e^x + e^{-x}}{2}, \; \sinh x := \frac{e^x - e^{-x}}{2}. \]
Compute numerically:
exp(M)
(numpy
)
expm(M)
(scipy.linalg
)
and check the results consistency.
We have
\[ M = \left[ \begin{array}{cc} 0 & 1 \\ 1 & 0 \end{array} \right], \; M^2 = \left[ \begin{array}{cc} 1 & 0 \\ 0 & 1 \end{array} \right] = I \]
and hence for any \(j \in \mathbb{N}\),
\[ M^{2j+1} = \left[ \begin{array}{cc} 0 & 1 \\ 1 & 0 \end{array} \right], \; M^{2j} = \left[ \begin{array}{cc} 1 & 0 \\ 0 & 1 \end{array} \right] = I. \]
\[ \begin{split} e^{M} &= \sum_{k=0}^{+\infty} \frac{M^k}{k !} \\ &= \left( \sum_{j=0}^{+\infty} \frac{1}{(2j) !} \right)I + \left(\sum_{j=0}^{+\infty} \frac{1}{(2j+1) !} \right)M \\ &= \left(\sum_{k=0}^{+\infty} \frac{1^k + (-1)^k}{2(k !)} \right)I + \left(\sum_{k=0}^{+\infty} \frac{1^k - (-1)^k}{2(k !)} \right)M \\ &= (\cosh 1)I + (\sinh 1)M \end{split} \]
Thus,
\[ e^M = \left[ \begin{array}{cc} \cosh 1 & \sinh 1 \\ \sinh 1 & \cosh 1 \end{array} \right]. \]
>>> M = [[0.0, 1.0], [1.0, 0.0]]
>>> exp(M)
array([[1. , 2.71828183],
[2.71828183, 1. ]])
>>> expm(M)
array([[1.54308063, 1.17520119],
[1.17520119, 1.54308063]])
These results are consistent:
>>> array([[exp(0.0), exp(1.0)],
... [exp(1.0), exp(0.0)]])
array([[1. , 2.71828183],
[2.71828183, 1. ]])
>>> array([[cosh(1.0), sinh(1.0)],
... [sinh(1.0), cosh(1.0)]])
array([[1.54308063, 1.17520119],
[1.17520119, 1.54308063]])
Note that
\[ \begin{align} \frac{d}{dt} e^{A t} &= \frac{d}{dt} \sum_{n=0}^{+\infty} \frac{A^n}{n!} t^n \\ &= \sum_{n=1}^{+\infty} \frac{A^{n}}{(n-1)!} t^{n-1} \\ &= A \sum_{n=1}^{+\infty} \frac{A^{n-1}}{(n-1)!} t^{n-1} = A e^{A t} \end{align} \]
Thus, for any \(A \in \mathbb{C}^{n\times n}\) and \(x_0 \in \mathbb{C}^n\),
\[ \frac{d}{dt} (e^{A t} x_0) = A (e^{At} x_0) \]
The solution of
\[ \dot{x} = A x\; \mbox{ and } \; x(0) = x_0 \]
is
\[ x(t) = e^{A t} x_0. \]
๐ For any dynamical system, if the origin is a globally asymptotically stable equilibrium, then it is a locally attractive equilbrium.
๐ For linear systems, the converse result also holds.
๐ Letโs prove this!
Show that for any linear system \(\dot{x} = A x\), if the origin is locally attractive, then it is also globally attractive.
Show that linear system \(\dot{x} = A x\), if the origin is globally attractive, then it is also globally asymptotically stable.
๐๏ธ Hint: Consider the solutions \(e_k(t) := e^{At}e_k\) associated to \(e_k(0)=e_k\) where \((e_1, \dots, e_n)\) is the canonical basis of the state space.
If the origin is locally attractive, then there is a \(\varepsilon > 0\) such that for any \(x_0 \in \mathbb{R}^n\) such that \(\|x_0\| \leq \varepsilon\),
\[ \lim_{t \to +\infty} e^{At} x_0 = 0. \]
Now, let any \(x_0 \in \mathbb{R}^n\). Since the norm of \(\varepsilon x_0 /\|x_0\|\) is \(\varepsilon\), and by linearity of \(e^{At}\), we obtain
\[ \begin{split} \lim_{t \to +\infty} e^{At} x_0 &= \lim_{t \to +\infty} e^{At} \left(\frac{\|x_0\|}{\varepsilon} \varepsilon \frac{x_0}{\|x_0\|} \right) \\ &= \frac{\|x_0\|}{\varepsilon} \lim_{t \to +\infty} e^{At} \left(\varepsilon \frac{x_0}{\|x_0\|} \right)\\ &= 0. \end{split} \]
Thus the origin is globally attractive.
Let \(X_0\) be a bounded set of \(\mathbb{R}^n\). Since \[ x_0 = \sum_{k=1}^n x_{0k} e_k, \]
the solution \(x(t)\) of \(\dot{x}=Ax\), \(x(0)=x_0\) satisfies \[ \begin{split} x(t) &= e^{At} x_0 = e^{At} \left(\sum_{k=1}^n x_{0k} e_k\right) = \sum_{k=1}^n x_{0k} e^{At} e_k. \end{split} \]
\[ \begin{split} \|x(t)\| &= \left\| \sum_{k=1}^n x_{0k} e^{At} e_k \right\| \\ &\leq \sum_{k=1}^n |x_{0k}| \left\| e^{At} e_k \right\| \\ &= \sum_{k=1}^n |x_{0k}| \left\| e_k(t) \right\| \\ &\leq \left(\sum_{k=1}^n |x_{0k}|\right) \max_{k=1, \dots, n} \|e_k(t)\| \end{split} \]
Since \(X_0\) is bounded, there is a \(\alpha > 0\) such that for any \(x_0 = (x_{01}, \dots, x_{0n})\) in \(X_0\),
\[ \|x_0\|_1 := \sum_{k=1}^n |x_{0k}| \leq \alpha. \]
Since for every \(k=1, \dots, n\), \(\lim_{t\to +\infty} \|e_k(t)\| =0\),
\[ \lim_{t\to +\infty} \max_{k=1, \dots, n} \|e_k(t)\| =0. \]
Finally
\[ \begin{split} \|x(t, x_0)\| &\leq \left(\sum_{k=1}^n |x_{0k}| \right) \max_{k=1, \dots, n} \|e_k(t)\| \\ &\leq \alpha \max_{k=1, \dots, n} \|e_k(t)\| \\ \end{split} \]
Thus \(\|x(t, x_0)\| \to 0\) when \(t \to \infty\), uniformly w.r.t. \(x_0 \in X_0\). In other words, the origin is globally asymptotically stable.
Let \(A \in \mathbb{C}^n\). If \(x \neq 0 \in \mathbb{C}^n\), \(s\in \mathbb{C}\) and
\[ A x = s x \]
\(x\) is an eigenvector of \(A\), \(s\) is an eigenvalue of \(A\).
The spectrum of \(A\) is the set of its eigenvalues.
It is characterized by:
\[ \sigma(A) := \{s \in \mathbb{C} \; | \; \det (sI -A) = 0\}. \]
Consider the system \(\dot{x} = Ax\).
a mode of the system is an eigenvector of \(A\),
a pole of the system is an eigenvalue of \(A\).
Let \(A \in \mathbb{C}^{n \times n}\).
The origin of \(\dot{x} = A x\) is globally asymptotically stable
\[\Longleftrightarrow\]
all eigenvalues of \(A\) have a negative real part.
\[\Longleftrightarrow\]
\[ \max \{\Re \, s \; | \; s \in \sigma(A)\} < 0. \]
Assume that:
(๐ very likely unless \(A\) has some special structure.)
Let \(\sigma(A) = \{\lambda_1, \dots, \lambda_n\}.\)
There is an invertible matrix \(P \in \mathbb{C}^{n \times n}\) such that
\[ P^{-1} A P = \mathrm{diag}(\lambda_1, \dots, \lambda_n) = \left[ \begin{array}{cccc} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \vdots & \vdots \\ 0 & \cdots & \cdots & \lambda_n \end{array} \right] \]
Thus, if \(y = P^{-1} x\), \(\dot{x} = A x\) is equivalent to
\[ \left| \begin{array}{ccc} \dot{y}_1 &=& \lambda_1 y_1 \\ \dot{y}_2 &=& \lambda_2 y_2 \\ \vdots &=& \vdots \\ \dot{y}_n &=& \lambda_n y_n \\ \end{array} \right. \]
The system is G.A.S. iff each component of the system is, which holds iff \(\Re \lambda_i < 0\) for each \(i\).
Consider the scalar ODE
\[ \ddot{x} + k x = 0, \; \mbox{ with } k > 0 \]
Represent this system as a first-order ODE.
Is this system asymptotically stable?
Do the solutions have oscillatory components?
Find the set of associated rotational frequencies.
Same set of questions (1., 2., 3.) for
\[ \ddot{x} + b \dot{x} + k x = 0 \]
when \(b>0\).
\[ \frac{d}{dt} \left[ \begin{array}{c} x \\ \dot{x} \end{array} \right] = \left[ \begin{array}{rr} 0 & 1 \\ -k & 0 \end{array} \right] \left[ \begin{array}{c} x \\ \dot{x} \end{array} \right] = A \left[ \begin{array}{c} x \\ \dot{x} \end{array} \right] \]
We have
\[ \max \{\Re \, s \; | \; s \in \sigma(A)\} = 0, \]
hence the system is not globally asymptotically stable.
Since
\[ \det (sI -A) = \det \left( \begin{array}{rr} s & -1 \\ k & s \end{array} \right) =s^2 +k, \]
the spectrum of \(A\) is
\[ \sigma (A) = \{s \in \mathbb{C} \; | \; \det (sI -A) = 0\} = \left\{ i\sqrt{k}, -i \sqrt{k} \right\}. \]
The system poles are \(\pm i\sqrt{k}\).
The general solution \(x(t)\) can be decomposed as \[ x(t) = x_+ e^{i\sqrt{k} t} + x_- e^{-i\sqrt{k}t}. \]
Thus the components of \(x(t)\) oscillate at the rotational frequency
\[ \omega = \sqrt{k}. \]
\[ \frac{d}{dt} \left[ \begin{array}{c} x \\ \dot{x} \end{array} \right] = \left[ \begin{array}{rr} 0 & 1 \\ -k & -b \end{array} \right] \left[ \begin{array}{c} x \\ \dot{x} \end{array} \right] = A \left[ \begin{array}{c} x \\ \dot{x} \end{array} \right] \]
\[ \det (sI -A) = \det \left( \begin{array}{rr} s & -1 \\ k & s +b \end{array} \right) =s^2 + bs+ k, \]
Let \(\Delta := b^2 - 4k\). If \(b \geq 2\sqrt{k}\), then \(\Delta \geq 0\) and
\[ \sigma(A) = \left\{ \frac{-b +\sqrt{\Delta}}{2}, \frac{-b - \sqrt{\Delta}}{2} \right\}. \]
Otherwise,
\[ \sigma(A) = \left\{ \frac{-b + i \sqrt{-\Delta}}{2}, \frac{-b - i \sqrt{-\Delta}}{2} \right\}. \]
Thus, if \(b \geq 2\sqrt{k}\),
\[ \max \{\Re \, s \; | \; s \in \sigma(A)\} = \frac{-b +\sqrt{b^2 - 4k}}{2} < 0 \]
and otherwise
\[ \max \{\Re \, s \; | \; s \in \sigma(A)\} = -\frac{b}{2} < 0. \]
In each case, the system is globally asymptotically stable.
If \(b \geq 2\sqrt{k}\), the poles are real-valued; the components of the solution do not oscillate.
If \(0 < b < 2\sqrt{k}\), the imaginary part of the poles is
\[ \pm \frac{\sqrt{4k - b^2}}2 = \pm \sqrt{k - (b/2)^2}, \]
thus the solution components oscillate at the rotational frequency
\[ \omega = \sqrt{k - (b/2)^2}. \]
Consider the system
\[ \dot{x} = J x \; \mbox{ with } \; J = \left[ \begin{array}{cccc} 0 & 1 & 0 & \cdots & 0 \\ 0 & 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & 1 \\ 0 & 0 & 0 & \cdots & 0 \end{array} \right] \]
Compute the solution \(x(t)\) when
\[ x(0) = \left[ \begin{array}{c} 0 \\ \vdots \\ 0 \\ 1 \end{array} \right]. \]
Compute the solution for an arbitrary \(x(0)\)
\[ x(0) = \left[ \begin{array}{c} x_{1}(0) \\ \vdots \\ \vdots \\ x_{n}(0) \end{array} \right].\]
Same questions for the system
\[ \dot{x} = (\lambda I + J)x \]
for some \(\lambda \in \mathbb{C}\).
๐๏ธ Hint: Find the ODE satisfied by \(y(t):= x(t)e^{-\lambda t}\).
Is the system asymptotically stable ?
Why does the stability analysis of this system matter ?
Let \(x=(x_1, \dots, x_n)\).
The ODE \(\dot{x} = J x\) is equivalent to: \[ \begin{array}{lcl} \dot{x}_1 &=& x_2 \\ \dot{x}_2 &=& x_3 \\ \vdots &\vdots & \vdots \\ \dot{x}_{n-1} &=& x_n \\ \dot{x}_n &=& 0. \end{array} \]
When \(x(0) = (0, \dots, 0, 1)\),
\(\dot{x}_n=0\) yields \(x_n(t) = 1\), then
\(\dot{x}_{n-1}=x_n\) yields \(x_{n-1}(t) = t\),
โฆ
\(\dot{x}_{k} = x_{k+1}\) yields \[ x_{k}(t) = \frac{t^{n-k}}{(n-k)!}. \]
To summarize: \[ x(t) = \left[ \begin{array}{c} t^{n-1} / (n-1)! \\ \vdots \\ t^{n-1-k}/(n-1-k)! \\ \vdots \\ t \\ 1 \end{array} \right] \]
We note that
\[ x(0) = x_1(0) \left[ \begin{array}{c} 1 \\ \vdots \\ 0 \\ 0 \end{array} \right] + \dots + x_{n-1}(0) \left[ \begin{array}{c} 0 \\ \vdots \\ 1 \\ 0 \end{array} \right] + x_n(0) \left[ \begin{array}{c} 0 \\ \vdots \\ 0 \\ 1 \end{array} \right].\]
Similarly to the previous question, we find that:
\[ x(0) = \left[ \begin{array}{c} 0 \\ \vdots \\ 0 \\ 1 \\ 0 \end{array} \right] \; \to \; x(t) = \left[ \begin{array}{c} t^{n-2} / (n-2)! \\ \vdots \\ t \\ 1 \\ 0 \end{array} \right] \]
\[ x(0) = \left[ \begin{array}{c} 0 \\ \vdots \\ 1 \\ 0 \\ 0 \end{array} \right] \; \to \; x(t) = \left[ \begin{array}{c} t^{n-3} / (n-3)! \\ \vdots \\ 1 \\ 0 \\ 0 \end{array} \right] \]
And more generally, by linearity:
\[ x(t) = \left[ \begin{array}{c} \displaystyle x_1(0) + \dots + x_{n-1}(0) \frac{t^{n-2}}{(n-2)!} + x_n(0) \frac{t^{n-1}}{(n-1)!} \\ \vdots \\ \displaystyle x_{n-2}(0) + x_{n-1}(0) t + x_{n}(0) \frac{t^2}{2} \\ x_{n-1}(0) + x_n(0) t \\ x_n(0) \end{array} \right]. \]
If \(\dot{x}(t) = (\lambda I + J) x(t)\) and \(y(t) = x(t)e^{-\lambda t}\), then
\[ \begin{split} \dot{y}(t) &= \dot{x}(t) e^{-\lambda t} + x(t) (-\lambda e^{-\lambda t}) \\ &= (\lambda I + J)x(t) e^{-\lambda t} - \lambda I x(t) e^{-\lambda t} \\ &= J x(t) e^{-\lambda t} \\ &= J y(t). \end{split} \]
Since \(y(0) = x(0) e^{-\lambda 0} = x(0)\) we get
\[ x(t) = \left[ \begin{array}{c} \displaystyle x_1(0) + \dots + x_n(0) \frac{t^{n-1}}{(n-1)!} \\ \vdots \\ x_{n-1}(0) + x_n(0) t \\ x_n(0) \end{array} \right] e^{\lambda t}. \]
The structure of \(x(t)\) shows that
If \(\Re \lambda < 0\), then the system is asymptotically stable.
If \(\Re \lambda \geq 0\), then the system is not.
For example when \(x(0)= (1, 0, \dots, 0)\), we have \[ x(t) = (1, 0, \dots, 0). \]
Every square complex matrix \(A\), even if it is not diagonalizable, can be decomposed into a block-diagonal matrix where each block has the structure \(\lambda I + J\).
Thus, the result of the previous question allows to prove the ๐ Stability Criteria in the general case.