17. Linear Time-Invariant Systems#
A linear time invariant system is governed by a linear ODE with constant coefficients. These equations can be recast as a first order system of ODEs,
Exploiting the properties of matrix exponential the general expression of the state can be written as the sum of the free response to initial condition and the forced response.
Proof in time domain
Multipying by \(e^{-\mathbf{A} t}\),
and integrating from \(0^-\) to a generic time value \(t\),
The state \(\mathbf{x}(t)\) can be written as the sum of the free response and a force response. The general expression of the state and the output as a function reads
Laplace domain. The Laplace transform of the problem reads
Performing inverse Laplace transform allows to go back to time domain (just use Laplace inverse transform of a matrix exponential, and the formula (13.1) for Laplace transform of convolution).
17.1. Impulsive force#
The effect of an impulsive force at time \(t=0\) is equivalent to an instantaneous change in the initial state, from time \(0^-\) before the impulse to time \(0^+\) after the impulse. Splitting the input \(\mathbf{u}(t)\) as the sum of impulsive input and regular input,
the solution in time and Laplace domain reads
17.2. Properties#
Matrix exponential.
Assuming it’s possible swap derivative operator and summation (when?), it’s possible to write
Laplace transform of exponential matrix.
for all the values of \(s\) for which \(-s\mathbf{I} + \mathbf{A}\) is asymptotically stable, i.e. has all the eignevalues (thus, assuming that the matrix \(\mathbf{A}\) can be diagonalizable. What happens if not? Exploit other matrix decompositions to draw conclusions) with negative real parts, and thus for all the values of \(s > \max \text{re}\{ s_k(\mathbf{A}) \}\), as it’s shown in Example 17.1
Example 17.1 (Asymptotic stability of a matrix \(\ \mathbf{A}\))
An \(N \times N\) diagnonalizable matrix \(\mathbf{A}\),
has all the eigenvalues with negative real part, \(\text{re}\left\{ s_k \right\} < 0\), \(\forall k=1:N\).
The eigenvalues of a matrix \(a \mathbf{I} + \mathbf{A}\) are \(a + s_k\), while the eigenvectors are the same as those of the matrix \(\mathbf{A}\). This can be easily proved adding \(a \mathbf{I} \mathbf{v}_k\) to both sides of equation (17.1),
Transform of the convolution.