5.1. Sensitivity of spectral decomposition#
Matrices involved in a eigenvalue problem can be function of some parameters \(p\). Sensitivity analysis evaluates first-order changes1 of eigenvalues and eigenvectors following a increment of the parameter \(p = \overline{p} + \Delta p\),
Here the terms \(s_{i/p}(\overline{p})\) and \(\mathbf{u}_{i/p}(\overline{p})\) are the sensitivity of the \(i^{th}\) eigenvalue and eigenvector respectively w.r.t. to an increment \(\Delta p\) of the variable \(p\), starting from the reference configuration determined by the value \(\overline{p}\) of the parameter.
Rank of sensitivity. Sensitivity of eigenvalue to a scalar parameter is a scalar quantity. Sensitivity of eivenvector to a scalar parameter is a vector quantity. If the parameter \(\mathbf{p}\) is a vector quantity, sensitivity of eigenvalue w.r.t. \(\mathbf{p}\) is a vector quantity, and sensitivity of eigenvector w.r.t. \(\mathbf{p}\) is a matrix (or tensor, sometimes…) quantity.
5.1.1. Generalized eigenvalue problem (first order)#
Sensitivity of eigenvalue and right eigenvector to a parameter \(p\) of matrices \(\mathbf{A}(p)\), \(\mathbf{B}(p)\) read
having exploited normalization condition \(\mathbf{w}_i^* \mathbf{B} \mathbf{u}_i = 1\), being \(\mathbf{u}_i\) and \(\mathbf{w}_i\) the2 right and left eigenvectors associated with eigenvalue \(s_i\).
5.1.1.1. Right and left eigenvalue problem#
Right and left eigenvalue problems are defined as
Property 5.1 (Eigenvalues of the right and left spectral problems)
Left and right eigenvalue problems have the same set of eigenvalues.
Proof
todo
Property 5.2 (Orthogonality conditions of right and left spectral problems)
Left and right eigenvectors associated with different eigenvalues are orthogonal w.r.t. the matrices of the system, i.e.
with \(a^{(i)} = s_i b^{(i)}\). Parameters \(a^{(i)}\), \(b^{(i)}\) are not uniquely determined, and one of them (or their combination) can be used in a normalization condition removing arbitrarieness of eigenvectors to a multiplicative factor.
Proof
Starting from the left and right eigenvalue problems (5.2) for two different eigenvalues \(s_i \ne s_j\), and multiplying (on the left) the right eigenvalue problem by \(\mathbf{v}_j^*\) and multiplying (on the right) the left eigenvalue problem by \(\mathbf{u}_i\)
and subracting the two equations
as the left-hand-side terms are the same, and \(s_i \ne s_j\). As (5.4) holds, from (5.3) it also follows that
These relations don’t hold for left and right eigenvectors associated with the same eigenvalues, and thus these conditions can be used as normalization conditions.
5.1.1.2. Sensitivity of eigenvalue#
The derivative w.r.t. parameter \(p\) of the right eigenvalue problem
reads
Multiplying on the left by the left eigenvector \(\mathbf{w}_i\), and recalling that the left eigenvalue problem reads \(0 = \mathbf{w}_i^* \left( \mathbf{A} - s_i \mathbf{B} \right)\), the last term is identically zero and
so that it’s possible to find the expression of the eigenvalue sensitivity as
where the last step exploits the normalization condition \(\mathbf{w}_i^* \mathbf{B} \mathbf{u}_i = b^{(i)} = 1\).
5.1.1.3. Sensitivity of eigenvector#
The sensitivity of the eigenvector \(\mathbf{u}_{i/p}\) is the solution of the linear system that can be obtained from the derivative of the eigenvalue problem,
once the sensitivity of the eigenvector \(s_{i/p}\) is known from expression (5.6).
Property 5.3 (Linear system is singular, \( \text{K}(A) = \{ \mathbf{u}_i \}\))
Linear system is singular as \(s_i\) is an eigenvalue of the problem, \(|\mathbf{A} - s_i \mathbf{B}| = 0\), and the kernel of the matrix is the space spanned by the right eigenvectors associated with \(s_i\), as
As the matrix of the linear system is singular, the linear system has no solution if the right-hand side is not in the range of the matrix of the system, i.e. \(\mathbf{b}_i \notin \text{R}(\mathbf{A}-s_i \mathbf{B})\).
Fortunately, the RHS belongs to the range of the matrix, \(\mathbf{b}_i \in \text{R}(\mathbf{A}- s_i \mathbf{B})\), as it’s proved in the box below.
Property 5.4 (\(\ \mathbf{b}_i \in \text{R}(\mathbf{A} - s_i \mathbf{B})\))
If \(\mathbf{b}_i\) belongs to \(\text{R}(\mathbf{A} - s_i \mathbf{B})\), it’s orthogonal to \(\text{K}(\mathbf{A}^* - s_i^* \mathbf{B}^*)\) and viceversa, by theorem … todo add link
Kernel of \(\mathbf{A}^* - s_i^* \mathbf{B}^*\) is spanned by the left eigenvectors \(\mathbf{v}_i\). Thus, using expression of the eigenvalue sensitivity in evaluating the product
it’s readily proved that \(\mathbf{v}^*_i \mathbf{b}_i = 0\). todo treat the case where eigenvalue multiplicity is larger than 1.
As the singular linear system has at least a solution, it has infinite solution as the kernel of the matrix is not empty. If \(\widetilde{\mathbf{u}_{/p}}\) is a solution of the system, adding a linear combination of the vectors of \(\text{K}(\mathbf{A} - s_i \mathbf{B}) = \{ \mathbf{u}_i \}\) produces a solution as well,
as \((\mathbf{A} - s_i \mathbf{B}) \mathbf{U}_i = \mathbf{0}\), being \(\mathbf{U}_i\) the matrix whose columns are the right eigenvectors \(\mathbf{u}_i\) associated with eigenvalue \(s_i\), spanning the kernel of the matrix.
In order to remove this arbitrariness, it’s possible to introduce the orthogonality condition \(\mathbf{V}_i^* \mathbf{B} \widetilde{\mathbf{u}_{i/p}} = \mathbf{0}\), setting the coefficients of the linear combination of the vectors of the kernel \(\boldsymbol\beta = \mathbf{0}\).
Property 5.5 (Augmented linear system)
If everything goes right, the coefficients \(\boldsymbol{\beta}\) must be zero.
Property 5.6 (Decomposition of the soluton using the modal basis)
Writing the solution as a linear combination of the right eigenvectors,
it’s possible to solve for \(\boldsymbol\alpha\) and for \(\boldsymbol\beta\) and then retrieve the solution of the underdetermined linear system. Projecting the linear system on \(\mathbf{V}_{\notin i}\), and prescibing the orthogonality condition \(\mathbf{V}_i^* \mathbf{B} \widetilde{\mathbf{u}_{i/p}} = \mathbf{0}\) on the solution, two decoupled sets of equations for \(\boldsymbol\alpha\) and \(\boldsymbol\beta\) appears,
as \(\left( \mathbf{A} - s_i \mathbf{B} \right) \mathbf{U}_i = \mathbf{0}\) and \(\mathbf{V}_{i}^* \mathbf{B} \mathbf{U}_{\notin i} = \mathbf{0}\). Exploiting diagonalization properties, the system can be written as two uncoupled diagonal systems
or, exploiting normalization condition \(\mathbf{v}_i^* \mathbf{B} \mathbf{u}_i = 1\), and thus \(b^{(j)} = 1\) and \(a^{(j)} = s_j\),
The solution reads
and thus
5.1.2. Generalized eigenvalue problem (second order)#
5.1.2.1. Right and left eigenvalue problem#
- 1
First order changes should be meant as derivatives w.r.t. the parameter \(p\), evaluated for the reference value of \(\overline{p}\) that appear in a Taylor series of the quantity of interest, \(s(p_i) = s(\overline{p}_i) + \Delta p_k \, \partial_{p_k} s(\overline{p}_i) + o(\Delta p_i)\).
- 2
Eigenvalues with algebraic and geometric multiplicity larger than one, \(m_i^{(a)} = m_i^{(g)} > 1\), have the associated eigenvectors that spans a subspace of dimension \(m_i\). In this subspace, it’s always possible to define a set of orthogonal vectors. todo write it better; refer to introduction to spectral decomposition… todo treat sensitivity to parameters of eigenvalues and eigenvectors with multiplicity larger than one…