Formulation of the Eigenproblem¶
Definition.
Let \(\,F\ \) be a linear operator defined on a vector space \(\,V(K)\,.\ \,\) If a vector \(\,v\in V\!\smallsetminus\!\{\theta\}\ \) and \(\,\) a scalar \(\,\lambda\in K\ \) satisfy the equality
then \(\,\lambda\ \) is an eigenvalue of the operator \(\,F,\ \) and \(\ \,v\,\) - \(\,\) an eigenvector of the operator \(\,F\ \) associated with the eigenvalue \(\,\lambda\ \) (an eigenvector for the value \(\,\lambda\)).
The formula (1) presents an \(\,\) eigenequation \(\,\) (an eigenproblem) \(\,\) of the linear operator \(\,F\,.\)
Remarks and comments.
The definition excludes the zero vector \(\,\theta\ \) from the set of eigenvectors even though it satisfies the condition (1) with an arbitrary eigenvalue \(\,\lambda.\ \) On the other, hand the eigenvalue \(\,\lambda\ \) may be equal to \(\,0.\ \) If \(\,0\ \) is an eigenvalue of the operator \(\,F,\ \) then the set of all eigenvectors associated with this eigenvalue, together with the zero vector, is the kernel of the operator \(\,F.\)
The action of the operator \(\,F\ \) on its eigenvector \(\,v\ \) boils down to multiplication \(\,v\ \) by a scalar. If \(\,V\ \) is a space of the geometric vectors, then this means that the operator \(\,F\ \) acting on the vector \(\,v\ \) does not change its direction (it may change the length or the orientation).
In quantum mechanics, physical quantities that are measurable in a quantum system (observables) are represented by Hermitian linear operators defined on a unitary space of states of the system. If the operator \(\,F\ \) represents an observable \(\,\mathcal{F},\ \) then its eigenvalues are possible measurement results of this observable. The Hermitian property of the operator \(\,F\ \) guarantees that the measurement results obtained in such a way are real numbers.
The solution of the eigenvalue problem (1) for the operator \(\,F\ \) is to find all its eigenvalues \(\,\lambda\ \) and the eigenvectors \(\,v\ \) associated with these values.
Let \(\,\mathcal{B}\ \) be a basis of the \(\,n\)-dimensional space \(\,V.\ \) Then the operator \(\,F\ \) correspods to the matrix \(\,M_{\mathcal{B}}(F)=\boldsymbol{A}=[\alpha_{ij}]_{n\times n}\in M_n(K)\,,\ \) and a vector \(\,v\ \) - \(\,\) to the column of its coefficients in the basis \(\,\mathcal{B}:\ \ I_{\mathcal{B}}(v)=\boldsymbol{x}\in K^n.\ \) The eigenequation may be rewritten as follows:
The last equation, where \(\,\boldsymbol{I}_n\,\) denotes the identity matrix of size \(\,n,\ \) is a homogeneous linear problem with a matrix \(\ \boldsymbol{A}-\lambda\,\boldsymbol{I}_n\,.\ \) The theory of linear system of equations tells us that the non-zero solutions \(\,\boldsymbol{x}\,\) (and these are the one we are interested in here) exist if and only if
Definition.
Let \(\,\boldsymbol{A}\,\in\,M_n(K)\,,\ \) and \(\,\boldsymbol{I}_n\ \) be the identity matrix of size \(\,n.\,\) The degree \(\,n\) polynomial \(\,w(\lambda)=\det\,(\boldsymbol{A}-\lambda\,\boldsymbol{I}_n)\ \) is the characteristic polynomial of the matrix \(\,\boldsymbol{A}\,.\ \) The equation \(\ w(\lambda)=0\ \) is the characteristic equation, \(\,\) and its solutions are the \(\,\) characteristic roots of the matrix \(\,\boldsymbol{A}.\)
We have deduced that the solutions \(\,v\neq\theta\ \) of the eigenvalue problem (1) exist if and only if \(\,\lambda\ \) is a characteristic root of the matrix of the operator \(\,F\ \) in a certain basis \(\,\mathcal{B}\,.\)
However, in different bases the operator \(\,F\ \) is represented by different matrices. This leads to a question whether admissible values \(\,\lambda\ \) depend on the chosen basis.
It turns out that even though a change of basis results in a change of the matrix of the linear operator, the characteristic polynomial and its roots remain the same.
Indeed, let \(\,\mathcal{B}'\ \) be another basis of the space \(\,V\ \ \text{and}\ \ \ \text{let}\ M_{\mathcal{B}'}(F)=\boldsymbol{A}'\,.\) \(\\\) Then \(\,\boldsymbol{A}'=\boldsymbol{S}^{-1}\boldsymbol{A}\,\boldsymbol{S}\,,\ \) where \(\,\boldsymbol{S}\ \) is the transformation matrix of the basis \(\,\mathcal{B}\ \) to the basis \(\,\mathcal{B}'\,.\ \) Moreover,
The above considerations lead to
Corollary.
If \(\,V\ \) is a finite dimensional vector space, then the eigenvalues of the linear operator \(\,F\in\text{End}(V)\ \) are characteristic roots of the matrix of this operator in any basis of the space \(\,V\,.\)
The question of solvability of the eigenvalue problem is treated in
Theorem 1.
Every linear operator defined on a finite dimensional complex vector space has eigenvectors.
This follows from the fundamental theorem of algebra, which states that every polynomial of positive degree with complex coefficients has at least one complex root.
Hence, if \(\,K=C\,,\ \) then the characteristic equation (2) has a complex root \(\,\lambda_0\ \) which substituted to the equation (\(\clubsuit\)) determines a suitable eigenvector (in fact: at least \(\,1\)-dimensional subspace of eigenvectors).
Theorem 1. is not true for real vector spaces. For example, consider an operator of rotation by an angle \(\,\phi\neq k\pi,\ k\in Z\,,\ \) which is defined on the (real) vector space of geometric vectors with the initial point at the origin of the Cartesian coordinate system. This operator changes the direction of each non-zero vector, and thus does not have eigenvectors.
If \(\,V\ \) is an \(\,n\)-dimensional complex vector space, then the characteristic polynomial \(\,w(\lambda)\ \) of the linear operator \(\,F\in\text{End}(V)\ \) has \(\,n\ \) (not necessarily different) roots:
(in case of a real vector space: \(\,K=R,\ \) the factorisation (3) may contain quadratic polynomials \(\,\lambda^2+p\,\lambda+q\ \) with negative discriminant \(\,\Delta\)). The power \(\,k_i\ \) is the multiplicity of the root \(\,\lambda_i\ \) of the polynomial \(\,w(\lambda)\ \) and at the same time the algebraic multiplicity of the eigenvalue \(\,\lambda_i\ \) of the operator \(\,F\,.\) The geometric multiplicity of the eigenvalue \(\,\lambda_i\ \) is, by definition, the number of linearly independent eigenvectors associated with this eigenvalue.
Note that if \(\ \ Fv_1=\lambda\,v_1\,,\ \ Fv_2=\lambda\,v_2\,,\quad v_1,v_2\in V\!\smallsetminus\!\{\theta\}\,,\ \,\) then for \(\ \ \alpha_1,\alpha_2\in K:\)
Hence, each linear combination (which is not a zero vector) of eigenvectors \(\\\) associated with the eigenvalue \(\,\lambda\ \) gives an eigenvector associated with the same eigenvalue.
Referring to the criterion for the space, we can formulate
Corollary.
The set of all eigenvectors associated with the eigenvalue \(\,\lambda\ \) \(\\\) of the linear operator \(\,F\in\text{End}(V),\,\) together with the zero vector \(\,\theta,\,\) forms \(\\\) a vector space (a subspace of the space \(\,V\)) of dimension \(\\\) equal to the geometric multiplicity of the eigenvalue \(\,\lambda.\)
Practical solution of the eigenvalue problem for the linear operator \(\,F\ \) acting \(\\\) on the \(\,n\)-dimensional vector space \(\,V(K)\ \) consists of three stages:
Construction of the matrix \(\,M_{\mathcal{B}}(F)\equiv\boldsymbol{A}=[\alpha_{ij}]_{n\times n}\in M_n(K)\ \) of the operator \(\,F\ \) in a basis \(\,\mathcal{B}\ \) of the space \(\,V\,.\)
Calculation of the eigenvalues \(\,\lambda\ \) of the operator \(\,F\ \) as the roots of the characteristic equation (2) of the matrix \(\,\boldsymbol{A}\,\) and determination of it algebraic multiplicities.
Substitution of each eigenvalue \(\,\lambda\ \) to the equation (\(\clubsuit\)), calculation of the associated eigenvectors (determined by their coordinates in the basis \(\,\mathcal{B}\)) and determination of its geometric multiplicity.