Problems¶
Exercise 1. \(\,\)
Let \(\,V\ \) be a unitary space, and \(\ \,Y\ \) its subspace: \(\ Y<V\,.\ \) \(\\\) The orthogonal complement of a subspace \(\,Y\,\) is the set \(\,Y^\perp\,\) of all vectors in the space \(\,V\,\) that are orthogonal to every vector in the space \(\,Y:\)
Prove that \(\ Y^\perp\ \) is a subspace: \(\ \,Y^\perp<V\,.\)
Exercise 2.
The method gram_schmidt()
of Sage orthogonalizes a set of rows
of a given matrix \(\,\boldsymbol{A}.\ \)
The method returns a pair of matrices \(\,(\boldsymbol{G},\boldsymbol{M}).\ \) The rows of matrix \(\,\boldsymbol{G}\ \) comprise an orthogonal set (but, in general, not orthonormal) obtained by application of the Gram-Schmidt process to the rows of matrix \(\boldsymbol{A},\ \) and a lower triangular matrix \(\boldsymbol{M}\) satisfies the condition \(\,\boldsymbol{A}=\boldsymbol{M}\boldsymbol{G}\ \) (in an older version of Sage one had to add an identity matrix \(\,\boldsymbol{I}\ \) to \(\,\boldsymbol{M}\)).
If \(\,\boldsymbol{A}\ \) is a square matrix of size \(\,n,\ \) then according to the row matrix multiplication rule, the entries of the \(\,i\)-th row of the matrix \(\,\boldsymbol{M}\ \) are coefficients of a linear combination of rows of the matrix \(\,\boldsymbol{G},\ \) which is equal to the \(\,i\)-th row of the matrix \(\,\boldsymbol{A},\ \ i=1,2,\dots,n.\)
Run the following code and check whether
the condition \(\,\boldsymbol{A}=\boldsymbol{M}\boldsymbol{G}\ \) holds?
the product \(\,\boldsymbol{G}\boldsymbol{G}^{\,T}\ \) is a diagonal matrix \(\\\) (which is equivalent to orthogonality of the set of rows of matrix \(\,\boldsymbol{G}\)) ?
the product \(\,\boldsymbol{G}^{\,T}\boldsymbol{G}\ \) is a diagonal matrix \(\\\) (which is equivalent to orthogonality of the set of columns of matrix \(\,\boldsymbol{G}\)) ?
Let \(\,\boldsymbol{Q}\ \) be a matrix obtained from the matrix \(\,\boldsymbol{G}\ \) after normalization of its rows.
Compute the products \(\,\boldsymbol{Q}\boldsymbol{Q}^{\,T}\ \) and \(\,\boldsymbol{Q}^{\,T}\boldsymbol{Q}\,:\)
\(\;\)
Corollary.
Orthogonality of rows of matrix \(\,\boldsymbol{G}\in M_n(R)\ \) does not imply orthogonality of its columns and vice versa: in general \(\,\boldsymbol{G}\,\boldsymbol{G}^{\,T}\neq\boldsymbol{G}^{\,T}\boldsymbol{G},\,\) unless \(\,\boldsymbol{G}=\lambda\,\boldsymbol{Q},\ \lambda\in R,\) where \(\,\boldsymbol{Q}\ \) is an orthogonal matrix; \(\,\) then \(\,\boldsymbol{G}\,\boldsymbol{G}^{\,T}=\,\boldsymbol{G}^{\,T}\boldsymbol{G}\,=\, \lambda^2\,\boldsymbol{I}_n\,.\)
Discussion
The above corollary is related to a property of diagonal matrices which is described in
Lemma.
In algebra \(\,M_n(R)\ \) of real square matrices of size \(\,n\,,\,\) a diagonal matrix \(\,\boldsymbol{D}\ \) commutes with every matrix \(\,\boldsymbol{A}\ \) of this algebra if and only if it is proportional to the idenity matrix:
We will denote a diagonal matrix with entries \(\ \alpha_1,\,\alpha_2,\,\ldots,\,\alpha_n\ \) by:
Consider a matrix \(\ \boldsymbol{A}\in M_n(R)\ \) with columns \(\ \boldsymbol{C}_1,\,\boldsymbol{C}_2,\,\ldots,\,\boldsymbol{C}_n\ \) and rows \(\ \boldsymbol{R}_1,\,\boldsymbol{R}_2,\,\ldots,\,\boldsymbol{R}_n\,.\ \) The above lemma allows us to write the column and row rule of matrix multilplication for a product of matrix \(\ \boldsymbol{A}\ \) and a diagonal matrix:
Proof of the lemma.
\(\ \Rightarrow\ :\ \) We will show that if \(\ \boldsymbol{D}\neq\alpha\,\boldsymbol{I}_n\,,\ \) then there exists a matrix \(\,\boldsymbol{A}\ \) such that \(\,\boldsymbol{A}\boldsymbol{D}\neq\boldsymbol{D}\boldsymbol{A}\,.\)
If \(\ \boldsymbol{D}=\text{diag}\,(\alpha_1,\alpha_2,\dots,\alpha_n)\,,\ \) where \(\ \alpha_p\neq\alpha_q\,,\quad 1\leq p<q \leq n\,,\ \) then we may choose \(\ \boldsymbol{A}\ \) to be the matrix whose only non-zero element, equal to 1, say, is in the \(\,p\)-th row and \(\,\) in the \(\,q\)-th column:
Denote: \(\ \boldsymbol{A}\boldsymbol{D}=[\,b_{ij}\,]_{n\times n}\,,\ \boldsymbol{D}\boldsymbol{A}=[\,c_{ij}\,]_{n\times n}\,.\ \)
Matrix multiplication rules and direct computation:
state that \(\,\boldsymbol{A}\boldsymbol{D}\neq\boldsymbol{D}\boldsymbol{A}\,,\ \) because the only non-zero element of both matrices is in the same place and has a different value: \(\ \ b_{pq}=\alpha_q\ \neq\ \alpha_p=c_{pq}\,.\)
\(\ \Leftarrow\ :\ \) If \(\ \boldsymbol{D}=\alpha\,\boldsymbol{I}_n\,,\ \) then properties of operations on matrices imply immediately that
Now we can explain a relation between the above \(\,\) Corollary \(\,\) and \(\,\) Lemma.
Assume that rows \(\ \boldsymbol{R}_1,\,\boldsymbol{R}_2,\,\ldots,\,\boldsymbol{R}_n\ \) of matrix \(\,\boldsymbol{G}\in M_n(R)\ \) comprise an orthogonal set:
Then the matrix \(\ \boldsymbol{G}\,\boldsymbol{G}^{\,T}\ \) is diagonal: \(\ \ \boldsymbol{G}\,\boldsymbol{G}^{\,T}\ =\ \boldsymbol{D}\ =\ \text{diag}\,(\alpha_1,\alpha_2,\dots,\alpha_n)\,,\ \) \(\\\) where \(\ \alpha_i=\|\,\boldsymbol{R}_i\,\|^2\,,\quad i=1,2,\dots,n\,.\)
If additionally the norms of all the rows are equal:
then \(\ \boldsymbol{D}=\alpha\,\boldsymbol{I}_n\ \,\) and \(\,\) the matrix \(\ \boldsymbol{D}\ \) commutes with all the matrices \(\ \boldsymbol{A}\in M_n(R)\,.\ \) Then
and so orthogonality of rows is equivalent to orthogonality of columns of matrix \(\ \boldsymbol{G}\,.\ \) Moreover, \(\ \,\boldsymbol{G}=\lambda\,\boldsymbol{Q}\,;\ \,\) and if \(\ \,\lambda=\sqrt{\alpha}\,, \,\) the matrix \(\ \,\boldsymbol{Q}\ \,\) is orthogonal:
However, if the norms of the rows are not equal, that is, the condition (1) does not hold, then \(\ \boldsymbol{D}\neq\alpha\,\boldsymbol{I}_n\ \,\) and so \(\,\) the matrix \(\ \boldsymbol{D}\ \) does not have to commute with \(\ \boldsymbol{G}^{-1}.\) Therefore the equivalences (2) may not hold, and thus orthogonality of rows does not imply orthogonality of columns of matrix \(\ \boldsymbol{G}\,.\)
Exercise 3.
A linear operator \(\,F\ \) defined on a unitary space \(\,V(C)\ \) is anti-Hermitian if \(\,F^+=-F.\)
Show that eigenvalues of such an operator are purely imaginary numbers \(\\\) (a complex number \(\,z\ \) is purely imaginary if \(\,\text{re}\,z=0,\ \) that is, if \(\,z=i\,\alpha,\ \alpha\in R.\))
Exercise 4.
Prove that a product of two Hermitian operators \(\,F_1,\,F_2\ \) is Hermitian \(\\\) if and only if these operators commute: \(\ [F_1,F_2]=0.\)
For a comparison, a product of unitary operators is always unitary.