Orthogonality of Vectors

We assume that \(\,V\ \) is a unitary or Euclidean vector space.

If an inner product of vectors \(\,x,y\in V\ \) equals zero: \(\,\langle x,y\rangle=0\,,\ \) then we say that these vectors are \(\,\) orthogonal. \(\,\) Orthogonality is thus the generalization of the notion of perpendicularity of geometric vectors.

Orthogonal Set of Vectors

Definition.

A set \(\ (x_1,x_2,\dots,x_r)\ \) of pairwise orthogonal non-zero vectors from the space \(\, V\), i.e.

\[x_i\neq \theta \qquad\text{and}\qquad \langle\,x_i,x_j\rangle=0\quad\text{for}\quad i\neq j\,,\qquad i,j=1,2,\dots,r\,,\]

is called an \(\,\) orthogonal set. \(\,\) An orthogonal set of unit vectors (that is, of the vectors having norm \(\,1\)) \(\,\) is an \(\,\) orthonormal set.

Hence, an inner product of any two vectors from an orthonormal set \(\ (x_1,x_2,\dots,x_r)\ \) is given by

\[\begin{split}\langle\,x_i,x_j\rangle=\delta_{ij}\,,\quad\text{where}\quad\delta_{ij}\ \,=\ \, \left\{\ \begin{array}{cc} 1 & \text{for}\ \ i=j, \\ 0 & \text{for}\ \ i\ne j; \end{array} \right.\quad i,j=1,2,\ldots,r\quad \text{(the Kronecker delta).}\end{split}\]

A relation between othogonality and linear independence of vectors presents

Theorem 3.

Every orthogonal set of vectors of the space \(\,V\ \) is linearly independent.

Proof. \(\,\) Assume that the set \(\ (x_1,x_2,\dots,x_r)\ \) of vectors from the space \(\,V\ \) is orthogonal:

(1)\[\langle\,x_i,x_i\rangle>0\,,\qquad\quad \langle\,x_i,x_j\rangle=0\quad\text{for}\quad i\neq j\,,\qquad\quad i,j=1,2,\dots,r\,.\]

Let \(\quad\alpha_1\,x_1\,+\;\alpha_2\,x_2\,+\,\dots\,+\,\alpha_r\,x_r\ =\ \theta\,.\)

After applying an inner product to both sides of the above equality with the vectors \(\ x_1,\;x_2,\,\dots,\,x_r\ \,\) on the left hand side and \(\,\) using linearity of an inner product with respect to the second variable, \(\,\) we obtain

\begin{alignat*}{5} \alpha_1\,\langle x_1,x_1\rangle & {\,} + {\ } & \alpha_2\,\langle x_1,x_2\rangle & {\,} + {\ } & \ldots & {\ \ } + {\ } & \alpha_r\,\langle x_1,x_r\rangle & {\ } = {\ \,} & 0 \\ \alpha_1\,\langle x_2,x_1\rangle & {\,} + {\ } & \alpha_2\,\langle x_2,x_2\rangle & {\,} + {\ } & \ldots & {\ \ } + {\ } & \alpha_r\,\langle x_2,x_r\rangle & {\ } = {\ \,} & 0 \\ \dots\quad\ \ & & \dots\quad\ \ & & \ \ldots & & \dots\quad\ \ & & \\ \alpha_1\,\langle x_r,x_1\rangle & {\,} + {\ } & \alpha_2\,\langle x_r,x_2\rangle & {\,} + {\ } & \ldots & {\ \ } + {\ } & \alpha_r\,\langle x_r,x_r\rangle & {\ } = {\ \,} & 0 \end{alignat*}

Conditions (1) imply that \(\quad\alpha_1\,=\;\alpha_2\,=\;\dots\;=\,\alpha_r\ =\ 0\,.\)

Hence, the implication

\[\alpha_1\,x_1+\,\alpha_2\,x_2+\ldots+\,\alpha_r\,x_r\ =\ \theta \qquad\Rightarrow\qquad \alpha_1=\,\alpha_2=\ldots=\,\alpha_r\,=\,0\,\]

is true, and thus the vectors \(\, x_1,\,x_2,\,\dots,\,x_r\,\) are linearly independent.

Corollary.

In \(\,n\)-dimensional unitary or Euclidean vector space:

  1. every orthogonal set of \(\,n\ \) vectors comprises a basis;

  2. an orthogonal set of vectors cannot contain more than \(\,n\ \) vectors.

Orthonormal Basis

Definition.

A basis of finite dimensional space \(\,V\ \) whose vectors comprise an orthogonal (orthonormal) set is called an orthogonal basis (resp. an orthonormal basis).

Assume that a basis \(\,\mathcal{B}=(u_1,u_2,\dots,u_n)\ \) of the space \(\,V\ \) is orthonormal:

\[\langle\,u_i,u_j\rangle\,=\,\delta_{ij}\,,\qquad i,j=1,2,\dots,n.\]
  1. Let \(\ \,v\,=\,\displaystyle\sum_{k\,=\,1}^n\ \alpha_k\,u_k\,.\ \,\)

Then, by definition of an inner product:

(2)\[\begin{split}\begin{array}{l} \displaystyle \langle\,u_i,v\,\rangle\ \,=\ \, \left\langle u_i\,,\ \sum_{k\,=\,1}^n\ \alpha_k\,u_k\right\rangle \ =\ \sum_{k\,=\,1}^n\ \alpha_k\,\langle u_i,u_k\rangle \ =\ \sum_{k\,=\,1}^n\ \alpha_k\,\delta_{ik}\ =\ \alpha_i\,; \\ \\ \blacktriangleright\quad\alpha_i\ =\ \langle\,u_i,v\,\rangle\,,\qquad i=1,2,\dots,n. \end{array}\end{split}\]

The \(\,i\)-th coordinate of the vector \(\,v\ \) in basis \(\ \mathcal{B}\ \) is equal to an inner product of the \(\,i\)-th vector from the basis \(\,\mathcal{B}\ \) and the vector \(\,v\,,\quad i=1,2,\dots,n.\)

  1. Let \(\quad v\,=\,\displaystyle\sum_{i\,=\,1}^n\ \alpha_i\,u_i\,,\ \ w\,=\,\displaystyle\sum_{j\,=\,1}^n\ \beta_j\,u_j\,:\quad I_{\mathcal{B}}(v)= \left[\begin{array}{c} \alpha_1 \\ \alpha_2 \\ \dots \\ \alpha_n \end{array}\right]\,,\ \ I_{\mathcal{B}}(w)= \left[\begin{array}{c} \beta_1 \\ \beta_2 \\ \dots \\ \beta_n \end{array}\right]\,.\)

    \[\begin{split}\begin{array}{rcl} \langle\,v,w\,\rangle & = & \left\langle\ \displaystyle\sum_{i\,=\,1}^n\ \alpha_i\,u_i\,, \ \displaystyle\sum_{j\,=\,1}^n\ \beta_j\,u_j\right\rangle\ \ =\ \ \displaystyle\sum_{i,j\,=\,1}^n\ \alpha_i^*\,\beta_j\ \langle\,u_i,u_j\rangle\ \ =\ \ \\ \\ & = & \displaystyle\sum_{i,j\,=\,1}^n\ \alpha_i^*\ \beta_j\ \delta_{ij}\ \ =\ \ \displaystyle\sum_{i\,=\,1}^n\ \alpha_i^*\,\beta_i\ \ =\ \ \langle\,I_{\mathcal{B}}(v),\,I_{\mathcal{B}}(w)\,\rangle\,; \\ \\ \blacktriangleright\quad\langle\,v,w\,\rangle & = & \langle\,I_{\mathcal{B}}(v), \,I_{\mathcal{B}}(w)\,\rangle\,. \end{array}\end{split}\]

    An inner product of th vectors \(\,v\,\) and \(\,w\,\) (in a unitary or Euclidean space \(\,V\)) \(\,\) is equal to an inner product \(\,\) (in the space \(\,C^n\) or \(\,R^n,\,\) respectively) \(\,\) of column vectors representing coordinates of the vectors \(\,v\,\) and \(\,w\,\) in the basis \(\,\mathcal{B}.\)

  2. Let \(\,F\in\text{End}(V)\,,\ \ M_{\mathcal{B}}(F)=[\,\varphi_{ij}\,]_{n\times n}\,.\ \) By definition of matrix of a linear operator:

    (3)\[\begin{split}\begin{array}{rcl} \langle\,u_i,Fu_j\rangle & = & \left\langle u_i\,,\,\displaystyle\sum_{k\,=\,1}^n\ \varphi_{kj}\,u_k\right\rangle\ \ = \ \ \displaystyle\sum_{k\,=\,1}^n\ \varphi_{kj}\,\langle u_i,u_k\rangle\ \ = \\ \\ & = & \displaystyle\sum_{k\,=\,1}^n\ \varphi_{kj}\ \delta_{ik}\ \ =\ \ \displaystyle\sum_{k\,=\,1}^n\ \delta_{ik}\ \varphi_{kj}\ \ = \ \ \varphi_{ij}\ ; \\ \\ \blacktriangleright\quad\varphi_{ij} & = & \langle\,u_i,Fu_j\rangle\,,\qquad i,j=1,2,\dots,n. \end{array}\end{split}\]

    An element \(\,\varphi_{ij}\ \) of matrix of a linear operator \(\,F\,\) in basis \(\,\mathcal{B}\ \) is equal to an inner product of the \(\,i\)-th vector from the basis \(\,\mathcal{B}\ \) and the image \(\,\) (under the transformation \(F\)) of the \(\ \,j\)-th vector from this basis, \(\ \ i,j=1,2,\dots,n.\)

It is worth to notice that while in an arbitrary basis \(\,\mathcal{B}=(v_1,v_2,\dots,v_n)\ \) coordinates \(\,\alpha_i\ \) of a vector \(\,v\ \) and elements \(\,\varphi_{ij}\ \) of matrix of a linear operator \(\,F\ \) are defined implicitely by relations

\[v\,=\,\sum_{i\,=\,1}^n\ \alpha_i\,v_i\,,\qquad Fv_j\,=\,\sum_{i\,=\,1}^n\ \varphi_{ij}\,v_i\,, \quad j=1,2,\dots,n\,,\]

then in an orthonormal basis these quantities are given explicitely by formulae (2) \(\,\) and \(\,\) (3).

Moreover, the equation (2) implies that every vector \(\,v\in V\ \) may be written as

(4)\[v\ \,=\ \,\sum_{i\;\,=\ \,1}^n\ \alpha_i\,u_i\ =\ \sum_{i\,=\,1}^n\ \langle u_i,v\rangle\;u_i\,.\]

Definition.

Let \(\,u,v\in V\,.\ \) If a vector \(\,u\ \) has norm \(\,1:\ \ \|u\|=1\,,\ \\\) then an inner product \(\,\langle u,v\rangle\ \) is called a coordinate of vector \(\,v\ \) on the axis \(\,\) u.

The formula (4) states that coordinates of vector \(\,v\ \) in an orthonormal basis \(\,\mathcal{B}=(u_1,u_2,\dots,u_n)\ \) are its coordinates on axes \(\,\text{u}_1,\,\text{u}_2,\,\dots,\,\text{u}_n\,.\)

Example.

  1. An orthonormal basis of real 3-dimensional space of geometric vectors consists of three mutually perpendicular unit vectors \(\,\mathcal{E}=(\vec{e}_1,\vec{e}_2,\vec{e}_3).\ \) An inner product of vectors \(\,\vec{a}=\alpha_1\,\vec{e}_1+\alpha_2\,\vec{e}_2+\alpha_3\,\vec{e}_3\,,\ \vec{b}=\beta_1\,\vec{e}_1+\beta_2\,\vec{e}_2+\beta_3\,\vec{e}_3\ \) equals

    \[\vec{a}\cdot\vec{b}\ =\ (\alpha_1\,\vec{e}_1+\alpha_2\,\vec{e}_2+\alpha_3\,\vec{e}_3)\cdot (\beta_1\,\vec{e}_1+\beta_2\,\vec{e}_2+\beta_3\,\vec{e}_3)\ =\ \alpha_1\,\beta_1\,+\,\alpha_2\,\beta_2\,+\,\alpha_3\,\beta_3\,.\]
  2. An example of an orthonormal basis of a unitary space \(\,C^n\ \) (and also Euclidean space \(\,R^n\)) is a canonical basis \(\,\mathcal{E}=(e_1,e_2,\dots,e_n),\ \) where the \(\,i\)-th vector equals

    \[\begin{split}e_i\ =\ \left[\begin{array}{c} 0 \\ \dots \\ 1 \\ \dots \\ 0 \end{array}\right] \begin{array}{c} \; \\ \; \\ \leftarrow\ i \\ \; \\ \; \end{array}\,, \qquad i=1,2,\dots,n\,.\end{split}\]