Inner product spaces¶
Inner product¶
Inner product is a generalization of the notion of dot product.
An inner product over a \(K\)-vector space \(V\) is any map
satisfying following requirements:
Positive definiteness
(1)¶\[ \langle v, v \rangle \geq 0 \text{ and } \langle v, v \rangle = 0 \iff v = 0\]Conjugate symmetry
(2)¶\[ \langle v_1, v_2 \rangle = \overline{\langle v_2, v_1 \rangle} \quad \forall v_1, v_2 \in V\]Linearity in the first argument
(3)¶\[\begin{split} \begin{aligned} &\langle \alpha v, w \rangle = \alpha \langle v, w \rangle \quad \forall v, w \in V; \forall \alpha \in K\\ &\langle v_1 + v_2, w \rangle = \langle v_1, w \rangle + \langle v_2, w \rangle \quad \forall v_1, v_2,w \in V \end{aligned}\end{split}\]
Remarks
- Linearity in first argument extends to any arbitrary linear combination:
- Similarly we have conjugate linearity in second argument for any arbitrary linear combination:
Orthogonality¶
A set of non-zero vectors \(\{v_1, \dots, v_p\}\) is called orthogonal if
A set of non-zero vectors \(\{v_1, \dots, v_p\}\) is called orthonormal if
i.e. \(\langle v_i, v_j \rangle = \delta(i, j)\).
Remarks:
- A set of orthogonal vectors is linearly independent. Prove!
Norm¶
Norms are a generalization of the notion of length.
A norm over a \(K\)-vector space \(V\) is any map
satisfying following requirements:
Positive definiteness
(5)¶\[ \| v\| \geq 0 \quad \forall v \in V \text{ and } \| v\| = 0 \iff v = 0\]Scalar multiplication
\[\| \alpha v \| = | \alpha | \| v \| \quad \forall \alpha \in K; \forall v \in V\]Triangle inequality
\[\| v_1 + v_2 \| \leq \| v_1 \| + \| v_2 \| \quad \forall v_1, v_2 \in V\]
Projection¶
A projection is a linear transformation \(P\) from a vector space \(V\) to itself such that \(P^2=P\). i.e. if \(P v = \beta\), then \(P \beta = \beta\). Thus whenever \(P\) is applied twice to any vector, it gives the same result as if it was applied once.
Thus \(P\) is an idempotent operator.
Consider the operator \(P : \RR^3 \to \RR^3\) defined as
Then application of \(P\) on any arbitrary vector is given by
A second application doesn’t change it
Thus \(P\) is a projection operator.
Usually we can directly verify the property by computing \(P^2\) as
Orthogonal projection¶
Consider a projection operator \(P : V \to V\) where \(V\) is an inner product space.
The range of \(P\) is given by
The null space of \(P\) is given by
A projection operator \(P : V \to V\) over an inner product space \(V\) is called orthogonal projection operator if its range \(\Range(P)\) and the null space \(\NullSpace(P)\) as defined above are orthogonal to each other. i.e.
Consider a unit norm vector \(u \in \RR^N\). Thus \(u^T u = 1\).
Consider
Now
Thus \(P\) is a projection operator.
Now
Thus \(P_u\) is self-adjoint. Hence \(P_u\) is an orthogonal projection operator.
Now
Thus \(P_u\) leaves \(u\) intact. i.e. Projection of \(u\) on to \(u\) is \(u\) itself.
Let \(v \in u^{\perp}\) i.e. \(\langle u, v \rangle = 0\).
Then
Thus \(P_u\) annihilates all vectors orthogonal to \(u\).
Now any vector \(x \in \RR^N\) can be broken down into two components
such that \(\langle u , x_{\perp} \rangle =0\) and \(x_{\parallel}\) is collinear with \(u\).
Then
Thus \(P_u\) retains the projection of \(x\) on \(u\) given by \(x_{\parallel}\).
Let \(A \in \RR^{M \times N}\) with \(N \leq M\) be a matrix given by
where \(a_i \in \RR^M\) are its columns which are linearly independent.
The column space of \(A\) is given by
It can be shown that \(A^T A\) is invertible.
Consider the operator
Now
Thus \(P_A\) is a projection operator.
Thus \(P_A\) is self-adjoint.
Hence \(P_A\) is an orthogonal projection operator on the column space of \(A\).
Parallelogram identity¶
Thus
When inner product is a real number following identity is quite useful.
Thus
since for real inner products
Polarization identity¶
When inner product is a complex number, polarization identity is quite useful.
Thus