$$ \newcommand{\qed}{\tag*{$\square$}} \newcommand{\span}{\operatorname{span}} \newcommand{\dim}{\operatorname{dim}} \newcommand{\rank}{\operatorname{rank}} \newcommand{\norm}[1]{\|#1\|} \newcommand{\grad}{\nabla} \newcommand{\prox}[1]{\operatorname{prox}_{#1}} \newcommand{\inner}[2]{\langle{#1}, {#2}\rangle} \newcommand{\mat}[1]{\mathcal{M}[#1]} \newcommand{\null}[1]{\operatorname{null} \left(#1\right)} \newcommand{\range}[1]{\operatorname{range} \left(#1\right)} \newcommand{\rowvec}[1]{\begin{bmatrix} #1 \end{bmatrix}^T} \newcommand{\Reals}{\mathbf{R}} \newcommand{\RR}{\mathbf{R}} \newcommand{\Complex}{\mathbf{C}} \newcommand{\Field}{\mathbf{F}} \newcommand{\Pb}{\operatorname{Pr}} \newcommand{\E}[1]{\operatorname{E}[#1]} \newcommand{\Var}[1]{\operatorname{Var}[#1]} \newcommand{\argmin}[2]{\underset{#1}{\operatorname{argmin}} {#2}} \newcommand{\optmin}[3]{ \begin{align*} & \underset{#1}{\text{minimize}} & & #2 \\ & \text{subject to} & & #3 \end{align*} } \newcommand{\optmax}[3]{ \begin{align*} & \underset{#1}{\text{maximize}} & & #2 \\ & \text{subject to} & & #3 \end{align*} } \newcommand{\optfind}[2]{ \begin{align*} & {\text{find}} & & #1 \\ & \text{subject to} & & #2 \end{align*} } $$
Definition 9.1. An inner product on a real vector space is a function that satisfies the following properties:
Examples of inner products include the dot product on , and, on the space of continuous real-valued functions on , .
Definition 9.2. The norm on a vector space induced by an inner product is a function given by
for .
For example, on , the norm induced by the dot product is given by .
Definition 9.3. Two vectors , are orthogonal if .
In , two vectors are orthogonal if and only if they are perpindicular to each other. In fact, in , we can relate the inner product between two nonzero vectors to the angle between them:
In higher dimensions, we define the angle between two vectors as
A set of vectors is called orthonormal if they are mutually orthogonal and have norm equal to one. It is simple to determine the coordinates of with respect to an orthonormal basis of . In particular,
In two dimensions, the Pythagorean theorem states that for any right triangle, the sum of the squared side lengths equals the length of the hypotenuse squared. We can generalize this result to higher dimensions using linear algebra.
Theorem 9.4. For orthogonal vectors , .
The proof follows by expanding the lefthand side and simplifying it by using the definition of orthogonality.
We can use the Pythagorean theorem to write any vector as , with ; simply take and .
The inequality stated in the following theorem is called the Cauchy-Schwarz inequality. It is one of the most important inequalities in mathematics.
Theorem 9.5. Let . Then , with equality if and only if for some scalar .
Proof. Write
By the Pythagorean theorem,
Multiplying through by and taking the square root of both sides furnishes the result.
Theorem 9.6. Let . Then .
One way to prove the above is to square and then expand the lefthand side, and then to bound the inner-product term with Cauchy-Schwarz.
Note that this implies that . The triangle inequality also implies , which is known as the reverse triangle inequality.
Linear Algebra Done Right, by Sheldon Axler.