## The Inner Product

The*inner product*(or ``dot product'', or ``scalar product'') is an operation on two vectors which produces a scalar. Defining an inner product for a Banach space specializes it to a

*Hilbert space*(or ``inner product space''). There are many examples of Hilbert spaces, but we will only need for this book (complex length vectors, and complex scalars).

The

*inner product*between (complex) -vectors and is defined by

^{5.9}

*norm*will be

*induced*by the inner product:

^{5.10}

*conjugate symmetric*:

### Linearity of the Inner Product

Any function of a vector (which we may call an*operator*on ) is said to be

*linear*if for all and , and for all scalars and in ,

*additivity*:*homogeneity*:

*e.g.*, can be linear or not with respect to each of its arguments. The inner product is

*linear in its first argument*,

*i.e.*, for all , and for all ,

*additive*in its second argument,

*i.e.*,

*conjugate homogeneous*(or

*antilinear*) in its second argument, since

*is*strictly linear in its second argument with respect to

*real*scalars and :

*bilinear operator*in that context.

### Norm Induced by the Inner Product

We may define a*norm*on using the inner product:

### Cauchy-Schwarz Inequality

The*Cauchy-Schwarz Inequality*(or ``Schwarz Inequality'') states that for all and , we have

### Triangle Inequality

The*triangle inequality*states that the length of any side of a triangle is less than or equal to the sum of the lengths of the other two sides, with equality occurring only when the triangle degenerates to a line. In , this becomes

### Triangle Difference Inequality

A useful variation on the triangle inequality is that the length of any side of a triangle is*greater*than the

*absolute difference*of the lengths of the other two sides:

*Proof:*By the triangle inequality,

### Vector Cosine

The Cauchy-Schwarz Inequality can be written*angle*between two vectors in .

### Orthogonality

The vectors (signals) and^{5.11}are said to be

*orthogonal*if , denoted . That is to say

*right angle*and are thus

*perpendicular*geometrically.

**Example ():**Let and , as shown in Fig.5.8. The inner product is . This shows that the vectors are

*orthogonal*. As marked in the figure, the lines intersect at a right angle and are therefore perpendicular.

### The Pythagorean Theorem in N-Space

In 2D, the Pythagorean Theorem says that when and are orthogonal, as in Fig.5.8, (*i.e.*, when the vectors and intersect at a

*right angle*), then we have

This relationship generalizes to dimensions, as we can easily show:

If , then and Eq.(5.1) holds in dimensions. Note that the converse is not true in . That is, does not imply in . For a counterexample, consider , , in which case

### Projection

The*orthogonal projection*(or simply ``projection'') of onto is defined by

*coefficient of projection*. When projecting onto a

*unit length*vector , the coefficient of projection is simply the inner product of with .

**Motivation:**The basic idea of orthogonal projection of onto is to ``drop a perpendicular'' from onto to define a new vector along which we call the ``projection'' of onto . This is illustrated for in Fig.5.9 for and , in which case

**Derivation:**(1) Since any projection onto must lie along the line collinear with , write the projection as . (2) Since by definition the

*projection error*is orthogonal to , we must have

**Next Section:**

Signal Reconstruction from Projections

**Previous Section:**

Signal Metrics