## Matrix Multiplication

Let be a general matrix and let denote a general matrix. Denote the matrix product by . Then*matrix multiplication*is carried out by computing the

*inner product*of every row of with every column of . Let the th row of be denoted by , , and the th column of by , . Then the matrix product is defined as

*complex*matrices by using a definition of inner product which does not conjugate its second argument.

^{H.2}

**Examples:**

*right*by an matrix, where is any positive integer. An matrix can be multiplied on the

*left*by a matrix, where is any positive integer. Thus, the number of columns in the matrix on the left must equal the number of rows in the matrix on the right. Matrix multiplication is

*non-commutative*, in general. That is, normally even when both products are defined (such as when the matrices are square.) The

*transpose of a matrix product*is the product of the transposes in

*reverse order*:

*identity matrix*is denoted by and is defined as

*square*. The identity matrix , sometimes denoted as , satisfies for every matrix . Similarly, , for every matrix . As a special case, a matrix times a vector produces a new vector which consists of the inner product of every row of with

*linear transformation*of . In fact, every linear function of a vector can be expressed as a matrix multiply. In particular, every linear

*filtering*operation can be expressed as a matrix multiply applied to the input signal. As a special case, every linear, time-invariant (LTI) filtering operation can be expressed as a matrix multiply in which the matrix is

*Toeplitz*,

*i.e.*, (constant along

*diagonals*). As a further special case, a row vector on the left may be multiplied by a column vector on the right to form a

*single inner product*:

**Next Section:**

Solving Linear Equations Using Matrices

**Previous Section:**

Round-Off Error Variance