Skip to main content

Section4.4Matrix Multiplication

Objectives
  1. Understand compositions of transformations.
  2. Understand the relationship between matrix products and compositions of matrix transformations.
  3. Become comfortable doing basic algebra involving matrices.
  4. Recipe: matrix multiplication (two ways).
  5. Picture: composition of transformations.
  6. Vocabulary: composition.

In this section, we study compositions of transformations. As we will see, composition is a way of chaining transformations together. The composition of matrix transformations corresponds to a notion of multiplying two matrices together. We also discuss addition and scalar multiplication of transformations and of matrices.

Subsection4.4.1Composition of linear transformations

Composition means the same thing in linear algebra as it does in Calculus. Here is the definition.

Definition

Let T:RnRm and U:RpRn be transformations. Their composition is the transformation TU:RpRm defined by

(TU)(x)=T(U(x)).

Composing two transformations means chaining them together: TU is the transformation that first applies U, then applies T (note the order of operations). More precisely, to evaluate TU on an input vector x, first you evaluate U(x), then you take this output vector of U and use it as an input vector of T: that is, (TU)(x)=T(U(x)). Of course, this only makes sense when the outputs of U are valid inputs of T, that is, when the range of U is contained in the domain of T.

RpxRnU(x)RmTU(x)UTTU

Here is a picture of the composition TU as a “machine” that first runs U, then takes its output and feeds it into T; there is a similar picture in this subsection in Section 4.1.

TUUTRpxRmTU(x)U(x)Rn
Domain and codomain of a composition

  • In order for TU to be defined, the codomain of U must equal the domain of T.
  • The domain of TU is the domain of U.
  • The codomain of TU is the codomain of T.

Recall from this definition in Section 4.1 that the identity transformation is the transformation IdRn:RnRn defined by IdRn(x)=x for every vector x.

Properties of composition

Let S,T,U be transformations and let c be a scalar. Suppose that T:RnRm, and that in each of the following identities, the domains and the codomains are compatible when necessary for the composition to be defined. The following properties are easily verified:

TIdRn=TIdRmT=Tc(TU)=(cT)Uc(TU)=T(cU)ifTislinearS(T+U)=ST+SU(S+T)U=SU+TUifSislinearS(TU)=(ST)U

The final property is called associativity. Unwrapping both sides, it says:

S(TU)(x)=S(TU(x))=S(T(U(x)))=ST(U(x))=(ST)U(x).

In other words, both S(TU) and (ST)U are the transformation defined by first applying U, then T, then S.

Composition of transformations is not commutative in general. That is, in general, TUB=UT, even when both compositions are defined.

Subsection4.4.2Matrix multiplication

In this subsection, we introduce a seemingly unrelated operation on matrices, namely, matrix multiplication. As we will see in the next subsection, matrix multiplication exactly corresponds to the composition of the corresponding linear transformations. First we need some terminology.

Notation

Let A be an m×n matrix. We will generally write aij for the entry in the i th row and the j th column. It is called the i,j entry of the matrix.

a11···a1j···a1n.........ai1···aij···ain.........am1···amj···amnEIIIGFJJJHjthcolumnithrow
Definition(Matrix multiplication)

Let A be an m×n matrix and let B be an n×p matrix. Denote the columns of B by v1,v2,...,vp:

B=C|||v1v2···vp|||D.

The product AB is the m×p matrix with columns Av1,Av2,...,Avp:

AB=C|||Av1Av2···Avp|||D.

In other words, matrix multiplication is defined column-by-column, or “distributes over the columns of B.

In order for the vectors Av1,Av2,...,Avp to be defined, the numbers of rows of B has to equal the number of columns of A.

The sizes of the matrices in the matrix product

  • In order for AB to be defined, the number of rows of B has to equal the number of columns of A.
  • The product of an m×n matrix and an n×p matrix is an m×p matrix.

If B has only one column, then AB also has one column. A matrix with one column is the same as a vector, so the definition of the matrix product generalizes the definition of the matrix-vector product from this definition in Section 2.4.

If A is a square matrix, then we can multiply it by itself; we define its powers to be

A2=AAA3=AAAetc.
The row-column rule for matrix multiplication

Recall from this definition in Section 2.4 that the product of a row vector and a column vector is the scalar

Aa1a2···anBEIIGx1x2...xnFJJH=a1x1+a2x2+···+anxn.

The following procedure for finding the matrix product is much better adapted to computations by hand; the previous definition is more suitable for proving theorems, such as this theorem below.

Recipe: The row-column rule for matrix multiplication

Let A be an m×n matrix, let B be an n×p matrix, and let C=AB. Then the ij entry of C is the i th row of A times the j th column of B:

cij=ai1b1j+ai2b2j+···+ainbnj.

Here is a diagram:

a11···a1k···a1n.........ai1···aik···ain.........am1···amk···amnEIIIGFJJJHithrowb11···b1j···b1p.........bk1···bkj···bkp.........bn1···bnj···bnpEIIIIGFJJJJHjthcolumn=c11···c1j···c1p.........ci1···cij···cip.........cm1···cmj···cmpEIIIGFJJJHijentry

Although matrix multiplication satisfies many of the properties one would expect (see the end of the section), one must be careful when doing matrix arithmetic, as there are several properties that are not satisfied in general.

Matrix multiplication caveats

  • Matrix multiplication is not commutative: AB is not usually equal to BA, even when both products are defined and have the same size. See this example.
  • Matrix multiplication does not satisfy the cancellation law: AB=AC does not imply B=C, even when AB=0. For example,
    K1000LK1234L=K1200L=K1000LK1256L.
  • It is possible for AB=0, even when AB=0 and BB=0. For example,
    K1010LK0011L=K0000L.

While matrix multiplication is not commutative in general there are examples of matrices A and B with AB=BA. For example, this always works when A is the zero matrix, or when A=B. The reader is encouraged to find other examples.

Subsection4.4.3Composition and Matrix Multiplication

The point of this subsection is to show that matrix multiplication corresponds to composition of transformations, that is, the standard matrix for TU is the product of the standard matrices for T and for U. It should be hard to believe that our complicated formula for matrix multiplication actually means something intuitive such as “chaining two transformations together”!

Proof

The theorem justifies our choice of definition of the matrix product. This is the one and only reason that matrix products are defined in this way. To rephrase:

Products and compositions

The matrix of the composition of two linear transformations is the product of the matrices of the transformations.

Recall from this definition in Section 4.3 that the identity matrix is the n×n matrix In whose columns are the standard coordinate vectors in Rn. The identity matrix is the standard matrix of the identity transformation: that is, x=IdRn(x)=Inx for all vectors x in Rn. For any linear transformation T:RnRm we have

IRmT=T

and by the same token we have for any m×n matrix A we have

ImA=A.

Similarly, we have TIRn=T and AIn=A.

Subsection4.4.4The algebra of transformations and matrices

In this subsection we describe two more operations that one can perform on transformations: addition and scalar multiplication. We then translate these operations into the language of matrices. This is analogous to what we did for the composition of linear transformations, but much less subtle.

Definition

  • Let T,U:RnRm be two transformations. Their sum is the transformation T+U:RnRm defined by
    (T+U)(x)=T(x)+U(x).
    Note that addition of transformations is only defined when both transformations have the same domain and codomain.
  • Let T:RnRm be a transformation, and let c be a scalar. The scalar product of c with T is the transformation cT:RnRm defined by
    (cT)(x)=c·T(x).

To emphasize, the sum of two transformations T,U:RnRm is another transformation called T+U; its value on an input vector x is the sum of the outputs of T and U. Similarly, the product of T with a scalar c is another transformation called cT; its value on an input vector x is the vector c·T(x).

Properties of addition and scalar multiplication for transformations

Let S,T,U:RnRm be transformations and let c,d be scalars. The following properties are easily verified:

T+U=U+TS+(T+U)=(S+T)+Uc(T+U)=cT+cU(c+d)T=cT+dTc(dT)=(cd)TT+0=T

In one of the above properties, we used 0 to denote the transformation RnRm that is zero on every input vector: 0(x)=0 for all x. This is called the zero transformation.

We now give the analogous operations for matrices.

Definition

  • The sum of two m×n matrices is the matrix obtained by summing the entries of A and B individually:
    Ka11a12a13a21a22a23L+Kb11b12b13b21b22b23L=Ka11+b11a12+b12a13+b13a21+b21a22+b22a23+b23L
    In other words, the i,j entry of A+B is the sum of the i,j entries of A and B. Note that addition of matrices is only defined when both matrices have the same size.
  • The scalar product of a scalar c with a matrix A is obtained by scaling all entries of A by c:
    cKa11a12a13a21a22a23L=Kca11ca12ca13ca21ca22ca23L
    In other words, the i,j entry of cA is c times the i,j entry of A.

In view of the above fact, the following properties are consequences of the corresponding properties of transformations. They are easily verified directly from the definitions as well.

Properties of addition and scalar multiplication for matrices

Let A,B,C be m×n matrices and let c,d be scalars. Then:

A+B=B+AC+(A+B)=(C+A)+Bc(A+B)=cA+cB(c+d)A=cA+dAc(dA)=(cd)AA+0=A

In one of the above properties, we used 0 to denote the m×n matrix whose entries are all zero. This is the standard matrix of the zero transformation, and is called the zero matrix.

We can also combine addition and scalar multiplication of matrices with multiplication of matrices. Since matrix multiplication corresponds to composition of transformations (theorem), the following properties are consequences of the corresponding properties of transformations.

Properties of matrix multiplication

Let A,B,C be matrices and let c be a scalar. Suppose that A is an m×n matrix, and that in each of the following identities, the sizes of B and C are compatible when necessary for the product to be defined. Then:

C(A+B)=CA+CB(A+B)C=AC+BCc(AB)=(cA)Bc(AB)=A(cB)AIn=AImA=A(AB)C=A(BC)

Most of the above properties are easily verified directly from the definitions. The associativity property (AB)C=A(BC), however, is not (try it!). It is much easier to prove by relating matrix multiplication to composition of transformations, and using the obvious fact that composition of transformations is associative.