



11/16/11  Orthogonal Complements and Orthogonal Projections




 Let W be a subspace of Rn. The orthogonal complement to W is the set of all vectors that are orthogonal to the vectors in W. The orthogonal complement of W is denoted W⊥ and is called "Wperp".





 The orthogonal complement of the rowspace of A is the nullspace of A.






Note that matrix A has 3 independent rows (and therefore rank(A) = 3). The 3 independent rows provide a basis for the rowspace (i.e., every vector in the rowspace is a linear combination of those 3 rows). Note that the rowspace spans R3, a subspace of R4 (the domain of the row vectors).
The nullspace has a single basis vector and every vector in the nullspace is a multiple of that vector.
Note that the each of the basis vectors in row(A) is orthogonal to the basis vector in null(A). In fact, note that this is actually required by the definition of the nullspace because the nullspace contains all the column vectors which multiplied by the rows of matrix A result in 0  meaning they are orthogonal! Of course, why didn't we think of that sooner?





 The orthogonal complement of the columnspace of A is the nullspace of AT.






Note that matrix A has 3 independent columns (and therefore rank(A) = 3). The 3 independent columns provide a basis for the columnspace (i.e., every vector in the columnspace is a linear combination of those 3 columns). Note that the columnspace spans R3, a subspace of R4 (the domain of the column vectors).
The nullspace of AT has a single basis vector and every vector in the nullspace of AT is a multiple of that vector.
Note that the each of the basis vectors in col(A) is orthogonal to the basis vector in null(A). Again, this is required by the definition of AT (the columns of A are the rows of AT) and the nullspace of AT contains all the vectors which multiplied by AT produce 0, meaning the rows of AT = columns of A are orthogonal to the nullspace of AT.





 Let W be a subspace of Rn and let {u1,...,uk} be an orthogonal basis for W. For any vector v in Rn, the orthogonal projection of v onto W is defined as:





 The set of vectors below is an orthogonal basis for a subspace W of R3:
The vector
is independent of the basis above (how do we know?).
We can find its projection onto the plane W in two ways:





 First, we project the vector
onto each of the two orthogonal basis vectors in W and then add the projections together:





 Second, we project the vector
onto the normal to the plane formed by
and subtract that projection from





 The component of v orthogonal to W is the vector:





 In the previous example, the projection of V onto W
Subtracting the projection from the original vector v yields





 Let W be a subspace of Rn and let v be a vector in Rn. Then there are unique vectors w in W and w⊥ in W⊥ such that v = w + w⊥.





 This simply says that in the last example, the vector v is equal to its projection into the plane plus its component perpendicular to the plane.





 If W is a subspace of Rn, then dim W + dim W⊥ = n





 In the previous example, the two orthogonal basis vectors in W spanned R2, a subspace of R3. The orthogonal complement of W had a basis vector
in W⊥ which spanned R1. W and W⊥ together span R3. Hence the sum of the dimensions is 2 + 1 = 3, the number of elements in each vector.





 Read Section 5.2 of the text, pp. 389398.






