If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains ***.kastatic.org** and ***.kasandbox.org** are unblocked.

Main content

Current time:0:00Total duration:16:14

we saw in the last video that orthonormal orthonormal bases make for good coordinate systems coordinate systems where it's easy to figure out the coordinates easy to figure out figure out coordinates that's what we did in the last video let's see if there are other useful reasons to have an orthonormal basis so we already know let's say I have some subspace V let's say V is a subspace subspace of RN and let's say we have let's say we have B which is an orthonormal basis B is equal to v1 v2 all the way to VK and it is an orthonormal orthonormal orthonormal basis for V which is just a fancy way of saying that all of these vectors have length 1 and they're all orthogonal with respect to each other now we've seen many times before that if I have just any member of RN so let's say that I have some vector X let's say I have some vector X that is a member of RN then X can be represented as a sum of a member of V as some vector V that is in our subspace and some vector W that is in the orthogonal complement of our subspace let me write that down where where V is a member of my subspace and W is a member of my subspace is orthogonal complement we saw this when I was doing the whole set of videos on orthogonal complements now what is this thing right here what is this thing right there by definition by definition that is the projection the projection of X onto V this would be the projection of X onto these orthogonal complement and we in the past that this is not an easy thing to find that if I set up let's say I set up some matrix a that has my basis vectors as the columns so if I set up some matrix a that looks like this v1 v2 all the way to VK we learned before that if we wanted to figure out and have a kind of a general way of figuring out what the projection is we learned that the projection the projection of any vector X onto V is equal to a times a transpose a inverse times a times X and this was a pain to figure out that is a pain to figure out but let's see if if the assumption that these guys are orthonormal or that this is an orthonormal set in any way in any way simplifies this so the first thing we can do is just explore this a little bit this vector V this is a member of our subspace which means it can be represented as a linear combination of my basis vectors so I can write I can write X is equal to a set of V I can write c1 times v1 plus c2 times v2 all the way to plus CK times VK this is the same thing as just any vector or some unique member of my subspace V so that's V right there and you can also view this as the projection the projection of X onto the subspace V so X can be represented as some member of V and then some member of these orthogonal complement plus W right there now what happens if we take both sides of this equation if we dot it with some one of these guys with let's say VI let's dot both sides this equation with VI so if I take VI dot X dot X where VI is the itis vector up here the ice basis that the ith basis vector and the basis for my subspace V what am I going to get this is going to be a c1 times V i x v1 plus c2 times VI times v2 plus you're going to keep going eventually you'll get to the I term which will be C I times V I dotted with VI and then you know assuming that I isn't one two or K eventually we'll get to see K times VI dotted with VK right we just saw this in the last video I'm just dotting both sides but we also have this W term so then we're going to plus VI dot W VI dot W now just as a kind of a you know just to make clarify things in the last video we assumed that X was inside of the subspace that so that X could be represented with coordinates here now X can be any member of RN and we're just looking at the projection of X and because it's any member it's going to be some combination of these guys plus some member of these orthogonal complement now when I take the dot product of one of my basis vectors the I paste this vector with both sides equation this side is just that then the right side something very similar happens to what we saw in the last video what is VI dot v1 well they're different they're different members of this orthonormal set so they're orthogonal so that's going to be 0 VI dot v2 that's 0 assuming VI doesn't equal to VI dot VI is 1 so this term is just going to be CI VI dot V K that's also 0 doesn't matter what our constant is 0 times anything is 0 and then what is VI dot W well by definition W is a member of our of our orthogonal complement to V which means that is orthogonal to every member of V well this is a member of V so these two guys are orthogonal so that is also equal to 0 and just like that you get CI CI is equal to V I times X I sorry times X just like that so what does this do this is kind of a very similar result that we got last time but remember we're not looking for we're not assuming that X is a member of V in that case and you know the C eyes would be the coordinates for X in this case we're looking for the we're looking for the projection of X onto V or the member of V that is that is kind of X's component in V or that represents X's projection onto V so if we now want to find if we now want to find the projection of X onto V the projection of X onto V it's equal to these CIS times the respective basis vectors but now we know what the CIS are there that basis vector times your vector X so just like that we get a pretty simple way of figuring out the projection onto a subspace with an orthonormal basis so let's see c1 is just going to be v1 dot X that's c1 and then we're gonna multiply that times the vector v1 that's a vector 2 and then the next the next I guess we could say you know the next coefficient on v2 is going to be v2 dot x times the vector v2 and then you're going to go all the way to + VK dot X dot X times VK and if you remember if you remember the what we did when we took the projection of X onto some line when we were taking the projection of X onto some line where L is equal to the span of some unit vector where this had a length 1 you know for T is any real number that's just a line some of the span of some unit vector we assume this has length 1 then the projection onto a line just simplified to the formula X dot let me write it this way X dot u times the vector u this was a projection onto a line notice when we're dealing with an orthonormal basis for a subspace when you take a projection of any vector in RN onto that subspace it's essentially you're just taking you're just dotting it you're just finding the projection onto the line spanned by each of these vectors right VX dot v1 times the vector v1 x times you want times vector v1 you're taking excess projections onto the line spanned by each of these guys that's all it is but clearly this is a much much simpler way of finding a projection than going through this mess of saying you know a times you know the inverse of a transpose a times a times a transpose I forgot that a transpose when I wrote it the first time times X this is clearly a lot easier but you might say okay this is easier but you know you told me that a projection is a linear transformation you've told me it's a linear transformation so I want to figure out the matrix here so let's see if being orthonormal in any way simplifies this so no we could always just figure out for any particular X we can just apply the dot product with each of the basis vectors those will be the coefficients and then apply those coefficients times the basis vectors add them up and you know your projection but you know some of us might actually want the transformation matrix so let's figure out what it is so let me just rewrite what we already know what we already know we already know that the projection onto any subspace V of X is equal to a times a transpose a inverse times a times X and where a is column vectors where a is column vectors are just are just the basis vectors v1 v2 all the way to VK now let's see if the assumption that these guys are an orthonormal basis let's see if this simplifies it at all let's take the case in particular of a transpose a a transpose a is going to be equal to what it's going to be equal to a transpose a transpose let's think about this these guys are members of RN so there's going to be an N by K matrix this is going to be so this is n by K this guy right here is K by n times an N by K we're going to have a K by K product right K by n times n by K is going to be K by K a transpose a is going to be K by K and what a transpose equal to well each of these columns are going to become rows so the first row here is going to be V 1 transpose V 1 transpose the second column here is going to be V 2 transpose and you're going to go all the way down the K column there is going to be VK transpose just like that and then a is of course this thing right there so a it looks like this you have V 1 @ you have V 2 like that and then you keep going and you have VK just like that now what's going to happen when we take this product let's do a couple of rows right here so when I take this product I'm going to K by K matrix let me write it big so I can explain it reasonably so what's the first row first column going to be it's going to be this row dotted with this column or v1 dot v1 well v1 dot v1 that's nice that's just one and then what's the second row second column well that's just going to be v2 you can get your row from this guy in your column from that guy this row dotted with that column so v2 dot v2 so that's nice that'll be a 1 and in general if your dotting if your finding the d you know aii or you're finding anything along the diagonal you're going to take the you know let's say the i throw with the ithe column so you're just going to have ones that go all the way down the diagonal now what about everything else let's say that you're looking for those say you're looking for this entry right here which is the first row second column this is going to be this guy right here is going to be the dot product of V 2 is going to be the dot product of this row it's going to be re the dot product of V 1 it's going to be a dot product of this row with this column right there so this is going to be V 1 dot V 2 but these two guys are orthogonal so what's that going to be equal to it's going to be equal to 0 this one right here is going to be V 1 dot V 3 well that's going to be 0 V 1 dot anything other than V 1 is going to be 0 similarly everything here the second row it's going to be v2 the first column in the second row is going to be v2 dot v1 which is clearly zero and you have v2 dot v2 which is 1 and then v2 dot all the rest of stuff is going to be 0 they're all orthogonal with respect to each other and so everything else is if you're not into if you're not if your row and your column is not the same well if your row and your column is the same you're going to be dotting the same vector so you're going to be getting one because all of their lengths are 1 but if your row and column are not the same you're going to be taking the dot product of two different members of your orthonormal basis and they're all orthogonal so you're just going to get a bunch of zeros so you're just going to get a bunch of zeros now what is this you have zeros everywhere with ones down the diagonal it's a K by K it's a K by K matrix this is the identity matrix in R K so that's simply what you assumed so normally this was our definition this was our definition of or this is our way of finding our transformation matrix for the projection of X onto some subspace but that's simply if we assume if we assume an orthonormal basis orthonormal orthonormal basis then a transpose a a transpose a becomes the K by K identity matrix and so what's the inverse of the identity matrix so a transpose a inverse becomes the inverse of the K by K identity matrix which is just the K by K identity matrix so this simplifies to the projection onto V of our vector X simplifies to a times the inverse of the identity major was just the identity matrix so it's just a times I K times a transpose I always forget that second a transpose right there times X and we could just ignore this that does nothing to it so it's just equal to it's just equal to a times a transpose times X which is a huge simplification I still have to do a matrix matrix product but finding the transpose finding the transpose of a matrix is pretty straightforward you just switch the rows in the column finding the inverse first multiplying the transpose times a that's a lot of work but then it's a huge amount of work to find the inverse of this thing but now since we assumed that these columns here are all they form an orthonormal set this just gets reduced to the identity matrix and the projection of X onto V is just equal to a times a transpose where a is the matrix where each of the column vectors each of the column vectors are the basis vectors for our subspace V anyway hopefully that gives you even more appreciation for orthonormal bases