[MUSIC] Previously we introduced the idea of a matrix and related it to the problem of solving simultaneous equations. And we showed that the columns of a matrix just read us what it does to the unit vector along each axis. Now we'll look at different types of matrices and what they do to space. And what happens if we apply one matrix transformation and then another, which is termed composition. Now because we could make any vector out of a vector sum of the scaled versions of e1 hat and e2 hat. Then what that means is the result of the transformation is just going to be some sum of the transform vectors, which I'm calling e1 hat and e2 hat. This is a bit hard to see but what it means is that the grid lines of our space stay parallel and evenly spaced. They might be stretched or sheared, but the origin stays where it is and there isn't any curviness to the space, it doesn't get warped. And that's a consequence of our scalar addition and multiplication rules for vectors. So that is if I write down the matrix as capital A and the vector its transforming is r, whatever it was. a b in our apples and bananas problem. And that gives me some altered version. We said it was 8 13 before, but I'm going to call it r transformed or r prime. Then we can look at what happens when I do algebra with it. So if I multiply r by some number, just a number, let's call it n. And if I apply A to (nr), what I'm saying is that I will get nr prime. And hopefully you can see if I put an n in there, when I multiply it all out I'm going to get an n in there. Similarly, if I multiply A by the vector (r+s), then I will get Ar + As. So if I get that multiplication, get that, do the whole thing again with another vector, and get another vector s, and add those two, that will be true. So what I'm saying here is that if I can then think of these as being the original basis vectors, e1 hat and e2 hat, I'll then get some addition of e2 and e1 primed. So if I say that's ne1 hat + me2 hat, I'll get nAe1 hat + mAe2 hat, which is ne1 prime + me2 primed. Which is just the vector sum of some multiple of those. So the space gets transformed, e1 and e2 get moved. And then I can just add up vectors with them. So that's very nice, that means that all of our vector sum rules work. Now maybe that's a bit confusing so let's try it with an example. So I've got my matrix A here from my apples bananas problem. So I have 2 3 10 1, or if we like, the vector 2 10 and the vector 3 1. Now let's try an example like a vector 3 2. Now if I multiply that out just straightforwardly, as we probably did at school, I've got 2 times 3 plus 3 times 2. So that's 6 plus 6, that's 12. And I've got 10 times 3, which is 30, plus 1 times 2, which is 2. So that's 32. But I could think of that as being 2 3 10 1, times (3 times 1 0 + 2 times 0 1). That is, 3 of our e1 hat and 2 of e2 hat in a vector sum. Now I can take the 3 out so that's 3 times (2 3 10 1 times 1 0). + 2 times (2 3 10 1 times 0 1). And this, we know that this is what happens from e1 hat to get to e1 prime here, so that's 3 times 2 10. And that's 2 times what happens to e2 hat, and that goes to 3 1. So that gives us 6 plus 6 is 12 and 30 plus 2 is 32. So it really is true, these rules really do work. We can think of a matrix multiplication as just being the multiplication of the vector sum of the transformed basis vectors. So pause now for a moment and try that maybe with an example of your own. And verify that that really does work because that's really quite deep. This matrix just tells us where the basis vectors go. That's the transformation it does. It's not a complicated, multiplying out thing. We don't need to worry about the mechanics of doing the sum. We can just think of it in terms of what it does to vectors in the space. [MUSIC]