Lecture 7: Positive Definite Day

{'English - US': '/courses/mathematics/18-085-computational-science-and-engineering-i-fall-2008/video-lectures/lecture-7-positive-definite-day/ocw-18.085-f08-lec07_300k.SRT'}

Flash and JavaScript are required for this feature.

Download the video from iTunes U or the Internet Archive.

Instructor: Prof. Gilbert Strang

 

The following content is provided under a Creative Commons license. Your support will help MIT OpenCourseWare continue to offer high quality educational resources for free. To make a donation or to view additional materials from hundreds of MIT courses visit MIT OpenCourseWare at ocw.mit.edu.

PROFESSOR STRANG: Finally we get to positive definite matrices. I've used the word and now it's time to pin it down. And so this would be my thank you for staying with it while we do this important preliminary stuff about linear algebra. So starting the next lecture we'll really make a big start on engineering applications. But these matrices are going to be the key to everything. And I'll call these matrices K and positive definite, I will only use that word about a symmetric matrix. So the matrix is already symmetric and that means it has real eigenvalues and many other important properties, orthogonal eigenvectors. And now we're asking for more. And it's that extra bit that is terrific in all kinds of applications. So if I can do this bit of linear algebra.

So what's coming then, my review session this afternoon at four, I'm very happy that we've got, I think, the best MATLAB problem ever invented in 18.085 anyway. So that should get onto the website probably by tomorrow. And Peter Buchak is like the MATLAB person. So his review sessions are Friday at noon. And I just saw him and suggested Friday at noon he might as well just stay in here. And knowing that that isn't maybe a good hour for everybody. So you could see him also outside of that hour. But that's the hour he will be ready for all kinds of questions about MATLAB or about the homeworks. Actually you'll be probably thinking more also about the homework questions on this topic.

Ready for positive definite? You said yes, right? And you have a hint about these things. So we have a symmetric matrix and the beauty is that it brings together all of linear algebra. Including elimination, that's when we see pivots. Including determinants which are closely related to the pivots. And what do I mean by upper left? I mean that if I have a three by three symmetric matrix and I want to test it for positive definite, and I guess actually this would be the easiest test if I had a tiny matrix, three by three, and I had numbers then this would be a good test. The determinants, by upper left determinants I mean that one by one determinant. So just that first number has to be positive. Then the two by two determinant, that times that minus that times that has to be positive. Oh I've already been saying that. Let me just put in some letters. So a has to be positive. This is symmetric, so a times c has to be bigger than b squared. So that will tell us. And then for two by two we finish. For three by three we would also require the three by three determinant to be positive.

But already here you're seeing one point about a positive definite matrix. Its diagonal will have to be positive. And somehow its diagonal has to be not just above zero, but somehow it has to defeat b squared. So the diagonal has to be somehow more positive than whatever negative stuff might come from off the diagonal. That's why I would need a*c > b squared. So both of those will be positive and their product has to be bigger than the other guy.

And then finally, a third test is that all the eigenvalues are positive. And of course if I give you a three by three matrix, that's probably not the easiest test since you'd have to find the eigenvalues. Much easier to find the determinants or the pivots. Actually, just while I'm at it, so the first pivot of course is a itself. No difficulty there. The second pivot turns out to be the ratio of a*c - b squared to a. So the connection between pivots and determinants is just really close. Pivots are ratios of determinants if you work it out. The second pivot, maybe I would call that d_2, is the ratio of a*c - b squared over a. In other words it's (c - b squared)/a. Determinants are positive and vice versa. Then it's fantastic that the eigenvalues come into the picture.

So those are three ways, three important properties of a positive definite matrix. But I'd like to make the definition something different. Now I'm coming to the meaning. If I think of those as the tests, that's done. Now the meaning of positive definite. The meaning of positive definite and the applications are closely related to looking for a minimum. And so what I'm going to bring in here, so it's symmetric. Now for a symmetric matrix I want to introduce the energy. So this is the reason why it has so many applications and such important physical meaning is that what I'm about to introduce, which is a function of x, and here it is, it's x transpose times A, not A, I'm sticking with K for my matrix, times x. I think of that as some f(x).

And let's just see what it would be if the matrix was two by two, [a, b; b, c]. Suppose that's my matrix. We want to get a handle on what, this is the first time I've ever written something that has x's times x's. So it's going to be quadratic. They're going to be x's times x's. And x is a general vector of the right size so it's got components x_1, x_2. And there it's transpose, so it's a row. And now I put in the [a, b; b, c]. And then I put in x again. This is going to give me a very nice, simple, important expression. Depending on x_1 and x_2. Now what is, can we do that multiplication? Maybe above I'll do the multiplication of this pair and then I have the other guy to bring in. So here, that would be ax_1+bx_2. And this would be bx_1+cx_2. So that's the first, that's this times this. What am I going to get? What shape, what size is this result going to be? This K is n by n. x is a column vector. n by one. x transpose, what's the shape of x transpose? One by n? So what's the total result? One by one. Just a number. Just a function. It's a number. But it depends on the x's and the matrix inside.

Can we do it now? So I've got this to multiply by this. Do you see an x_1 squared showing up? Yes, from there times there. And what's it multiplied by? The a. The first term is this times the ax_1 is a(x_1 squared). So that's our first quadratic. Now there'd be an x_1, x_2. Let me leave that for a minute and find the x_2 squared because it's easy. So where am I going to get x_2 squared? I'm going to get that from x_2, second guy here times second guy here. There's a c(x_2 squared).

So you're seeing already where the diagonal shows up. The diagonal a, c, whatever is multiplying the perfect squares. And it'll be the off-diagonal that multiplies the x_1, x_2. We might call those the crossterms. And what do we get for that then? We have x_1 times this guy. So that's a crossterm. bx_1*x_2, right? And here's another one coming from x_2 times this guy. And what's that one? It's also bx_1*x_2. So x_1, multiply that, x_2 multiply that, and so what do we have for this crossterm here? Two of them. 2bx_1*x_2. In other words, that b and that b came together in the 2bx_1*x_2. So here's my energy. Can I just loosely call it energy? And then as we get to applications we'll see why.

So I'm interested in that because it has important meaning. Well, so now I'm ready to define positive definite matrices. So I'll call that number four. But I'm going to give it a big star. Even more because it's the sort of key. So the test will be, you can probably guess it, I look at this expression, that x transpose Ax. And if it's a positive definite matrix and this represents energy, the key will be that this should be positive. This one should be positive for all x's. Well, with one exception, of course. All x's except, which vector is it? x=0 would just give me-- See, I put K. My default for a matrix, but should be, it's K. Except x=0, except the zero vector. Of course. If x_1 and x_2 are both zero.

Now that looks a little maybe less straightforward, I would say, because it's a statement about this is true for all x_1 and x_2. And we better do some examples and draw a picture. Let me draw a picture right away. So here's x_1 direction. Here's x_2 direction. And here is the x transpose Ax, my function. So this depends on two variables. So it's going to be a sort of a surface if I draw it. Now, what point do we absolutely know? And I put A again. I am so sorry. Well, we know one point. It's there whatever that matrix might be. It's there. Zero, right? You just told me that if both x's are zero then we automatically get zero.

Now what do you think the shape of this curve, the shape of this graph is going to look like? The point is, if we're positive definite now. So I'm drawing the picture for positive definite. So my definition is that the energy goes up. It's positive, right? When I leave, when I move away from that point I go upwards. That point will be a minimum. Let me just draw it roughly. So it sort of goes up like this. These cheap 2-D boards and I've got a three-dimensional picture here. But you see it somehow? What word or what's your visualization? It has a minimum there. That's why minimization, which was like, the core problem in calculus, is here now. But for functions of two x's or n x's. We're up the dimension over the basic minimum problem of calculus. It's sort of like a parabola It's cross-sections cutting down through the thing would be just parabolas because of the x squared.

I'm going to call this a bowl. It's a short word. Do you see it? It opens up. That's the key point, that it opens upward. And let's do some examples. Tell me some positive definite. So positive definite and then let me here put some not positive definite cases. Tell me a matrix. Well, what's the easiest, first matrix that occurs to you as a positive definite matrix? The identity. That passes all our tests, its eigenvalues are one, its pivots are one, the determinants are one. And the function is x_1 squared plus x_2 squared with no b in it. It's just a perfect bowl, perfectly symmetric, the way it would come off a potter's wheel.

Now let me take one that's maybe not so, let me put a nine there. So I'm off to a reasonable start. I have an x_1 squared and a nine x_2 squared. And now I want to ask you, what could I put in there that would leave it positive definite? Well, give me a couple of possibilities. What's a nice, not too big now, that's the thing. Two. Two would be fine. So if I had a two there and a two there I would have a 4x_1*x_2 and it would, like, this, instead of being a circle, which it was for the identity, the plane there would cut out a ellipse instead. But it would be a good ellipse. Because we're doing squares, we're really, the Greeks understood these second degree things and they would have known this would have been an ellipse.

How high can I go with that two or where do I have to stop? Where would I have to, if I wanted to change the two, let me just focus on that one, suppose I wanted to change it. First of all, give me one that's, how about the borderline. Three would be the borderline. Why's that? Because at three we have nine minus nine for the determinant. So the determinant is zero. Of course it passed the first test. One by one was ok. But two by two was not, was at the borderline. What else should I think? Oh, that's a very interesting case. The borderline. You know, it almost makes it. But can you tell me the eigenvalues of that matrix? Don't do any quadratic equations.

How do I know, what's one eigenvalue of a matrix? You made it singular, right? You made that matrix singular. Determinant zero. So one of its eigenvalues is zero. And the other one is visible by looking at the trace. I just quickly mentioned that if I add the diagonal, I get the same answer as if I add the two eigenvalues. So that other eigenvalue must be ten. And this is entirely typical, that ten and zero, the extreme eigenvalues, lambda_max and lambda_min, are bigger than, these diagonal guys are inside. They're inside, between zero and ten and it's these terms that enter somehow and gave us an eigenvalue of ten and an eigenvalue of zero.

I guess I'm tempted to try to draw that figure. Just to get a feeling of what's with that one. It always helps to get the borderline case. So what's with this one? Let me see what my quadratic would be. Can I just change it up here? Rather than rewriting it. So I'm going to, I'll put it up here. So I have to change that four to what? Now that I'm looking at this matrix. That four is now a six. Six. This is my guy for this one. Which is not positive definite.

Let me tell you right away the word that I would use for this one. I would call it positive semi-definite because it's almost there, but not quite. So semi-definite allows the matrix to be singular. So semi-definite, maybe I'll do it in green what semi-definite would be. Semi-def would be eigenvalues greater than or equal zero. Determinants greater than or equal zero. Pivots greater than zero if they're there or then we run out of pivots. You could say greater than or equal to zero then. And energy, greater than or equal to zero for semi-definite. And when would the energy, what x's, what would be the like, you could say the ground states or something, what x's, so greater than or equal to zero, emphasize that possibility of equal in the semi-definite case.

Suppose I have a semi-definite matrix, yeah, I've got one. But it's singular. So that means a singular matrix takes some vector x to zero. Right? If my matrix is actually singular, then there'll be an x where Kx is zero. And then, of course, multiplying by x transpose, I'm still at zero. So the x's, the zero energy guys, this is straightforward, the zero energy guys, the ones where x transpose Kx is zero, will happen when Kx is zero. If Kx is zero, and we'll see it in that example.

Let's see it in that example. What's the x for which, I could say in the null space, what's the vector x that that matrix kills? , right? The vector . gives me . That's the vector that, so I get 3-3, the zero, 9-9, the zero. So I believe that this thing will be-- Is it zero at three, minus one? I think that it has to be, right? If I take x_1 to be three and x_2 to be minus one, I think I've got zero energy here. Do I? x_1 squared will be at the nine and nine x_2 squared will be nine more. And what will be this 6x_1*x_2? What will that come out for this x_1 and x_2? Minus 18. Had to, right? So I'd get nine from there, nine from there, minus 18, zero.

So the graph for this positive semi-definite will look a bit like this. There'll be a direction in which it doesn't climb. It doesn't go below the base, right? It's never negative. This is now the semi-definite picture. But it can run along the base. And it will for the vector x_1=3, x_2=-1, I don't know where that is, one, two, three, and then maybe minus one. Along some line here the graph doesn't go up. It's sitting, can you imagine that sitting in the base? I'm not Rembrandt here, but in the other direction it goes up. Oh, the hell with that one. Do you see, sort of? It's like a trough, would you say? I mean, it's like a, you know, a bit of a drainpipe or something. It's running along the ground, along this direction and in the other directions it does go up. So it's shaped like this with the base not climbing. Whereas here, there's no bad direction. Climbs every way you go. So that's positive definite and that's positive semi-definite.

Well suppose I push it a little further. Let me make a place here for a matrix that isn't even positive semi-definite. Now it's just going to go down somewhere. I'll start again with one and nine and tell me what to put in now. So this is going to be a case where the off-diagonal is too big, it wins. And prevents positive definite. So what number would you like here? Five? Five is certainly plenty. So now I have [1, 5; 5, 9]. Let me take a little space on a board just to show you. Sorry about that. So I'm going to do the [1, 5; 5, 9] just because they're all important, but then we're coming back to positive definite. So if it's [1, 5; 5, 9] and I do that usual x, x transpose Kx and I do the multiplication out, I see the one x_1 squared and I see the nine x_2 squareds. And how many x_1*x_2's do I see? Five from there, five from there, ten. And I believe that can be negative. The fact of having all nice plus signs is not going to help it because we can choose, as we already did, x_1 to be like a negative number and x_2 to be a positive. And we can get this guy to be negative and make it, in this case we can make it defeat these positive parts.

What choice would do it? Let me take x_1 to be minus one and tell me an x_2 that's good enough to show that this thing is not positive definite or even semi-definite, it goes downhill. Take x_2 equal? What do you say? 1/2? Yeah, I don't want too big an x_2 because if I have too big an x_2, then this'll be important. Does 1/2 do it? So I've got 1/4, that's positive, but not very. 9/4, so I'm up to 10/4, but this guy is what? Ten and the minus is minus five. Yeah. So that absolutely goes, at this one I come out less than zero. And I might as well complete.

So this is the case where I would call it indefinite. Indefinite. It goes up like if x_2 is zero, then it's just got x_1 squared, that's up. If x_1 is zero, it's only got x_2 squared, that's up. But there are other directions where it goes downhill. So it goes either up, it goes both up in some ways and down in others. And what kind of a graph, what kind of a surface would I now have for x transpose for this x transpose, this indefinite guy? So up in some ways and down in others. This gets really hard to draw. I believe that if you ride horses you have an edge on visualizing this. So it's called, what kind of a point's it called? Saddle point, it's called a saddle point. So what's a saddle point? That's not bad, right? So this is a direction where it went up. This is a direction where it went down. And so it sort of fills in somehow.

Or maybe, if you don't, I mean, who rides horses now? Actually maybe something we do do is drive over mountains. So the path, if the road is sort of well-chosen, the road will go, it'll look for the, this would be-- Yeah, here's our road. We would do as little climbing as possible. The mountain would go like this, sort of. So this would be like, the bottom part looking along the peaks of the mountains. But it's the top part looking along the driving direction. So driving, it's a maximum, but in the mountain range direction it's a minimum. So it's a saddle point. So that's what you get from a typical symmetric matrix. And if it was minus five it would still be the same saddle point, would still be 9-25, it would still be negative and a saddle.

Positive guys are our thing. Alright. So now back to positive definite. With these four tests and then the discussion of semi-definite. Very key, that energy. Let me just look ahead a moment. Most physical problems, many, many physical problems, you have an option. Either you solve some equations, either you find the solution from our equations, Ku=f, typically. Matrix equation or differential equation. Or there's another option of minimizing some function. Some energy. And it gives the same equations. So this minimizing energy will be a second way to describe the applications.

Now can I get a number five? There's an important number five and then you know really all you need to know about symmetric matrices. This gives me, about positive definite matrices, this gives me a chance to recap. So I'm going to put down a number five. Because this is where the matrices come from. Really important. And it's where they'll come from in all these applications that chapter two is going to be all about, that we're going to start. So they come, these positive definite matrices, so this is another way to, it's a test for positive definite matrices and it's, actually, it's where they come from. So here's a positive definite matrix. They come from A transpose A. A fundamental message is that if I have just an average matrix, possibly rectangular, could be a square but not symmetric, then sooner or later, in fact usually sooner, you end up looking at A transpose A. We've seen that already. And we already know that A transpose A is square, we already know it's symmetric and now we're going to know that it's positive definite. So matrices like A transpose A are positive definite or possibly semi-definite. There's that possibility. If A was the zero matrix, of course, we would just get the zero matrix which would be only semi-definite, or other ways to get a semi-definite.

So I'm saying that if K, if I have a matrix, any matrix, and I form A transpose A, I get a positive definite matrix or maybe just semi-definite, but not indefinite. Can we see why? Why is this positive definite or semi-? So that's my question. And the answer is really worth, it's just neat and worth seeing. So do I want to look at the pivots of A transpose A? No. They're something, but whatever they are, I can't really follow those well. Or the eigenvalues very well, or the determinants. None of those come out nicely. But the real guy works perfectly. So look at x transpose Kx. So I'm just doing, following my instinct here.

So if K is A transpose A, my claim is, what am I saying then about this energy? What is it that I want to discover and understand? Why it's positive. Why does taking any matrix, multiplying by its transpose produce something that's positive? Can you see any reason why that quantity, which looks kind of messy, I just want to look at it the right way to see why that should be positive, that should come out positive. So I'm not going to get into numbers, I'm not going to get into diagonals and off-diagonals. I'm just going to do one thing to understand that particular combination, x transpose A transpose Ax. What shall I do? Anybody see what I might do? Yeah, you're seeing here if you look at it again, what are you seeing here? Tell me again. If I take Ax together, then what's the other half? It's the transpose of Ax. So I just want to write that as, I just want to think of it that way, as Ax. And here's the transpose of Ax. Right? Because transposes of Ax, so transpose guys in the opposite order, and the multiplication--

This is the great. I call these proof by parenthesis because I'm just putting parentheses in the right place, but the key law of matrix multiplication is that, that I can put (AB)C is the same as A(BC). That rule, which is just multiply it out and you see that parentheses are not needed because if you keep them in the right order you can do this first, or you can do this first. Same answer. What do I learn from that? What was the point? This is some vector, I don't know especially what it is times its transpose. So that's the length squared. What's the key fact about that? That it is never negative. It's always greater than zero or possibly equal.

When does that quantity equal zero? When Ax is zero. When Ax is zero. Because this is a vector. That's the same vector transposed. And everybody's got that picture. When I take any y transpose y, I get y_1 squared plus y_2 squared through y_n squared. And I get a positive answer except if the vector is zero. So it's zero when Ax is zero. So that's going to be the key. If I pick any matrix A, and I can just take an example, but chapter, the applications are just going to be full of examples. Where the problem begins with a matrix A and then A transpose shows up and it's the combination A transpose A that we work with. And we're just learning that it's positive definite.

Unless, shall I just hang on since I've got here, I have to say when is it, have to get these two possibilities. Positive definite or only semi-definite. So what's the key to that borderline question? This thing will be only semi-definite if there's a solution to Ax=0. If there is an x, well, there's always the zero vector. Zero vector I can't expect to be positive. So I'm looking for if there's an x so that Ax is zero but x is not zero, then I'll only be semi-definite. That's the test. If there is a solution to Ax=0.

When we see applications that'll mean there's a displacement with no stretching. We might have a line of springs and when could the line of springs displace with no stretching? When it's free-free, right? If I have a line of springs and no supports at the ends, then that would be the case where it could shift over by the vector. So that would be the case where the matrix is only singular. We know that. The matrix is now positive semi-definite. We just learned that. So the free-free matrix, like B, both ends free, or C. So our answer is going to be that K and T are positive definite. And our other two guys, the singular ones, of course, just don't make it. B at both ends, the free-free line of springs, it can shift without stretching. Since Ax will measure the stretching when it just shifts rigid motion, the Ax is zero and we see only positive definite. And also C, the circular one. There it can displace with no stretching because it can just turn in the circle. So these guys will be only positive semi-definite.

Maybe I better say this another way. When is this positive definite? Can I use just a different sentence to describe this possibility? This is positive definite provided, so what I'm going to write now is to remove this possibility and get positive definite. This is positive definite provided, now, I could say it this way. The A has independent columns. So I just needed to give you another way of looking at this Ax=0 question. If A has independent columns, what does that mean? That means that the only solution to Ax=0 is the zero solution. In other words, it means that this thing works perfectly and gives me positive. When A has independent columns.

Let's just remember our K, T, B, C. So here's a matrix, so let me take the T matrix, that's this one, this guy. And then the third column is . Those three columns are independent. They point off. They don't lie in a plane. They point off in three different directions. And then there are no solutions to, no x's that's go Kx=0. So that would be a case of independent columns. Let me make a case of dependent columns. So and I'm going to make it B now. Now the columns of that matrix are dependent. There's a combination of them that give zero. They all lie in the same plane. There's a solution to that matrix times x equal zero. What combination of those columns shows me that they are dependent? That some combination of those three columns, some amount of this plus some amount of this plus some amount of that column gives me the zero vector. You see the combination. What should I take? again. No surprise. That's the vector that we know is in the everything shifting the same amount, nothing stretching.

Talking fast here about positive definite matrices. This is the key. Let's just ask a few questions about positive definite matrices as a way to practice. Suppose I had one. Positive definite. What about its inverse? Is that positive definite or not? So I've got a positive definite one, it's not singular, it's got positive eigenvalues, everything else. It's inverse will be symmetric, so I'm allowed to think about it. Will it be positive definite? What do you think? Well, you've got a whole bunch of tests to sort of mentally run through. Pivots of the inverse, you don't want to touch that stuff. Determinants, no. What about eigenvalues? What would be the eigenvalues if I have this positive definite symmetric matrix, its eigenvalues are one, four, five. What can you tell me about the eigenvalues of the inverse matrix? They're the inverses. So those three eigenvalues are? 1, 1/4, 1/5, what's the conclusion here? It is positive definite. Those are all positive, it is positive definite. So if I invert a positive definite matrix, I'm still positive definite.

All the tests would have to pass. It's just I'm looking each time for the easiest test. Let me look now, for the easiest test on K_1+K_2. Suppose that's positive definite and that's positive definite. What if I add them? What do you think? Well, we hope so. But we have to say which of my one, two, three, four, five would be a good way to see it. Would be a good way to see it. Good question. Four? We certainly don't want to touch pivots and now we don't want to touch eigenvalues either. Of course, if number four works, others will also work. The eigenvalues will come out positive. But not too easy to say what they are. Let's try test number four. So K_1. What's the test? So test number four tells us that this part, x transpose K_1*x, that that part is positive, right? That that part is positive. If we know that's positive definite. Now, about K_2 we also know that for every x, you see it's for every x, that helps, don't let me put x_2 there, for every x this will be positive.

And now what's the step I want to take? To get some information on the matrix K_1+K_2. I should add. If I add these guys, you see that it just, then I can write that as, I can write that this way. And what have I learned? I've learned that that's positive, even greater than, except for the zero vector. Because this was greater than, this is greater than. If I add two positive numbers, the energies are positive and the energies just add. The energies just add. So that definition four was the good way, just nice, easy way to see that if I have a couple of positive definite matrices, a couple of positive energies, I'm really coupling the two systems. This is associated somehow. I've got two systems, I'm putting them together and the energy is just even more positive. It's more positive either of these guys because I'm adding.

As I'm speaking here, will you allow me to try test number five, this A transpose A business? Suppose K_1 was A transpose A. If it's positive definite, it will. Be And suppose K_2 is B transpose B. If it's positive definite, it will be. Now I would like to write the sum somehow as, in this something transpose something. And I just do it now because I think it's like, you won't perhaps have thought of this way to do it. Watch. Suppose I create the matrix [A; B]. That'll be my new matrix. Say, call it C. Am I allowed to do that? I mean, that creates a matrix? These A and B, they had the same number of columns, n. So I can put one over the other and I still have something with n columns. So that's my new matrix C. And now I want C transpose. By the way, I'd call that a block matrix. You know, instead of numbers, it's got two blocks in there. Block matrices are really handy.

Now what's the transpose of that block matrix? You just have faith, just have faith with blocks. It's just like numbers. If I had a matrix [1; 5] then I'd get a row one, five. But what do you think? This is worth thinking about even after class. What would be, if this C matrix is this block A above B, what do you think for C transpose? A transpose, B transpose side by side. Just put in numbers and you'd see it. And now I'm going to take C transpose times C. I'm calling it C now instead of A because I've used the A in the first guy and I've used B in the second one and now I'm ready for C. How do you multiply block matrices? Again, you just have faith. What do you think? Tell me the answer. A transpose, I multiply that by that just as if they were numbers. And I add that times that just as if they were numbers. And what do I have? I've got K_1+K_2. So I've written K_1, this is K_1+K_2 and this is in my form C transpose C that I was looking for, that number five was looking for. So it's done it. It's done it. The fact of getting A, K_1 in this form, K_2 in this form. And I just made a block matrix and I got K_1+K_2. That's not a big deal in itself, but block matrices are really handy. It's good to take that step with matrices. Think of, possibly, the entries as coming in blocks and not just one at a time.

Well, thank you, okay. I swear Friday we'll start applications in all kinds of engineering problems and you'll have new applications.