1
00:00:08 --> 00:00:12
-- one and -- the lecture on
symmetric matrixes.
2
00:00:12 --> 00:00:17
So that's the most important
class of matrixes,
3
00:00:17 --> 00:00:19
symmetric matrixes.
4
00:00:19 --> 00:00:21
A equals A transpose.
5
00:00:21 --> 00:00:26
So the first points,
the main points of the lecture
6
00:00:26 --> 00:00:28
I'll tell you right away.
7
00:00:28 --> 00:00:33.4
What's special about the
eigenvalues?
8
00:00:33.4 --> 00:00:36
What's special about the
eigenvectors?
9
00:00:36 --> 00:00:39
This is -- the way we now look
at a matrix.
10
00:00:39 --> 00:00:43
We want to know about its
eigenvalues and eigenvectors and
11
00:00:43 --> 00:00:48
if we have a special type of
matrix, that should tell us
12
00:00:48 --> 00:00:52.4
something about eigenvalues and
eigenvectors.
13
00:00:52.4 --> 00:00:56
Like Markov matrixes,
they have an eigenvalue equal
14
00:00:56 --> 00:00:56
one.
15
00:00:56 --> 00:01:00
Now symmetric matrixes,
can I just tell you right off
16
00:01:00 --> 00:01:04
what the main facts -- the two
main facts are?
17
00:01:04 --> 00:01:08
The eigenvalues of a symmetric
matrix, real -- this is a real
18
00:01:08 --> 00:01:13
symmetric matrix,
we -- talking mostly about real
19
00:01:13 --> 00:01:14
matrixes.
20
00:01:14 --> 00:01:16
The eigenvalues are also real.
21
00:01:16 --> 00:01:22
So our examples of rotation
matrixes, where -- where we got
22
00:01:22 --> 00:01:27
E- eigenvalues that were
complex, that won't happen now.
23
00:01:27 --> 00:01:32
For symmetric matrixes,
the eigenvalues are real and
24
00:01:32 --> 00:01:37.2
the eigenvectors are also very
special.
25
00:01:37.2 --> 00:01:41.29
The eigenvectors are
perpendicular,
26
00:01:41.29 --> 00:01:45.38
orthogonal, so which do you
prefer?
27
00:01:45.38 --> 00:01:48
I'll say perpendicular.
28
00:01:48 --> 00:01:52
Perp- well, they're both long
words.
29
00:01:52 --> 00:01:53
Okay, right.
30
00:01:53 --> 00:02:00
So -- I have a -- you should
say "why?" and I'll at least
31
00:02:00 --> 00:02:06
answer why for case one,
maybe case two,
32
00:02:06 --> 00:02:11.32
the checking the Eigen -- that
the eigenvectors are
33
00:02:11.32 --> 00:02:14.15
perpendicular,
I'll leave to,
34
00:02:14.15 --> 00:02:16
the -- to the book.
35
00:02:16 --> 00:02:21
But let's just realize what --
well, first I have to say,
36
00:02:21 --> 00:02:28
it -- it could happen,
like for the identity matrix --
37
00:02:28 --> 00:02:30
there's a symmetric matrix.
38
00:02:30 --> 00:02:34
Its eigenvalues are certainly
all real, they're all one for
39
00:02:34 --> 00:02:36
the identity matrix.
40
00:02:36 --> 00:02:38.82
What about the eigenvectors?
41
00:02:38.82 --> 00:02:43
Well, for the identity,
every vector is an eigenvector.
42
00:02:43 --> 00:02:46
So how can I say they're
perpendicular?
43
00:02:46 --> 00:02:51
What I really mean is the --
they -- this word are should
44
00:02:51 --> 00:02:55
really be written can be chosen
perpendicular.
45
00:02:55 --> 00:02:59
That is, if we have -- it's the
usual case.
46
00:02:59 --> 00:03:05
If the eigenvalues are all
different, then each eigenvalue
47
00:03:05 --> 00:03:10
has one line of eigenvectors and
those lines are perpendicular
48
00:03:10 --> 00:03:12
here.
49
00:03:12 --> 00:03:15
But if an eigenvalue's
repeated, then there's a whole
50
00:03:15 --> 00:03:20
plane of eigenvectors and all
I'm saying is that in that
51
00:03:20 --> 00:03:22
plain, we can choose
perpendicular ones.
52
00:03:22 --> 00:03:27
So that's why it's a can be
chosen part, is -- this is in
53
00:03:27 --> 00:03:30
the case of a repeated
eigenvalue where there's some
54
00:03:30 --> 00:03:33
real, substantial freedom.
55
00:03:33 --> 00:03:38
But the typical case is
different eigenvalues,
56
00:03:38 --> 00:03:42
all real, one dimensional
eigenvector space,
57
00:03:42 --> 00:03:46
Eigen spaces,
and all perpendicular.
58
00:03:46 --> 00:03:50
So, just -- let's just see the
conclusion.
59
00:03:50 --> 00:03:56
If we accept those as correct,
what happens -- and I also mean
60
00:03:56 --> 00:04:00
that there's a full set of them.
61
00:04:00 --> 00:04:04
I -- so that's part of this
picture here,
62
00:04:04 --> 00:04:08
that there -- there's a
complete set of eigenvectors,
63
00:04:08 --> 00:04:10
perpendicular ones.
64
00:04:10 --> 00:04:15
So, having a complete set of
eigenvectors means -- so normal
65
00:04:15 --> 00:04:20.6
-- so the usual -- maybe I put
the -- usually -- usual --
66
00:04:20.6 --> 00:04:27
usual case is that the matrix A
we can write in terms of its
67
00:04:27 --> 00:04:32
eigenvalue matrix and its
eigenvector matrix this way,
68
00:04:32 --> 00:04:33.63
right?
69
00:04:33.63 --> 00:04:39
We can do that in the usual
case, but now what's special
70
00:04:39 --> 00:04:42
when the matrix is symmetric?
71
00:04:42 --> 00:04:48
So this is the usual case,
and now let me go to the
72
00:04:48 --> 00:04:51.3
symmetric case.
73
00:04:51.3 --> 00:04:55
So in the symmetric case,
A, this -- this should become
74
00:04:55 --> 00:04:57
somehow a little special.
75
00:04:57 --> 00:05:02
Well, the lambdas on the
diagonal are still on the
76
00:05:02 --> 00:05:02
diagonal.
77
00:05:02 --> 00:05:07
They're -- they're real,
but that's where they are.
78
00:05:07 --> 00:05:09
What about the eigenvector
matrix?
79
00:05:09 --> 00:05:13.55
So what can I do now special
about
80
00:05:13.55 --> 00:05:18
the eigenvector matrix when --
when the A itself is symmetric,
81
00:05:18 --> 00:05:22
that says something good about
the eigenvector matrix,
82
00:05:22 --> 00:05:25
so what is this -- what does
this lead to?
83
00:05:25 --> 00:05:28
This -- these perpendicular
eigenvectors,
84
00:05:28 --> 00:05:32
I can not only -- I can not
only guarantee they're
85
00:05:32 --> 00:05:36
perpendicular,
I could also make them unit
86
00:05:36 --> 00:05:42
vectors, no problem,
just s- scale their length to
87
00:05:42 --> 00:05:42
one.
88
00:05:42 --> 00:05:44
So what do I have?
89
00:05:44 --> 00:05:47
I have orthonormal
eigenvectors.
90
00:05:47 --> 00:05:53
And what does that tell me
about the eigenvector matrix?
91
00:05:53 --> 00:05:59
What -- what letter should I
now use in place of S --
92
00:05:59 --> 00:06:04
remember
S has the eigenvectors in its
93
00:06:04 --> 00:06:08
columns, but now those columns
are orthonormal,
94
00:06:08 --> 00:06:11
so the right letter to use is
Q.
95
00:06:11 --> 00:06:16
So that's where -- so we've got
the letter all set up,
96
00:06:16 --> 00:06:20
so this should be Q lambda Q
inverse.
97
00:06:20 --> 00:06:26
Q standing in our minds always
for this matrix -- in this case
98
00:06:26 --> 00:06:31
it's square,
it's -- so these are the
99
00:06:31 --> 00:06:33
columns of Q,
of course.
100
00:06:33 --> 00:06:36
And one more thing.
101
00:06:36 --> 00:06:37
What's Q inverse?
102
00:06:37 --> 00:06:43
For a matrix that has these
orthonormal columns,
103
00:06:43 --> 00:06:49
we know that the inverse is the
same as the transpose.
104
00:06:49 --> 00:06:56
So here is the beautiful --
there is the -- the great
105
00:06:56 --> 00:06:59
description,
the factorization of a
106
00:06:59 --> 00:07:01
symmetric matrix.
107
00:07:01 --> 00:07:04
And this is,
like, one of the famous
108
00:07:04 --> 00:07:09
theorems of linear algebra,
that if I have a symmetric
109
00:07:09 --> 00:07:12
matrix, it can be factored in
this form.
110
00:07:12 --> 00:07:18
An orthogonal matrix times
diagonal times the transpose of
111
00:07:18 --> 00:07:19
that orthogonal matrix.
112
00:07:19 --> 00:07:23
And, of course,
everybody immediately says yes,
113
00:07:23 --> 00:07:27
and if this is possible,
then that's clearly symmetric,
114
00:07:27 --> 00:07:28
right?
115
00:07:28 --> 00:07:33
That -- take -- we've looked at
products of three guys like that
116
00:07:33 --> 00:07:36.56
and taken their transpose and we
got
117
00:07:36.56 --> 00:07:37
it back again.
118
00:07:37 --> 00:07:42
So do you -- do you see the
beauty of this -- of this
119
00:07:42 --> 00:07:43
factorization,
then?
120
00:07:43 --> 00:07:48
It -- it completely displays
the eigenvalues and eigenvectors
121
00:07:48 --> 00:07:53
the symmetry of the -- of the
whole thing, because -- that
122
00:07:53 --> 00:07:57
product, Q times lambda times Q
transpose,
123
00:07:57 --> 00:08:00
if I transpose it,
it -- this comes in this
124
00:08:00 --> 00:08:04
position and we get that matrix
back again.
125
00:08:04 --> 00:08:08.04
So that's -- in mathematics,
that's called the spectral
126
00:08:08.04 --> 00:08:08
theorem.
127
00:08:08 --> 00:08:12
Spectrum is the set of
eigenvalues of a matrix.
128
00:08:12 --> 00:08:15
Spec- it somehow comes from the
idea
129
00:08:15 --> 00:08:21
of the spectrum of light as a
combination of pure things --
130
00:08:21 --> 00:08:27
where our matrix is broken down
into pure eigenvalues and
131
00:08:27 --> 00:08:33
eigenvectors -- in mechanics
it's often called the principle
132
00:08:33 --> 00:08:34
axis theorem.
133
00:08:34 --> 00:08:36
It's very useful.
134
00:08:36 --> 00:08:42
It means that if you have --
we'll see it geometrically.
135
00:08:42 --> 00:08:47.37
It means that if I have some
material -- if I look at the
136
00:08:47.37 --> 00:08:51.84
right axis, it becomes diagonal,
it becomes -- the -- the
137
00:08:51.84 --> 00:08:54
directions don't couple
together.
138
00:08:54 --> 00:08:54
Okay.
139
00:08:54 --> 00:08:59
So that's -- that -- that's
what to remember from -- from
140
00:08:59 --> 00:09:01.1
this lecture.
141
00:09:01.1 --> 00:09:06
Now, I would like to say why
are the eigenvalues real?
142
00:09:06 --> 00:09:08
Can I do that?
143
00:09:08 --> 00:09:14
So -- so -- because that --
something useful comes out.
144
00:09:14 --> 00:09:20
So I'll just come back -- come
to that question why real
145
00:09:20 --> 00:09:21
eigenvalues?
146
00:09:21 --> 00:09:22
Okay.
147
00:09:22 --> 00:09:28
So I have to start from the
only thing we know,
148
00:09:28 --> 00:09:29.89
Ax equal lambda x.
149
00:09:29.89 --> 00:09:30
Okay.
150
00:09:30 --> 00:09:36.49
But as far as I know at this
moment, lambda could be complex.
151
00:09:36.49 --> 00:09:41
I'm going to prove it's not --
and x could be complex.
152
00:09:41 --> 00:09:47.05
In fact, for the moment,
even A could be -- we could
153
00:09:47.05 --> 00:09:50
even think, well,
what
154
00:09:50 --> 00:09:52
happens if A is complex?
155
00:09:52 --> 00:09:56.97
Well, one thing we can always
do -- this is -- this is like
156
00:09:56.97 --> 00:10:01
always -- always okay -- I can
-- if I have an equation,
157
00:10:01 --> 00:10:05
I can take the complex
conjugate of everything.
158
00:10:05 --> 00:10:09
That's -- no -- no -- so A
conjugate x conjugate equal
159
00:10:09 --> 00:10:12
lambda
conjugate x conjugate,
160
00:10:12 --> 00:10:18
it just means that everywhere
over here that there was a -- an
161
00:10:18 --> 00:10:21
i, then here I changed it to
a-i.
162
00:10:21 --> 00:10:26
That's -- that -- you know that
that step -- that conjugate
163
00:10:26 --> 00:10:32
business, that a+ib,
if I conjugate it it's a-ib.
164
00:10:32 --> 00:10:37
That's the meaning of conjugate
-- and products behave right,
165
00:10:37 --> 00:10:40.41
I can conjugate every factor.
166
00:10:40.41 --> 00:10:46.06
So I haven't done anything yet
except to say what would be true
167
00:10:46.06 --> 00:10:50
if, x -- in any case,
even if x and lambda were
168
00:10:50 --> 00:10:50.98
complex.
169
00:10:50.98 --> 00:10:55
Of course, our -- we're
speaking about real matrixes A,
170
00:10:55 --> 00:10:59
so I can take that out.
171
00:10:59 --> 00:11:04
Actually, this already tells me
something about real matrixes.
172
00:11:04 --> 00:11:09
I haven't used any assumption
of A -- A transpose yet.
173
00:11:09 --> 00:11:13
Symmetry is waiting in the
wings to be used.
174
00:11:13 --> 00:11:18
This tells me that if a real
matrix has an eigenvalue lambda
175
00:11:18 --> 00:11:23
and an eigenvector x,
it also has -- another of its
176
00:11:23 --> 00:11:27
eigenvalues is lambda bar with
eigenvector x bar.
177
00:11:27 --> 00:11:31
Real matrixes,
the eigenvalues come in lambda,
178
00:11:31 --> 00:11:35
lambda bar -- the complex
eigenvalues come in lambda and
179
00:11:35 --> 00:11:36
lambda bar pairs.
180
00:11:36 --> 00:11:40
But, of course,
I'm aiming to show that they're
181
00:11:40 --> 00:11:42.81
not
complex at all,
182
00:11:42.81 --> 00:11:45
here, by getting symmetry in.
183
00:11:45 --> 00:11:48
So how I going to use symmetry?
184
00:11:48 --> 00:11:54
I'm going to transpose this
equation to x bar transpose A
185
00:11:54 --> 00:11:58
transpose equals x bar transpose
lambda bar.
186
00:11:58 --> 00:12:03
That's just a number,
so I don't mind wear I put that
187
00:12:03 --> 00:12:04
number.
188
00:12:04 --> 00:12:08
This is --
this is -- this is a -- then
189
00:12:08 --> 00:12:09
again okay.
190
00:12:09 --> 00:12:13
But now I'm ready to use
symmetry.
191
00:12:13 --> 00:12:17
I'm ready -- so this was all
just mechanics.
192
00:12:17 --> 00:12:22
Now -- now comes the moment to
say, okay, if the matrix is
193
00:12:22 --> 00:12:27
symmetric, then this A transpose
is the same as A.
194
00:12:27 --> 00:12:31
You see, at that moment I used
the
195
00:12:31 --> 00:12:32
assumption.
196
00:12:32 --> 00:12:35.86
Now let me finish the
discussion.
197
00:12:35.86 --> 00:12:38
Here -- here's the way I
finish.
198
00:12:38 --> 00:12:43
I look at this original
equation and I take the inner
199
00:12:43 --> 00:12:44
product.
200
00:12:44 --> 00:12:50
I multiply both sides by -- oh,
maybe I'll do it with this one.
201
00:12:50 --> 00:12:56
I take -- I multiply both sides
by x bar transpose.
202
00:12:56 --> 00:13:03
x bar transpose Ax bar equals
lambda bar x bar transpose x
203
00:13:03 --> 00:13:04
bar.
204
00:13:04 --> 00:13:05
Okay, fine.
205
00:13:05 --> 00:13:10
All right, now what's the other
one?
206
00:13:10 --> 00:13:16
Oh, for the other one I'll
probably use this guy.
207
00:13:16 --> 00:13:19
A- I happy about this?
208
00:13:19 --> 00:13:19
No.
209
00:13:19 --> 00:13:23
For some reason I'm not.
210
00:13:23 --> 00:12:21
I'm -- I want to --
if I take the inner product of
211
00:12:21 --> 00:10:13
this from the right with x bar,
I get x bar transpose Ax bar
212
00:10:13 --> 00:08:48
equals x bar transpose lambda
bar x bar.
213
00:08:48 --> 00:06:52
I- I've done something dumb,
because I've got the -- I
214
00:06:52 --> 00:05:57
haven't learned anything.
215
00:05:57 --> 00:03:58
I've got -- those two equations
are identical,1
216
00:03:58 --> 00:06:14
so forgive me for doing such a
thing, but, I'll look at the
217
00:06:14 --> 00:06:26
book.
218
00:06:26 --> 00:06:38.49
Okay.
219
00:06:38.49 --> 00:08:48
So I took the dot product --
ye, somehow, it didn't -- I
220
00:08:48 --> 00:11:14
should've taken the dot product
of this guy here with -- that's
221
00:11:14 --> 00:12:08
what I was going to do.
222
00:12:08 --> 00:13:45
Ax equals lambda x bar
transpose x, right?
223
00:13:45 --> 00:14:25
Okay.
224
00:14:25 --> 00:14:29
So that -- that was -- that's
fine.
225
00:14:29 --> 00:14:34
That comes directly from that,
multiplying both sides by x bar
226
00:14:34 --> 00:14:40
transpose, but now I'd like to
get -- why do I have x bars over
227
00:14:40 --> 00:14:41
there?
228
00:14:41 --> 00:14:42
Oh, yes.
229
00:14:42 --> 00:14:43
Forget this.
230
00:14:43 --> 00:14:43
Okay.
231
00:14:43 --> 00:14:45
On this one -- right.
232
00:14:45 --> 00:14:50
On this one,
I took it like that,
233
00:14:50 --> 00:14:53.46
I multiply on the right by x.
234
00:14:53.46 --> 00:14:55
That's the idea.
235
00:14:55 --> 00:14:55
Okay.
236
00:14:55 --> 00:15:00
Now why I happier with this
situation now?
237
00:15:00 --> 00:15:02
A proof is coming here.
238
00:15:02 --> 00:15:07
Because I compare this guy with
this one.
239
00:15:07 --> 00:15:11
And they have the same left
hand side.
240
00:15:11 --> 00:15:16
So they have the same
right hand side.
241
00:15:16 --> 00:15:22
So comparing those two,
can -- I'll raise the board to
242
00:15:22 --> 00:15:28
do this comparison -- this
thing, lambda x bar transpose x
243
00:15:28 --> 00:15:33
is equal to lambda bar x bar
transpose x.
244
00:15:33 --> 00:15:34
Okay.
245
00:15:34 --> 00:15:40
And the conclusion I'm going to
reach -- I -- I
246
00:15:40 --> 00:15:43
on the right track here?
247
00:15:43 --> 00:15:48
The conclusion I'm going to
reach is lambda equal lambda
248
00:15:48 --> 00:15:49
bar.
249
00:15:49 --> 00:15:55
I would have to track down the
other possibility that this --
250
00:15:55 --> 00:15:59.75
this thing is zero,
but let me -- oh -- oh,
251
00:15:59.75 --> 00:16:02
yes, that's important.
252
00:16:02 --> 00:16:03
It's not zero.
253
00:16:03 --> 00:16:10
So once I know that this isn't
zero, I just cancel it and I
254
00:16:10 --> 00:16:13
learn that lambda equals lambda
bar.
255
00:16:13 --> 00:16:19.59
And so what can you -- do you
-- have you got the reasoning
256
00:16:19.59 --> 00:16:20
altogether?
257
00:16:20 --> 00:16:22
What does this tell us?
258
00:16:22 --> 00:16:27
Lambda's an eigenvalue of this
symmetric matrix.
259
00:16:27 --> 00:16:32
We've just proved that it
equaled lambda bar,
260
00:16:32 --> 00:16:37
so we have just proved that
lambda is real,
261
00:16:37 --> 00:16:37
right?
262
00:16:37 --> 00:16:42
If, if a number is equal to its
own complex conjugate,
263
00:16:42 --> 00:16:46
then there's no imaginary part
at all.
264
00:16:46 --> 00:16:48
The number is real.
265
00:16:48 --> 00:16:50
So lambda is real.
266
00:16:50 --> 00:16:50
Good.
267
00:16:50 --> 00:16:51
Good.
268
00:16:51 --> 00:16:56.5
Now, what -- but it depended on
this little
269
00:16:56.5 --> 00:17:01
expression, on knowing that
that wasn't zero,
270
00:17:01 --> 00:17:07
so that I could cancel it out
-- so can we just take a second
271
00:17:07 --> 00:17:08.6
on that one?
272
00:17:08.6 --> 00:17:12
Because it's an important
quantity.
273
00:17:12 --> 00:17:14
x bar transpose x.
274
00:17:14 --> 00:17:17
Okay, now remember,
as far as we know,
275
00:17:17 --> 00:17:19
x is complex.
276
00:17:19 --> 00:17:24
So this is --
here -- x is complex,
277
00:17:24 --> 00:17:28
x has these components,
x1, x2 down to xn.
278
00:17:28 --> 00:17:33
And x bar transpose,
well, it's transposed and it's
279
00:17:33 --> 00:17:39
conjugated, so that's x1
conjugated x2 conjugated up to
280
00:17:39 --> 00:17:40.92
xn conjugated.
281
00:17:40.92 --> 00:17:47
I'm -- I'm -- I'm really
reminding you of crucial
282
00:17:47 --> 00:17:51
facts about complex numbers
that are going to come into the
283
00:17:51 --> 00:17:54
next lecture as well as this
one.
284
00:17:54 --> 00:17:59
So w- what can you tell me
about that product -- I -- I
285
00:17:59 --> 00:18:04
guess what I'm trying to say is,
if I had a complex vector,
286
00:18:04 --> 00:18:09
this would be the quantity I
would -- I would like.
287
00:18:09 --> 00:18:10
This is the quantity I like.
288
00:18:10 --> 00:18:14
I would take the vector times
its transpose -- now what --
289
00:18:14 --> 00:18:18
what happens usually if I take a
vector -- a -- a -- x transpose
290
00:18:18 --> 00:18:18
x?
291
00:18:18 --> 00:18:21
I mean, that's a quantity we
see all the time,
292
00:18:21 --> 00:18:22
x transpose x.
293
00:18:22 --> 00:18:25
That's the length of x squared,
right?
294
00:18:25 --> 00:18:28
That's this positive length
squared,
295
00:18:28 --> 00:18:32
it's Pythagoras,
it's x1 squared plus x2 squared
296
00:18:32 --> 00:18:33
and so on.
297
00:18:33 --> 00:18:38
Now our vector's complex,
and you see the effect?
298
00:18:38 --> 00:18:41
I'm conjugating one of these
guys.
299
00:18:41 --> 00:18:44
So now when I do this
multiplication,
300
00:18:44 --> 00:18:50
I have x1 bar times x1 and x2
bar times x2 and so on.
301
00:18:50 --> 00:18:54
So this is an -- this is sum
a+ib.
302
00:18:54 --> 00:18:56
And this is sum a-ib.
303
00:18:56 --> 00:18:59
I mean, what's the point here?
304
00:18:59 --> 00:19:04
What's the point -- when I
multiply a number by its
305
00:19:04 --> 00:19:09
conjugate, a complex number by
its conjugate,
306
00:19:09 --> 00:19:10
what do I get?
307
00:19:10 --> 00:19:16
I get a n- the -- the imaginary
part is gone.
308
00:19:16 --> 00:19:21
When I multiply a+ib by its
conjugate, what's -- what's the
309
00:19:21 --> 00:19:25
result of that -- of each of
those separate little
310
00:19:25 --> 00:19:26
multiplications?
311
00:19:26 --> 00:19:31
There's an a squared and -- and
what -- how many -- what's -- b
312
00:19:31 --> 00:19:35
squared comes in with a plus or
a minus?
313
00:19:35 --> 00:19:39
A plus.
i times minus i is a plus b
314
00:19:39 --> 00:19:40
squared.
315
00:19:40 --> 00:19:44
And what about the imaginary
part?
316
00:19:44 --> 00:19:46
Gone, right?
317
00:19:46 --> 00:19:49
An iab and a minus iab.
318
00:19:49 --> 00:19:54
So this -- this is the right
thing to do.
319
00:19:54 --> 00:20:01
If you want a decent answer,
then multiply numbers by their
320
00:20:01 --> 00:20:04
conjugates.
321
00:20:04 --> 00:20:08
Multiply vectors by the
conjugates of x transpose.
322
00:20:08 --> 00:20:13
So this quantity is positive,
this quantity is positive --
323
00:20:13 --> 00:20:18
the whole thing is positive
except for the zero vector and
324
00:20:18 --> 00:20:23
that allows me to know that this
is a positive number,
325
00:20:23 --> 00:20:29
which I safely cancel out and I
reach the conclusion.
326
00:20:29 --> 00:20:32
So actually,
in this discussion here,
327
00:20:32 --> 00:20:34
I've done two things.
328
00:20:34 --> 00:20:38
If I reached the conclusion
that lambda's real,
329
00:20:38 --> 00:20:40
which I wanted to do.
330
00:20:40 --> 00:20:44
But at the same time,
we sort of saw what to do if
331
00:20:44 --> 00:20:46
things were complex.
332
00:20:46 --> 00:20:52
If a vector is complex,
then it's x bar transpose x,
333
00:20:52 --> 00:20:54.73
this is its length squared.
334
00:20:54.73 --> 00:20:58
And as I said,
the next lecture Monday,
335
00:20:58 --> 00:21:04
we'll -- we'll repeat that this
is the right thing and then do
336
00:21:04 --> 00:21:09
the right thing for matrixes and
all other -- all other,
337
00:21:09 --> 00:21:12
complex possibilities.
338
00:21:12 --> 00:21:12
Okay.
339
00:21:12 --> 00:21:16
But the main point,
then, is that the eigenvalues
340
00:21:16 --> 00:21:21
of a symmetric matrix,
it just -- do you -- do --
341
00:21:21 --> 00:21:24
where did we use symmetry,
by the way?
342
00:21:24 --> 00:21:26
We used it here,
right?
343
00:21:26 --> 00:21:31
Let -- can I just -- let --
suppose A was a complex.
344
00:21:31 --> 00:21:35.7
Suppose A had been a complex
number.
345
00:21:35.7 --> 00:21:38
Could -- could I have made all
this work?
346
00:21:38 --> 00:21:42
If A was a complex number --
complex matrix,
347
00:21:42 --> 00:21:45
then here I should have written
A bar.
348
00:21:45 --> 00:21:48
I erased the bar because I
assumed A was real.
349
00:21:48 --> 00:21:51
But now let's suppose for a
moment it's not.
350
00:21:51 --> 00:21:55
Then when I took this step,
what should I have?
351
00:21:55 --> 00:21:58
What did I do
on that step?
352
00:21:58 --> 00:21:59
I transposed.
353
00:21:59 --> 00:22:02
So I should have A bar
transpose.
354
00:22:02 --> 00:22:06
In the symmetric case,
that was A, and that's what
355
00:22:06 --> 00:22:09
made everything work,
right?
356
00:22:09 --> 00:22:12
This -- this led immediately to
that.
357
00:22:12 --> 00:22:17
This one led immediately to
this when the matrix was real,
358
00:22:17 --> 00:22:21
so that didn't matter,
and it was
359
00:22:21 --> 00:22:25
symmetric, so that didn't
matter.
360
00:22:25 --> 00:22:27
Then I got A.
361
00:22:27 --> 00:22:32
But -- so now I just get to ask
you.
362
00:22:32 --> 00:22:36
Suppose the matrix had been
complex.
363
00:22:36 --> 00:22:42.53
What's the right equivalent of
sym- symmetry?
364
00:22:42.53 --> 00:22:50
So the good matrix -- so here,
let me say -- good matrixes --
365
00:22:50 --> 00:22:57
by good I mean real
lambdas and perpendicular x-s.
366
00:22:57 --> 00:23:01
And tell me now,
which matrixes are good?
367
00:23:01 --> 00:23:06
If they're -- If they're real
matrixes, the good ones are
368
00:23:06 --> 00:23:11
symmetric, because then
everything went through.
369
00:23:11 --> 00:23:15
The -- so the good -- I'm
saying now
370
00:23:15 --> 00:23:16
what's good.
371
00:23:16 --> 00:23:20
This is -- this is -- these are
the good matrixes.
372
00:23:20 --> 00:23:25
They have real eigenvalues,
perpendicular eigenvectors --
373
00:23:25 --> 00:23:28
good means A equal A transpose
if real.
374
00:23:28 --> 00:23:32
Then -- then that was what --
our proof worked.
375
00:23:32 --> 00:23:36
But if A is complex,
all -- our proof will still
376
00:23:36 --> 00:23:40
work
provided A bar transpose is A.
377
00:23:40 --> 00:23:43
Do you see what I'm saying?
378
00:23:43 --> 00:23:49
I'm saying if we have complex
matrixes and we want to say are
379
00:23:49 --> 00:23:53
they -- are they as good as
symmetric matrixes,
380
00:23:53 --> 00:23:58
then we should not only
transpose the thing,
381
00:23:58 --> 00:23:59
but conjugate it.
382
00:23:59 --> 00:24:02
Those are good matrixes.
383
00:24:02 --> 00:24:07
And of course,
the most important s- the most
384
00:24:07 --> 00:24:11
important case is when they're
real, this part doesn't matter
385
00:24:11 --> 00:24:15
and I just have A equal A
transpose symmetric.
386
00:24:15 --> 00:24:18
Do you -- I -- I'll just repeat
that.
387
00:24:18 --> 00:24:21
The good matrixes,
if complex, are these.
388
00:24:21 --> 00:24:27
If real, that doesn't make any
difference so I'm just saying
389
00:24:27 --> 00:24:28
symmetric.
390
00:24:28 --> 00:24:31
And of course,
99% of examples and
391
00:24:31 --> 00:24:37.5
applications to the matrixes are
real and we don't have that and
392
00:24:37.5 --> 00:24:40
then symmetric is the key
property.
393
00:24:40 --> 00:24:41
Okay.
394
00:24:41 --> 00:24:46
So that -- that's,
these main facts and now
395
00:24:46 --> 00:24:53
let me just -- let me just --
so that's this x bar transpose
396
00:24:53 --> 00:24:54
x, okay.
397
00:24:54 --> 00:24:58
So I'll just,
write it once more in this
398
00:24:58 --> 00:24:59
form.
399
00:24:59 --> 00:25:03
So perpendicular orthonormal
eigenvectors,
400
00:25:03 --> 00:25:08
real eigenvalues,
transposes of orthonormal
401
00:25:08 --> 00:25:11
eigenvectors.
402
00:25:11 --> 00:25:16
That's the symmetric case,
A equal A transpose.
403
00:25:16 --> 00:25:17
Okay.
404
00:25:17 --> 00:25:17
Good.
405
00:25:17 --> 00:25:23
Actually, I'll even take one
more step here.
406
00:25:23 --> 00:25:29
Suppose -- I -- I can break
this down to show you really
407
00:25:29 --> 00:25:34
what that says about a symmetric
matrix.
408
00:25:34 --> 00:25:38
I can break that down.
409
00:25:38 --> 00:25:42
Let me here -- here go these
eigenvectors.
410
00:25:42 --> 00:25:49
I -- here go these eigenvalues,
lambda one, lambda two and so
411
00:25:49 --> 00:25:49
on.
412
00:25:49 --> 00:25:54
Here go these eigenvectors
transposed.
413
00:25:54 --> 00:26:00.54
And what happens if I actually
do out that multiplication?
414
00:26:00.54 --> 00:26:03
Do you see what will happen?
415
00:26:03 --> 00:26:09
There's lambda one times q1
transpose.
416
00:26:09 --> 00:26:14
So the first row here is just
lambda one q1 transpose.
417
00:26:14 --> 00:26:21
If I multiply column times row
-- you remember I could do that?
418
00:26:21 --> 00:26:27
When I multiply matrixes,
I can multiply columns times
419
00:26:27 --> 00:26:28
rows?
420
00:26:28 --> 00:26:33
So when I do that,
I get lambda one and then the
421
00:26:33 --> 00:26:38
column
and then the row and then
422
00:26:38 --> 00:26:43
lambda two and then the column
and the row.
423
00:26:43 --> 00:26:48.95
Every symmetric matrix breaks
up into these pieces.
424
00:26:48.95 --> 00:26:55
So these pieces have real
lambdas and they have these
425
00:26:55 --> 00:26:59
Eigen -- these orthonormal
eigenvectors.
426
00:26:59 --> 00:27:06
And, maybe you even could tell
me what kind of a matrix have I
427
00:27:06 --> 00:27:09.4
got there?
428
00:27:09.4 --> 00:27:13
Suppose I take a unit vector
times its transpose?
429
00:27:13 --> 00:27:17
So column times row,
I'm getting a matrix.
430
00:27:17 --> 00:27:20
That's a matrix with a special
name.
431
00:27:20 --> 00:27:24
What's it's -- what kind of a
matrix is it?
432
00:27:24 --> 00:27:28
We've seen those matrixes,
now, in chapter four.
433
00:27:28 --> 00:27:33
It's -- is A A transpose with a
unit vector,
434
00:27:33 --> 00:27:37
so I don't have to divide by A
transpose A.
435
00:27:37 --> 00:27:40
That matrix is a projection
matrix.
436
00:27:40 --> 00:27:43
That's a projection matrix.
437
00:27:43 --> 00:27:48
It's symmetric and if I square
it there'll be another --
438
00:27:48 --> 00:27:51.99
there'll be a q1 transpose q1,
which is one.
439
00:27:51.99 --> 00:27:56.2
So I'll get that matrix back
again.
440
00:27:56.2 --> 00:28:03
Every -- so every symmetric
matrix -- every symmetric matrix
441
00:28:03 --> 00:28:10
is a combination of -- of
mutually perpendicular -- so
442
00:28:10 --> 00:28:14
perpendicular projection
matrixes.
443
00:28:14 --> 00:28:17
Projection matrixes.
444
00:28:17 --> 00:28:17
Okay.
445
00:28:17 --> 00:28:25
That's another way that people
like to think of the spectral
446
00:28:25 --> 00:28:32
theorem, that every symmetric
matrix can be
447
00:28:32 --> 00:28:33
broken up that way.
448
00:28:33 --> 00:28:37
That -- I guess at this moment
-- first I haven't done an
449
00:28:37 --> 00:28:38
example.
450
00:28:38 --> 00:28:42
I could create a symmetric
matrix, check that it's -- find
451
00:28:42 --> 00:28:45
its eigenvalues,
they would come out real,
452
00:28:45 --> 00:28:48
find its eigenvectors,
they would come out
453
00:28:48 --> 00:28:51.92
perpendicular and you would see
it
454
00:28:51.92 --> 00:28:56
in numbers, but maybe I'll
leave it here for the moment in
455
00:28:56 --> 00:28:57
letters.
456
00:28:57 --> 00:29:01
Oh, I -- maybe I will do it
with numbers,
457
00:29:01 --> 00:29:02
for this reason.
458
00:29:02 --> 00:29:06
Because there's one more
remarkable fact.
459
00:29:06 --> 00:29:10
Can I just put this further
great fact about symmetric
460
00:29:10 --> 00:29:13.9
matrixes on the board?
461
00:29:13.9 --> 00:29:18.13
When I have symmetric matrixes,
I know their eigenvalues are
462
00:29:18.13 --> 00:29:18
real.
463
00:29:18 --> 00:29:22
So then I can get interested in
the question are they positive
464
00:29:22 --> 00:29:23
or negative?
465
00:29:23 --> 00:29:26
And you remember why that's
important.
466
00:29:26 --> 00:29:29
For differential equations,
that decides between
467
00:29:29 --> 00:29:32.4
instability and stability.
468
00:29:32.4 --> 00:29:38
So I'm -- after I know they're
real, then the next question is
469
00:29:38 --> 00:29:41
are they positive,
are they negative?
470
00:29:41 --> 00:29:47
And I hate to have to compute
those eigenvalues to answer that
471
00:29:47 --> 00:29:48
question, right?
472
00:29:48 --> 00:29:53
Because computing the
eigenvalues of a symmetric
473
00:29:53 --> 00:29:59
matrix of order let's say 50 --
compute its 50 eigenvalues -- is
474
00:29:59 --> 00:30:01
a job.
475
00:30:01 --> 00:30:06
I mean, by pencil and paper
it's a lifetime's job.
476
00:30:06 --> 00:30:11.46
I mean, which -- and in fact,
a few years ago -- well,
477
00:30:11.46 --> 00:30:16
say, 20 years ago,
or 30, nobody really knew how
478
00:30:16 --> 00:30:17
to do it.
479
00:30:17 --> 00:30:21
I mean, so, like,
science was stuck on this
480
00:30:21 --> 00:30:22
problem.
481
00:30:22 --> 00:30:26
If you have a matrix of order
50 or
482
00:30:26 --> 00:30:30
100, how do you find its
eigenvalues?
483
00:30:30 --> 00:30:33
Numerically,
now, I'm just saying,
484
00:30:33 --> 00:30:38
because pencil and paper is --
we're going to run out of time
485
00:30:38 --> 00:30:42
or paper or something before we
get it.
486
00:30:42 --> 00:30:46
Well -- and you might think,
okay, get
487
00:30:46 --> 00:30:51.49
Matlab to compute the
determinant of lambda minus A,
488
00:30:51.49 --> 00:30:55
A minus lambda I,
this polynomial of 50th degree,
489
00:30:55 --> 00:30:58
and then find the roots.
490
00:30:58 --> 00:31:01
Matlab will do it,
but it will complain,
491
00:31:01 --> 00:31:06
because it's a very bad way to
find the eigenvalues.
492
00:31:06 --> 00:31:11
I'm sorry to be saying
this, because it's the way I
493
00:31:11 --> 00:31:12
taught you to do it,
right?
494
00:31:12 --> 00:31:15
I taught you to find the
eigenvalues by doing that
495
00:31:15 --> 00:31:19
determinant and taking the roots
of that polynomial.
496
00:31:19 --> 00:31:22
But now I'm saying,
okay, I really meant that for
497
00:31:22 --> 00:31:26
two by twos and three by threes
but I didn't mean you to do it
498
00:31:26 --> 00:31:29
on a 50 by
50 and you're not too unhappy,
499
00:31:29 --> 00:31:33
probably, because you didn't
want to do it.
500
00:31:33 --> 00:31:36
But -- good,
because it would be a very
501
00:31:36 --> 00:31:41
unstable way -- the 50 answers
that would come out would be
502
00:31:41 --> 00:31:42.56
highly unreliable.
503
00:31:42.56 --> 00:31:47
So, new ways are -- are much
better to find those 50
504
00:31:47 --> 00:31:48
eigenvalues.
505
00:31:48 --> 00:31:53
That's a -- that's a part of
numerical linear algebra.
506
00:31:53 --> 00:31:58
But here's the remarkable fact
-- that Matlab would quite
507
00:31:58 --> 00:32:01
happily find the 50 pivots,
right?
508
00:32:01 --> 00:32:06
Now the pivots are not the same
as the eigenvalues.
509
00:32:06 --> 00:32:09.5
But here's the great thing.
510
00:32:09.5 --> 00:32:13
If I had a real matrix,
I could find those 50 pivots
511
00:32:13 --> 00:32:18
and I could see maybe 28 of them
are positive and 22 are negative
512
00:32:18 --> 00:32:18
pivots.
513
00:32:18 --> 00:32:21
And I can compute those safely
and quickly.
514
00:32:21 --> 00:32:25
And the great fact is that 28
of the eigenvalues would be
515
00:32:25 --> 00:32:29
positive and 22 would be
negative --
516
00:32:29 --> 00:32:34
that the sines of the pivots --
so this is, like -- I hope you
517
00:32:34 --> 00:32:40
think this -- this is kind of a
nice thing, that the sines of
518
00:32:40 --> 00:32:44
the pivots -- for symmetric,
I'm always talking about
519
00:32:44 --> 00:32:48.72
symmetric matrixes -- so I'm
really,
520
00:32:48.72 --> 00:32:55
like, trying to convince you
that symmetric matrixes are
521
00:32:55 --> 00:32:58
better than the rest.
522
00:32:58 --> 00:33:05
So the sines of the pivots are
same as the sines of the
523
00:33:05 --> 00:33:06
eigenvalues.
524
00:33:06 --> 00:33:08
The same number.
525
00:33:08 --> 00:33:15
The number of pivots greater
than zero, the number of
526
00:33:15 --> 00:33:24
positive pivots is equal to the
number of positive eigenvalues.
527
00:33:24 --> 00:33:28
So that, actually,
is a very useful -- that gives
528
00:33:28 --> 00:33:31
you a g- a good start on a
decent way to compute
529
00:33:31 --> 00:33:35
eigenvalues, because you can
narrow them down,
530
00:33:35 --> 00:33:40
you can find out how many are
positive, how many are negative.
531
00:33:40 --> 00:33:45
Then you could shift the matrix
by seven times the identity.
532
00:33:45 --> 00:33:48
That would shift all the
eigenvalues by seven.
533
00:33:48 --> 00:33:53
Then you could take the pivots
of that matrix and you would
534
00:33:53 --> 00:33:58
know how many eigenvalues of the
original were above seven and
535
00:33:58 --> 00:33:58
below seven.
536
00:33:58 --> 00:34:02
So this -- this neat little
theorem, that,
537
00:34:02 --> 00:34:05
symmetric
matrixes have this connection
538
00:34:05 --> 00:34:10
between the -- nobody's mixing
up and thinking the pivots are
539
00:34:10 --> 00:34:14
the eigenvalues -- I mean,
the only thing I can think of
540
00:34:14 --> 00:34:18
is the product of the pivots
equals the product of the
541
00:34:18 --> 00:34:20
eigenvalues, why is that?
542
00:34:20 --> 00:34:23
So if I asked you for the
reason on that,
543
00:34:23 --> 00:34:27
why is the product
of the pivots for a symmetric
544
00:34:27 --> 00:34:31
matrix the same as the product
of the eigenvalues?
545
00:34:31 --> 00:34:34
Because they both equal the
determinant.
546
00:34:34 --> 00:34:35
Right.
547
00:34:35 --> 00:34:39
The product of the pivots gives
the determinant if no row
548
00:34:39 --> 00:34:44
exchanges, the product of the
eigenvalues always gives the
549
00:34:44 --> 00:34:45
determinant.
550
00:34:45 --> 00:34:50
So -- so the products --
but that doesn't tell you
551
00:34:50 --> 00:34:56
anything about the 50 individual
ones, which this does.
552
00:34:56 --> 00:34:56
Okay.
553
00:34:56 --> 00:35:01
So that's -- those are
essential facts about symmetric
554
00:35:01 --> 00:35:02
matrixes.
555
00:35:02 --> 00:35:03
Okay.
556
00:35:03 --> 00:35:09
Now I -- I said in the -- in
the lecture description that I
557
00:35:09 --> 00:35:17
would take the last minutes
to start on positive definite
558
00:35:17 --> 00:35:27
matrixes, because we're right
there, we're ready to say what's
559
00:35:27 --> 00:35:31
a positive definite matrix?
560
00:35:31 --> 00:35:35
It's symmetric,
first of all.
561
00:35:35 --> 00:35:40
On -- always I will mean
symmetric.
562
00:35:40 --> 00:35:47
So this is the --
this is the next section of the
563
00:35:47 --> 00:35:48
book.
564
00:35:48 --> 00:35:53
It's about this -- if symmetric
matrixes are good,
565
00:35:53 --> 00:35:58
which was, like,
the point of my lecture so far,
566
00:35:58 --> 00:36:02
then positive,
definite matrixes are -- a
567
00:36:02 --> 00:36:05
subclass that are excellent,
okay.
568
00:36:05 --> 00:36:10
Just the greatest.
so what are they?
569
00:36:10 --> 00:36:15
They're matrixes -- they're
symmetric matrixes,
570
00:36:15 --> 00:36:19
so all their eigenvalues are
real.
571
00:36:19 --> 00:36:22
You can guess what they are.
572
00:36:22 --> 00:36:28
These are symmetric matrixes
with all -- the eigenvalues are
573
00:36:28 --> 00:36:32
-- okay, tell me what to write.
574
00:36:32 --> 00:36:36
What -- well,
it -- it's hinted,
575
00:36:36 --> 00:36:40
of course, by the name for
these things.
576
00:36:40 --> 00:36:44
All the eigenvalues are
positive.
577
00:36:44 --> 00:36:44
Okay.
578
00:36:44 --> 00:36:47
Tell me about the pivots.
579
00:36:47 --> 00:36:53
We can check the eigenvalues or
we can check the pivots.
580
00:36:53 --> 00:36:56
All the pivots are what?
581
00:36:56 --> 00:37:01
And then I'll -- then I'll
finally give an example.
582
00:37:01 --> 00:37:06
I feel awful that I have got to
this
583
00:37:06 --> 00:37:12
point in the lecture and I
haven't given you a single
584
00:37:12 --> 00:37:12
example.
585
00:37:12 --> 00:37:15
So let me give you one.
586
00:37:15 --> 00:37:17
Five three two two.
587
00:37:17 --> 00:37:19
That's symmetric,
fine.
588
00:37:19 --> 00:37:23
It's eigenvalues are real,
for sure.
589
00:37:23 --> 00:37:27
But more than that,
I know the sines of those
590
00:37:27 --> 00:37:28
eigenvalues.
591
00:37:28 --> 00:37:33
And also I know the sines of
those
592
00:37:33 --> 00:37:37
pivots, so what's the deal with
the pivots?
593
00:37:37 --> 00:37:42
The Ei- if the eigenvalues are
all positive and if this little
594
00:37:42 --> 00:37:48
fact is true that the pivots and
eigenvalues have the same sines,
595
00:37:48 --> 00:37:53
then this must be true -- all
the pivots are positive.
596
00:37:53 --> 00:37:56
And that's the good way to
test.
597
00:37:56 --> 00:38:02
This is the good test,
because I can -- what are the
598
00:38:02 --> 00:38:04
pivots for that matrix?
599
00:38:04 --> 00:38:07
The pivots for that matrix are
five.
600
00:38:07 --> 00:38:11
So pivots are five and what's
the second pivot?
601
00:38:11 --> 00:38:15
Have we, like,
noticed the formula for the
602
00:38:15 --> 00:38:17.85
second pivot in a matrix?
603
00:38:17.85 --> 00:38:24.55
It doesn't necessarily -- you
know, it may come out a fraction
604
00:38:24.55 --> 00:38:27
for sure, but what is that
fraction?
605
00:38:27 --> 00:38:29
Can you tell me?
606
00:38:29 --> 00:38:35
Well, here, the product of the
pivots is the determinant.
607
00:38:35 --> 00:38:38
What's the determinant of this
matrix?
608
00:38:38 --> 00:38:39
Eleven?
609
00:38:39 --> 00:38:43
So the second pivot must be
eleven over five,
610
00:38:43 --> 00:38:47
so that the product is eleven.
611
00:38:47 --> 00:38:49
They're both positive.
612
00:38:49 --> 00:38:53
Then I know that the
eigenvalues of that matrix are
613
00:38:53 --> 00:38:54
both positive.
614
00:38:54 --> 00:38:55
What are the eigenvalues?
615
00:38:55 --> 00:38:59
Well, I've got to take the
roots of -- you know,
616
00:38:59 --> 00:39:01
do I put in a minus lambda?
617
00:39:01 --> 00:39:06.8
You mentally do this -- lambda
squared minus how many lambdas?
618
00:39:06.8 --> 00:39:07
Eight?
619
00:39:07 --> 00:39:07
Right.
620
00:39:07 --> 00:39:11
Five and three,
the trace comes in there,
621
00:39:11 --> 00:39:14
plus what number comes here?
622
00:39:14 --> 00:39:19
The determinant,
the eleven, so I set that to
623
00:39:19 --> 00:39:19
zero.
624
00:39:19 --> 00:39:25
So the eigenvalues are -- let's
see, half of that is four,
625
00:39:25 --> 00:39:30
look at that positive number,
plus or minus the square root
626
00:39:30 --> 00:39:34
of
sixteen minus eleven,
627
00:39:34 --> 00:39:35.46
I think five.
628
00:39:35.46 --> 00:39:39
The eigenvalues -- well,
two by two they're not so
629
00:39:39 --> 00:39:43
terrible, but they're not so
perfect.
630
00:39:43 --> 00:39:45
Pivots are really simple.
631
00:39:45 --> 00:39:50
And this is a -- this is the
family of matrixes that you
632
00:39:50 --> 00:39:56
really want in differential
equations, because you know the
633
00:39:56 --> 00:40:01
sines of the eigenvalues,
so you know the stability or
634
00:40:01 --> 00:40:02
not.
635
00:40:02 --> 00:40:02
Okay.
636
00:40:02 --> 00:40:08
There's one other related fact
I can pop in here in -- in the
637
00:40:08 --> 00:40:12
time available for positive
definite matrixes.
638
00:40:12 --> 00:40:18.3
The related fact is to ask you
about determinants.
639
00:40:18.3 --> 00:40:21
So what's the determinant?
640
00:40:21 --> 00:40:27
What can you tell me if I --
remember, positive definite
641
00:40:27 --> 00:40:33
means all eigenvalues are
positive, all pivots are
642
00:40:33 --> 00:40:39
positive, so what can you tell
me about the determinant?
643
00:40:39 --> 00:40:41
It's positive,
too.
644
00:40:41 --> 00:40:46
But somehow that -- that's not
quite enough.
645
00:40:46 --> 00:40:51
Here -- here's a matrix minus
one
646
00:40:51 --> 00:40:54
minus three,
what's the determinant of that
647
00:40:54 --> 00:40:55
guy?
648
00:40:55 --> 00:40:56
It's positive,
right?
649
00:40:56 --> 00:40:59
Is this a positive,
definite matrix?
650
00:40:59 --> 00:41:02
Are the pivots -- what are the
pivots?
651
00:41:02 --> 00:41:03
Well, negative.
652
00:41:03 --> 00:41:05
What are the eigenvalues?
653
00:41:05 --> 00:41:09
Well, they're also the same.
654
00:41:09 --> 00:41:14
So somehow I don't just want
the determinant of the whole
655
00:41:14 --> 00:41:14
matrix.
656
00:41:14 --> 00:41:17
Here is eleven,
that's great.
657
00:41:17 --> 00:41:21
Here the determinant of the
whole matrix is three,
658
00:41:21 --> 00:41:22
that's positive.
659
00:41:22 --> 00:41:27
I also -- I've got to check,
like, little sub-determinants,
660
00:41:27 --> 00:41:30
say maybe coming down from the
left.
661
00:41:30 --> 00:41:34.9
So the one by one and the two
by two
662
00:41:34.9 --> 00:41:36
have to be positive.
663
00:41:36 --> 00:41:40
So there -- that's where I get
the all.
664
00:41:40 --> 00:41:45
All -- can I call them
sub-determinants -- are -- see,
665
00:41:45 --> 00:41:49.23
I have to -- I need to make the
thing plural.
666
00:41:49.23 --> 00:41:54.13
I need to test n things,
not just the big determinant.
667
00:41:54.13 --> 00:41:58
All sub-determinants are
positive.
668
00:41:58 --> 00:41:59
Then I'm okay.
669
00:41:59 --> 00:42:01
Then I'm okay.
670
00:42:01 --> 00:42:03
This passes the test.
671
00:42:03 --> 00:42:07
Five is positive and eleven is
positive.
672
00:42:07 --> 00:42:13
This fails the test because
that minus one there is
673
00:42:13 --> 00:42:14
negative.
674
00:42:14 --> 00:42:19
And then the big determinant is
positive three.
675
00:42:19 --> 00:42:24.67
So t- this --
these -- this fact -- you see
676
00:42:24.67 --> 00:42:28
that actually the course,
like, coming together.
677
00:42:28 --> 00:42:31
And that's really my point now.
678
00:42:31 --> 00:42:37
In the next -- in this lecture
and particularly next Wednesday
679
00:42:37 --> 00:42:40
and Friday, the course comes
together.
680
00:42:40 --> 00:42:47
These pivots that we met in the
first week, these determinants
681
00:42:47 --> 00:42:52
that we met in the middle of
the course, these eigenvalues
682
00:42:52 --> 00:42:56
that we met most recently -- all
matrixes are square here,
683
00:42:56 --> 00:43:01
so coming together for square
matrixes means these three
684
00:43:01 --> 00:43:06
pieces come together and they
come together in that beautiful
685
00:43:06 --> 00:43:12
fact, that if -- that all the --
that if I have one of these,
686
00:43:12 --> 00:43:14
I have the others.
687
00:43:14 --> 00:43:17
That if I -- but for symmetric
matrixes.
688
00:43:17 --> 00:43:22
So that -- this will be the
positive definite section and
689
00:43:22 --> 00:43:26
then the real climax of the
course is to make everything
690
00:43:26 --> 00:43:30
come together for n by n
matrixes,
691
00:43:30 --> 00:43:35
not necessarily symmetric --
bring everything together there
692
00:43:35 --> 00:43:37
and that will be the final
thing.
693
00:43:37 --> 00:43:38
Okay.
694
00:43:38 --> 00:43:43
So have a great weekend and
don't forget symmetric matrixes.
695
00:43:43 --> 00:43:46
Thanks.