1
00:00:06 --> 00:00:07.23
Okay.
2
00:00:07.23 --> 00:00:12
This lecture is mostly about
the idea of similar matrixes.
3
00:00:12 --> 00:00:17
I'm going to tell you what that
word similar means and in what
4
00:00:17 --> 00:00:20
way two matrixes are called
similar.
5
00:00:20 --> 00:00:24
But before I do that,
I have a little more to say
6
00:00:24 --> 00:00:27
about positive definite
matrixes.
7
00:00:27 --> 00:00:31
You can tell this is a subject
I
8
00:00:31 --> 00:00:37
think is really important and I
told you what positive definite
9
00:00:37 --> 00:00:41
meant -- it means that this --
this expression,
10
00:00:41 --> 00:00:45
this quadratic form,
x transpose I x is always
11
00:00:45 --> 00:00:46
positive.
12
00:00:46 --> 00:00:51
But the direct way to test it
was with eigenvalues or pivots
13
00:00:51 --> 00:00:53
or determinants.
14
00:00:53 --> 00:00:56
So I --
we know what it means,
15
00:00:56 --> 00:01:01
we know how to test it,
but I didn't really say where
16
00:01:01 --> 00:01:04
positive definite matrixes come
from.
17
00:01:04 --> 00:01:08
And so one thing I want to say
is that they come from least
18
00:01:08 --> 00:01:13
squares in -- and all sorts of
physical problems start with a
19
00:01:13 --> 00:01:16
rectangular
matrix -- well,
20
00:01:16 --> 00:01:21
you remember in least squares
the crucial combination was A
21
00:01:21 --> 00:01:22
transpose A.
22
00:01:22 --> 00:01:26
So I want to show that that's a
positive definite matrix.
23
00:01:26 --> 00:01:30
Can -- so I -- I'm going to
speak a little more about
24
00:01:30 --> 00:01:35
positive definite matrixes,
just recapping -- so let me ask
25
00:01:35 --> 00:01:36
a
question.
26
00:01:36 --> 00:01:39
It may be on the homework.
27
00:01:39 --> 00:01:42
Suppose a matrix A is positive
definite.
28
00:01:42 --> 00:01:47.38
I mean by that it's all -- I'm
assuming it's symmetric.
29
00:01:47.38 --> 00:01:50
That's always built into the
definition.
30
00:01:50 --> 00:01:55
So we have a symmetric positive
definite matrix.
31
00:01:55 --> 00:01:57
What about its inverse?
32
00:01:57 --> 00:02:03
Is the inverse of a symmetric
positive definite matrix also
33
00:02:03 --> 00:02:06
symmetric positive definite?
34
00:02:06 --> 00:02:11
So you quickly think,
okay, what do I know about the
35
00:02:11 --> 00:02:13
pivots of the inverse matrix?
36
00:02:13 --> 00:02:14
Not much.
37
00:02:14 --> 00:02:19
What do I know about the
eigenvalues of the inverse
38
00:02:19 --> 00:02:21
matrix?
39
00:02:21 --> 00:02:22
Everything, right?
40
00:02:22 --> 00:02:27
The eigenvalues of the inverse
are one over the eigenvalues of
41
00:02:27 --> 00:02:28
the matrix.
42
00:02:28 --> 00:02:32.54
So if my matrix starts out
positive definite,
43
00:02:32.54 --> 00:02:37
then right away I know that its
inverse is positive definite,
44
00:02:37 --> 00:02:42
because those positive
eigenvalues -- then one over the
45
00:02:42 --> 00:02:45
eigenvalue is
also positive.
46
00:02:45 --> 00:02:52
What if I know that A -- a
matrix A and a matrix B are both
47
00:02:52 --> 00:02:54
positive definite?
48
00:02:54 --> 00:02:56.69
But let me ask you this.
49
00:02:56.69 --> 00:03:03
Suppose if A and B are positive
definite, what about -- what
50
00:03:03 --> 00:03:04
about A plus B?
51
00:03:04 --> 00:03:09
In some way,
you hope that that would be
52
00:03:09 --> 00:03:11
true.
53
00:03:11 --> 00:03:15
It's -- positive definite for a
matrix is kind of like positive
54
00:03:15 --> 00:03:16
for a real number.
55
00:03:16 --> 00:03:20
But we don't know the
eigenvalues of A plus B.
56
00:03:20 --> 00:03:22
We don't know the pivots of A
plus B.
57
00:03:22 --> 00:03:26.41
So we just, like,
have to go down this list of,
58
00:03:26.41 --> 00:03:31
all right, which approach to
positive definite can we get a
59
00:03:31 --> 00:03:32.61
handle on?
60
00:03:32.61 --> 00:03:34
And this is a good one.
61
00:03:34 --> 00:03:36
This is a good one.
62
00:03:36 --> 00:03:41
Can we -- how would we decide
that -- if A was like this and
63
00:03:41 --> 00:03:44
if B was like this,
then we would look at x
64
00:03:44 --> 00:03:46.46
transpose A plus B x.
65
00:03:46.46 --> 00:03:49
I'm sure this is in the
homework.
66
00:03:49 --> 00:03:52
Now -- so we have x transpose A
x
67
00:03:52 --> 00:03:57.66
bigger than zero,
x transpose B x positive for
68
00:03:57.66 --> 00:04:02
all -- for all x,
so now I ask you about this
69
00:04:02 --> 00:04:02
guy.
70
00:04:02 --> 00:04:07
And of course,
you just add that and that and
71
00:04:07 --> 00:04:09
we get what we want.
72
00:04:09 --> 00:04:14
If A and B are positive
definites, so is A plus B.
73
00:04:14 --> 00:04:17
So that's what I've shown.
74
00:04:17 --> 00:04:18
So is A plus B.
75
00:04:18 --> 00:04:26
Just -- be sort of ready for
all the approaches through
76
00:04:26 --> 00:04:30
eigenvalues and through this
expression.
77
00:04:30 --> 00:04:36
And now, finally,
one more thought about positive
78
00:04:36 --> 00:04:42
definite is this combination
that came up in least squares.
79
00:04:42 --> 00:04:44
Can I do that?
80
00:04:44 --> 00:04:49
So now -- now suppose A is
rectangular, m by n.
81
00:04:49 --> 00:04:54
I -- so I'm sorry that I've
used
82
00:04:54 --> 00:04:59
the same letter A for the
positive definite matrixes in
83
00:04:59 --> 00:05:04
the eigenvalue chapter that I
used way back in earlier
84
00:05:04 --> 00:05:08
chapters when the matrix was
rectangular.
85
00:05:08 --> 00:05:11
Now, that matrix -- a
rectangular matrix,
86
00:05:11 --> 00:05:14.52
no way its positive definite.
87
00:05:14.52 --> 00:05:16
It's not symmetric.
88
00:05:16 --> 00:05:20
It's not even square in
general.
89
00:05:20 --> 00:05:27
But you remember that the key
for these rectangular ones was A
90
00:05:27 --> 00:05:28
transpose A.
91
00:05:28 --> 00:05:30
That's square.
92
00:05:30 --> 00:05:32
That's symmetric.
93
00:05:32 --> 00:05:40
Those are things we knew -- we
knew back when we met this thing
94
00:05:40 --> 00:05:46
in the least square stuff,
in the projection stuff.
95
00:05:46 --> 00:05:54
But now we know something more
-- we can ask a more important
96
00:05:54 --> 00:06:00
question, a deeper question --
is it positive definite?
97
00:06:00 --> 00:06:02
And we sort of hope so.
98
00:06:02 --> 00:06:07
Like, we -- we might -- in
analogy with numbers,
99
00:06:07 --> 00:06:12
this is like -- sort of like
the square of a number,
100
00:06:12 --> 00:06:14
and that's positive.
101
00:06:14 --> 00:06:18
So now I want to ask the matrix
question.
102
00:06:18 --> 00:06:22
Is A transpose
A positive definite?
103
00:06:22 --> 00:06:27
Okay, now it's -- so again,
it's a rectangular A that I'm
104
00:06:27 --> 00:06:30
starting with,
but it's the combination A
105
00:06:30 --> 00:06:34
transpose A that's the square,
symmetric and hopefully
106
00:06:34 --> 00:06:36
positive definite matrix.
107
00:06:36 --> 00:06:41
So how -- how do I see that it
is positive definite,
108
00:06:41 --> 00:06:44
or at least positive
semi-definite?
109
00:06:44 --> 00:06:46
You'll see that.
110
00:06:46 --> 00:06:50.83
Well, I don't know the
eigenvalues of this product.
111
00:06:50.83 --> 00:06:54
I don't want to work with the
pivots.
112
00:06:54 --> 00:06:59
The right thing -- the right
quantity to look at is this,
113
00:06:59 --> 00:07:04.63
x transpose Ax -- A -- x
transpose times my matrix
114
00:07:04.63 --> 00:07:05
times x.
115
00:07:05 --> 00:07:10.14
I'd like to see that this thing
-- that that expression is
116
00:07:10.14 --> 00:07:11
always positive.
117
00:07:11 --> 00:07:16
I'm not doing it with numbers,
I'm doing it with symbols.
118
00:07:16 --> 00:07:20.99
Do you see -- how do I see that
that expression comes out
119
00:07:20.99 --> 00:07:21
positive?
120
00:07:21 --> 00:07:25
I'm taking a rectangular matrix
A
121
00:07:25 --> 00:07:29.03
and an A transpose -- that
gives me something square
122
00:07:29.03 --> 00:07:33
symmetric, but now I want to see
that if I multiply -- that if I
123
00:07:33 --> 00:07:37
do this -- I form this quadratic
expression that I get this
124
00:07:37 --> 00:07:41
positive thing that goes upwards
when I graph it.
125
00:07:41 --> 00:07:44.13
How do I see that that's
positive,
126
00:07:44.13 --> 00:07:47
or absolutely it isn't negative
anyway?
127
00:07:47 --> 00:07:51
We'll have to,
like, spend a minute on the
128
00:07:51 --> 00:07:56
question could it be zero,
but it can't be negative.
129
00:07:56 --> 00:07:59
Why can this never be negative?
130
00:07:59 --> 00:08:04
The argument is -- like the one
key idea in so many steps in
131
00:08:04 --> 00:08:09
linear algebra -- put those
parentheses
132
00:08:09 --> 00:08:11
in a good way.
133
00:08:11 --> 00:08:18.84
Put the parentheses around Ax
and what's the first part?
134
00:08:18.84 --> 00:08:23
What's this x transpose A
transpose?
135
00:08:23 --> 00:08:26
That is Ax transpose.
136
00:08:26 --> 00:08:28
So what do we have?
137
00:08:28 --> 00:08:33
We have the length squared of
Ax.
138
00:08:33 --> 00:08:41
We have -- that's the column
vector Ax that's the row
139
00:08:41 --> 00:08:46
vector Ax, its length squared,
certainly greater than or
140
00:08:46 --> 00:08:48
possibly equal to zero.
141
00:08:48 --> 00:08:53
So we have to deal with this
little possibility.
142
00:08:53 --> 00:08:54
Could it be equal?
143
00:08:54 --> 00:08:58.65
Well, when could the length
squared be zero?
144
00:08:58.65 --> 00:09:02
Only if the vector is zero,
right?
145
00:09:02 --> 00:09:06
That's the only vector that has
length squared zero.
146
00:09:06 --> 00:09:11
So we have -- we would like to
-- I would like to get that
147
00:09:11 --> 00:09:14.03
possibility out of there.
148
00:09:14.03 --> 00:09:17
So I want to have Ax never --
never be zero,
149
00:09:17 --> 00:09:20
except of course for the zero
vector.
150
00:09:20 --> 00:09:25
How do I assure that Ax is
never zero?
151
00:09:25 --> 00:09:30
The -- in other words,
how do I show that there's no
152
00:09:30 --> 00:09:31.91
null space of A?
153
00:09:31.91 --> 00:09:37
The rank should be -- so now
remember -- what's the rank when
154
00:09:37 --> 00:09:40
there's no null space?
155
00:09:40 --> 00:09:44
By no null space,
you know what I mean.
156
00:09:44 --> 00:09:47
Only the zero vector in the
null space.
157
00:09:47 --> 00:09:53
So if I have a --
if I have an 11 by 5 matrix --
158
00:09:53 --> 00:09:58
so it's got 11 rows,
5 columns, when is there no
159
00:09:58 --> 00:09:59
null space?
160
00:09:59 --> 00:10:05.47
So the columns should be
independent -- what's the rank?
161
00:10:05.47 --> 00:10:06.92
n 5 -- rank n.
162
00:10:06.92 --> 00:10:11.58
Independent columns,
when -- so if I -- then I
163
00:10:11.58 --> 00:10:14.79
conclude yes,
positive definite.
164
00:10:14.79 --> 00:10:21.35
And this was the assumption --
then A transpose A is
165
00:10:21.35 --> 00:10:27
invertible -- the least squares
equations all work fine.
166
00:10:27 --> 00:10:32
And more than that -- the
matrix is even positive
167
00:10:32 --> 00:10:33
definite.
168
00:10:33 --> 00:10:38
And I just to say one comment
about numerical things,
169
00:10:38 --> 00:10:42
with a positive definite
matrix,
170
00:10:42 --> 00:10:45
you never have to do row
exchanges.
171
00:10:45 --> 00:10:50
You never run into unsuitably
small numbers or zeroes in the
172
00:10:50 --> 00:10:51
pivot position.
173
00:10:51 --> 00:10:55
They're the right -- they're
the great matrixes to compute
174
00:10:55 --> 00:10:58
with, and they're the great
matrixes to study.
175
00:10:58 --> 00:11:03
So that's -- I wanted to take
this first ten
176
00:11:03 --> 00:11:08
minutes of grab the first ten
minutes away from similar
177
00:11:08 --> 00:11:13
matrixes and continue a -- this
much more with positive
178
00:11:13 --> 00:11:14
definite.
179
00:11:14 --> 00:11:19
I'm really at this point,
now, coming close to the end of
180
00:11:19 --> 00:11:22
the heart of linear algebra.
181
00:11:22 --> 00:11:28
The positive definiteness
brought everything together.
182
00:11:28 --> 00:11:33
Similar matrixes,
which is coming the rest of
183
00:11:33 --> 00:11:38
this hour is a key topic,
and please come on Monday.
184
00:11:38 --> 00:11:43
Monday is about what's called
the SVD, singular values.
185
00:11:43 --> 00:11:48
It's the -- has become a
central fact in -- a central
186
00:11:48 --> 00:11:51
part of linear algebra.
187
00:11:51 --> 00:11:55
I mean, you can come after
Monday
188
00:11:55 --> 00:12:00
also, but -- Monday is,
-- that singular value thing
189
00:12:00 --> 00:12:03
has made it into this course.
190
00:12:03 --> 00:12:08
Ten years ago,
five years ago it wasn't in the
191
00:12:08 --> 00:12:11
course, now it has to be.
192
00:12:11 --> 00:12:11.79
Okay.
193
00:12:11.79 --> 00:12:17
So can I begin today's
lecture proper with this idea
194
00:12:17 --> 00:12:19
of similar matrixes.
195
00:12:19 --> 00:12:22
This is what similar matrixes
mean.
196
00:12:22 --> 00:12:25
So here -- let's start again.
197
00:12:25 --> 00:12:26
I'll write it again.
198
00:12:26 --> 00:12:28
So A and B are similar.
199
00:12:28 --> 00:12:33
A and B are -- now I'm -- these
matrixes -- I'm no longer
200
00:12:33 --> 00:12:37
talking about symmetric
matrixes,
201
00:12:37 --> 00:12:43
in -- at least no longer
expecting symmetric matrixes.
202
00:12:43 --> 00:12:48
I'm talking about two square
matrixes n by n.
203
00:12:48 --> 00:12:52.38
A and B, they're n by n
matrixes.
204
00:12:52.38 --> 00:12:56
And I'm introducing this word
similar.
205
00:12:56 --> 00:13:00
So I'm going to say what does
it mean?
206
00:13:00 --> 00:13:07.16
It means that they're connected
in the way --
207
00:13:07.16 --> 00:13:14
well, in the way I've written
here, so let me rewrite it.
208
00:13:14 --> 00:13:21
That means that for some matrix
M, which has to be invertible,
209
00:13:21 --> 00:13:29
because you'll see that -- this
one matrix is -- take the other
210
00:13:29 --> 00:13:37
matrix, multiply on the right by
M and on the left by M inverse.
211
00:13:37 --> 00:13:42
So the question is,
why that combination?
212
00:13:42 --> 00:13:47
But part of the answer you know
already.
213
00:13:47 --> 00:13:55
You remember -- we've done this
-- we've taken a matrix A -- so
214
00:13:55 --> 00:13:59
let's do an example of similar.
215
00:13:59 --> 00:14:06
Suppose A -- the matrix A --
suppose it has a
216
00:14:06 --> 00:14:08
full set of eigenvectors.
217
00:14:08 --> 00:14:11.93
They go in this eigenvector
matrix S.
218
00:14:11.93 --> 00:14:16
Then what was the main point of
the whole -- the main
219
00:14:16 --> 00:14:21
calculation of the whole chapter
was -- is -- use that
220
00:14:21 --> 00:14:27
eigenvector matrix S and its
inverse comes over there to
221
00:14:27 --> 00:14:32
produce the nicest possible
matrix lambda.
222
00:14:32 --> 00:14:36
Nicest possible because it's
diagonal.
223
00:14:36 --> 00:14:43
So in our new language,
this is saying A is similar to
224
00:14:43 --> 00:14:44
lambda.
225
00:14:44 --> 00:14:50.16
A is similar to lambda,
because there is a matrix,
226
00:14:50.16 --> 00:14:57
and this particular -- there is
an M and this particular M is
227
00:14:57 --> 00:15:03
this important guy,
this eigenvector
228
00:15:03 --> 00:15:04
matrix.
229
00:15:04 --> 00:15:09
But if I take a different
matrix M and I look at M inverse
230
00:15:09 --> 00:15:16
A M, the result won't come out
diagonal, but it will come out a
231
00:15:16 --> 00:15:19.37
matrix B that's similar to A.
232
00:15:19.37 --> 00:15:25.59
Do you see that I'm -- what I'm
doing is, like -- I'm putting
233
00:15:25.59 --> 00:15:29
these matrixes into families.
234
00:15:29 --> 00:15:34
All the matrixes in one -- in
the family are similar to each
235
00:15:34 --> 00:15:34
other.
236
00:15:34 --> 00:15:39
They're all -- each one in this
family is connected to each
237
00:15:39 --> 00:15:44
other one by some matrix M and
the -- like the outstanding
238
00:15:44 --> 00:15:47.54
member of the family is the
diagonal guy.
239
00:15:47.54 --> 00:15:53
I mean, that's the simplest,
neatest matrix in this family
240
00:15:53 --> 00:15:56
of all the matrixes that are
similar to A,
241
00:15:56 --> 00:15:58
the best one is lambda.
242
00:15:58 --> 00:16:03.25
But there are lots of others,
because I can take different --
243
00:16:03.25 --> 00:16:06
instead of S,
I can take any old matrix M,
244
00:16:06 --> 00:16:10
any old invertible matrix and
-- and do it.
245
00:16:10 --> 00:16:12
I'd better do an example.
246
00:16:12 --> 00:16:12
Okay.
247
00:16:12 --> 00:16:18
Suppose I take A as the
matrix two one one two.
248
00:16:18 --> 00:16:18
Okay.
249
00:16:18 --> 00:16:23
Do you know the eigenvalue
matrix for that?
250
00:16:23 --> 00:16:29
The eigenvalues of that matrix
are -- well, three and one.
251
00:16:29 --> 00:16:35
So that -- and the eigenvectors
would be easy to find.
252
00:16:35 --> 00:16:40
So this matrix is similar to
this one.
253
00:16:40 --> 00:16:47
But my point is -- but also,
I can also take my matrix,
254
00:16:47 --> 00:16:51
two one one two,
I could multiply it by -- let's
255
00:16:51 --> 00:16:55
see, what -- I'm just going to
cook up a matrix M here.
256
00:16:55 --> 00:16:59
I'm -- I'll -- let me just
invent -- one four one zero.
257
00:16:59 --> 00:17:03
And over here I'll put M
inverse, and because I happened
258
00:17:03 --> 00:17:08
to make that triangular,
I know that its inverse is
259
00:17:08 --> 00:17:10
that, right?
260
00:17:10 --> 00:17:15
So there's M inverse A M,
that's going to produce some
261
00:17:15 --> 00:17:18
matrix -- oh,
well, I've got to do the
262
00:17:18 --> 00:17:21
multiplication,
so hang on a second,
263
00:17:21 --> 00:17:27
let -- I'll just copy that one
minus four zero one and multiply
264
00:17:27 --> 00:17:32
these guys so I'm getting two
nine one
265
00:17:32 --> 00:17:33
and six, I think.
266
00:17:33 --> 00:17:39
Can you check it as I go,
because you -- see I'm just --
267
00:17:39 --> 00:17:45
so that's two minus four,
I'm getting a minus two nine
268
00:17:45 --> 00:17:50
minus 24 is a minus 15,
my God, how did I get this?
269
00:17:50 --> 00:17:53
And that's probably one and
six.
270
00:17:53 --> 00:17:55
So there's my matrix B.
271
00:17:55 --> 00:18:02
And there's my matrix lambda,
there's my matrix A and my
272
00:18:02 --> 00:18:07.22
point is these are all similar
matrixes.
273
00:18:07.22 --> 00:18:13
They all have something in
common, besides being just two
274
00:18:13 --> 00:18:14
by two.
275
00:18:14 --> 00:18:17
They have something in common.
276
00:18:17 --> 00:18:20
And that's -- and what is it?
277
00:18:20 --> 00:18:27
What's the point about two
matrixes that are built out of
278
00:18:27 --> 00:18:32.66
-- the B is built out of M
inverse A M.
279
00:18:32.66 --> 00:18:35
What is it that A and B have in
common?
280
00:18:35 --> 00:18:39
That's the main -- now I'm
telling you the main fact about
281
00:18:39 --> 00:18:41.07
similar matrixes.
282
00:18:41.07 --> 00:18:43
They have the same eigenvalues.
283
00:18:43 --> 00:18:46
This is -- this chapter is
about eigenvalues,
284
00:18:46 --> 00:18:51
and that's why we're interested
in this family of matrixes that
285
00:18:51 --> 00:18:54
have the same eigenvalues.
286
00:18:54 --> 00:18:57
What are the eigenvalues in
this example?
287
00:18:57 --> 00:18:58
Lambda.
288
00:18:58 --> 00:19:01
The eigenvalues of that I could
compute.
289
00:19:01 --> 00:19:05
The eigenvalues of that I can
compute really fast.
290
00:19:05 --> 00:19:10
So the eigenvalues are three
and one -- for this for sure.
291
00:19:10 --> 00:19:16
Now did we -- do you see why
the eigenvalues are
292
00:19:16 --> 00:19:18.05
three and one for that one?
293
00:19:18.05 --> 00:19:21
If I tell you the eigenvalues
are three and one,
294
00:19:21 --> 00:19:25
you prick -- quickly process
the trace, which is -- and four
295
00:19:25 --> 00:19:29
-- agrees with four and you
process the determinant,
296
00:19:29 --> 00:19:34
three times one -- the
determinant is three and you say
297
00:19:34 --> 00:19:35
yes, it's right.
298
00:19:35 --> 00:19:39.84
Now I'm hoping that the
eigenvalues of this thing are
299
00:19:39.84 --> 00:19:40
three and one.
300
00:19:40 --> 00:19:45
May I process the trace and the
determinant for that one?
301
00:19:45 --> 00:19:47
What's the trace here?
302
00:19:47 --> 00:19:51.21
The trace of this matrix is
four minus two and six,
303
00:19:51.21 --> 00:19:53
and that's what it should be.
304
00:19:53 --> 00:19:57
What's the determinant minus
twelve
305
00:19:57 --> 00:19:58.97
plus fifteen is three.
306
00:19:58.97 --> 00:20:00
The determinant is three.
307
00:20:00 --> 00:20:05
The eigenvalues of that matrix
are also three and one.
308
00:20:05 --> 00:20:09
And you see I created this
matrix just like -- I just took
309
00:20:09 --> 00:20:14.13
any M, like, one that popped
into my head and computed M
310
00:20:14.13 --> 00:20:21
inverse A M, got that matrix,
it didn't look anything special
311
00:20:21 --> 00:20:30.06
but it's -- like A itself,
it has those eigenvalues three
312
00:20:30.06 --> 00:20:31
and one.
313
00:20:31 --> 00:20:38
So that's the main fact and let
me write it down.
314
00:20:38 --> 00:20:44
Similar matrixes have the same
eigenvalues.
315
00:20:44 --> 00:20:50
So I'll just put that as an
important point.
316
00:20:50 --> 00:20:54
And think
why.
317
00:20:54 --> 00:20:55
Why is that?
318
00:20:55 --> 00:20:59.53
So that's what that family of
matrixes is.
319
00:20:59.53 --> 00:21:04
The matrixes that are similar
to this A here are all the
320
00:21:04 --> 00:21:08
matrixes with eigenvalues three
and one.
321
00:21:08 --> 00:21:12
Every matrix with eigenvalues
three and one,
322
00:21:12 --> 00:21:18
there's some M that connects
this guy to the one you think
323
00:21:18 --> 00:21:20
of.
324
00:21:20 --> 00:21:23
And then of course,
the most special guy in the
325
00:21:23 --> 00:21:27.84
whole family is the diagonal one
with eigenvalues three and one
326
00:21:27.84 --> 00:21:29
sitting there on the diagonal.
327
00:21:29 --> 00:21:34
But also, I could find -- I
mean, tell me just a couple more
328
00:21:34 --> 00:21:35
members of the family.
329
00:21:35 --> 00:21:39
Another -- tell me another
matrix that has
330
00:21:39 --> 00:21:41
eigenvalues three and one.
331
00:21:41 --> 00:21:46
Well, let's see,
I -- oh, I'll just make it
332
00:21:46 --> 00:21:47
triangular.
333
00:21:47 --> 00:21:49
That's in the family.
334
00:21:49 --> 00:21:54
There is some M that -- that
connects to this one.
335
00:21:54 --> 00:21:56
And -- and also this.
336
00:21:56 --> 00:22:01
There's some matrix M -- so
that M inverse A M comes out to
337
00:22:01 --> 00:22:03
be that.
338
00:22:03 --> 00:22:07.09
There's a whole family here.
339
00:22:07.09 --> 00:22:11
And they all share the same
eigenvalues.
340
00:22:11 --> 00:22:13
So why is that?
341
00:22:13 --> 00:22:13
Okay.
342
00:22:13 --> 00:22:20
I'm going to start -- the only
possibility is to start with Ax
343
00:22:20 --> 00:22:22
equal lambda x.
344
00:22:22 --> 00:22:27
Okay, so suppose A has the
eigenvalue lambda.
345
00:22:27 --> 00:22:34
Now I want to get B into the
picture here somehow.
346
00:22:34 --> 00:22:37
You remember B is M inverse A
M.
347
00:22:37 --> 00:22:41
Let's just remember that over
here.
348
00:22:41 --> 00:22:42
B is M inverse A M.
349
00:22:42 --> 00:22:46
And I want to see its
eigenvalues.
350
00:22:46 --> 00:22:50
How I going to get M inverse A
M into this equation?
351
00:22:50 --> 00:22:53
Let me just sort of do it.
352
00:22:53 --> 00:22:57
I'll put an M times an M
inverse in there,
353
00:22:57 --> 00:22:58
right?
354
00:22:58 --> 00:23:02
That was -- I haven't changed
the
355
00:23:02 --> 00:23:05
left-hand side,
so I better not change the
356
00:23:05 --> 00:23:07
right-hand side.
357
00:23:07 --> 00:23:11
So everybody's okay so far,
I just put in there -- see,
358
00:23:11 --> 00:23:16
I want to get a -- so now I'll
multiply on the left by M
359
00:23:16 --> 00:23:21
inverse -- I have to do the same
to this side and that number
360
00:23:21 --> 00:23:27
lambda's just a number,
so it factors out in the front.
361
00:23:27 --> 00:23:31
So what I have here is this was
safe.
362
00:23:31 --> 00:23:34
I did the same thing to both
sides.
363
00:23:34 --> 00:23:36.52
And now I've got B.
364
00:23:36.52 --> 00:23:37
There's B.
365
00:23:37 --> 00:23:43
That's B times this vector M
inverse x is equal to lambda
366
00:23:43 --> 00:23:46
times this vector M inverse x.
367
00:23:46 --> 00:23:48.56
So what have I learned?
368
00:23:48.56 --> 00:23:54
I've learned that B times some
vector is lambda times that
369
00:23:54 --> 00:23:55
vector.
370
00:23:55 --> 00:23:59
I've learned that lambda is an
eigenvalue of B also.
371
00:23:59 --> 00:24:03
So this is -- if -- so this is
-- if lambda's an eigenvalue of
372
00:24:03 --> 00:24:08
A, then I can write it this way
and I discover that lambda's an
373
00:24:08 --> 00:24:09
eigenvalue of B.
374
00:24:09 --> 00:24:12
That's the end of the proof.
375
00:24:12 --> 00:24:15
The eigenvector didn't stay the
same.
376
00:24:15 --> 00:24:20.58
Of course I don't expect the
eigenvectors to stay the same.
377
00:24:20.58 --> 00:24:25
If all the eigenvalues are the
same and all the eigenvectors
378
00:24:25 --> 00:24:29
are the same,
then probably the matrix is the
379
00:24:29 --> 00:24:29.5
same.
380
00:24:29.5 --> 00:24:34
Here the eigenvector changes,
so the eigenvector -- so the
381
00:24:34 --> 00:24:38
point is then the eigenvector of
B --
382
00:24:38 --> 00:24:43
of B is M inverse times the
eigenvector of A.
383
00:24:43 --> 00:24:44
Okay.
384
00:24:44 --> 00:24:47
That's all that this says here.
385
00:24:47 --> 00:24:54
The eigenvector of A was X,
and so the M inverse -- similar
386
00:24:54 --> 00:25:00
matrixes, then have the same
eigenvalues and their
387
00:25:00 --> 00:25:04
eigenvectors are just moved
around.
388
00:25:04 --> 00:25:09
Of course,
that's what we -- that's what
389
00:25:09 --> 00:25:13
happened way back -- and the
most important similar matrixes
390
00:25:13 --> 00:25:15
are to diagonalize.
391
00:25:15 --> 00:25:18
So what was the point when we
diagonalized?
392
00:25:18 --> 00:25:21
The eigenvalues stayed the
same, of course.
393
00:25:21 --> 00:25:22
Three and one.
394
00:25:22 --> 00:25:24.73
What about the eigenvectors?
395
00:25:24.73 --> 00:25:29
The eigenvectors were whatever
they were for the matrix A,
396
00:25:29 --> 00:25:33
but then what were the
eigenvectors for the diagonal
397
00:25:33 --> 00:25:34.39
matrix?
398
00:25:34.39 --> 00:25:38.58
They're just -- what are the
eigenvectors of a diagonal
399
00:25:38.58 --> 00:25:39
matrix?
400
00:25:39 --> 00:25:41
They're just one zero and zero
one.
401
00:25:41 --> 00:25:44
So this step made the
eigenvectors nice,
402
00:25:44 --> 00:25:49
didn't change the eigenvalues,
and every time we don't change
403
00:25:49 --> 00:25:52
the
eigenvalues.
404
00:25:52 --> 00:25:54
Same eigenvalues.
405
00:25:54 --> 00:25:54
Okay.
406
00:25:54 --> 00:26:00
Now -- so I've got all these
matrixes in -- I've got this
407
00:26:00 --> 00:26:06
family of matrixes with
eigenvalues three and one.
408
00:26:06 --> 00:26:07
Fine.
409
00:26:07 --> 00:26:09
That's a nice family.
410
00:26:09 --> 00:26:15
It's nice because those two
eigenvalues are different.
411
00:26:15 --> 00:26:21
I now have to --
to get into that -- the -- into
412
00:26:21 --> 00:26:27
the less happy possibility that
the two eigenvalues could be the
413
00:26:27 --> 00:26:27
same.
414
00:26:27 --> 00:26:32
And then it's a little
trickier, because you remember
415
00:26:32 --> 00:26:36
when two eigenvalues are the
same, what's the bad
416
00:26:36 --> 00:26:37
possibility?
417
00:26:37 --> 00:26:43.16
That there might not be enough
-- a full set of eigenvectors
418
00:26:43.16 --> 00:26:47
and we
might not be able to
419
00:26:47 --> 00:26:48
diagonalize.
420
00:26:48 --> 00:26:52
So I need to discuss the bad
case.
421
00:26:52 --> 00:26:56
So the bad -- can I just say
bad?
422
00:26:56 --> 00:27:03
If lambda one equals lambda
two, then the matrix might not
423
00:27:03 --> 00:27:05
be diagonalizable.
424
00:27:05 --> 00:27:11
Suppose lambda one equals
lambda two equals four,
425
00:27:11 --> 00:27:11
say.
426
00:27:11 --> 00:27:21.11
Now if I look at the family of
matrixes with eigenvalues four
427
00:27:21.11 --> 00:27:27
and four, well,
one possibility occurs to me.
428
00:27:27 --> 00:27:35
One family with eigenvalues
four and four has this matrix in
429
00:27:35 --> 00:27:39
it, four times the identity.
430
00:27:39 --> 00:27:47
Then another -- but now I want
to ask also about the matrix
431
00:27:47 --> 00:27:50
four four one zero.
432
00:27:50 --> 00:27:56
And my point --
here's the whole point of this
433
00:27:56 --> 00:28:02.11
-- of this bad stuff,
is that this guy is not in the
434
00:28:02.11 --> 00:28:04
same family with that one.
435
00:28:04 --> 00:28:10
The family of a -- of matrixes
that have eigenvalues four and
436
00:28:10 --> 00:28:13
four is two families.
437
00:28:13 --> 00:28:19
There's this total loner here
who's in a family off -- right?
438
00:28:19 --> 00:28:21
Just by himself.
439
00:28:21 --> 00:28:26
And all the others are in with
this guy.
440
00:28:26 --> 00:28:29
So the big family includes this
one.
441
00:28:29 --> 00:28:34
And it includes a whole lot of
other matrixes,
442
00:28:34 --> 00:28:38
all -- in fact,
in this two by two case,
443
00:28:38 --> 00:28:44
it -- you see where -- what do
I mean -- so what I using,
444
00:28:44 --> 00:28:50
this word family -- in a
family, I mean
445
00:28:50 --> 00:28:51.52
they're similar.
446
00:28:51.52 --> 00:28:56.99
So my point is that the only
matrix that's similar to this is
447
00:28:56.99 --> 00:28:57
itself.
448
00:28:57 --> 00:29:03
The only matrix that's similar
to four times the identity is
449
00:29:03 --> 00:29:05
four times the identity.
450
00:29:05 --> 00:29:06
It's off by itself.
451
00:29:06 --> 00:29:08
Why is that?
452
00:29:08 --> 00:29:12
The -- if this is my matrix,
four times the identity,
453
00:29:12 --> 00:29:17
and I take
it, I multiply on the right by
454
00:29:17 --> 00:29:21
any matrix M,
I multiply on the left by M
455
00:29:21 --> 00:29:23
inverse, what do I get?
456
00:29:23 --> 00:29:26
This is any M,
but what's the result?
457
00:29:26 --> 00:29:31
Well, factoring out a four,
that's just the identity matrix
458
00:29:31 --> 00:29:32
in there.
459
00:29:32 --> 00:29:37
So then the M inverse cancels
the M, so I've just got this
460
00:29:37 --> 00:29:40.71
matrix back
again.
461
00:29:40.71 --> 00:29:45
So whatever the M is,
I'm not getting any more
462
00:29:45 --> 00:29:47
members of the family.
463
00:29:47 --> 00:29:53
So this is one small family,
because it only has one person.
464
00:29:53 --> 00:29:56
One matrix, excuse me.
465
00:29:56 --> 00:30:01
I think of these matrixes as
people by this point,
466
00:30:01 --> 00:30:03
in eighteen oh six.
467
00:30:03 --> 00:30:09
Okay, the other family includes
all the rest --
468
00:30:09 --> 00:30:14
all other matrixes that have
eigenvalues four and four.
469
00:30:14 --> 00:30:17
This is somehow the best one in
that family.
470
00:30:17 --> 00:30:20
See, I can't make it diagonal.
471
00:30:20 --> 00:30:23.48
If I -- if it's diagonal,
it's this one.
472
00:30:23.48 --> 00:30:25
It's in its own,
by itself.
473
00:30:25 --> 00:30:29.81
So I have to think,
okay, what's the nearest I can
474
00:30:29.81 --> 00:30:31
get
to diagonal?
475
00:30:31 --> 00:30:34
But it will not be
diagonalizable.
476
00:30:34 --> 00:30:38
That -- do you know that that
matrix is not diagonalizable?
477
00:30:38 --> 00:30:42
Of course, because if it was
diagonalizable,
478
00:30:42 --> 00:30:45
it would be similar to that,
which it isn't.
479
00:30:45 --> 00:30:48
The eigenvalues of this are
four
480
00:30:48 --> 00:30:53.06
and four, but what's the catch
with that matrix?
481
00:30:53.06 --> 00:30:55
It's only got one eigenvector.
482
00:30:55 --> 00:30:59
That's a non-diagonalizable
matrix.
483
00:30:59 --> 00:31:01
Only one eigenvector.
484
00:31:01 --> 00:31:04
And somehow,
if I made that one into a ten
485
00:31:04 --> 00:31:08
or to a million,
I could find an M,
486
00:31:08 --> 00:31:11
it's in the family,
it's similar.
487
00:31:11 --> 00:31:16
But the best --
so the best guy in this family
488
00:31:16 --> 00:31:17
is this one.
489
00:31:17 --> 00:31:22
And this is called the Jordan
-- so this guy Jordan picked out
490
00:31:22 --> 00:31:26
-- so he, like,
studied, these families of
491
00:31:26 --> 00:31:31
matrixes, and each family,
he picked out the nicest,
492
00:31:31 --> 00:31:32
most diagonal one.
493
00:31:32 --> 00:31:38
But not completely diagonal,
because there's nobody -- there
494
00:31:38 --> 00:31:43
isn't a diagonal matrix in this
family, so there's a one up
495
00:31:43 --> 00:31:45
there in the Jordan form.
496
00:31:45 --> 00:31:45
Okay.
497
00:31:45 --> 00:31:50.2
I think we've got to see some
more matrixes in that family.
498
00:31:50.2 --> 00:31:53
So, all right,
let me -- let's just think of
499
00:31:53 --> 00:31:57
some other matrixes whose
eigenvalues are four and four
500
00:31:57 --> 00:32:02
but they're not
four times the identity.
501
00:32:02 --> 00:32:07
So -- and I believe that --
that this -- that all the
502
00:32:07 --> 00:32:12
examples we pick up will be
similar to each other and -- do
503
00:32:12 --> 00:32:17
you see why -- in this topic of
similar matrixes,
504
00:32:17 --> 00:32:20
the climax is the Jordan form.
505
00:32:20 --> 00:32:26
So it says that every
matrix -- I'll write down what
506
00:32:26 --> 00:32:30
the Jordan form -- what Jordan
discovered.
507
00:32:30 --> 00:32:34
He found the best looking
matrix in each family.
508
00:32:34 --> 00:32:40.68
And that's -- then we've got --
then we've covered all matrixes
509
00:32:40.68 --> 00:32:44
including the non-diagonalizable
one.
510
00:32:44 --> 00:32:47
That -- that's the point,
that in some way,
511
00:32:47 --> 00:32:51
Jordan completed the
diagonalization by coming as
512
00:32:51 --> 00:32:54
near as he could,
which is his Jordan form.
513
00:32:54 --> 00:32:56.98
And therefore,
if you want to cover all
514
00:32:56.98 --> 00:33:00
matrixes, you've got to get him
in the picture.
515
00:33:00 --> 00:33:04
It used to be -- when I took
eighteen oh six,
516
00:33:04 --> 00:33:09
that was the climax of the
course, this Jordan form stuff.
517
00:33:09 --> 00:33:14
I think it's not the climax of
linear algebra anymore,
518
00:33:14 --> 00:33:20
because -- it's not easy to
find this Jordan form for a
519
00:33:20 --> 00:33:24.06
general matrix,
because it depends on these
520
00:33:24.06 --> 00:33:28
eigenvalues being exactly the
same.
521
00:33:28 --> 00:33:33
You'd have to know exactly the
eigenvalues and it -- and you'd
522
00:33:33 --> 00:33:38
have to know exactly the rank
and the slightest change in
523
00:33:38 --> 00:33:43
numbers will change those
eigenvalues, change the rank and
524
00:33:43 --> 00:33:47
therefore the whole thing is
numerically not an -- a good
525
00:33:47 --> 00:33:48
thing.
526
00:33:48 --> 00:33:53
But for algebra,
it's the right thing to
527
00:33:53 --> 00:33:55.74
understand this family.
528
00:33:55.74 --> 00:34:02
So just tell me another matrix
-- a few more matrixes -- so
529
00:34:02 --> 00:34:05
more members of the family.
530
00:34:05 --> 00:34:10
Let me put down again what the
best one is.
531
00:34:10 --> 00:34:10
Okay.
532
00:34:10 --> 00:34:12
All right.
533
00:34:12 --> 00:34:14
Some more matrixes.
534
00:34:14 --> 00:34:18
Let's see, what I looking for?
535
00:34:18 --> 00:34:21
I'm looking for matrixes whose
trace is what?
536
00:34:21 --> 00:34:25
So if I'm looking for more
matrixes in the family,
537
00:34:25 --> 00:34:28
they'll all have the same
eigenvalues, four and four.
538
00:34:28 --> 00:34:30
So their trace will be eight.
539
00:34:30 --> 00:34:34
So why don't I just take,
like, five and three -- I've
540
00:34:34 --> 00:34:38
got the trace right,
now the determinant
541
00:34:38 --> 00:34:39
should be what?
542
00:34:39 --> 00:34:40.17
Sixteen.
543
00:34:40.17 --> 00:34:45
So I just fix this up -- shall
I put maybe a one and a minus
544
00:34:45 --> 00:34:46
one there?
545
00:34:46 --> 00:34:46
Okay.
546
00:34:46 --> 00:34:50
There's a matrix with
eigenvalues four and four,
547
00:34:50 --> 00:34:56
because the trace is eight and
the determinant is sixteen.
548
00:34:56 --> 00:35:00
And I don't think it's
diagonalizable.
549
00:35:00 --> 00:35:03
Do you know why it's not
diagonalizable?
550
00:35:03 --> 00:35:06.42
Because if it was
diagonalizable,
551
00:35:06.42 --> 00:35:09
the diagonal form would have to
be this.
552
00:35:09 --> 00:35:14
But I can't get to that form,
because whatever I do with any
553
00:35:14 --> 00:35:17
M inverse and M I stay with that
form.
554
00:35:17 --> 00:35:20
I could never get -- connect
those.
555
00:35:20 --> 00:35:24
So I can put down more members
--
556
00:35:24 --> 00:35:26.56
here -- here's another easy
one.
557
00:35:26.56 --> 00:35:30.64
I could put the four and the
four and a seventeen down there.
558
00:35:30.64 --> 00:35:32
All these matrixes are similar.
559
00:35:32 --> 00:35:36.76
If I'm -- I could find an M
that would show that that one is
560
00:35:36.76 --> 00:35:38.12
similar to that one.
561
00:35:38.12 --> 00:35:41
And in -- you can see the
general picture is I can take
562
00:35:41 --> 00:35:45
any a and any 8-a
here and any -- oh,
563
00:35:45 --> 00:35:49.12
I don't know,
whatever you put it'd be --
564
00:35:49.12 --> 00:35:50
anyway, you can see.
565
00:35:50 --> 00:35:55
I can fill this in,
fill this in to make the trace
566
00:35:55 --> 00:36:00
equal eight, the determinant
equal 16, I get all that family
567
00:36:00 --> 00:36:04
of matrixes and they're all
similar.
568
00:36:04 --> 00:36:07
So we see what eigenvalues do.
569
00:36:07 --> 00:36:13
They're all similar and they
all have only one eigenvector.
570
00:36:13 --> 00:36:19
So I -- if I'm -- if you were
going to -- allow me to add to
571
00:36:19 --> 00:36:23
this picture,
they have the same lambdas and
572
00:36:23 --> 00:36:29
they also have the same number
of independent eigenvectors.
573
00:36:29 --> 00:36:35
Because if I get an eigenvector
for x I get one for --
574
00:36:35 --> 00:36:37
for A, I get one for B also.
575
00:36:37 --> 00:36:41
So -- and same number of
eigenvectors.
576
00:36:41 --> 00:36:46
But even more than that -- even
more than that -- I mean,
577
00:36:46 --> 00:36:50
it's not enough just to count
eigenvectors.
578
00:36:50 --> 00:36:55
Yes, let me give you an example
why it's not even enough to
579
00:36:55 --> 00:36:58
count eigenvectors.
580
00:36:58 --> 00:36:59
So another example.
581
00:36:59 --> 00:37:05
So here are some matrixes --
oh, let me make them four by
582
00:37:05 --> 00:37:09
four -- okay,
here -- here's a matrix.
583
00:37:09 --> 00:37:15.11
I mean, like if you want
nightmares, think about matrixes
584
00:37:15.11 --> 00:37:16
like these.
585
00:37:16 --> 00:37:21
Uh, so a one off the diagonal
-- say a one there,
586
00:37:21 --> 00:37:26
how many -- what are the
eigenvalues of that matrix?
587
00:37:26 --> 00:37:29.54
Oh, I mean --
okay.
588
00:37:29.54 --> 00:37:34
What are the eigenvalues of
that matrix?
589
00:37:34 --> 00:37:35
Please.
590
00:37:35 --> 00:37:37
Four 0s, right?
591
00:37:37 --> 00:37:41
So we're really getting bad
matrixes now.
592
00:37:41 --> 00:37:47
So I mean, this is,
like -- Jordan was a good guy,
593
00:37:47 --> 00:37:54
but he had to think about
matrixes that all --
594
00:37:54 --> 00:37:59
that had, like -- an eigenvalue
repeated four times.
595
00:37:59 --> 00:38:02
How many eigenvectors does that
matrix have?
596
00:38:02 --> 00:38:07.92
Well, I'm -- eigenvectors will
be -- since the eigenvalue is
597
00:38:07.92 --> 00:38:11
zero, eigenvectors will be in
the null space,
598
00:38:11 --> 00:38:12.19
right?
599
00:38:12.19 --> 00:38:15
I'm -- eigenvectors have got to
be
600
00:38:15 --> 00:38:17
A x equal zero x.
601
00:38:17 --> 00:38:20
So what's the dimension of the
null space?
602
00:38:20 --> 00:38:21.27
Two.
603
00:38:21.27 --> 00:38:22
Somebody said two.
604
00:38:22 --> 00:38:24
And that's right.
605
00:38:24 --> 00:38:25.22
How -- why?
606
00:38:25.22 --> 00:38:29
Because you ask what's the rank
of that matrix,
607
00:38:29 --> 00:38:31
the rank is obviously two.
608
00:38:31 --> 00:38:35
The number of independent rows
is
609
00:38:35 --> 00:38:39
two, the number of independent
columns is two,
610
00:38:39 --> 00:38:45
the rank is two so the null --
the dimension of the null space
611
00:38:45 --> 00:38:49
is four minus two,
so it's got two eigenvectors.
612
00:38:49 --> 00:38:51.4
Two eigenvectors.
613
00:38:51.4 --> 00:38:54
Two independent eigenvectors.
614
00:38:54 --> 00:38:55
All right.
615
00:38:55 --> 00:38:59
The dimension of the null space
is two.
616
00:38:59 --> 00:39:04
Now, suppose I change this zero
to a seven.
617
00:39:04 --> 00:39:09
The eigenvalues are all still
zero, how -- what about -- how
618
00:39:09 --> 00:39:11
many eigenvectors?
619
00:39:11 --> 00:39:17
What's the dimension of the --
what's the rank of this matrix
620
00:39:17 --> 00:39:18
now?
621
00:39:18 --> 00:39:19
Still two, right?
622
00:39:19 --> 00:39:21
So it's okay.
623
00:39:21 --> 00:39:25
And actually,
this would be similar to the
624
00:39:25 --> 00:39:29
one
that had a zero in there.
625
00:39:29 --> 00:39:33
But it's not as beautiful,
Jordan picked this one.
626
00:39:33 --> 00:39:38
He picked -- he put ones -- we
have a one on the -- above the
627
00:39:38 --> 00:39:42
diagonal for every missing
eigenvector, and here we're
628
00:39:42 --> 00:39:45.84
missing two because we've got
two,
629
00:39:45.84 --> 00:39:51
so we've got two eigenvectors
and two are missing,
630
00:39:51 --> 00:39:55
because it's a four by four
matrix.
631
00:39:55 --> 00:40:03
Okay, now -- but I was going to
give you this second example.
632
00:40:03 --> 00:40:07.15
0 1 0 0, let me just move the
one.
633
00:40:07.15 --> 00:40:08
Oop, not there.
634
00:40:08 --> 00:40:14
Off the diagonal and zero zero
zero
635
00:40:14 --> 00:40:15
zero zero.
636
00:40:15 --> 00:40:16
Okay.
637
00:40:16 --> 00:40:19
So now tell me about this
matrix.
638
00:40:19 --> 00:40:24.18
Its eigenvalues are four zeroes
again.
639
00:40:24.18 --> 00:40:26
Its rank is two again.
640
00:40:26 --> 00:40:31
So it has two eigenvectors and
two missing.
641
00:40:31 --> 00:40:36
But the darn thing is not
similar to that one.
642
00:40:36 --> 00:40:43
A -- a count of eigenvectors
looks like these could be
643
00:40:43 --> 00:40:47
similar,
but they're not.
644
00:40:47 --> 00:40:51
Jordan -- see,
this is like -- a little three
645
00:40:51 --> 00:40:55
by three block and a little one
by one block.
646
00:40:55 --> 00:41:00
And this one is like a two by
two block and a two by two
647
00:41:00 --> 00:41:05
block, and those blocks are
called Jordan blocks.
648
00:41:05 --> 00:41:09
So let me say what is a Jordan
block?
649
00:41:09 --> 00:41:16
J block number I has -- so a
Jordan block has a repeated
650
00:41:16 --> 00:41:22
eigenvalue, lambda I,
lambda I on the diagonal.
651
00:41:22 --> 00:41:26
Zeroes below and ones above.
652
00:41:26 --> 00:41:31
So there's a block with this
guy repeated,
653
00:41:31 --> 00:41:35
but it only has one
eigenvector.
654
00:41:35 --> 00:41:42
So a Jordan block has one
eigenvector only.
655
00:41:42 --> 00:41:49.13
This one has one eigenvector,
this block has one eigenvector
656
00:41:49.13 --> 00:41:50
and we get two.
657
00:41:50 --> 00:41:56.57
This block has one eigenvector
and that block has one
658
00:41:56.57 --> 00:41:59
eigenvector and we get two.
659
00:41:59 --> 00:42:04.01
So -- but the blocks are
different sizes.
660
00:42:04.01 --> 00:42:10
And that -- it turns out Jordan
worked out -- then this is not
661
00:42:10 --> 00:42:16
similar, not
similar to this one.
662
00:42:16 --> 00:42:22
So the -- so I'm,
like, giving you the whole
663
00:42:22 --> 00:42:27
story -- well,
not the whole story,
664
00:42:27 --> 00:42:35
but the main themes of the
story -- is here's Jordan's
665
00:42:35 --> 00:42:36
theorem.
666
00:42:36 --> 00:42:44
Every square matrix A is
similar to A Jordan matrix J.
667
00:42:44 --> 00:42:48
And what's a Jordan matrix J?
668
00:42:48 --> 00:42:55
It's a matrix with these
blocks,
669
00:42:55 --> 00:43:01
block -- Jordan block number
one, Jordan block number two and
670
00:43:01 --> 00:43:02
so on.
671
00:43:02 --> 00:43:06
And let's say Jordan block
number d.
672
00:43:06 --> 00:43:12
And those Jordan blocks look
like that, so the eigenvalues
673
00:43:12 --> 00:43:18
are sitting on the diagonal,
but we've got some of these
674
00:43:18 --> 00:43:21
ones above the diagonal.
675
00:43:21 --> 00:43:29
We've got the number of -- so
the number of blocks -- the
676
00:43:29 --> 00:43:34.63
number of blocks is the number
of eigenvectors,
677
00:43:34.63 --> 00:43:39
because we get one eigenvector
per block.
678
00:43:39 --> 00:43:45
So what I'm -- so if I
summarize Jordan's idea -- start
679
00:43:45 --> 00:43:46
with any A.
680
00:43:46 --> 00:43:53
If its eigenvalues are
distinct, then what's it similar
681
00:43:53 --> 00:43:53
to?
682
00:43:53 --> 00:44:00
This is the good case.
if I start with a matrix A and
683
00:44:00 --> 00:44:04
it has different eigenvalues --
it's n eigenvalues,
684
00:44:04 --> 00:44:09
none of them are repeated,
then that's a diagonal --
685
00:44:09 --> 00:44:14
diagonalizable matrix -- the
Jordan blocks is -- has -- the
686
00:44:14 --> 00:44:17
Jordan matrix is diagonal.
687
00:44:17 --> 00:44:18
It's lambda.
688
00:44:18 --> 00:44:22
So the good case --
the good case,
689
00:44:22 --> 00:44:23.94
J is lambda.
690
00:44:23.94 --> 00:44:26
All -- there are -- d=n.
691
00:44:26 --> 00:44:31
There are n eigenvectors,
n blocks, diagonal,
692
00:44:31 --> 00:44:33
everything great.
693
00:44:33 --> 00:44:39
But Jordan covered all cases by
including these cases of
694
00:44:39 --> 00:44:44
repeated eigenvalues and missing
eigenvectors.
695
00:44:44 --> 00:44:45
Okay.
696
00:44:45 --> 00:44:48
That's a description of Jordan.
697
00:44:48 --> 00:44:53
That --
that's -- I haven't told you
698
00:44:53 --> 00:44:57
how to compute this thing,
and it isn't easy.
699
00:44:57 --> 00:45:02
Whereas the good case is the --
the good case is what 18.06 is
700
00:45:02 --> 00:45:02
about.
701
00:45:02 --> 00:45:07
The -- this case is what 18.06
was about 20 years ago.
702
00:45:07 --> 00:45:13
So you can see you probably
won't have on the final exam the
703
00:45:13 --> 00:45:19
computation of a Jordan matrix
for some horrible thing with
704
00:45:19 --> 00:45:22
four repeated eigenvalues.
705
00:45:22 --> 00:45:26
I'm not that crazy about the
Jordan form.
706
00:45:26 --> 00:45:32
But I'm very positive about
positive definite matrixes and
707
00:45:32 --> 00:45:38
about the idea that's coming
Monday, the singular value
708
00:45:38 --> 00:45:39
decomposition.
709
00:45:39 --> 00:45:46
So I'll see you on Monday,
and have a great weekend.
710
00:45:46 --> 00:45:49
Bye.