1
00:00:04 --> 00:00:09
OK, here we go with,
quiz review for the third quiz
2
00:00:09 --> 00:00:12
that's coming on Friday.
3
00:00:12 --> 00:00:18
So, one key point is that the
quiz covers through chapter six.
4
00:00:18 --> 00:00:24
Chapter seven on linear
transformations will appear on
5
00:00:24 --> 00:00:29
the final exam,
but not on the quiz.
6
00:00:29 --> 00:00:33
So I won't review linear
transformations today,
7
00:00:33 --> 00:00:39
but they'll come into the full
course review on the very last
8
00:00:39 --> 00:00:40
lecture.
9
00:00:40 --> 00:00:46
So today, I'm reviewing chapter
six, and I'm going to take some
10
00:00:46 --> 00:00:52
old exams, and I'm always ready
to answer questions.
11
00:00:52 --> 00:00:55
And I thought,
kind of help our memories if I
12
00:00:55 --> 00:00:59
write down the main topics in
chapter six.
13
00:00:59 --> 00:01:01
So, already,
on the previous quiz,
14
00:01:01 --> 00:01:04.98
we knew how to find eigenvalues
and eigenvectors.
15
00:01:04.98 --> 00:01:09
Well, we knew how to find them
by that determinant of A minus
16
00:01:09 --> 00:01:10
lambda I equals zero.
17
00:01:10 --> 00:01:15
But, of course,
there could be shortcuts.
18
00:01:15 --> 00:01:19
There could be,
like, useful information about
19
00:01:19 --> 00:01:23
the eigenvalues that we can
speed things up with.
20
00:01:23 --> 00:01:24
OK.
21
00:01:24 --> 00:01:29
Then, the new stuff starts out
with a differential equation,
22
00:01:29 --> 00:01:32
so I'll do a problem.
23
00:01:32 --> 00:01:35
I'll do a differential equation
problem first.
24
00:01:35 --> 00:01:37
What's special about symmetric
matrices?
25
00:01:37 --> 00:01:39
Can we just say that in words?
26
00:01:39 --> 00:01:42
I'd better write it down,
though.
27
00:01:42 --> 00:01:44
What's special about symmetric
matrices?
28
00:01:44 --> 00:01:46
Their eigenvalues are real.
29
00:01:46 --> 00:01:50
The eigenvalues of a symmetric
matrix always come out real,
30
00:01:50 --> 00:01:54.37
and there always are enough
eigenvectors.
31
00:01:54.37 --> 00:01:58
Even if there are repeated
eigenvalues, there are enough
32
00:01:58 --> 00:02:00
eigenvectors,
and we can choose those
33
00:02:00 --> 00:02:03
eigenvectors to be orthogonal.
34
00:02:03 --> 00:02:07
So if A equals A transposed,
the big fact will be that we
35
00:02:07 --> 00:02:10
can diagonalize it,
and those eigenvector matrix,
36
00:02:10 --> 00:02:14
with the eigenvectors in the
column, can be an orthogonal
37
00:02:14 --> 00:02:16
matrix.
38
00:02:16 --> 00:02:22
So we get a Q lambda Q
transpose.
39
00:02:22 --> 00:02:32
That, in three symbols,
expresses a wonderful fact,
40
00:02:32 --> 00:02:42
a fundamental fact for
symmetric matrices.
41
00:02:42 --> 00:02:42
OK.
42
00:02:42 --> 00:02:47
Then, we went beyond that fact
to ask about positive definite
43
00:02:47 --> 00:02:51
matrices, when the eigenvalues
were positive.
44
00:02:51 --> 00:02:53
I'll do an example of that.
45
00:02:53 --> 00:02:55
Now we've left symmetry.
46
00:02:55 --> 00:03:00
Similar matrices are any square
matrices, but two matrices are
47
00:03:00 --> 00:03:03
similar if they're related that
way.
48
00:03:03 --> 00:03:07
And what's the key point about
similar matrices?
49
00:03:07 --> 00:03:12
Somehow, those matrices are
representing the same thing in
50
00:03:12 --> 00:03:16
different basis,
in chapter seven language.
51
00:03:16 --> 00:03:20
In chapter six language,
what's up with these similar
52
00:03:20 --> 00:03:22.52
matrices?
53
00:03:22.52 --> 00:03:27
What's the key fact,
the key positive fact about
54
00:03:27 --> 00:03:28
similar matrices?
55
00:03:28 --> 00:03:31
They have the same eigenvalues.
56
00:03:31 --> 00:03:33
Same eigenvalues.
57
00:03:33 --> 00:03:37
So if one of them grows,
the other one grows.
58
00:03:37 --> 00:03:44
If one of them decays to zero,
the other one decays to zero.
59
00:03:44 --> 00:03:49
Powers of A will look like
powers of B, because powers of A
60
00:03:49 --> 00:03:55
and powers of B only differ by
an M inverse and an M way on the
61
00:03:55 --> 00:03:55
outside.
62
00:03:55 --> 00:04:00
So if these are similar,
then B to the k-th power is M
63
00:04:00 --> 00:04:04
inverse A to the k-th power M.
64
00:04:04 --> 00:04:09
And that's why I say,
eh, this M, it does change the
65
00:04:09 --> 00:04:12
eigenvectors,
but it doesn't change the
66
00:04:12 --> 00:04:14
eigenvalues.
67
00:04:14 --> 00:04:15.59
So same lambdas.
68
00:04:15.59 --> 00:04:20
And then, finally,
I've got to review the point
69
00:04:20 --> 00:04:23.27
about the SVD,
the Singular Value
70
00:04:23.27 --> 00:04:25
Decomposition.
71
00:04:25 --> 00:04:26
OK.
72
00:04:26 --> 00:04:29
So that's what this quiz has
got to cover,
73
00:04:29 --> 00:04:34
and now I'll just take problems
from earlier exams,
74
00:04:34 --> 00:04:37
starting with a differential
equation.
75
00:04:37 --> 00:04:37.84
OK.
76
00:04:37.84 --> 00:04:40
And always ready for questions.
77
00:04:40 --> 00:04:44
So here is an exam from about
the year zero,
78
00:04:44 --> 00:04:48
and it has a three by three.
79
00:04:48 --> 00:04:53
So that was -- but it's a
pretty special-looking matrix,
80
00:04:53 --> 00:04:58
it's got zeroes on the
diagonal, it's got minus ones
81
00:04:58 --> 00:05:02
above, and it's got plus ones
like that.
82
00:05:02 --> 00:05:04
So that's the matrix A.
83
00:05:04 --> 00:05:05
OK.
84
00:05:05 --> 00:05:09
Step one is,
well, I want to solve that
85
00:05:09 --> 00:05:09
equation.
86
00:05:09 --> 00:05:14
I want to find the general
solution.
87
00:05:14 --> 00:05:18
I haven't given you a u(0)
here, so I'm looking for the
88
00:05:18 --> 00:05:22
general solution,
so now what's the form of the
89
00:05:22 --> 00:05:23
general solution?
90
00:05:23 --> 00:05:27
With three arbitrary constants
going to be inside it,
91
00:05:27 --> 00:05:31
because those will be used to
match the initial condition.
92
00:05:31 --> 00:05:35
So the general form is u at
time t is some multiple of the
93
00:05:35 --> 00:05:38
first special solution.
94
00:05:38 --> 00:05:43
The first special solution will
be growing like the eigenvalue,
95
00:05:43 --> 00:05:46
and it's the eigenvector.
96
00:05:46 --> 00:05:51
So that's a pure exponential
solution, just staying with that
97
00:05:51 --> 00:05:52
eigenvector.
98
00:05:52 --> 00:05:57.19
Of course, I haven't found,
yet, the eigenvalues and
99
00:05:57.19 --> 00:05:58
eigenvectors.
100
00:05:58 --> 00:06:02
That's, normally,
the first job.
101
00:06:02 --> 00:06:06
Now, there will be second one,
growing like e to the lambda
102
00:06:06 --> 00:06:10
two, and a third one growing
like e to the lambda three.
103
00:06:10 --> 00:06:15
So we're all done -- well,
we haven't done anything yet,
104
00:06:15 --> 00:06:15
actually.
105
00:06:15 --> 00:06:19
I've got to find the
eigenvalues and eigenvectors,
106
00:06:19 --> 00:06:23
and then I would match u(0) by
choosing the right three
107
00:06:23 --> 00:06:25
constants.
108
00:06:25 --> 00:06:25
OK.
109
00:06:25 --> 00:06:30
So now I ask -- ask you about
the eigenvalues and
110
00:06:30 --> 00:06:35
eigenvectors,
and you look at this matrix and
111
00:06:35 --> 00:06:39
what do you see in that matrix?
112
00:06:39 --> 00:06:44
Um, well, I guess we might ask
ourselves right away,
113
00:06:44 --> 00:06:46
is it singular?
114
00:06:46 --> 00:06:49
Is it singular?
115
00:06:49 --> 00:06:51
Because, if so,
then we really have a head
116
00:06:51 --> 00:06:54
start, we know one of the
eigenvalues is zero.
117
00:06:54 --> 00:06:56
Is that matrix singular?
118
00:06:56 --> 00:06:59
Eh, I don't know,
do you take the determinant to
119
00:06:59 --> 00:06:59
find out?
120
00:06:59 --> 00:07:03
Or maybe you look at the first
row and third row and say,
121
00:07:03 --> 00:07:06
hey, the first row and third
row are just opposite signs,
122
00:07:06 --> 00:07:09
they're linear-dependent?
123
00:07:09 --> 00:07:13
The first column and third
column are dependent -- it's
124
00:07:13 --> 00:07:13
singular.
125
00:07:13 --> 00:07:15.74
So one eigenvalue is zero.
126
00:07:15.74 --> 00:07:17
Let's make that lambda one.
127
00:07:17 --> 00:07:20
Lambda one, then,
will be zero.
128
00:07:20 --> 00:07:20
OK.
129
00:07:20 --> 00:07:24
Now we've got a couple of other
eigenvalues to find,
130
00:07:24 --> 00:07:29
and, I suppose the simplest way
is to look at A minus lambda I
131
00:07:29 --> 00:07:35
So let me just put minus lambda
in here, minus ones above,
132
00:07:35 --> 00:07:36
ones below.
133
00:07:36 --> 00:07:40
But, actually,
before I do it,
134
00:07:40 --> 00:07:44
that matrix is not symmetric,
for sure, right?
135
00:07:44 --> 00:07:49
In fact, it's the very opposite
of symmetric.
136
00:07:49 --> 00:07:55
That matrix A transpose,
how is A transpose connected to
137
00:07:55 --> 00:07:57
A?
138
00:07:57 --> 00:07:58
It's negative A.
139
00:07:58 --> 00:08:03
It's an anti-symmetric matrix,
skew-symmetric matrix.
140
00:08:03 --> 00:08:07
And we've met,
maybe, a two-by-two example of
141
00:08:07 --> 00:08:11
skew-symmetric matrices,
and let me just say,
142
00:08:11 --> 00:08:15
what's the deal with their
eigenvalues?
143
00:08:15 --> 00:08:18
They're pure imaginary.
144
00:08:18 --> 00:08:21
They'll be on the imaginary
axis, there be some multiple of
145
00:08:21 --> 00:08:24
I if it's an anti-symmetric,
skew-symmetric matrix.
146
00:08:24 --> 00:08:27
So I'm looking for multiples of
I, and of course,
147
00:08:27 --> 00:08:30
that's zero times I,
that's on the imaginary axis,
148
00:08:30 --> 00:08:32
but maybe I just do it out,
here.
149
00:08:32 --> 00:08:35.44
Lambda cubed.
well, maybe that's minus lambda
150
00:08:35.44 --> 00:08:38
cubed, and then a zero and a
zero.
151
00:08:38 --> 00:08:43
Zero, and then maybe I have a
plus a lambda,
152
00:08:43 --> 00:08:50
and another plus lambda,
but those go with a minus sign.
153
00:08:50 --> 00:08:55
Am I getting minus two lambda
equals zero?
154
00:08:55 --> 00:08:55.52
So.
155
00:08:55.52 --> 00:09:03.16
So I'm solving lambda cube plus
two lambda equals zero.
156
00:09:03.16 --> 00:09:08
So one root factors out lambda,
and the the rest is lambda
157
00:09:08 --> 00:09:10
squared plus two.
158
00:09:10 --> 00:09:10
OK.
159
00:09:10 --> 00:09:14
This is going the way we
expect, right?
160
00:09:14 --> 00:09:19
Because this gives the root
lambda equals zero,
161
00:09:19 --> 00:09:26
and gives the other two roots,
which are lambda equal what?
162
00:09:26 --> 00:09:31
The solutions of when is lambda
squared plus two equals zero
163
00:09:31 --> 00:09:35
then the eigenvalues those guys,
what are they?
164
00:09:35 --> 00:09:39
They're a multiple of i,
they're just square root of two
165
00:09:39 --> 00:09:39
i.
166
00:09:39 --> 00:09:44.91
When I set this equals to zero,
I have lambda squared equal to
167
00:09:44.91 --> 00:09:46.32
minus two, right?
168
00:09:46.32 --> 00:09:48
To make that zero?
169
00:09:48 --> 00:09:54
And the roots are square root
of two i and minus the square
170
00:09:54 --> 00:09:55
root of two i.
171
00:09:55 --> 00:09:58
So now I know what those are.
172
00:09:58 --> 00:10:00
I'll put those in,
now.
173
00:10:00 --> 00:10:03
Either the zero t is just a
one.
174
00:10:03 --> 00:10:04
That's just a one.
175
00:10:04 --> 00:10:10
This is square root of two I
and this is minus square root of
176
00:10:10 --> 00:10:12
two I.
177
00:10:12 --> 00:10:15
So, is the solution decaying to
zero?
178
00:10:15 --> 00:10:19
Is this a completely stable
problem where the solution is
179
00:10:19 --> 00:10:20
going to zero?
180
00:10:20 --> 00:10:21
No.
181
00:10:21 --> 00:10:25
In fact, all these things are
staying the same size.
182
00:10:25 --> 00:10:29
This thing is getting
multiplied by this number.
183
00:10:29 --> 00:10:33.35
e to the I something t,
that's a number that has
184
00:10:33.35 --> 00:10:37
magnitude one,
and sort of wanders around the
185
00:10:37 --> 00:10:39
unit circle.
186
00:10:39 --> 00:10:41
Same for this.
187
00:10:41 --> 00:10:48
So that the solution doesn't
blow up, and it doesn't go to
188
00:10:48 --> 00:10:48
zero.
189
00:10:48 --> 00:10:49
OK.
190
00:10:49 --> 00:10:55
And to find out what it
actually is, we would have to
191
00:10:55 --> 00:11:00
plug in initial conditions.
192
00:11:00 --> 00:11:03
But actually,
the next question I ask is,
193
00:11:03 --> 00:11:07
when does the solution return
to its initial value?
194
00:11:07 --> 00:11:10
I won't even say what's the
initial value.
195
00:11:10 --> 00:11:15
This is a case in which I think
this solution is periodic after.
196
00:11:15 --> 00:11:17
At t equals zero,
it starts with c1,
197
00:11:17 --> 00:11:22.14
c2, and c3, and then at some
value of t, it comes back to
198
00:11:22.14 --> 00:11:23
that.
199
00:11:23 --> 00:11:29.09
So that's a very special
question, Well,
200
00:11:29.09 --> 00:11:37
let's just take three seconds,
because that special question
201
00:11:37 --> 00:11:42.64
isn't likely to be on the quiz.
202
00:11:42.64 --> 00:11:46
But it comes back to the start,
when?
203
00:11:46 --> 00:11:51
Well, whenever we have e to the
two pi i, that's one,
204
00:11:51 --> 00:11:54
and we've come back again.
205
00:11:54 --> 00:11:57
So it comes back to the start.
206
00:11:57 --> 00:12:02
It's periodic,
when this square root of two i
207
00:12:02 --> 00:12:07
-- shall I call it capital T,
for the period?
208
00:12:07 --> 00:12:12
For that particular T,
if that equals two pi i,
209
00:12:12 --> 00:12:18
then e to this thing is one,
and we've come around again.
210
00:12:18 --> 00:12:21.93
So the period is T is
determined here,
211
00:12:21.93 --> 00:12:26.31
cancel the i-s,
and T is pi times the square
212
00:12:26.31 --> 00:12:27
root of two.
213
00:12:27 --> 00:12:30
So that's pretty neat.
214
00:12:30 --> 00:12:34
We get all the information
about all solutions,
215
00:12:34 --> 00:12:38.29
we haven't fixed on only one
particular solution,
216
00:12:38.29 --> 00:12:40
but it comes around again.
217
00:12:40 --> 00:12:44.99
So this was probably my first
chance to say something about
218
00:12:44.99 --> 00:12:47
the whole family of
anti-symmetric,
219
00:12:47 --> 00:12:50
skew-symmetric matrices.
220
00:12:50 --> 00:12:50
OK.
221
00:12:50 --> 00:12:55
And then, finally,
I asked, take two eigenvectors
222
00:12:55 --> 00:13:01
(again, I haven't computed the
eigenvectors) and it turns out
223
00:13:01 --> 00:13:03
they're orthogonal.
224
00:13:03 --> 00:13:04
They're orthogonal.
225
00:13:04 --> 00:13:10
The eigenvectors of a symmetric
matrix, or a skew-symmetric
226
00:13:10 --> 00:13:14
matrix, are always orthogonal.
227
00:13:14 --> 00:13:19
I guess may conscience makes me
tell you, what are all the
228
00:13:19 --> 00:13:23
matrices that have orthogonal
eigenvectors?
229
00:13:23 --> 00:13:27
And symmetric is the most
important class,
230
00:13:27 --> 00:13:31
so that's the one we've spoken
about.
231
00:13:31 --> 00:13:35
But let me just put that little
fact down, here.
232
00:13:35 --> 00:13:39
Orthogonal x-s.
eigenvectors.
233
00:13:39 --> 00:13:42
A matrix has orthogonal
eigenvectors,
234
00:13:42 --> 00:13:47
the exact condition -- it's
quite beautiful that I can tell
235
00:13:47 --> 00:13:49
you exactly when that happens.
236
00:13:49 --> 00:13:54
It happens when A times A
transpose equals A transpose
237
00:13:54 --> 00:13:54
times A.
238
00:13:54 --> 00:14:00
Any time that's the condition
for orthogonal eigenvectors.
239
00:14:00 --> 00:14:04
And because we're interested in
special families of vectors,
240
00:14:04 --> 00:14:07
tell me some special families
that fit.
241
00:14:07 --> 00:14:09
This is the whole requirement.
242
00:14:09 --> 00:14:13
That's a pretty special
requirement most matrices have.
243
00:14:13 --> 00:14:17
So the average three-by-three
matrix has three eigenvectors,
244
00:14:17 --> 00:14:19
but not orthogonal.
245
00:14:19 --> 00:14:24
But if it happens to commute
with its transpose,
246
00:14:24 --> 00:14:29
then, wonderfully,
the eigenvectors are
247
00:14:29 --> 00:14:30.34
orthogonal.
248
00:14:30.34 --> 00:14:36
Now, do you see how symmetric
matrices pass this test?
249
00:14:36 --> 00:14:38
Of course.
250
00:14:38 --> 00:14:42
If A transpose equals A,
then both sides are A squared,
251
00:14:42 --> 00:14:43
we've got it.
252
00:14:43 --> 00:14:47
How do anti-symmetric matrices
pass this test?
253
00:14:47 --> 00:14:51
If A transpose equals minus A,
then we've got it again,
254
00:14:51 --> 00:14:55
because we've got minus A
squared on both sides.
255
00:14:55 --> 00:14:58.24
So that's another group.
256
00:14:58.24 --> 00:15:01
And finally,
let me ask you about our other
257
00:15:01 --> 00:15:04
favorite family,
orthogonal matrices.
258
00:15:04 --> 00:15:09
Do orthogonal matrices pass
this test, if A is a Q,
259
00:15:09 --> 00:15:13
do they pass the test for
orthogonal eigenvectors.
260
00:15:13 --> 00:15:16
Well, if A is Q,
an orthogonal matrix,
261
00:15:16 --> 00:15:18
what is Q transpose Q?
262
00:15:18 --> 00:15:19
It's I.
263
00:15:19 --> 00:15:22
And what is Q Q transpose?
264
00:15:22 --> 00:15:26
It's I, we're talking square
matrices here.
265
00:15:26 --> 00:15:29
So yes, it passes the test.
266
00:15:29 --> 00:15:34
So the special cases are
symmetric, anti-symmetric (I'll
267
00:15:34 --> 00:15:38
say skew-symmetric,) and
orthogonal.
268
00:15:38 --> 00:15:44
Those are the three important
special classes that are in this
269
00:15:44 --> 00:15:45
family.
270
00:15:45 --> 00:15:46
OK.
271
00:15:46 --> 00:15:52
That's like a comment that,
could have been made back in,
272
00:15:52 --> 00:15:54
section six point four.
273
00:15:54 --> 00:15:59
OK, I can pursue the
differential equations,
274
00:15:59 --> 00:16:03
also this question,
didn't ask you to tell me,
275
00:16:03 --> 00:16:09
how would I find this matrix
exponential, e to the At?
276
00:16:09 --> 00:16:12
So can I erase this?
277
00:16:12 --> 00:16:16.33
I'll just stay with this
same...
278
00:16:16.33 --> 00:16:19
how would I find e to the At?
279
00:16:19 --> 00:16:23.09
Because, how does that come in?
280
00:16:23.09 --> 00:16:28
That's the key matrix for a
differential equation,
281
00:16:28 --> 00:16:36
because the solution is -- the
solution is u(t) is e^(At) u(0).
282
00:16:36 --> 00:16:41
So this is like the fundamental
matrix that multiplies the given
283
00:16:41 --> 00:16:44
function and gives the answer.
284
00:16:44 --> 00:16:47
And how would we compute it if
we wanted that?
285
00:16:47 --> 00:16:52
We don't always have to find e
to the At, because I can go
286
00:16:52 --> 00:16:56
directly to the answer without
any e to the At-s,
287
00:16:56 --> 00:17:02
but hiding here is an e to the
At, and how would I compute it?
288
00:17:02 --> 00:17:05.66
Well, if A is diagonalizable.
289
00:17:05.66 --> 00:17:11
So I'm now going to put in my
usual if A can be diagonalized
290
00:17:11 --> 00:17:16
(and everybody remember that
there is an if there,
291
00:17:16 --> 00:17:21
because it might not have
enough eigenvectors) this
292
00:17:21 --> 00:17:27
example does have enough,
random matrices have enough.
293
00:17:27 --> 00:17:33
So if we can diagonalize,
then we get a nice formula for
294
00:17:33 --> 00:17:38
this, because an S comes way out
at the beginning,
295
00:17:38 --> 00:17:44
and S inverse comes way out at
the end, and we only have to
296
00:17:44 --> 00:17:48
take the exponential of lambda.
297
00:17:48 --> 00:17:52
And that's just a diagonal
matrix, so that's just e the
298
00:17:52 --> 00:17:55
lambda one t,
these guys are showing up,
299
00:17:55 --> 00:17:58
now, in e to the lambda nt.
300
00:17:58 --> 00:17:58.27
OK?
301
00:17:58.27 --> 00:18:01
That's a really quick review of
that formula.
302
00:18:01 --> 00:18:06
It's something we can compute
it quickly if we have done the S
303
00:18:06 --> 00:18:07
and lambda part.
304
00:18:07 --> 00:18:12
If we know S and lambda,
then it's not hard to take that
305
00:18:12 --> 00:18:13
step.
306
00:18:13 --> 00:18:17
OK, that's some comments on
differential equations.
307
00:18:17 --> 00:18:21
I would like to go on to a next
question that I started here.
308
00:18:21 --> 00:18:25
And it's, got several parts,
and I can just read it out.
309
00:18:25 --> 00:18:28
What we're given is a
three-by-three matrix,
310
00:18:28 --> 00:18:32
and we're told its eigenvalues,
except one of these is,
311
00:18:32 --> 00:18:35
like, we don't know,
and we're told the
312
00:18:35 --> 00:18:37
eigenvectors.
313
00:18:37 --> 00:18:41
And I want to ask you about the
matrix.
314
00:18:41 --> 00:18:41
OK.
315
00:18:41 --> 00:18:43
So, first question.
316
00:18:43 --> 00:18:46
Is the matrix diagonalizable?
317
00:18:46 --> 00:18:51.19
And I really mean for which c,
because I don't know c,
318
00:18:51.19 --> 00:18:56
so my questions will all be,
for which is there a condition
319
00:18:56 --> 00:18:59
on c, does one c work.
320
00:18:59 --> 00:19:06
But your answer should tell me
all the c-s that work.
321
00:19:06 --> 00:19:13
I'm not asking for you to tell
me, well, c equal four,
322
00:19:13 --> 00:19:16
yes, that checks out.
323
00:19:16 --> 00:19:23
I want to know all the c-s that
make it diagonalizable.
324
00:19:23 --> 00:19:24
OK?
325
00:19:24 --> 00:19:28
What's the real on
diagonalizable?
326
00:19:28 --> 00:19:32
We need enough eigenvectors,
right?
327
00:19:32 --> 00:19:38
We don't care what those
eigenvalues are,
328
00:19:38 --> 00:19:43
it's eigenvectors that count
for
329
00:19:43 --> 00:19:47.1
diagonalizable,
and we need three independent
330
00:19:47.1 --> 00:19:50
ones, and are those three guys
independent?
331
00:19:50 --> 00:19:50
Yes.
332
00:19:50 --> 00:19:53
Actually, let's look at them
for a moment.
333
00:19:53 --> 00:19:57
What do you see about those
three vectors right away?
334
00:19:57 --> 00:19:59
They're more than independent.
335
00:19:59 --> 00:20:03
Can you see why those three got
chosen?
336
00:20:03 --> 00:20:08
Because it will come up in the
next part, they're orthogonal.
337
00:20:08 --> 00:20:11
Those eigenvectors are
orthogonal.
338
00:20:11 --> 00:20:13
They're certainly independent.
339
00:20:13 --> 00:20:17
So the answer to diagonalizable
is, yes, all c,
340
00:20:17 --> 00:20:18
all c.
341
00:20:18 --> 00:20:21
Doesn't matter.
c could be a repeated guy,
342
00:20:21 --> 00:20:24
but we've got enough
eigenvectors,
343
00:20:24 --> 00:20:28
so that's what we care about.
344
00:20:28 --> 00:20:30
OK, second question.
345
00:20:30 --> 00:20:34.65
For which values of c is it
symmetric?
346
00:20:34.65 --> 00:20:38.43
OK, what's the answer to that
one?
347
00:20:38.43 --> 00:20:44
If we know the same setup if we
know that much about it,
348
00:20:44 --> 00:20:50
we know those eigenvectors,
and we've noticed they're
349
00:20:50 --> 00:20:56
orthogonal, then which c-s will
work?
350
00:20:56 --> 00:21:02
So the eigenvalues of that
symmetric matrix have to be
351
00:21:02 --> 00:21:02
real.
352
00:21:02 --> 00:21:04
So all real c.
353
00:21:04 --> 00:21:10
If c was i, the matrix wouldn't
have been symmetric.
354
00:21:10 --> 00:21:15
But if c is a real number,
then we've got real
355
00:21:15 --> 00:21:21
eigenvalues, we've got
orthogonal eigenvectors,
356
00:21:21 --> 00:21:25.59
that matrix is symmetric.
357
00:21:25.59 --> 00:21:27
OK, positive definite.
358
00:21:27 --> 00:21:31
OK, now this is a sub-case of
symmetric, so we need c to be
359
00:21:31 --> 00:21:35
real, so we've got a symmetric
matrix, but we also want the
360
00:21:35 --> 00:21:37
thing to be positive definite.
361
00:21:37 --> 00:21:41.31
Now, we're looking at
eigenvalues, we've got a lot of
362
00:21:41.31 --> 00:21:44
tests for positive definite,
but eigenvalues,
363
00:21:44 --> 00:21:47
if we know them,
is certainly a good,
364
00:21:47 --> 00:21:49.24
quick, clean test.
365
00:21:49.24 --> 00:21:52
Could this matrix be positive
definite?
366
00:21:52 --> 00:21:53
No.
367
00:21:53 --> 00:21:56
No, because it's got an
eigenvalue zero.
368
00:21:56 --> 00:21:59
It could be positive
semi-definite,
369
00:21:59 --> 00:22:05
you know, like consolation
prize, if c was greater or equal
370
00:22:05 --> 00:22:10
to zero, it would be positive
semi-definite.
371
00:22:10 --> 00:22:12
But it's not,
no.
372
00:22:12 --> 00:22:18.79
Semi-definite,
if I put that comment in,
373
00:22:18.79 --> 00:22:25
semi-definite,
that the condition would be c
374
00:22:25 --> 00:22:29
greater or equal to zero.
375
00:22:29 --> 00:22:32
That would be all right.
376
00:22:32 --> 00:22:33
OK.
377
00:22:33 --> 00:22:34
Next part.
378
00:22:34 --> 00:22:38
Is it a Markov matrix?
379
00:22:38 --> 00:22:40
Hm.
380
00:22:40 --> 00:22:46
Could this matrix be,
if I choose the number c
381
00:22:46 --> 00:22:49
correctly, a Markov matrix?
382
00:22:49 --> 00:22:55
Well, what do we know about
Markov matrices?
383
00:22:55 --> 00:23:01
Mainly, we know something about
their eigenvalues.
384
00:23:01 --> 00:23:08
One eigenvalue is always one,
and the other eigenvalues are
385
00:23:08 --> 00:23:09
smaller.
386
00:23:09 --> 00:23:12
Not larger.
387
00:23:12 --> 00:23:15
So an eigenvalue two can't
happen.
388
00:23:15 --> 00:23:19
So the answer is,
no, not a ma- that's never a
389
00:23:19 --> 00:23:20
Markov matrix.
390
00:23:20 --> 00:23:20.73
OK?
391
00:23:20.73 --> 00:23:23
And finally,
could one half of A be a
392
00:23:23 --> 00:23:25.36
projection matrix?
393
00:23:25.36 --> 00:23:29
So could it- could this --
eh-eh could this be twice a
394
00:23:29 --> 00:23:31
projection matrix?
395
00:23:31 --> 00:23:33
So let me write it this way.
396
00:23:33 --> 00:23:38
Could A over two be a
projection matrix?
397
00:23:38 --> 00:23:41
OK, what are projection
matrices?
398
00:23:41 --> 00:23:42
They're real.
399
00:23:42 --> 00:23:48
I mean, th- they're symmetric,
so their eigenvalues are real.
400
00:23:48 --> 00:23:52
But more than that,
we know what those eigenvalues
401
00:23:52 --> 00:23:53
have to be.
402
00:23:53 --> 00:24:00
What do the eigenvalues of a
projection matrix have to be?
403
00:24:00 --> 00:24:03
See, that any nice matrix we've
got an idea about its
404
00:24:03 --> 00:24:04
eigenvalues.
405
00:24:04 --> 00:24:07
So the eigenvalues of
projection matrices are zero and
406
00:24:07 --> 00:24:07
one.
407
00:24:07 --> 00:24:09
Zero and one,
only.
408
00:24:09 --> 00:24:12
Because P squared equals P,
let me call this matrix P,
409
00:24:12 --> 00:24:15
so P squared equals P,
so lambda squared equals
410
00:24:15 --> 00:24:19
lambda, because eigenvalues of P
squared are lambda squared,
411
00:24:19 --> 00:24:23
and we must have that,
so lambda equals zero or one.
412
00:24:23 --> 00:24:23.88
OK.
413
00:24:23.88 --> 00:24:30
Now what value of c will work
there?
414
00:24:30 --> 00:24:39
So, then, there are some value
that will work,
415
00:24:39 --> 00:24:47
and what will work?
c equals zero will work,
416
00:24:47 --> 00:24:54
or what else will work?
c equal to two.
417
00:24:54 --> 00:24:58
Because if c is two,
then when we divide by two,
418
00:24:58 --> 00:25:02
this Eigenvalue of two will
drop to one, and so will the
419
00:25:02 --> 00:25:04
other one, so,
or c equal to two.
420
00:25:04 --> 00:25:08
OK, those are the guys that
will work, and it was the fact
421
00:25:08 --> 00:25:12
that those eigenvectors were
orthogonal, the fact that those
422
00:25:12 --> 00:25:16.63
eigenvectors were orthogonal
carried us a lot of the way,
423
00:25:16.63 --> 00:25:17
here.
424
00:25:17 --> 00:25:22
If they weren't orthogonal,
then symmetric would have been
425
00:25:22 --> 00:25:26
dead, positive definite would
have been dead,
426
00:25:26 --> 00:25:29
projection would have been
dead.
427
00:25:29 --> 00:25:34.08
But those eigenvectors were
orthogonal, so it came down to
428
00:25:34.08 --> 00:25:36
the eigenvalues.
429
00:25:36 --> 00:25:44
OK, that was like a chance to
review a lot of this chapter.
430
00:25:44 --> 00:25:51.87
Shall I jump to the singular
value decomposition,
431
00:25:51.87 --> 00:25:58
then, as the third,
topic for, for the review?
432
00:25:58 --> 00:26:04
OK, so I'm going to.
jump to this.
433
00:26:04 --> 00:26:04
OK.
434
00:26:04 --> 00:26:10
So this is the singular value
decomposition,
435
00:26:10 --> 00:26:14
known to everybody as the SVD.
436
00:26:14 --> 00:26:22
And that's a factorization of A
into orthogonal times diagonal
437
00:26:22 --> 00:26:24
times orthogonal.
438
00:26:24 --> 00:26:31
And we always call those U and
sigma and V transpose.
439
00:26:31 --> 00:26:32
OK.
440
00:26:32 --> 00:26:38
And the key to that -- this is
for every matrix,
441
00:26:38 --> 00:26:42
every A, every A.
442
00:26:42 --> 00:26:45
Rectangular,
doesn't matter,
443
00:26:45 --> 00:26:49
whatever, has this
decomposition.
444
00:26:49 --> 00:26:52
So it's really important.
445
00:26:52 --> 00:26:58
And the key to it is to look at
things like A transpose A.
446
00:26:58 --> 00:27:05
Can we remember what happens
with A transpose A?
447
00:27:05 --> 00:27:12
If I just transpose that I get
V sigma transpose U transpose,
448
00:27:12 --> 00:27:17
that's multiplying A,
which is U, sigma V transpose,
449
00:27:17 --> 00:27:24
and the result is V on the
outside, s- U transpose U is the
450
00:27:24 --> 00:27:30
identity, because it's an
orthogonal matrix.
451
00:27:30 --> 00:27:35
So I'm just left with sigma
transpose sigma in the middle,
452
00:27:35 --> 00:27:39
that's a diagonal,
possibly rectangular diagonal
453
00:27:39 --> 00:27:42
by its transpose,
so the result,
454
00:27:42 --> 00:27:46
this is orthogonal,
diagonal, orthogonal.
455
00:27:46 --> 00:27:50
So, I guess,
actually, this is the SVD for A
456
00:27:50 --> 00:27:51
transpose A.
457
00:27:51 --> 00:27:57
Here I see orthogonal,
diagonal, and orthogonal.
458
00:27:57 --> 00:27:57
Great.
459
00:27:57 --> 00:28:01.54
But a little more is happening.
460
00:28:01.54 --> 00:28:05
For A transpose A,
the difference is,
461
00:28:05 --> 00:28:09
the orthogonal guys are the
same.
462
00:28:09 --> 00:28:12
It's V and V transpose.
463
00:28:12 --> 00:28:14
What I seeing here?
464
00:28:14 --> 00:28:20
I'm seeing the factorization
for a symmetric matrix.
465
00:28:20 --> 00:28:25
This thing is symmetric.
466
00:28:25 --> 00:28:28
So in a symmetric case,
U is the same as V.
467
00:28:28 --> 00:28:31
U is the same as V for this
symmetric matrix,
468
00:28:31 --> 00:28:34
and, of course,
we see it happening.
469
00:28:34 --> 00:28:34
OK.
470
00:28:34 --> 00:28:37
So that tells us,
right away, what V is.
471
00:28:37 --> 00:28:40
V is the eigenvector matrix for
A transpose A.
472
00:28:40 --> 00:28:40
OK.
473
00:28:40 --> 00:28:44
Now, if you were here when I
lectured about this topic,
474
00:28:44 --> 00:28:48
when I gave the topic on
singular value decompositions,
475
00:28:48 --> 00:28:53.22
you'll remember that I got into
trouble.
476
00:28:53.22 --> 00:29:00.65
I'm sorry to remember that
myself, but it happened.
477
00:29:00.65 --> 00:29:01
OK.
478
00:29:01 --> 00:29:03
How did it happen?
479
00:29:03 --> 00:29:10
I was in great shape for a
while, cruising along.
480
00:29:10 --> 00:29:17
So I found the eigenvectors for
A transpose A.
481
00:29:17 --> 00:29:19
Good.
482
00:29:19 --> 00:29:23
I found the singular values,
what were they?
483
00:29:23 --> 00:29:25
What were the singular values?
484
00:29:25 --> 00:29:29
The singular value number i,
or -- these are the guys in
485
00:29:29 --> 00:29:33
sigma -- this is diagonal with
the number sigma in it.
486
00:29:33 --> 00:29:37
This diagonal is sigma one,
sigma two, up to the rank,
487
00:29:37 --> 00:29:41
sigma r, those are the non-zero
ones.
488
00:29:41 --> 00:29:45
So I found those,
and what are they?
489
00:29:45 --> 00:29:47
Remind me about that?
490
00:29:47 --> 00:29:52.9
Well, here, I'm seeing them
squared, so their squares are
491
00:29:52.9 --> 00:29:56
the eigenvalues of A transpose
A.
492
00:29:56 --> 00:29:56
Good.
493
00:29:56 --> 00:30:02
So I just take the square root,
if I want the eigenvalues of A
494
00:30:02 --> 00:30:08
transpose --
If I want the sigmas and I know
495
00:30:08 --> 00:30:13
these, I take the square root,
the positive square root.
496
00:30:13 --> 00:30:14
OK.
497
00:30:14 --> 00:30:17
Where did I run into trouble?
498
00:30:17 --> 00:30:20
Well, then, my final step was
to find U.
499
00:30:20 --> 00:30:23
And I didn't read the book.
500
00:30:23 --> 00:30:28
So, I did something that was
practically right,
501
00:30:28 --> 00:30:33
but --
well, I guess practically right
502
00:30:33 --> 00:30:35
is not quite the same.
503
00:30:35 --> 00:30:41.1
OK, so I thought,
OK, I'll look at A A transpose.
504
00:30:41.1 --> 00:30:45
What happened when I looked at
A A transpose?
505
00:30:45 --> 00:30:51
Let me just put it here,
and then I can feel it.
506
00:30:51 --> 00:30:55.14
OK, so here's A A transpose.
507
00:30:55.14 --> 00:31:00
So that's U sigma V transpose,
that's A, and then the
508
00:31:00 --> 00:31:06.49
transpose is V sigma transpose,
U sigma transpose.
509
00:31:06.49 --> 00:31:07
Fine.
510
00:31:07 --> 00:31:12.11
And then, in the middle is the
identity again,
511
00:31:12.11 --> 00:31:14
so it looks great.
512
00:31:14 --> 00:31:18
U sigma sigma transpose,
U transpose.
513
00:31:18 --> 00:31:19
Fine.
514
00:31:19 --> 00:31:25.05
All good, and now these columns
of U are the eigenvectors,
515
00:31:25.05 --> 00:31:29
that's U is the eigenvector
matrix for this guy.
516
00:31:29 --> 00:31:32
That was correct,
so I did that fine.
517
00:31:32 --> 00:31:35
Where did something go wrong?
518
00:31:35 --> 00:31:36.62
A sign went wrong.
519
00:31:36.62 --> 00:31:41
A sign went wrong because --
and now -- now I see,
520
00:31:41 --> 00:31:44
actually, somebody told me
right after class,
521
00:31:44 --> 00:31:48
we can't tell from this
description which sign to give
522
00:31:48 --> 00:31:50.23
the eigenvectors.
523
00:31:50.23 --> 00:31:53
If these are the eigenvectors
of this matrix,
524
00:31:53 --> 00:31:57
well, if you give me an
eigenvector and I change all its
525
00:31:57 --> 00:32:01
signs, we've still got another
eigenvector.
526
00:32:01 --> 00:32:05
So what I wasn't able to
determine (and I had a
527
00:32:05 --> 00:32:09
fifty-fifty change and life let
me down,) the signs I just
528
00:32:09 --> 00:32:11
happened to pick for the
eigenvectors,
529
00:32:11 --> 00:32:14
one of them I should have
reversed the sign.
530
00:32:14 --> 00:32:17
So, from this,
I can't tell whether the
531
00:32:17 --> 00:32:22
eigenvector or its negative is
the right one to use in there.
532
00:32:22 --> 00:32:28
So the right way to do it is
to, having settled on the signs,
533
00:32:28 --> 00:32:32.26
the Vs also,
I don't know which sign to
534
00:32:32.26 --> 00:32:34
choose, but I choose one.
535
00:32:34 --> 00:32:36
I choose one.
536
00:32:36 --> 00:32:41
And then, instead,
I should have used the one that
537
00:32:41 --> 00:32:47
tells me what sign to choose,
the rule that A times a V is
538
00:32:47 --> 00:32:50
sigma times the U.
539
00:32:50 --> 00:32:55
So, having decided on the V,
I multiply by A,
540
00:32:55 --> 00:33:02
I'll notice the factor sigma
coming out, and there will be a
541
00:33:02 --> 00:33:08
unit vector there,
and I now know exactly what it
542
00:33:08 --> 00:33:14
is, and not only up to a change
of sign.
543
00:33:14 --> 00:33:19
So that's the good and,
of course, this is the main
544
00:33:19 --> 00:33:21
point about the SVD.
545
00:33:21 --> 00:33:25
That's the point that we've
diagonalized,
546
00:33:25 --> 00:33:31
that's A times the matrix of Vs
equals U times the diagonal
547
00:33:31 --> 00:33:33
matrix of sigmas.
548
00:33:33 --> 00:33:36
That's the same as that.
549
00:33:36 --> 00:33:36
OK.
550
00:33:36 --> 00:33:40
So that's, like,
correcting the wrong sign from
551
00:33:40 --> 00:33:42
that earlier lecture.
552
00:33:42 --> 00:33:48
And that would complete that,
so that's how you would compute
553
00:33:48 --> 00:33:48
the SVD.
554
00:33:48 --> 00:33:52
Now, on the quiz,
I going to ask -- well,
555
00:33:52 --> 00:33:55
maybe on the final.
556
00:33:55 --> 00:33:58
So we've got quiz and final
ahead.
557
00:33:58 --> 00:34:03
Sometimes, you might be asked
to find the SVD if I give you
558
00:34:03 --> 00:34:08
the matrix -- let me come back,
now, to the main board -- or,
559
00:34:08 --> 00:34:11
I might give you the pieces.
560
00:34:11 --> 00:34:15
And I might ask you something
about the matrix.
561
00:34:15 --> 00:34:18
For example,
suppose I ask you,
562
00:34:18 --> 00:34:21
oh, let's say,
if I tell you what sigma is --
563
00:34:21 --> 00:34:23
OK.
564
00:34:23 --> 00:34:27
Let's take one example.
565
00:34:27 --> 00:34:36
Suppose sigma is -- so all
that's how we would compute
566
00:34:36 --> 00:34:37
them.
567
00:34:37 --> 00:34:43.26
But now, suppose I give you
these.
568
00:34:43.26 --> 00:34:52
Suppose I give you sigma is,
say, three two.
569
00:34:52 --> 00:34:55
And I tell you that U has a
couple of columns,
570
00:34:55 --> 00:34:57
and V has a couple of columns.
571
00:34:57 --> 00:34:58
OK.
572
00:34:58 --> 00:35:02
Those are orthogonal columns,
of course, because U and V are
573
00:35:02 --> 00:35:03
orthogonal.
574
00:35:03 --> 00:35:07
I'm just sort of,
like, getting you to think
575
00:35:07 --> 00:35:10
about the SVD,
because we only had that one
576
00:35:10 --> 00:35:12
lecture about it,
and one homework,
577
00:35:12 --> 00:35:17.44
and, what kind of a matrix have
I got here?
578
00:35:17.44 --> 00:35:21
What do I know about this
matrix?
579
00:35:21 --> 00:35:27
All I really know right now is
that its singular values,
580
00:35:27 --> 00:35:34.01
those sigmas are three and two,
and the only thing interesting
581
00:35:34.01 --> 00:35:40
that I can see in that is that
they're not zero.
582
00:35:40 --> 00:35:43
I know that this matrix is
non-singular,
583
00:35:43 --> 00:35:44
right?
584
00:35:44 --> 00:35:47
That's invertible,
I don't have any zero
585
00:35:47 --> 00:35:52
eigenvalues, and zero singular
values, that's invertible,
586
00:35:52 --> 00:35:57
there's a typical SVD for a
nice two-by-two non-singular
587
00:35:57 --> 00:36:00
invertible good matrix.
588
00:36:00 --> 00:36:05
If I actually gave you a
matrix, then you'd have to find
589
00:36:05 --> 00:36:08
the Us and the Vs as we just
spoke.
590
00:36:08 --> 00:36:09
But, there.
591
00:36:09 --> 00:36:13
Now, what if the two wasn't a
two but it was -- well,
592
00:36:13 --> 00:36:18
let me make an extreme case,
here -- suppose it was minus
593
00:36:18 --> 00:36:19
five.
594
00:36:19 --> 00:36:21.43
That's wrong,
right away.
595
00:36:21.43 --> 00:36:24
That's not a singular value
decomposition,
596
00:36:24 --> 00:36:25
right?
597
00:36:25 --> 00:36:27
The singular values are not
negative.
598
00:36:27 --> 00:36:31
So that's not a singular value
decomposition,
599
00:36:31 --> 00:36:32
and forget it.
600
00:36:32 --> 00:36:32
OK.
601
00:36:32 --> 00:36:35
So let me ask you about that
one.
602
00:36:35 --> 00:36:39
What can you tell me about that
matrix?
603
00:36:39 --> 00:36:41
It's singular,
right?
604
00:36:41 --> 00:36:47
It's got a singular matrix
there in the middle,
605
00:36:47 --> 00:36:51
and, let's see,
so, OK, it's singular,
606
00:36:51 --> 00:36:55
maybe you can tell me,
its rank?
607
00:36:55 --> 00:36:58
What's the rank of A?
608
00:36:58 --> 00:37:03.46
It's clearly -- somebody just
say it -- one,
609
00:37:03.46 --> 00:37:05
thanks.
610
00:37:05 --> 00:37:08
The rank is one,
so the null space,
611
00:37:08 --> 00:37:12
what's the dimension of the
null space?
612
00:37:12 --> 00:37:12.68
One.
613
00:37:12.68 --> 00:37:13
Right?
614
00:37:13 --> 00:37:18
We've got a two-by-two matrix
of rank one, so of all that
615
00:37:18 --> 00:37:23
stuff from the beginning of the
course is still with us.
616
00:37:23 --> 00:37:27
The dimensions of those
fundamental spaces is still
617
00:37:27 --> 00:37:31
central, and a basis for them.
618
00:37:31 --> 00:37:37
Now, can you tell me a vector
that's in the null space?
619
00:37:37 --> 00:37:43
And then that will be my last
point to make about the SVD.
620
00:37:43 --> 00:37:49
Can you tell me a vector that's
in the null space?
621
00:37:49 --> 00:37:55.15
So what would I multiply by and
get zero, here?
622
00:37:55.15 --> 00:37:58
I think the answer is probably
v2.
623
00:37:58 --> 00:38:03
I think probably v2 is in the
null space, because I think that
624
00:38:03 --> 00:38:08
must be the eigenvector going
with this zero eigenvalue.
625
00:38:08 --> 00:38:09
Yes.
626
00:38:09 --> 00:38:10
Have a look at that.
627
00:38:10 --> 00:38:16
And I could ask you the null
space of A transpose.
628
00:38:16 --> 00:38:19
And I could ask you the column
space.
629
00:38:19 --> 00:38:21
All that stuff.
630
00:38:21 --> 00:38:25.26
Everything is sitting there in
the SVD.
631
00:38:25.26 --> 00:38:29
The SVD takes a little more
time to compute,
632
00:38:29 --> 00:38:34
but it displays all the good
stuff about a matrix.
633
00:38:34 --> 00:38:34
OK.
634
00:38:34 --> 00:38:38
Any question about the SVD?
635
00:38:38 --> 00:38:44
Let me keep going with further
topics.
636
00:38:44 --> 00:38:47
Now, let's see.
637
00:38:47 --> 00:38:56
Similar matrices we've talked
about, let me see if I've got
638
00:38:56 --> 00:38:58.91
another, -- OK.
639
00:38:58.91 --> 00:39:05
Here's a true false,
so we can do that,
640
00:39:05 --> 00:39:06
easily.
641
00:39:06 --> 00:39:06
So.
642
00:39:06 --> 00:39:11
Question, A given.
643
00:39:11 --> 00:39:14
A is symmetric and orthogonal.
644
00:39:14 --> 00:39:15
OK.
645
00:39:15 --> 00:39:21
So beautiful matrices like that
don't come along every day.
646
00:39:21 --> 00:39:26
But what can we say first about
its eigenvalues?
647
00:39:26 --> 00:39:28
Actually, of course.
648
00:39:28 --> 00:39:34
Here are our two most important
classes of matrices,
649
00:39:34 --> 00:39:39
and we're looking at the
intersection.
650
00:39:39 --> 00:39:44
So those really are neat
matrices, and what can you tell
651
00:39:44 --> 00:39:48
me about what could the possible
eigenvalues be?
652
00:39:48 --> 00:39:51
Eigenvalues can be what?
653
00:39:51 --> 00:39:55.55
What do I know about the
eigenvalues of a symmetric
654
00:39:55.55 --> 00:39:56.17
matrix?
655
00:39:56.17 --> 00:39:57
Lambda is real.
656
00:39:57 --> 00:40:02
What do I know about the
eigenvalues of an orthogonal
657
00:40:02 --> 00:40:03
matrix?
658
00:40:03 --> 00:40:04
Ha.
659
00:40:04 --> 00:40:05
Maybe nothing.
660
00:40:05 --> 00:40:08.23
But, no, that can't be.
661
00:40:08.23 --> 00:40:14
What do I know about the
eigenvalues of an orthogonal
662
00:40:14 --> 00:40:14
matrix?
663
00:40:14 --> 00:40:17
Well, what feels right?
664
00:40:17 --> 00:40:23
Basing mathematics on just a
little gut instinct here,
665
00:40:23 --> 00:40:29
the eigenvalues of an
orthogonal matrix ought to have
666
00:40:29 --> 00:40:31
magnitude one.
667
00:40:31 --> 00:40:38
Orthogonal matrices are like
rotations, they're not changing
668
00:40:38 --> 00:40:44
the length, so orthogonal,
the eigenvalues are one.
669
00:40:44 --> 00:40:47.16
Let me just show you why.
670
00:40:47.16 --> 00:40:47
Why?
671
00:40:47 --> 00:40:52
So the matrix,
can I call it Q for orthogonal
672
00:40:52 --> 00:40:55
for the moment?
673
00:40:55 --> 00:40:59
If I look at Q x equal lambda
x, how do I see that this thing
674
00:40:59 --> 00:41:01
has magnitude one?
675
00:41:01 --> 00:41:03
I take the length of both
sides.
676
00:41:03 --> 00:41:06
This is taking lengths,
taking lengths,
677
00:41:06 --> 00:41:10
this is whatever the magnitude
is times the length of x.
678
00:41:10 --> 00:41:14
And what's the length of Q x if
Q is an orthogonal matrix?
679
00:41:14 --> 00:41:16
This is something you should
know.
680
00:41:16 --> 00:41:20
It's the same as the length of
x.
681
00:41:20 --> 00:41:24
Orthogonal matrices don't
change lengths.
682
00:41:24 --> 00:41:26
So lambda has to be one.
683
00:41:26 --> 00:41:27
Right.
684
00:41:27 --> 00:41:27
OK.
685
00:41:27 --> 00:41:33
That's worth committing to
memory, that could show up
686
00:41:33 --> 00:41:33.99
again.
687
00:41:33.99 --> 00:41:34
OK.
688
00:41:34 --> 00:41:38
So what's the answer now to
this question,
689
00:41:38 --> 00:41:42
what can the eigenvalues be?
690
00:41:42 --> 00:41:48
There's only two possibilities,
and they are one and the other
691
00:41:48 --> 00:41:52
one, the other possibility is
negative one,
692
00:41:52 --> 00:41:56
right, because these have the
right magnitude,
693
00:41:56 --> 00:41:58
and they're real.
694
00:41:58 --> 00:41:58
OK.
695
00:41:58 --> 00:41:59
TK.
true -- OK.
696
00:41:59 --> 00:42:01
True or false?
697
00:42:01 --> 00:42:04
A is sure to be positive
definite.
698
00:42:04 --> 00:42:08
Well, this is a great matrix,
but
699
00:42:08 --> 00:42:11
is it sure to be positive
definite?
700
00:42:11 --> 00:42:11
No.
701
00:42:11 --> 00:42:16.12
If it could have an eigenvalue
minus one, it wouldn't be
702
00:42:16.12 --> 00:42:17.58
positive definite.
703
00:42:17.58 --> 00:42:21
True or false,
it has no repeated eigenvalues.
704
00:42:21 --> 00:42:22
That's false,
too.
705
00:42:22 --> 00:42:27
In fact, it's going to have
repeated eigenvalues if it's as
706
00:42:27 --> 00:42:29.84
big
as three by three,
707
00:42:29.84 --> 00:42:33.6
one of these c- one of these,
at least, will have to get
708
00:42:33.6 --> 00:42:34
repeated.
709
00:42:34 --> 00:42:34
Sure.
710
00:42:34 --> 00:42:37
So it's got repeated
eigenvalues, but,
711
00:42:37 --> 00:42:38
is it diagonalizable?
712
00:42:38 --> 00:42:41
It's got these many,
many, repeated eigenvalues.
713
00:42:41 --> 00:42:45
If it's fifty by fifty,
it's certainly got a lot of
714
00:42:45 --> 00:42:46
repetitions.
715
00:42:46 --> 00:42:48
Is it diagonalizable?
716
00:42:48 --> 00:42:48
Yes.
717
00:42:48 --> 00:42:53
All symmetric matrices,
all orthogonal matrices can be
718
00:42:53 --> 00:42:54
diagonalized.
719
00:42:54 --> 00:42:57
And, in fact,
the eigenvectors can even be
720
00:42:57 --> 00:42:58.67
chosen orthogonal.
721
00:42:58.67 --> 00:43:00
So it could be,
sort of, like,
722
00:43:00 --> 00:43:06
diagonalized the best way with
a Q, and not just any old S.
723
00:43:06 --> 00:43:06
OK.
724
00:43:06 --> 00:43:09
Is it non-singular?
725
00:43:09 --> 00:43:15.18
Is a symmetric orthogonal
matrix non-singular?
726
00:43:15.18 --> 00:43:15
Sure.
727
00:43:15 --> 00:43:21
Orthogonal matrices are always
non-singular.
728
00:43:21 --> 00:43:26.24
And, obviously,
we don't have any zero
729
00:43:26.24 --> 00:43:27
Eigenvalues.
730
00:43:27 --> 00:43:31
Is it sure to be
diagonalizable?
731
00:43:31 --> 00:43:34
Yes.
732
00:43:34 --> 00:43:42
Now, here's a final step --
show that one-half of A plus I
733
00:43:42 --> 00:43:48
is A -- that is,
prove one-half of A plus I is a
734
00:43:48 --> 00:43:51
projection matrix.
735
00:43:51 --> 00:43:51
OK?
736
00:43:51 --> 00:43:53.41
Let's see.
737
00:43:53.41 --> 00:43:55
What do I do?
738
00:43:55 --> 00:43:59
I could see two ways to do
this.
739
00:43:59 --> 00:44:07
I could check the properties of
a projection matrix,
740
00:44:07 --> 00:44:11
which are what?
741
00:44:11 --> 00:44:13
A projection matrix is
symmetric.
742
00:44:13 --> 00:44:18
Well, that's certainly
symmetric, because A is.
743
00:44:18 --> 00:44:20
And what's the other property?
744
00:44:20 --> 00:44:24
I should square it,
and hopefully get the same
745
00:44:24 --> 00:44:25
thing back.
746
00:44:25 --> 00:44:29
So can I do that,
square and see if I get the
747
00:44:29 --> 00:44:32
same thing back?
748
00:44:32 --> 00:44:37
So if I square it,
I'll get one-quarter of A
749
00:44:37 --> 00:44:41
squared plus two A plus I,
right?
750
00:44:41 --> 00:44:47.43
And the question is,
does that agree with p- the
751
00:44:47.43 --> 00:44:49
thing itself?
752
00:44:49 --> 00:44:51
One-half A plus I.
753
00:44:51 --> 00:44:51
Hm.
754
00:44:51 --> 00:44:57
I guess I'd like to know
something about A squared.
755
00:44:57 --> 00:45:00
What is A squared?
756
00:45:00 --> 00:45:04
That's our problem.
757
00:45:04 --> 00:45:05
What is A squared?
758
00:45:05 --> 00:45:11
If A is symmetric and
orthogonal, A is symmetric and
759
00:45:11 --> 00:45:12
orthogonal.
760
00:45:12 --> 00:45:15
This is what we're given,
right?
761
00:45:15 --> 00:45:19
It's symmetric,
and it's orthogonal.
762
00:45:19 --> 00:45:21
So what's A squared?
763
00:45:21 --> 00:45:21
I.
764
00:45:21 --> 00:45:26.13
A squared is I,
because A times A -- if A
765
00:45:26.13 --> 00:45:31
equals its own inverse,
so A times A is the same as A
766
00:45:31 --> 00:45:36
times A inverse,
which is I.
767
00:45:36 --> 00:45:39
So this A squared here is I.
768
00:45:39 --> 00:45:42
And now we've got it.
769
00:45:42 --> 00:45:48
We've got two identities over
four, that's good,
770
00:45:48 --> 00:45:54
and we've got two As over four,
that's good.
771
00:45:54 --> 00:45:54
OK.
772
00:45:54 --> 00:46:02
So it turned out to be a
projection matrix safely.
773
00:46:02 --> 00:46:07
And we could also have said,
well, what are the eigenvalues
774
00:46:07 --> 00:46:08
of this thing?
775
00:46:08 --> 00:46:12
What are the eigenvalues of a
half A plus I?
776
00:46:12 --> 00:46:16
If the eigenvalues of A are one
and minus one,
777
00:46:16 --> 00:46:19
what are the eigenvalues of A
plus I?
778
00:46:19 --> 00:46:24
Just stay with it these last
thirty seconds here.
779
00:46:24 --> 00:46:28
What if I know these
eigenvalues of A,
780
00:46:28 --> 00:46:32
and I add the identity,
the eigenvalues of A plus I are
781
00:46:32 --> 00:46:34
zero and two.
782
00:46:34 --> 00:46:39
And then when I divide by two,
the eigenvalues are zero and
783
00:46:39 --> 00:46:39
one.
784
00:46:39 --> 00:46:43
So it's symmetric,
it's got the right eigenvalues,
785
00:46:43 --> 00:46:47
it's a projection matrix.
786
00:46:47 --> 00:46:51
OK, you're seeing a lot of
stuff about eigenvalues,
787
00:46:51 --> 00:46:55
and special matrices,
and that's what the quiz is
788
00:46:55 --> 00:46:56
about.
789
00:46:56 --> 00:46:59
OK, so good luck on the quiz.