1
00:00:00 --> 00:00:01
2
00:00:01 --> 00:00:02
The following content is
provided under a Creative
3
00:00:02 --> 00:00:03
Commons license.
4
00:00:03 --> 00:00:06
Your support will help MIT
OpenCourseWare continue to
5
00:00:06 --> 00:00:10
offer high-quality educational
resources for free.
6
00:00:10 --> 00:00:13
To make a donation, or to view
additional materials from
7
00:00:13 --> 00:00:16
hundreds of MIT courses, visit
MIT OpenCourseWare
8
00:00:16 --> 00:00:20
at ocw.mit.edu.
9
00:00:20 --> 00:00:22
PROFESSOR STRANG: Actually,
two things to say
10
00:00:22 --> 00:00:23
about eigenvalues.
11
00:00:23 --> 00:00:29
One is about matrices in
general and then the second is
12
00:00:29 --> 00:00:32
to focus on our favorites,
those second derivatives
13
00:00:32 --> 00:00:37
and second differences.
14
00:00:37 --> 00:00:41
There's a lot to say about
eigenvalues but then we'll
15
00:00:41 --> 00:00:42
have the main ideas.
16
00:00:42 --> 00:00:47
So the central idea of course
is to find these special
17
00:00:47 --> 00:00:52
directions and we expect to
find n directions, n
18
00:00:52 --> 00:00:57
eigenvectors y where this n by
n matrix is acting like a
19
00:00:57 --> 00:01:03
number in each of
those directions.
20
00:01:03 --> 00:01:07
So we have this for n different
y's and each one has its
21
00:01:07 --> 00:01:09
own eigenvalue lambda.
22
00:01:09 --> 00:01:13
And of course the eig command
in MATLAB will find the
23
00:01:13 --> 00:01:15
y's and the lambdas.
24
00:01:15 --> 00:01:21
So it finds the y's and
the lambdas in a matrix.
25
00:01:21 --> 00:01:24
So that's what I'm going to
do now, straightforward.
26
00:01:24 --> 00:01:27
Any time I have n vectors,
so I have n of these y's,
27
00:01:27 --> 00:01:32
I've n y's and n lambdas.
28
00:01:32 --> 00:01:36
Well, if you give me n vectors,
I put them into the columns
29
00:01:36 --> 00:01:38
of a matrix, almost
without thinking.
30
00:01:38 --> 00:01:41
So can I just do that?
31
00:01:41 --> 00:01:44
So there is y_1, the
first eigenvector.
32
00:01:44 --> 00:01:46
That's y_2 to y_n.
33
00:01:47 --> 00:01:50
Ok, that's my
eigenvector matrix.
34
00:01:50 --> 00:01:52
Often I call it S.
35
00:01:52 --> 00:01:54
So I'll stick with that.
36
00:01:54 --> 00:01:59
S will be the
eigenvector matrix.
37
00:01:59 --> 00:02:02
Since these are eigenvectors
I'm going to multiply
38
00:02:02 --> 00:02:05
that matrix by A.
39
00:02:05 --> 00:02:07
That should bring
out the key point.
40
00:02:07 --> 00:02:16
I'm just going to repeat this
which is one at a time by
41
00:02:16 --> 00:02:18
doing them all that once.
42
00:02:18 --> 00:02:23
So what happens if I multiply a
matrix by a bunch of columns?
43
00:02:23 --> 00:02:25
Matrix multiplication
is wonderful.
44
00:02:25 --> 00:02:26
It does the right thing.
45
00:02:26 --> 00:02:30
It multiplies A times
the first column.
46
00:02:30 --> 00:02:31
So let's put that there.
47
00:02:31 --> 00:02:37
A times the first column along
to A times the last column.
48
00:02:37 --> 00:02:39
Just column by column.
49
00:02:39 --> 00:02:41
But now we recognize these.
50
00:02:41 --> 00:02:43
They're special y's.
51
00:02:43 --> 00:02:45
They're special because
they're eigenvectors.
52
00:02:45 --> 00:02:52
So this is lambda_1*y_1 along
to that column is lambda_n*y_n.
53
00:02:52 --> 00:02:56
Y Right?
54
00:02:56 --> 00:02:59
Now I've used the fact that
they were eigenvectors.
55
00:02:59 --> 00:03:03
And now, one final neat step of
matrix multiplication is
56
00:03:03 --> 00:03:10
to factor out this same
eigenvector matrix again and
57
00:03:10 --> 00:03:15
realize, and I'll look at it,
that it's being multiplied by
58
00:03:15 --> 00:03:23
this diagonal that's now a
diagonal matrix of eigenvalues.
59
00:03:23 --> 00:03:26
So let's just look at that
very last step here.
60
00:03:26 --> 00:03:29
Here I had the first
column was lambda_1*y_1.
61
00:03:29 --> 00:03:31
62
00:03:31 --> 00:03:35
I just want to see,
did I get that right?
63
00:03:35 --> 00:03:39
If I'm looking at the first
column where that lambda_1 is
64
00:03:39 --> 00:03:43
sitting, it's going to multiply
y_1 and it'll be all zeroes
65
00:03:43 --> 00:03:47
below so I'll have none of
the other eigenvectors.
66
00:03:47 --> 00:03:51
So I'll have lambda_1*y_1,
just what I want.
67
00:03:51 --> 00:03:55
Got a little squeezed near
the end there, but so
68
00:03:55 --> 00:03:57
let me write above.
69
00:03:57 --> 00:04:03
The result is just A times this
eigenvector matrix that I'm
70
00:04:03 --> 00:04:10
going to call S equals what?
71
00:04:10 --> 00:04:15
This is Ay=lambda*y
for all n at once.
72
00:04:15 --> 00:04:18
A times S is, what
have I got here?
73
00:04:18 --> 00:04:19
What's this?
74
00:04:19 --> 00:04:20
That's S.
75
00:04:20 --> 00:04:22
And what's the other guy?
76
00:04:22 --> 00:04:25
That's the eigenvalue matrix.
77
00:04:25 --> 00:04:27
So it's just got n numbers.
78
00:04:27 --> 00:04:30
They automatically go on
the diagonal and it gets
79
00:04:30 --> 00:04:32
called capital Lambda.
80
00:04:32 --> 00:04:37
Capital Lambda for the matrix,
little lambda for the numbers.
81
00:04:37 --> 00:04:42
So this is n, this
is all n at once.
82
00:04:42 --> 00:04:46
Straightforward.
83
00:04:46 --> 00:04:51
Now I'm going to assume that
I've got these n eigenvectors,
84
00:04:51 --> 00:04:55
that I've been able to find
n independent directions.
85
00:04:55 --> 00:04:57
And almost always, you can.
86
00:04:57 --> 00:05:01
For every symmetric matrix
you automatically can.
87
00:05:01 --> 00:05:06
So these y's are
independent directions.
88
00:05:06 --> 00:05:10
If those are the columns of a
matrix, yeah, here's a key
89
00:05:10 --> 00:05:12
question about matrices.
90
00:05:12 --> 00:05:17
What can I say about this
matrix S if its n columns
91
00:05:17 --> 00:05:19
are independent?
92
00:05:19 --> 00:05:22
Whatever that, you know, we
haven't focused in careful
93
00:05:22 --> 00:05:23
detail, but we have an idea.
94
00:05:23 --> 00:05:28
That means sort of none of them
are combinations of the others.
95
00:05:28 --> 00:05:30
We really have n
separate directions.
96
00:05:30 --> 00:05:33
Then that matrix is?
97
00:05:33 --> 00:05:35
Invertible.
98
00:05:35 --> 00:05:39
A matrix that's got n
columns, independent, that's
99
00:05:39 --> 00:05:41
what we're hoping for.
100
00:05:41 --> 00:05:42
That matrix has an inverse.
101
00:05:42 --> 00:05:47
We can produce, well all the
good facts about matrices.
102
00:05:47 --> 00:05:48
This is a square matrix.
103
00:05:48 --> 00:05:50
So I can invert it if you like.
104
00:05:50 --> 00:05:53
And I can write A as S lambda.
105
00:05:53 --> 00:05:56
I'm multiplying on the
right by this S inverse
106
00:05:56 --> 00:06:03
And there I have the
diagonalization of a matrix.
107
00:06:03 --> 00:06:06
The matrix has been
diagonalized.
108
00:06:06 --> 00:06:07
And what does that mean?
109
00:06:07 --> 00:06:13
Well this is, of course the
diagonal that we're headed for.
110
00:06:13 --> 00:06:18
And what it means is that if I
look at my matrix and I
111
00:06:18 --> 00:06:23
separate out the different
eigendirections, I could say,
112
00:06:23 --> 00:06:27
that the matrix in those
directions is just
113
00:06:27 --> 00:06:28
this diagonal matrix.
114
00:06:28 --> 00:06:36
So that's a short
way of saying it.
115
00:06:36 --> 00:06:40
Let me just carry
one step further.
116
00:06:40 --> 00:06:44
What would A squared be?
117
00:06:44 --> 00:06:50
Well now that I have it in this
cool form, S*lambda*S inverse,
118
00:06:50 --> 00:06:53
I would multiply two of those
together and what
119
00:06:53 --> 00:06:55
would I learn?
120
00:06:55 --> 00:06:59
If I do that multiplication
what comes out?
121
00:06:59 --> 00:07:02
First an S from here.
122
00:07:02 --> 00:07:04
And then what?
123
00:07:04 --> 00:07:05
Lambda squared.
124
00:07:05 --> 00:07:07
Why lambda squared?
125
00:07:07 --> 00:07:11
Because in the middle is the
S S inverse that's giving
126
00:07:11 --> 00:07:14
the identity matrix.
127
00:07:14 --> 00:07:17
So then the lambda multiplies
the lambda and now
128
00:07:17 --> 00:07:18
here is S inverse.
129
00:07:18 --> 00:07:21
Well A squared is S*lambda
squared*S inverse.
130
00:07:22 --> 00:07:25
What does that
tell me in words?
131
00:07:25 --> 00:07:31
That tells me that the
eigenvectors of A squared are?
132
00:07:31 --> 00:07:33
The same.
133
00:07:33 --> 00:07:34
As for A.
134
00:07:34 --> 00:07:39
And it tells me that the
eigenvalues of A squared are?
135
00:07:39 --> 00:07:41
The squares.
136
00:07:41 --> 00:07:43
So I could do this.
137
00:07:43 --> 00:07:45
Maybe I did it before,
one at a time.
138
00:07:45 --> 00:07:49
Ay=lambda*y, multiply
again by A.
139
00:07:49 --> 00:07:56
A squared*y is lambda*Ay,
but Ay is lambda*y so I'm
140
00:07:56 --> 00:07:57
up to lambda squared*y.
141
00:07:59 --> 00:08:03
You should just see that when
you've got these directions
142
00:08:03 --> 00:08:06
then your matrix
is really simple.
143
00:08:06 --> 00:08:09
Effectively it's a
diagonal matrix in
144
00:08:09 --> 00:08:11
these good directions.
145
00:08:11 --> 00:08:16
So that just shows
one way of seeing.
146
00:08:16 --> 00:08:18
And of course what
about A inverse?
147
00:08:18 --> 00:08:20
We might as well
mention A inverse.
148
00:08:20 --> 00:08:24
Suppose A is invertible.
149
00:08:24 --> 00:08:28
Then what do I learn
about A inverse?
150
00:08:28 --> 00:08:32
Can I just invert that?
151
00:08:32 --> 00:08:35
I'm just playing with that
formula, so you'll kind of,
152
00:08:35 --> 00:08:39
like, get handy with it.
153
00:08:39 --> 00:08:42
What would the inverse be if
I have three things in a
154
00:08:42 --> 00:08:45
row multiplied together?
155
00:08:45 --> 00:08:48
What's the inverse?
156
00:08:48 --> 00:08:50
So I'm going to take
the inverses in the
157
00:08:50 --> 00:08:52
opposite order, right?
158
00:08:52 --> 00:08:56
So the inverse of that
will come first.
159
00:08:56 --> 00:08:58
So what's that?
160
00:08:58 --> 00:09:00
Just S.
161
00:09:00 --> 00:09:05
The lambda in the middle gets
inverted and then the S at
162
00:09:05 --> 00:09:09
the left, its inverse
comes at the right.
163
00:09:09 --> 00:09:13
Well what do I learn from that?
164
00:09:13 --> 00:09:19
I learn that the eigenvector
matrix for A inverse is?
165
00:09:19 --> 00:09:21
Same thing, again.
166
00:09:21 --> 00:09:22
Same.
167
00:09:22 --> 00:09:24
Let me put just Same.
168
00:09:24 --> 00:09:31
What's the eigenvalue
matrix for A inverse?
169
00:09:31 --> 00:09:37
It's the inverse of this guy,
so what does it look like?
170
00:09:37 --> 00:09:39
It's got one over lambdas.
171
00:09:39 --> 00:09:42
That's all it says.
172
00:09:42 --> 00:09:47
The eigenvalues for A inverse
are just one over the
173
00:09:47 --> 00:09:49
eigenvalues for A.
174
00:09:49 --> 00:09:54
If that is so, and it can't be
difficult, we could again, we
175
00:09:54 --> 00:10:01
could prove it sort of
like, one at a time.
176
00:10:01 --> 00:10:03
This is my starting
point, always.
177
00:10:03 --> 00:10:07
How would I get to A inverse
now and recover this fact that
178
00:10:07 --> 00:10:13
the eigenvalues for the
inverse, just turn them up.
179
00:10:13 --> 00:10:17
If A has an eigenvalue
seven, A inverse will
180
00:10:17 --> 00:10:20
have an eigenvalue 1/7.
181
00:10:20 --> 00:10:23
What do I do?
182
00:10:23 --> 00:10:26
Usually multiply both sides
by something sensible.
183
00:10:26 --> 00:10:28
Right?
184
00:10:28 --> 00:10:31
What shall I multiply
both sides by?
185
00:10:31 --> 00:10:34
A inverse sounds like
a good idea, right.
186
00:10:34 --> 00:10:37
So I'm multiplying both sides
by A inverse, so that just
187
00:10:37 --> 00:10:40
leaves y and here is that
number, here is A
188
00:10:40 --> 00:10:44
inverse times y.
189
00:10:44 --> 00:10:46
Well, maybe I should
do one more thing.
190
00:10:46 --> 00:10:48
What else shall I do?
191
00:10:48 --> 00:10:52
Divide by lambda.
192
00:10:52 --> 00:10:55
Take that number lambda and put
it over here as one lambda.
193
00:10:55 --> 00:11:00
Well, just exactly what
we're looking for.
194
00:11:00 --> 00:11:06
The same y has A inverse, the
same y as an eigenvector of A
195
00:11:06 --> 00:11:10
inverse and the eigenvalue
is one over lambda.
196
00:11:10 --> 00:11:14
Oh, and of course, I should
have said before I inverted
197
00:11:14 --> 00:11:19
anything, what should I have
said about the lambdas?
198
00:11:19 --> 00:11:21
Not zero.
199
00:11:21 --> 00:11:22
Right?
200
00:11:22 --> 00:11:28
A zero eigenvalue is a signal
the matrix isn't invertible.
201
00:11:28 --> 00:11:32
So that's perfect test.
202
00:11:32 --> 00:11:35
If the matrix is invertible,
all its eigenvalues
203
00:11:35 --> 00:11:37
are not zero.
204
00:11:37 --> 00:11:40
If it's singular, it's
got a zero eigenvalue.
205
00:11:40 --> 00:11:47
If a matrix is singular, then
Ay would be 0y for some,
206
00:11:47 --> 00:11:49
there'd be a vector that
that matrix kills.
207
00:11:49 --> 00:11:52
If A is not invertible,
there's a reason for it.
208
00:11:52 --> 00:11:56
It's because it takes
some vector to zero, and
209
00:11:56 --> 00:12:01
of course, you can't
bring it back to life.
210
00:12:01 --> 00:12:03
So shall I just
put that up here?
211
00:12:03 --> 00:12:08
Lambda=0 would tell me I
have a singular matrix.
212
00:12:08 --> 00:12:12
All lambda not equal zero
would tell me I have
213
00:12:12 --> 00:12:17
an invertible matrix.
214
00:12:17 --> 00:12:19
These are
straightforward facts.
215
00:12:19 --> 00:12:24
It's taken down in this
row and it's just really
216
00:12:24 --> 00:12:30
handy to have up here.
217
00:12:30 --> 00:12:33
Well now I'm ready to
move toward the specific
218
00:12:33 --> 00:12:35
matrices, our favorites.
219
00:12:35 --> 00:12:38
Now, those are symmetric.
220
00:12:38 --> 00:12:43
So maybe before I leave this
picture, we better recall
221
00:12:43 --> 00:12:46
what is special when the
matrix is symmetric.
222
00:12:46 --> 00:12:49
So that's going to
be the next thing.
223
00:12:49 --> 00:12:53
So if A is symmetric I get
some extra good things.
224
00:12:53 --> 00:12:58
So let me take instead
of A, I'll use K.
225
00:12:58 --> 00:13:02
So that'll be my letter
for the best matrices.
226
00:13:02 --> 00:13:07
So symmetric.
227
00:13:07 --> 00:13:10
So now what's the deal
was symmetric matrices?
228
00:13:10 --> 00:13:12
The eigenvalues, the lambdas.
229
00:13:12 --> 00:13:16
I'll just call them the
lambdas and the y's.
230
00:13:16 --> 00:13:20
The lambdas are, do you
remember from last time?
231
00:13:20 --> 00:13:25
If I have a symmetric matrix,
the eigenvalues are all?
232
00:13:25 --> 00:13:26
Anybody remember?
233
00:13:26 --> 00:13:28
They're all real numbers.
234
00:13:28 --> 00:13:32
You can never run into complex
eigenvalues if you start
235
00:13:32 --> 00:13:34
with a symmetric matrix.
236
00:13:34 --> 00:13:39
We didn't prove that but it's
just a few steps like those.
237
00:13:39 --> 00:13:42
And what about, most important,
what about the y's?
238
00:13:42 --> 00:13:44
The eigenvectors.
239
00:13:44 --> 00:13:48
They are, or can be chosen
to be, or whatever, anybody
240
00:13:48 --> 00:13:50
remember that fact?
241
00:13:50 --> 00:13:53
These are, like,
the golden facts.
242
00:13:53 --> 00:14:02
Every sort of bunch of matrices
reveals itself through what its
243
00:14:02 --> 00:14:05
eigenvalues are like and what
its eigenvectors are like.
244
00:14:05 --> 00:14:08
And the most important class is
symmetric and that reveals
245
00:14:08 --> 00:14:13
itself through real eigenvalues
and? orthogonal, good.
246
00:14:13 --> 00:14:18
Orthogonal eigenvectors,
orthogonal.
247
00:14:18 --> 00:14:24
And in fact, since I'm an
eigenvector, I can adjust
248
00:14:24 --> 00:14:27
it's length as I like.
249
00:14:27 --> 00:14:29
Right?
250
00:14:29 --> 00:14:32
If y is an eigenvector, 11y is
an eigenvector because I would
251
00:14:32 --> 00:14:35
just multiply both sides by 11.
252
00:14:35 --> 00:14:40
That whole line of eigenvectors
is getting stretched by lambda.
253
00:14:40 --> 00:14:45
So what I want to do is
make them unit vectors.
254
00:14:45 --> 00:14:50
MATLAB will automatically
produce, eig would
255
00:14:50 --> 00:15:00
automatically give you vectors
that have been normalized to u.
256
00:15:00 --> 00:15:03
Here's something good.
257
00:15:03 --> 00:15:05
So what does orthogonal mean?
258
00:15:05 --> 00:15:09
That means that one of them,
the dot product of one of
259
00:15:09 --> 00:15:14
them with another one is?
260
00:15:14 --> 00:15:17
Now that's not, I didn't
do the dot product yet.
261
00:15:17 --> 00:15:21
What symbol do I have to
write on left-hand side?
262
00:15:21 --> 00:15:23
Well you could say,
just put a dot.
263
00:15:23 --> 00:15:25
Of course.
264
00:15:25 --> 00:15:34
But dots are not cool, right?
265
00:15:34 --> 00:15:37
So maybe I should say
inner product, that's the
266
00:15:37 --> 00:15:41
more upper class word.
267
00:15:41 --> 00:15:44
But I want to use transpose.
268
00:15:44 --> 00:15:45
So it's the transpose.
269
00:15:45 --> 00:15:47
That's the dot product.
270
00:15:47 --> 00:15:50
And that's the test
for perpendicular.
271
00:15:50 --> 00:15:51
So what's the answer then?
272
00:15:51 --> 00:15:55
I get a zero if i is
different from j.
273
00:15:55 --> 00:15:59
If I'm taking two different
eigenvectors and I take their
274
00:15:59 --> 00:16:02
dot product, that's what you
told me, they're orthogonal.
275
00:16:02 --> 00:16:05
And now what if i equals j?
276
00:16:05 --> 00:16:09
If I'm taking the dot
product with itself, each
277
00:16:09 --> 00:16:11
eigenvector with itself.
278
00:16:11 --> 00:16:16
So what does the dot product
of a vector with itself give?
279
00:16:16 --> 00:16:19
It'll be one because
I'm normal.
280
00:16:19 --> 00:16:21
Exactly.
281
00:16:21 --> 00:16:24
What it always gives, the dot
product of a vector with
282
00:16:24 --> 00:16:27
itself, you just realize that
that'll be y_1 squared,
283
00:16:27 --> 00:16:31
y_2 squared, it'll be
the length squared.
284
00:16:31 --> 00:16:36
And here we're making
the length to be one.
285
00:16:36 --> 00:16:41
Well once again, if I write
something down like this which
286
00:16:41 --> 00:16:44
is straightforward I want
to express it as a
287
00:16:44 --> 00:16:46
matrix statement.
288
00:16:46 --> 00:16:53
So I want to multiply,
it'll involve my good
289
00:16:53 --> 00:16:55
eigenvector matrix.
290
00:16:55 --> 00:17:00
And this will be what?
291
00:17:00 --> 00:17:03
I want to take all these
dots products at once.
292
00:17:03 --> 00:17:07
I want to take the dot product
of every y with every other y.
293
00:17:07 --> 00:17:08
Well here you go.
294
00:17:08 --> 00:17:14
Just put these guys in the
rows, now that we see that it
295
00:17:14 --> 00:17:19
really was the transpose
multiplying y, do you see
296
00:17:19 --> 00:17:21
that that's just done it?
297
00:17:21 --> 00:17:24
In fact, you'll tell me
what the answer is here.
298
00:17:24 --> 00:17:28
Don't shout it out, but let's
take it two or three entries
299
00:17:28 --> 00:17:31
and then you can shout it out.
300
00:17:31 --> 00:17:37
So what's the 1, 1 entry
here of I guess that's
301
00:17:37 --> 00:17:39
what we called S.
302
00:17:39 --> 00:17:41
And now this would
be its transpose.
303
00:17:41 --> 00:17:44
And what I'm saying is if I
take-- yeah, this is important
304
00:17:44 --> 00:17:50
because throughout this course
we're going to be taking A
305
00:17:50 --> 00:17:55
transpose A, S transpose S, Q
transpose Q, often,
306
00:17:55 --> 00:17:56
often, often.
307
00:17:56 --> 00:17:58
So here we got the
first time at it.
308
00:17:58 --> 00:18:02
So why did I put a zero
there, because it's not it.
309
00:18:02 --> 00:18:03
What is it?
310
00:18:03 --> 00:18:06
What is that first entry?
311
00:18:06 --> 00:18:07
One.
312
00:18:07 --> 00:18:11
Because that's the row times
the column, that's a one.
313
00:18:11 --> 00:18:13
And what's the
entry next to it?
314
00:18:13 --> 00:18:14
Zero.
315
00:18:14 --> 00:18:19
Right? y_1, dot product with
y_2 is, we're saying, zero.
316
00:18:19 --> 00:18:23
So what matrix have I got here?
317
00:18:23 --> 00:18:24
I've got the identity.
318
00:18:24 --> 00:18:28
Because y_2 with y_2 will
put a one there and
319
00:18:28 --> 00:18:30
all zeroes elsewhere.
320
00:18:30 --> 00:18:31
Zero, zero.
321
00:18:31 --> 00:18:34
And y_3 times y_3
will be the one.
322
00:18:34 --> 00:18:40
I get the identity.
323
00:18:40 --> 00:18:48
So this is for
symmetric matrices.
324
00:18:48 --> 00:18:53
In general, we can't expect the
eigenvectors to be orthogonal.
325
00:18:53 --> 00:18:57
It's these special
ones that are.
326
00:18:57 --> 00:19:02
But they're so important
that we notice.
327
00:19:02 --> 00:19:07
Now so this is the eigenvector
matrix S and this
328
00:19:07 --> 00:19:09
is its transpose.
329
00:19:09 --> 00:19:12
So I'm saying that for
a symmetric matrix, S
330
00:19:12 --> 00:19:16
transpose times S is I.
331
00:19:16 --> 00:19:19
Well that's pretty important.
332
00:19:19 --> 00:19:24
In fact, that's important
enough that I'm going to give
333
00:19:24 --> 00:19:32
an extra name to S, the
eigenvector matrix when it
334
00:19:32 --> 00:19:36
comes from a symmetric matrix,
when it has a matrix with S
335
00:19:36 --> 00:19:42
transpose times S equaling the
identity is really a
336
00:19:42 --> 00:19:44
good matrix to know.
337
00:19:44 --> 00:19:50
So let's just focus
on those guys.
338
00:19:50 --> 00:19:52
I can put that up here.
339
00:19:52 --> 00:19:57
So here's a matrix.
340
00:19:57 --> 00:20:00
Can I introduce a
different letter than S?
341
00:20:00 --> 00:20:04
It just helps you to remember
that this remarkable
342
00:20:04 --> 00:20:06
property is in force.
343
00:20:06 --> 00:20:10
That we've got it.
344
00:20:10 --> 00:20:15
So I'm going to call it when K
is a symmetric matrix, I'll
345
00:20:15 --> 00:20:25
just repeat that, then its
eigenvector matrix has this S
346
00:20:25 --> 00:20:32
transpose times S, I'm
going to call it Q.
347
00:20:32 --> 00:20:36
I'm going to call the
eigenvectors, so for this
348
00:20:36 --> 00:20:46
special situation, A times--
So I'm going to call the
349
00:20:46 --> 00:20:48
eigenvector matrix Q.
350
00:20:48 --> 00:20:54
It's the S but it's worth
giving it this special notation
351
00:20:54 --> 00:21:02
to remind us that this is, so
Q is an orthogonal matrix.
352
00:21:02 --> 00:21:06
There's a name for matrices
with this important property.
353
00:21:06 --> 00:21:09
And there's a letter Q
that everybody uses.
354
00:21:09 --> 00:21:12
An orthogonal matrix.
355
00:21:12 --> 00:21:14
And what does that mean?
356
00:21:14 --> 00:21:23
Means just what we said, Q
transpose times Q is I.
357
00:21:23 --> 00:21:28
What I've done here is just
giving a special, introducing a
358
00:21:28 --> 00:21:33
special letter Q, a special
name, orthogonal matrix for
359
00:21:33 --> 00:21:38
what we found in the good,
in this for eigenvectors
360
00:21:38 --> 00:21:40
of a symmetric matrix.
361
00:21:40 --> 00:21:44
And this tells me
one thing more.
362
00:21:44 --> 00:21:46
Look what's happening here.
363
00:21:46 --> 00:21:51
Q transpose times Q is
giving the identity.
364
00:21:51 --> 00:21:54
What does that tell me
about the inverse of Q?
365
00:21:54 --> 00:22:00
That tells me here some matrix
is multiplying Q and giving I.
366
00:22:00 --> 00:22:02
So what is this matrix?
367
00:22:02 --> 00:22:06
What's another name
for this Q transpose?
368
00:22:06 --> 00:22:09
Is also Q inverse.
369
00:22:09 --> 00:22:12
Because that's what defines
the inverse matrix, at
370
00:22:12 --> 00:22:14
times Q should give I.
371
00:22:14 --> 00:22:23
So Q transpose is Q inverse.
372
00:22:23 --> 00:22:26
I'm moving along here.
373
00:22:26 --> 00:22:33
Yes, please.
374
00:22:33 --> 00:22:37
The question was, shouldn't I
call it an orthonormal matrix?
375
00:22:37 --> 00:22:40
The answer is yes, I should.
376
00:22:40 --> 00:22:42
But nobody does.
377
00:22:42 --> 00:22:42
Damnit!
378
00:22:42 --> 00:22:45
So I'm stuck with that name.
379
00:22:45 --> 00:22:47
But orthonormal is
the proper name.
380
00:22:47 --> 00:22:51
If you call it an orthonormal
matrix, I'm happy because
381
00:22:51 --> 00:22:54
that's really the right name
for that matrix, orthonormal.
382
00:22:54 --> 00:22:58
Because orthogonal would just
mean orthogonal columns but
383
00:22:58 --> 00:23:02
we've taken this extra little
step to make all
384
00:23:02 --> 00:23:03
the lengths one.
385
00:23:03 --> 00:23:07
And then that gives us
this great property.
386
00:23:07 --> 00:23:09
Q transpose is Q inverse.
387
00:23:09 --> 00:23:14
Orthogonal matrices
are like rotations.
388
00:23:14 --> 00:23:19
I better give an example
of an orthogonal matrix.
389
00:23:19 --> 00:23:20
I'll do it right under here.
390
00:23:20 --> 00:23:22
Here is an orthogonal matrix.
391
00:23:22 --> 00:23:25
So what's the point?
392
00:23:25 --> 00:23:28
It's supposed to be a unit
vector in the first column
393
00:23:28 --> 00:23:30
so I'll put cos(theta),
sin(theta).
394
00:23:31 --> 00:23:34
And now what can go in the
second column of this
395
00:23:34 --> 00:23:36
orthogonal matrix?
396
00:23:36 --> 00:23:40
It's gotta be a unit vector
again because we've normalized
397
00:23:40 --> 00:23:46
and it's gotta be, what's the
connection to the first column?
398
00:23:46 --> 00:23:48
Orthogonal, gotta
be orthogonal.
399
00:23:48 --> 00:23:52
So I just wanted to put
something here that sum of
400
00:23:52 --> 00:23:55
squares is one, so I'll
think cos(theta) and
401
00:23:55 --> 00:23:56
sin(theta) again.
402
00:23:56 --> 00:23:59
But then I've got to flip
them a little to make
403
00:23:59 --> 00:24:01
it orthogonal to this.
404
00:24:01 --> 00:24:06
So if I put minus sin(theta)
there and plus cos(theta)
405
00:24:06 --> 00:24:10
there that certainly
has length one, good.
406
00:24:10 --> 00:24:14
And the dot product, can you
do the dot product of that
407
00:24:14 --> 00:24:15
column with that column?
408
00:24:15 --> 00:24:20
It's minus sine, cosine,
plus sine, cosine, zero.
409
00:24:20 --> 00:24:24
So there is a two by two,
actually that's a fantastic
410
00:24:24 --> 00:24:28
building block out of which you
good build many orthogonal
411
00:24:28 --> 00:24:32
matrices of all sizes.
412
00:24:32 --> 00:24:38
That's a rotation by theta.
413
00:24:38 --> 00:24:41
That's a useful matrix to know.
414
00:24:41 --> 00:24:46
It takes every vector, swings
it around by an angle theta.
415
00:24:46 --> 00:24:47
What do I mean?
416
00:24:47 --> 00:24:53
I mean that Qx, Q times a
vector x rotates x by theta.
417
00:24:53 --> 00:24:55
Let me put it.
418
00:24:55 --> 00:25:01
Qx rotates whatever vector x
you give it, you multiply by Q,
419
00:25:01 --> 00:25:07
it rotates it around by theta,
it doesn't change the length.
420
00:25:07 --> 00:25:13
So that would be an eigenvector
matrix of a pretty
421
00:25:13 --> 00:25:19
typical two by two.
422
00:25:19 --> 00:25:22
I see as I talk about
eigenvectors, eigenvalues
423
00:25:22 --> 00:25:24
there's so much to say.
424
00:25:24 --> 00:25:29
Because everything you know
about a matrix shows up somehow
425
00:25:29 --> 00:25:31
in its eigenvectors and
eigenvalues and we're
426
00:25:31 --> 00:25:38
focusing on symmetric guys.
427
00:25:38 --> 00:25:40
What happens to this
A=S*lambda*S inverse?
428
00:25:42 --> 00:25:42
Let's write that again.
429
00:25:42 --> 00:25:45
Now we've got K.
430
00:25:45 --> 00:25:53
It's S*lambda*S inverse like
any good diagonalization but
431
00:25:53 --> 00:25:59
now I'm giving S a new
name, which is what?
432
00:25:59 --> 00:26:04
Q. because when I give K, when
I use that letter K I'm
433
00:26:04 --> 00:26:09
thinking symmetric so I'm in
this special situation
434
00:26:09 --> 00:26:10
of symmetric.
435
00:26:10 --> 00:26:12
I have the lambda, the
eigenvalue matrix, and
436
00:26:12 --> 00:26:16
here I have Q inverse.
437
00:26:16 --> 00:26:20
But there's another little
way to write it and it's
438
00:26:20 --> 00:26:24
terrifically important
in mechanics and
439
00:26:24 --> 00:26:26
dynamics, everywhere.
440
00:26:26 --> 00:26:27
It's simple now.
441
00:26:27 --> 00:26:29
We know everything.
442
00:26:29 --> 00:26:31
Q lambda what?
443
00:26:31 --> 00:26:37
Q transpose.
444
00:26:37 --> 00:26:40
Do you say the beauty
of that form?
445
00:26:40 --> 00:26:48
That's called the principal
access theorem in mechanics.
446
00:26:48 --> 00:26:50
It's called the spectral
theorem in mathematics.
447
00:26:50 --> 00:26:55
It's diagonalization, it's
quantum mechanics, everything.
448
00:26:55 --> 00:27:00
Any time you have a symmetric
matrix there's the wonderful
449
00:27:00 --> 00:27:06
statement of how it breaks up
when you look at its
450
00:27:06 --> 00:27:10
orthonormal eigenvectors and
its real eigenvalues.
451
00:27:10 --> 00:27:17
Do you see that once again
the symmetry has reappeared
452
00:27:17 --> 00:27:19
in the three factors?
453
00:27:19 --> 00:27:23
The symmetry has reappeared in
the fact that this vector is
454
00:27:23 --> 00:27:25
the transpose of this one.
455
00:27:25 --> 00:27:34
We saw that for elimination
when these were triangular.
456
00:27:34 --> 00:27:40
That makes me remember what we
had in a different context, in
457
00:27:40 --> 00:27:46
the elimination when things
were triangular we had
458
00:27:46 --> 00:27:46
K=L*D*L transpose.
459
00:27:48 --> 00:27:55
I just squeezed that in to ask
you to sort of think of the two
460
00:27:55 --> 00:28:01
as two wonderful pieces of
linear algebra in such a
461
00:28:01 --> 00:28:04
perfect shorthand,
perfect notation.
462
00:28:04 --> 00:28:08
This was triangular times
the pivot matrix times
463
00:28:08 --> 00:28:10
the upper triangular.
464
00:28:10 --> 00:28:13
This is orthogonal times
the eigenvalue matrix
465
00:28:13 --> 00:28:17
times its transpose.
466
00:28:17 --> 00:28:20
And the key point here was
triangular and the key
467
00:28:20 --> 00:28:28
point here is orthogonal.
468
00:28:28 --> 00:28:31
That took some time,
but it had to be done.
469
00:28:31 --> 00:28:33
This is the right
way to understand.
470
00:28:33 --> 00:28:37
That the central theme, it's
a highlight of a linear
471
00:28:37 --> 00:28:42
algebra course and we
just went straight to it.
472
00:28:42 --> 00:28:50
And now what I wanted to do was
look now at the special K.
473
00:28:50 --> 00:28:56
Oh, that's an awful pun.
474
00:28:56 --> 00:29:03
The special matrices that we
have, so those are n by n.
475
00:29:03 --> 00:29:08
And as I said last time,
usually it's not very likely
476
00:29:08 --> 00:29:11
that we find all the
eigenvalues and eigenvectors
477
00:29:11 --> 00:29:16
of this family of bigger
and bigger matrices.
478
00:29:16 --> 00:29:23
So now I'm going to specialize
to my n by n matrix K equals
479
00:29:23 --> 00:29:26
twos down the diagonal,
minus ones above and
480
00:29:26 --> 00:29:30
minus ones below.
481
00:29:30 --> 00:29:32
What are the eigenvalues
of that matrix and what
482
00:29:32 --> 00:29:36
are the eigenvectors?
483
00:29:36 --> 00:29:39
How to tackle that?
484
00:29:39 --> 00:29:45
The best way is the way we've
done with the inverse and other
485
00:29:45 --> 00:29:49
ways of understanding K was to
compare it with the
486
00:29:49 --> 00:29:52
continuous problem.
487
00:29:52 --> 00:29:56
So this is a big matrix which
is a second difference
488
00:29:56 --> 00:29:59
matrix, fixed-fixed.
489
00:29:59 --> 00:30:04
Everybody remembers that the
boundary conditions associated
490
00:30:04 --> 00:30:05
with this are fixed-fixed.
491
00:30:05 --> 00:30:11
I want to ask you to look
at the corresponding
492
00:30:11 --> 00:30:13
differential equation.
493
00:30:13 --> 00:30:17
So you may not have thought
about eigenvectors of
494
00:30:17 --> 00:30:19
differential equations.
495
00:30:19 --> 00:30:22
And maybe I have to call them
eigenfunctions but the idea
496
00:30:22 --> 00:30:24
doesn't change one bit.
497
00:30:24 --> 00:30:29
So what shall I look at?
498
00:30:29 --> 00:30:32
K corresponds what?
499
00:30:32 --> 00:30:37
Continuous differential
business, what
500
00:30:37 --> 00:30:40
derivative, what?
501
00:30:40 --> 00:30:42
So I would like to
look at Ky=lambda*y.
502
00:30:42 --> 00:30:46
503
00:30:46 --> 00:30:49
I'm looking for the y's and
lambdas and the way I'm going
504
00:30:49 --> 00:30:56
to get them is to look at,
what did you say it was?
505
00:30:56 --> 00:31:00
K, now I'm going to write down
a differential equation that's
506
00:31:00 --> 00:31:05
like this but we'll
solve it quickly.
507
00:31:05 --> 00:31:07
So what will it be?
508
00:31:07 --> 00:31:11
K is like, tell me again.
509
00:31:11 --> 00:31:17
Second derivative of y with
respect to x squared.
510
00:31:17 --> 00:31:20
And there's one more thing
you have to remember.
511
00:31:20 --> 00:31:22
Minus.
512
00:31:22 --> 00:31:25
And here we have lambda*y(x).
513
00:31:25 --> 00:31:32
514
00:31:32 --> 00:31:37
That's an eigenvalue and an
eigenfunction that we're
515
00:31:37 --> 00:31:40
looking at for this
differential equation.
516
00:31:40 --> 00:31:44
Now there's another thing
you have to remember.
517
00:31:44 --> 00:31:48
And you'll know what it
is and you'll tell me.
518
00:31:48 --> 00:31:51
I could look for
all the solutions.
519
00:31:51 --> 00:31:53
Well, let me
momentarily do that.
520
00:31:53 --> 00:32:00
What functions have minus
the second derivative is a
521
00:32:00 --> 00:32:02
multiple of the function?
522
00:32:02 --> 00:32:05
Can you just tell me a few?
523
00:32:05 --> 00:32:07
Sine and cosine.
524
00:32:07 --> 00:32:10
I mean this is a fantastic
eigenvalue problem because
525
00:32:10 --> 00:32:18
it's solutions are
sines and cosines.
526
00:32:18 --> 00:32:22
And of course we could combine
them into exponential.
527
00:32:22 --> 00:32:29
We could have sine(omega*x)
or cos(omega*x) or we
528
00:32:29 --> 00:32:33
could combine those into
e^(i*omega*x), would be a
529
00:32:33 --> 00:32:36
combination of those,
or e^(-i*omega*x).
530
00:32:36 --> 00:32:39
531
00:32:39 --> 00:32:45
Those are combinations of
these, so those are not new.
532
00:32:45 --> 00:32:47
We've gotten lots
of eigenfunctions.
533
00:32:47 --> 00:32:51
Oh, for every frequency omega
this solves the equation.
534
00:32:51 --> 00:32:54
What's the eigenvalue?
535
00:32:54 --> 00:32:57
If you guess the eigenfunction
you've got the eigenvalue
536
00:32:57 --> 00:32:59
just by seeing what happens.
537
00:32:59 --> 00:33:05
So what would the
eigenvalue be?
538
00:33:05 --> 00:33:06
Tell me again.
539
00:33:06 --> 00:33:07
Omega squared.
540
00:33:07 --> 00:33:11
Because I take the second
derivative of the sine, that'll
541
00:33:11 --> 00:33:14
give me the cosine back to the
sine, omega squared comes
542
00:33:14 --> 00:33:17
out, omega comes out twice.
543
00:33:17 --> 00:33:19
Comes out with a minus
sign from the cosine and
544
00:33:19 --> 00:33:25
that minus sign is just
right to make it plus.
545
00:33:25 --> 00:33:27
Lambda is omega squared.
546
00:33:27 --> 00:33:29
So omega squared.
547
00:33:29 --> 00:33:31
All the way of course.
548
00:33:31 --> 00:33:38
Those are the eigenvalues.
549
00:33:38 --> 00:33:41
All our differential examples
had something more than just
550
00:33:41 --> 00:33:43
the differential equation.
551
00:33:43 --> 00:33:47
What's the additional
thing that a differential
552
00:33:47 --> 00:33:49
equation comes with?
553
00:33:49 --> 00:33:51
Boundary conditions.
554
00:33:51 --> 00:33:53
With boundary conditions.
555
00:33:53 --> 00:33:55
Otherwise we got too many.
556
00:33:55 --> 00:33:58
I mean we don't want
all of these guys.
557
00:33:58 --> 00:34:02
What boundary conditions, if
we're thinking about K, our
558
00:34:02 --> 00:34:09
boundary conditions should
be fixed and fixed.
559
00:34:09 --> 00:34:11
So that's the full problem.
560
00:34:11 --> 00:34:17
This is part of the problem
not just an afterthought.
561
00:34:17 --> 00:34:21
Now these conditions,
that will be perfect.
562
00:34:21 --> 00:34:25
Instead of having all these
sines and cosines we're
563
00:34:25 --> 00:34:33
going to narrow down to
a family that satisfies
564
00:34:33 --> 00:34:36
the boundary conditions.
565
00:34:36 --> 00:34:39
First boundary condition is
it has to be zero at x=0.
566
00:34:40 --> 00:34:43
What does that eliminate now?
567
00:34:43 --> 00:34:45
Cosines are gone,
keeps the sines.
568
00:34:45 --> 00:34:49
Cosines are gone by that
first boundary condition.
569
00:34:49 --> 00:34:52
These are guys that are left.
570
00:34:52 --> 00:34:57
I won't deal with these at this
point because I'm down to
571
00:34:57 --> 00:35:01
sines already from one
boundary condition.
572
00:35:01 --> 00:35:06
And now, the other
boundary condition.
573
00:35:06 --> 00:35:13
The other boundary condition
has to at x=1 if it's going to
574
00:35:13 --> 00:35:16
work sin(omega*x) has to be?
575
00:35:16 --> 00:35:22
Nope, what do I put now?
sin(omega), right? x is one.
576
00:35:22 --> 00:35:23
I'm plugging in here.
577
00:35:23 --> 00:35:26
I'm just plugging
in x=1 to satisfy.
578
00:35:26 --> 00:35:30
And it has to equal zero.
579
00:35:30 --> 00:35:37
So that means, that
pins down omega.
580
00:35:37 --> 00:35:39
Doesn't give me just one
of it, well tell me one
581
00:35:39 --> 00:35:43
omega that's okay then.
582
00:35:43 --> 00:35:45
The first omega that
occurs to you is?
583
00:35:45 --> 00:35:47
Pi.
584
00:35:47 --> 00:35:48
The sine comes back to pi.
585
00:35:48 --> 00:35:50
So we've got one. y_1.
586
00:35:50 --> 00:35:54
Our first guy is with
omega=pi is sin(pi*x).
587
00:35:54 --> 00:36:00
588
00:36:00 --> 00:36:03
That's our fundamental mode.
589
00:36:03 --> 00:36:08
That's the number
one eigenfunction.
590
00:36:08 --> 00:36:11
And it is an eigenfunction,
it satisfies the
591
00:36:11 --> 00:36:13
boundary condition.
592
00:36:13 --> 00:36:16
Everybody would know its
picture, just one arch
593
00:36:16 --> 00:36:17
of the sine function.
594
00:36:17 --> 00:36:21
And the lambda that goes with
it, lambda_1, so this is the
595
00:36:21 --> 00:36:25
first eigenfunction, what's
the first eigenvalue?
596
00:36:25 --> 00:36:27
Pi squared, right.
597
00:36:27 --> 00:36:29
Because omega, we
took to be pi.
598
00:36:29 --> 00:36:33
So lambda_1 is pi squared.
599
00:36:33 --> 00:36:36
We've got one.
600
00:36:36 --> 00:36:40
We were able to do it
because we could solve this
601
00:36:40 --> 00:36:45
equation in an easy way.
602
00:36:45 --> 00:36:47
Ready for a second one?
603
00:36:47 --> 00:36:49
What will the next one be?
604
00:36:49 --> 00:36:54
The next eigenfunction it's got
to, whatever its frequency is,
605
00:36:54 --> 00:36:56
omega, it's got to
have sin(omega)=0.
606
00:36:57 --> 00:37:00
What's your choice?
607
00:37:00 --> 00:37:01
2pi.
608
00:37:01 --> 00:37:04
So the next one is going
to be sin(2pi*x).
609
00:37:06 --> 00:37:10
And what will be the eigenvalue
that goes with that guy?
610
00:37:10 --> 00:37:17
lambda_2 will be omega squared,
which is 2pi squared, 2pi all
611
00:37:17 --> 00:37:20
squared, so that's
four pi squared.
612
00:37:20 --> 00:37:24
You see the whole list.
613
00:37:24 --> 00:37:29
The sines with these correct
frequencies are the
614
00:37:29 --> 00:37:34
eigenfunctions of the second
derivative with fixed-fixed
615
00:37:34 --> 00:37:36
boundary conditions.
616
00:37:36 --> 00:37:39
And this is entirely typical.
617
00:37:39 --> 00:37:43
We don't have just n of them.
618
00:37:43 --> 00:37:45
The list goes on
forever, right?
619
00:37:45 --> 00:37:48
The list goes on forever
because we're talking here
620
00:37:48 --> 00:37:50
about a differential equation.
621
00:37:50 --> 00:37:54
A differential equation's
somehow like a matrix
622
00:37:54 --> 00:37:55
of infinite size.
623
00:37:55 --> 00:38:03
And somehow these sines are
the columns of the infinite
624
00:38:03 --> 00:38:06
size eigenvector matrix.
625
00:38:06 --> 00:38:09
And these numbers, pi squared,
four pi squared, nine pi
626
00:38:09 --> 00:38:15
squared, 16pi squared are the
eigenvalues of the infinite
627
00:38:15 --> 00:38:22
eigenvalue matrix.
628
00:38:22 --> 00:38:24
We got those answers quickly.
629
00:38:24 --> 00:38:31
And let's just mention that if
I changed to free-fixed or
630
00:38:31 --> 00:38:36
to free-free I could repeat.
631
00:38:36 --> 00:38:38
I'd get different y's.
632
00:38:38 --> 00:38:41
If I have different boundary
conditions I expect
633
00:38:41 --> 00:38:43
to get different y's.
634
00:38:43 --> 00:38:55
In fact, what it look like if
that was y'=0 as the left end?
635
00:38:55 --> 00:38:59
What would you expect the
eigenfunctions to look like?
636
00:38:59 --> 00:39:01
They'd be cosines.
637
00:39:01 --> 00:39:02
They'd be cosines.
638
00:39:02 --> 00:39:06
And then we would have to
adjust the omegas to make
639
00:39:06 --> 00:39:11
them come out right at
the right-hand end.
640
00:39:11 --> 00:39:19
So this y(0)=0, the fixed ones
gave us sines, the free ones
641
00:39:19 --> 00:39:27
give us cosines, the periodic
ones if I had y(0)=y(1) so that
642
00:39:27 --> 00:39:32
I'm just circling around, then
I would expect these e^(ikx)'s
643
00:39:32 --> 00:39:38
-- the textbook will, so I'm in
the eigenvalue section of
644
00:39:38 --> 00:39:42
course, and the textbook
lists the answers for
645
00:39:42 --> 00:39:44
the other possibilities.
646
00:39:44 --> 00:39:46
Let's go with this one.
647
00:39:46 --> 00:39:53
Because this is the one
that corresponds to K.
648
00:39:53 --> 00:40:04
We're now ready for
the final moment.
649
00:40:04 --> 00:40:14
And it is can we guess the
eigenvectors for the matrix?
650
00:40:14 --> 00:40:18
Now I'm going back to
the matrix question.
651
00:40:18 --> 00:40:22
And as I say, normally
the answer's no.
652
00:40:22 --> 00:40:24
Who could guess?
653
00:40:24 --> 00:40:25
But you can always hope.
654
00:40:25 --> 00:40:28
You can try.
655
00:40:28 --> 00:40:33
So what well I try?
656
00:40:33 --> 00:40:36
Here, let me draw
sin(x), sin(pi*x).
657
00:40:38 --> 00:40:42
And let me remember that
my matrix K was a finite
658
00:40:42 --> 00:40:45
difference matrix.
659
00:40:45 --> 00:40:47
Let's make it four by four.
660
00:40:47 --> 00:40:57
One, two, three,
four let's say.
661
00:40:57 --> 00:41:01
What would be the best I could
hope for, for the eigenvector,
662
00:41:01 --> 00:41:02
the first eigenvector?
663
00:41:02 --> 00:41:07
I'm hoping that the first
eigenvector of K is very, very
664
00:41:07 --> 00:41:12
like the first eigenfunction in
the differential equation,
665
00:41:12 --> 00:41:16
which was this sin(pi*x),
so that's sin(pi*x).
666
00:41:17 --> 00:41:20
Well, what do you hope for?
667
00:41:20 --> 00:41:26
What shall I hope for as
the components of y_1,
668
00:41:26 --> 00:41:29
the first eigenvector?
669
00:41:29 --> 00:41:32
It's almost too good.
670
00:41:32 --> 00:41:36
And as far as I know, basically
it only happens with these
671
00:41:36 --> 00:41:38
sines and cosines example.
672
00:41:38 --> 00:41:42
These heights, I just picked
these, what I might call
673
00:41:42 --> 00:41:46
samples, of the thing.
674
00:41:46 --> 00:41:51
Those four values and of course
zero at that end and zero at
675
00:41:51 --> 00:42:00
that end, so because K, the
matrix K is building in the
676
00:42:00 --> 00:42:04
fixed-fixed, these four
heights, these four numbers,
677
00:42:04 --> 00:42:09
those four sines-- In other
words, what I hope is that for
678
00:42:09 --> 00:42:17
Ky=lambda*y, I hope that y_1,
the first eigenvector, it'll be
679
00:42:17 --> 00:42:20
sin(pi*x), but now what is x?
680
00:42:20 --> 00:42:24
So this is x here
from zero to one.
681
00:42:24 --> 00:42:27
So what's x there,
there, there and there?
682
00:42:27 --> 00:42:31
Instead of sin(pi*x), the
whole curve, I'm picking
683
00:42:31 --> 00:42:33
out those four samples.
684
00:42:33 --> 00:42:39
So it'll be the sine of,
what'll it be here?
685
00:42:39 --> 00:42:42
Pi.
686
00:42:42 --> 00:42:45
Pi divided by n+1.
687
00:42:46 --> 00:42:49
Which in my picture
would be, we'll make it
688
00:42:49 --> 00:42:51
completely explicit.
689
00:42:51 --> 00:42:53
Five.
690
00:42:53 --> 00:42:57
It's 1/5 away along.
691
00:42:57 --> 00:43:01
Maybe I should make these y's
into column vectors since we're
692
00:43:01 --> 00:43:03
thinking of them as columns.
693
00:43:03 --> 00:43:04
So here's y_1.
694
00:43:05 --> 00:43:06
sin(pi/5), sin(2pi/5),
sin(3pi/5), sin(4pi/5).
695
00:43:06 --> 00:43:16
696
00:43:16 --> 00:43:21
That's the first eigenvector.
697
00:43:21 --> 00:43:22
And it works.
698
00:43:22 --> 00:43:27
And you could guess
now the general one.
699
00:43:27 --> 00:43:36
Well when I say it works, I
haven't checked that it works.
700
00:43:36 --> 00:43:37
I better do that.
701
00:43:37 --> 00:43:43
But the essential point
is that it works.
702
00:43:43 --> 00:43:46
I may not even do it today.
703
00:43:46 --> 00:43:50
So, in fact, tell me the
second eigenvector.
704
00:43:50 --> 00:43:55
Or tell me the second
eigenfunction over here.
705
00:43:55 --> 00:43:57
What's the second
eigenfunction?
706
00:43:57 --> 00:44:02
Let me draw it with
this green chalk.
707
00:44:02 --> 00:44:04
So I'm going to draw y_2.
708
00:44:06 --> 00:44:08
Now what does y_2 look
like? sin(2pi*x).
709
00:44:09 --> 00:44:13
What's the new picture here?
710
00:44:13 --> 00:44:16
It goes up.
711
00:44:16 --> 00:44:18
What does it do?
712
00:44:18 --> 00:44:24
By here it's got
back, oh no, damn.
713
00:44:24 --> 00:44:27
I would've been better
with three points in the
714
00:44:27 --> 00:44:30
middle, but it's correct.
715
00:44:30 --> 00:44:32
It comes down here.
716
00:44:32 --> 00:44:33
Right?
717
00:44:33 --> 00:44:34
That's sin(2pi*x).
718
00:44:34 --> 00:44:37
719
00:44:37 --> 00:44:46
That's halfway along.
720
00:44:46 --> 00:44:48
I'll finish this guy.
721
00:44:48 --> 00:44:50
This'll be sin(2pi/5),
sin(4pi/5).
722
00:44:50 --> 00:44:56
723
00:44:56 --> 00:44:58
See I'm sampling
this same thing.
724
00:44:58 --> 00:45:04
I'm sampling 2pi*x at those
same points. sin(6pi/5)
725
00:45:04 --> 00:45:05
and sin(8pi/5).
726
00:45:05 --> 00:45:12
727
00:45:12 --> 00:45:15
Maybe let's accept
this as correct.
728
00:45:15 --> 00:45:17
It really works.
729
00:45:17 --> 00:45:19
It's the next eigenvector.
730
00:45:19 --> 00:45:23
And then there's a third one
and then there's a fourth one.
731
00:45:23 --> 00:45:28
And how many are
there? n usually.
732
00:45:28 --> 00:45:30
And in my case, what is
n in the picture I've
733
00:45:30 --> 00:45:35
drawn? n here is four.
734
00:45:35 --> 00:45:39
One, two, three, four. n is
four in that picture and that
735
00:45:39 --> 00:45:41
means that I'm dividing by n+1.
736
00:45:41 --> 00:45:45
737
00:45:45 --> 00:45:47
That's really sin(pi*h).
738
00:45:49 --> 00:45:53
You remember I used
h as the step size.
739
00:45:53 --> 00:45:58
So h is 1/5, 1/(n+1), 1/5.
740
00:45:58 --> 00:46:03
So it's sin(pi*h), sin(2pi*h),
4pi*h, 3pi*h, 4pi*h.
741
00:46:03 --> 00:46:08
Here's 2sin(2pi*h), sin(4pi*h),
sin(6pi*h), sin(8pi*h).
742
00:46:08 --> 00:46:15
743
00:46:15 --> 00:46:18
So I have two things to do.
744
00:46:18 --> 00:46:21
One is to remember what
is the remarkable
745
00:46:21 --> 00:46:23
property of these y's.
746
00:46:23 --> 00:46:26
So there's a y that
we've guessed.
747
00:46:26 --> 00:46:28
Right now you're taking my
word for it that it is
748
00:46:28 --> 00:46:32
the eigenvector and
this is the next one.
749
00:46:32 --> 00:46:34
I copied them out of
those functions.
750
00:46:34 --> 00:46:38
And just remind me, what is
it that I'm claiming to be
751
00:46:38 --> 00:46:40
true about y_1 and y_2.
752
00:46:42 --> 00:46:47
They are orthogonal,
there are orthogonal.
753
00:46:47 --> 00:46:52
Well to check that I'd have
to do some trig stuff.
754
00:46:52 --> 00:46:56
But what I was going to do
was come over here and
755
00:46:56 --> 00:47:03
say this was a symmetric
differential equation.
756
00:47:03 --> 00:47:07
We found its eigenfunctions.
757
00:47:07 --> 00:47:11
What do you think's
up with those?
758
00:47:11 --> 00:47:13
Those are orthogonal too.
759
00:47:13 --> 00:47:20
So this would be a key fact in
any sort of advanced applied
760
00:47:20 --> 00:47:25
math is that the sine function
is orthogonal to the sin(2x).
761
00:47:28 --> 00:47:30
That function as
orthogonal to this one.
762
00:47:30 --> 00:47:33
And actually that's what
makes the whole world
763
00:47:33 --> 00:47:36
of Fourier series work.
764
00:47:36 --> 00:47:40
So that was really
a wonderful fact.
765
00:47:40 --> 00:47:42
That this is
orthogonal to this.
766
00:47:42 --> 00:47:47
Now you may, quite reasonably,
ask what do I mean by that?
767
00:47:47 --> 00:47:52
What does it mean for two
functions to be orthogonal?
768
00:47:52 --> 00:47:53
As long as we're getting
all these parallels,
769
00:47:53 --> 00:47:55
let's get that one too.
770
00:47:55 --> 00:48:02
I claim that this function, is
orthogonal to this function.
771
00:48:02 --> 00:48:05
What does that mean?
772
00:48:05 --> 00:48:09
What should these functions
could I write dot or
773
00:48:09 --> 00:48:11
transpose or something?
774
00:48:11 --> 00:48:15
But now I'm doing
it for functions.
775
00:48:15 --> 00:48:21
I just want you to see
the complete analogy.
776
00:48:21 --> 00:48:26
So for vectors, what did I do?
777
00:48:26 --> 00:48:29
If I take a dot product I
multiply the first component
778
00:48:29 --> 00:48:32
times the first component,
second component times the
779
00:48:32 --> 00:48:33
second, so on, so on.
780
00:48:33 --> 00:48:36
Now what'll I do for functions?
781
00:48:36 --> 00:48:37
I multiply sin(pi*x) *
sin(2pi*x) at each x.
782
00:48:37 --> 00:48:42
783
00:48:42 --> 00:48:44
Of course I've got a
whole range of x's.
784
00:48:44 --> 00:48:46
And then what do I do?
785
00:48:46 --> 00:48:48
I integrate.
786
00:48:48 --> 00:48:49
I can't add.
787
00:48:49 --> 00:48:52
I integreate instead.
788
00:48:52 --> 00:48:58
So I integrate one function
sin(pi*x) against the other
789
00:48:58 --> 00:49:06
function, sin(2pi*x), dx, and I
integrate from zero to one and
790
00:49:06 --> 00:49:08
the answer comes out zero.
791
00:49:08 --> 00:49:09
The answer comes out zero.
792
00:49:09 --> 00:49:13
The sine functions
are orthogonal.
793
00:49:13 --> 00:49:15
The sines are
orthogonal functions.
794
00:49:15 --> 00:49:19
The sine vectors are
orthogonal vectors.
795
00:49:19 --> 00:49:26
I normalize to length one and
they go right into my queue.
796
00:49:26 --> 00:49:29
So if I multiply, if I did that
times that, that dot product
797
00:49:29 --> 00:49:31
would turn out to be zero.
798
00:49:31 --> 00:49:37
If I had been a little less
ambitious and taken n to be two
799
00:49:37 --> 00:49:41
or three or something we would
have seen it completely.
800
00:49:41 --> 00:49:48
But maybe doing
with four is ok.
801
00:49:48 --> 00:49:55
So great lecture
except for that.
802
00:49:55 --> 00:49:58
Didn't get there.
803
00:49:58 --> 00:50:03
So Wednesday's lecture is sort
of the bringing all these
804
00:50:03 --> 00:50:07
pieces together, positive
eigenvalues, positive
805
00:50:07 --> 00:50:09
pivots, positive definite.
806
00:50:09 --> 00:50:11
So come on Wednesday please.
807
00:50:11 --> 00:50:13
Come Wednesday.
808
00:50:13 --> 00:50:16
And Wednesday afternoon
I'll have the review
809
00:50:16 --> 00:50:18
session as usual.