1
00:00:00,530 --> 00:00:02,960
The following content is
provided under a Creative
2
00:00:02,960 --> 00:00:04,370
Commons license.
3
00:00:04,370 --> 00:00:07,410
Your support will help MIT
OpenCourseWare continue to
4
00:00:07,410 --> 00:00:11,060
offer high-quality educational
resources for free.
5
00:00:11,060 --> 00:00:13,960
To make a donation or view
additional materials from
6
00:00:13,960 --> 00:00:19,790
hundreds of MIT courses, visit
MIT OpenCourseWare at
7
00:00:19,790 --> 00:00:21,390
ocw.mit.edu.
8
00:00:21,390 --> 00:00:22,730
PROFESSOR: OK, so let's
get started.
9
00:00:26,335 --> 00:00:30,230
I want to talk mostly about
countable state Markov chains
10
00:00:30,230 --> 00:00:35,790
today, which is the new topic
we started on Wednesday.
11
00:00:35,790 --> 00:00:41,360
I want to talk just a little bit
about the strong law proof
12
00:00:41,360 --> 00:00:44,260
that was in the third
problem of the quiz.
13
00:00:44,260 --> 00:00:49,040
I'm not doing that because
none of you understood
14
00:00:49,040 --> 00:00:50,050
anything about it.
15
00:00:50,050 --> 00:00:53,660
I'm doing it because all of you
understood more about it
16
00:00:53,660 --> 00:00:55,680
than I thought you would.
17
00:00:55,680 --> 00:00:59,350
And in fact, I've always avoided
saying too much about
18
00:00:59,350 --> 00:01:04,010
this proof because I thought
everybody was tuning them out.
19
00:01:04,010 --> 00:01:06,710
And for the class here, it looks
like a lot of you have
20
00:01:06,710 --> 00:01:09,870
tried seriously to understand
these things.
21
00:01:09,870 --> 00:01:15,090
So I thought I would explain
that one part of the quiz in
22
00:01:15,090 --> 00:01:18,450
detail so that you'd see the
parts you're missing, and so
23
00:01:18,450 --> 00:01:22,110
that [INAUDIBLE] all these other
proofs that we have,
24
00:01:22,110 --> 00:01:24,730
talking about the strong law
and the strong law for
25
00:01:24,730 --> 00:01:27,250
renewals, and putting them
together and all of these
26
00:01:27,250 --> 00:01:30,710
things, all of them are
essentially the same.
27
00:01:30,710 --> 00:01:34,070
And it's just a matter
of figuring out
28
00:01:34,070 --> 00:01:35,580
how things fit together.
29
00:01:35,580 --> 00:01:43,090
So I wanted to talk about that
because it's clear that most
30
00:01:43,090 --> 00:01:46,560
of you understand enough about
it that it makes sense.
31
00:01:46,560 --> 00:01:50,420
The situation in the quiz, which
is very close to the
32
00:01:50,420 --> 00:01:56,190
usual queuing situation and
little theorem type things,
33
00:01:56,190 --> 00:01:58,690
there's a sequence
of Y sub i's.
34
00:01:58,690 --> 00:02:00,320
They're IID.
35
00:02:00,320 --> 00:02:06,120
There's the service times for
G/G infinity queue, and n of
36
00:02:06,120 --> 00:02:08,710
t, which is a renewal
process for the
37
00:02:08,710 --> 00:02:10,250
arrivals to the process.
38
00:02:10,250 --> 00:02:14,070
We have arrivals coming in
according to this renewal
39
00:02:14,070 --> 00:02:17,840
process, which means the
[INAUDIBLE] arrival times, X
40
00:02:17,840 --> 00:02:20,680
sub i, were IID.
41
00:02:20,680 --> 00:02:24,450
And we want to put those
two things together.
42
00:02:24,450 --> 00:02:31,250
And we want to find out
what this limit is.
43
00:02:31,250 --> 00:02:34,190
If it's a limit, show
that it's a limit.
44
00:02:34,190 --> 00:02:38,490
And hopefully show that it's
a limit with probability 1.
45
00:02:38,490 --> 00:02:45,500
And I think a large number of
you basically realize that
46
00:02:45,500 --> 00:02:49,750
this argument consisted
of a bunch of steps.
47
00:02:49,750 --> 00:02:52,400
Some people with more
detail than others.
48
00:02:52,400 --> 00:02:57,850
But the first step, which we do
in all of these arguments,
49
00:02:57,850 --> 00:03:03,010
is to divide and multiply
by n of t of omega.
50
00:03:03,010 --> 00:03:06,010
So we're starting out, looking
at some particular sample
51
00:03:06,010 --> 00:03:11,670
path, and start out by
multiplying and dividing by n
52
00:03:11,670 --> 00:03:13,390
of t of omega.
53
00:03:13,390 --> 00:03:18,640
The next thing is to claim that
the limit of this times
54
00:03:18,640 --> 00:03:24,680
this is equal to the limit of
this times the limit of this.
55
00:03:24,680 --> 00:03:28,460
Almost no one recognized that as
a real problem, and that is
56
00:03:28,460 --> 00:03:30,840
a real problem.
57
00:03:30,840 --> 00:03:34,440
It's probably the least
obvious thing
58
00:03:34,440 --> 00:03:39,080
in this whole problem.
59
00:03:39,080 --> 00:03:41,540
I'm not saying you shouldn't
have done that, because I've
60
00:03:41,540 --> 00:03:43,220
been doing that in all
the proofs I've been
61
00:03:43,220 --> 00:03:45,500
giving you all along.
62
00:03:45,500 --> 00:03:48,500
It is sort of obvious
that that works.
63
00:03:48,500 --> 00:03:53,120
And when you're constructing a
proof, especially in a quiz
64
00:03:53,120 --> 00:03:57,440
when you don't have much time,
things which are almost
65
00:03:57,440 --> 00:04:00,300
obvious or which look obvious,
you should just go ahead and
66
00:04:00,300 --> 00:04:03,270
assume them and come back later
when you have time to
67
00:04:03,270 --> 00:04:05,860
see whether that really
makes sense.
68
00:04:05,860 --> 00:04:08,450
That's the way you
do research also.
69
00:04:08,450 --> 00:04:12,240
You don't do research by
painstakingly establishing
70
00:04:12,240 --> 00:04:15,740
every point in some
linear path.
71
00:04:15,740 --> 00:04:22,320
What you do is you carelessly
as you can and with as much
72
00:04:22,320 --> 00:04:26,290
insight as you can, you jump
all the way to the end, you
73
00:04:26,290 --> 00:04:29,310
see where you're trying to go,
you see how to get there, and
74
00:04:29,310 --> 00:04:32,280
then you come back and you try
to figure out what each
75
00:04:32,280 --> 00:04:34,880
of the steps are.
76
00:04:34,880 --> 00:04:38,320
So this is certainly a very
reasonable way of solving this
77
00:04:38,320 --> 00:04:42,250
problem, because it looks like
this limit should be equal to
78
00:04:42,250 --> 00:04:44,340
this limit times this limit.
79
00:04:44,340 --> 00:04:50,520
The next step in the argument is
to claim that this sum, up
80
00:04:50,520 --> 00:04:54,850
to N of t of omega, over
N of t of omega, as
81
00:04:54,850 --> 00:04:56,220
t approaches infinity--
82
00:04:56,220 --> 00:04:59,290
the argument is that t
approaches infinity--
83
00:04:59,290 --> 00:05:04,500
this N of t of omega goes
through, one by one, a
84
00:05:04,500 --> 00:05:08,250
sequence, 1, 2, 3, 4,
5, and so forth.
85
00:05:08,250 --> 00:05:12,320
So this limit is equal
to that limit.
86
00:05:12,320 --> 00:05:15,100
I've never been able to figure
out whether that's obvious or
87
00:05:15,100 --> 00:05:17,200
not obvious.
88
00:05:17,200 --> 00:05:20,390
It is just on the borderline
between what's obvious and not
89
00:05:20,390 --> 00:05:23,430
obvious, so I'm gonna
prove it to you.
90
00:05:23,430 --> 00:05:29,470
And then the next step is to see
that N of t of omega over
91
00:05:29,470 --> 00:05:35,260
t is equal to 1/X-bar
with probability 1.
92
00:05:35,260 --> 00:05:40,190
And this limit is equal to
Y-bar with probability 1.
93
00:05:40,190 --> 00:05:45,770
The first argument, this equal
to 1/X-bar is because of the
94
00:05:45,770 --> 00:05:47,850
strong law with renewals.
95
00:05:47,850 --> 00:05:51,010
And this one over here is
because of the strong law of
96
00:05:51,010 --> 00:05:52,000
large numbers.
97
00:05:52,000 --> 00:05:54,230
And most of you managed
to get this.
98
00:05:54,230 --> 00:05:57,200
And the whole argument assumes
that X-bar is less than
99
00:05:57,200 --> 00:06:00,730
infinity, and Y-bar is
less than infinity.
100
00:06:00,730 --> 00:06:03,720
Now, how do you go back and
actually see that this
101
00:06:03,720 --> 00:06:04,900
actually makes sense?
102
00:06:04,900 --> 00:06:07,940
And that's what I
want to do next.
103
00:06:07,940 --> 00:06:13,720
And if you look at this, you
can't do it by starting here
104
00:06:13,720 --> 00:06:17,220
and working your way down to
there, because there's no way
105
00:06:17,220 --> 00:06:20,620
you're going to argue that this
limit is equal to this
106
00:06:20,620 --> 00:06:24,200
product of limit unless you know
something about this and
107
00:06:24,200 --> 00:06:25,960
you know something about this.
108
00:06:25,960 --> 00:06:28,340
So you really have to establish
that these things
109
00:06:28,340 --> 00:06:31,020
have limits first before
you can go back
110
00:06:31,020 --> 00:06:32,080
and establish this.
111
00:06:32,080 --> 00:06:35,890
In the same way, you have to
know something about this
112
00:06:35,890 --> 00:06:37,780
before you can establish this.
113
00:06:37,780 --> 00:06:40,840
So what you have to do way after
you've managed to have
114
00:06:40,840 --> 00:06:44,400
the insight to jump the whole
way through this thing, is to
115
00:06:44,400 --> 00:06:48,890
go back and argue each of the
points, but argue them in
116
00:06:48,890 --> 00:06:50,465
reverse order.
117
00:06:50,465 --> 00:06:53,990
And that's very often the way
you do research, and it's
118
00:06:53,990 --> 00:06:56,610
certainly the way
you do quizzes.
119
00:06:56,610 --> 00:07:02,750
So let's see where those
arguments were.
120
00:07:02,750 --> 00:07:07,670
Start out by letting A1 be the
set of omega for which this
121
00:07:07,670 --> 00:07:14,670
limit here, N of t of omega over
t, is equal to 1/X-bar.
122
00:07:14,670 --> 00:07:18,060
By the strong law for renewal
processes, the probability of
123
00:07:18,060 --> 00:07:21,000
A1 equals 1.
124
00:07:21,000 --> 00:07:24,200
This is stating this in a little
cleaner way, I think,
125
00:07:24,200 --> 00:07:28,000
than we stated the strong law
of renewals originally,
126
00:07:28,000 --> 00:07:32,120
because we started out by
tumbling together this
127
00:07:32,120 --> 00:07:34,660
statement and this statement.
128
00:07:34,660 --> 00:07:38,190
I think it's cleaner to say,
start out, there's a set of
129
00:07:38,190 --> 00:07:41,700
omega 1 for which this
limit exists.
130
00:07:41,700 --> 00:07:45,540
And what the strong law says is
that that set of omega has
131
00:07:45,540 --> 00:07:47,000
probability 1.
132
00:07:47,000 --> 00:07:49,710
And now we have some terminology
for A1, what it
133
00:07:49,710 --> 00:07:51,210
actually means.
134
00:07:51,210 --> 00:07:53,830
It is the set of omega
for which this works.
135
00:07:53,830 --> 00:07:56,510
You never have it working
for all omega,
136
00:07:56,510 --> 00:07:58,800
only for some omega.
137
00:07:58,800 --> 00:08:03,730
Then the next step, let A2 be
the set of omega for which the
138
00:08:03,730 --> 00:08:09,400
limit is n goes to infinity of
1/n times the sum of Y sub i
139
00:08:09,400 --> 00:08:12,340
of omega, is equal to Y-bar.
140
00:08:12,340 --> 00:08:18,010
By the strong law of large
numbers, the probability of A2
141
00:08:18,010 --> 00:08:19,260
is equal to 1.
142
00:08:22,660 --> 00:08:26,180
So now we've established there's
an A1, there's an A2.
143
00:08:26,180 --> 00:08:28,390
Each of them have
probability 1.
144
00:08:28,390 --> 00:08:30,170
On A1, one limit exists.
145
00:08:30,170 --> 00:08:32,240
On A2, the other limit exists.
146
00:08:32,240 --> 00:08:34,960
And we have two sets, both
at probability 1.
147
00:08:34,960 --> 00:08:37,799
What's the probability of the
intersection of them?
148
00:08:37,799 --> 00:08:40,909
It has to be 1 also.
149
00:08:40,909 --> 00:08:45,570
So with that, you know that
equation three is equal to
150
00:08:45,570 --> 00:08:49,560
equation four for omega
in the sets, A1, A2.
151
00:08:49,560 --> 00:08:52,290
And also, you know that
the probability of A1,
152
00:08:52,290 --> 00:08:54,950
A2 is equal to 1.
153
00:08:54,950 --> 00:09:01,420
So we've established this part
of the argument down here.
154
00:09:01,420 --> 00:09:04,930
Now we want to go up and
establish this part of the
155
00:09:04,930 --> 00:09:10,540
argument, which as I said, I
can't convince myself that
156
00:09:10,540 --> 00:09:12,930
it's necessary or
not necessary.
157
00:09:12,930 --> 00:09:17,290
But since I can't convince
myself, I thought, in trying
158
00:09:17,290 --> 00:09:20,190
to make up solutions for the
quiz, I ought to actually
159
00:09:20,190 --> 00:09:21,770
write a proof of it.
160
00:09:21,770 --> 00:09:24,680
And I want to show you what the
proof is so that, if it's
161
00:09:24,680 --> 00:09:28,440
not obvious to you, you'll know
exactly how to do it.
162
00:09:28,440 --> 00:09:33,200
And if it is obvious, you can
maybe sort out exactly why
163
00:09:33,200 --> 00:09:34,170
it's obvious.
164
00:09:34,170 --> 00:09:36,900
So this is an epsilon delta
kind of argument.
165
00:09:36,900 --> 00:09:39,680
We assume that omega is in A2.
166
00:09:39,680 --> 00:09:41,590
That's the set for which
the strong law of
167
00:09:41,590 --> 00:09:43,700
large numbers holds.
168
00:09:43,700 --> 00:09:48,360
There exists some integer, n,
which is a function of both
169
00:09:48,360 --> 00:09:49,900
epsilon and omega.
170
00:09:49,900 --> 00:09:51,790
This is the funny thing
about all of
171
00:09:51,790 --> 00:09:53,820
these strong law arguments.
172
00:09:53,820 --> 00:09:56,290
In almost all of them,
you're dealing with
173
00:09:56,290 --> 00:09:58,170
individual sample paths.
174
00:09:58,170 --> 00:10:01,400
When you start saying something
exists as a limit,
175
00:10:01,400 --> 00:10:03,980
you're not saying that it exists
as a limit for the
176
00:10:03,980 --> 00:10:04,980
random variables.
177
00:10:04,980 --> 00:10:07,470
You're saying it exists
as a limit for a
178
00:10:07,470 --> 00:10:10,040
set of sample paths.
179
00:10:10,040 --> 00:10:14,540
And therefore, this epsilon here
that you're gonna choose,
180
00:10:14,540 --> 00:10:25,090
you need some integer there,
such that this minus this is
181
00:10:25,090 --> 00:10:26,010
less than epsilon.
182
00:10:26,010 --> 00:10:28,020
I think I'm going to have
to give up on this.
183
00:10:28,020 --> 00:10:30,880
These things run out of
batteries too quickly.
184
00:10:30,880 --> 00:10:35,200
So we have that this difference
here must be less
185
00:10:35,200 --> 00:10:39,870
than epsilon if n is bigger than
that m of epsilon omega.
186
00:10:39,870 --> 00:10:41,670
That's simply what
a limit means.
187
00:10:41,670 --> 00:10:43,980
That's the definition
of a limit.
188
00:10:43,980 --> 00:10:47,440
The only way to define a limit
sensibly is to say, for all
189
00:10:47,440 --> 00:10:51,000
epsilon greater than 0, no
matter how small the epsilon
190
00:10:51,000 --> 00:10:54,500
is, you can always find an
n big enough that this
191
00:10:54,500 --> 00:10:58,270
difference here is less
than epsilon.
192
00:10:58,270 --> 00:11:04,210
Then if omega is also in A1, the
limit of N of t of omega
193
00:11:04,210 --> 00:11:05,790
has to be equal to infinity.
194
00:11:05,790 --> 00:11:08,640
If you want to, you can just
say, we proved in class that
195
00:11:08,640 --> 00:11:11,820
the limit of t of omega is
equal to infinity with
196
00:11:11,820 --> 00:11:15,000
probability 1, and introduce
another set, A3.
197
00:11:15,000 --> 00:11:16,450
You want to do a
probability 1.
198
00:11:16,450 --> 00:11:18,620
But let's do it this way.
199
00:11:18,620 --> 00:11:21,950
And then there has to be a t,
which is also a function of
200
00:11:21,950 --> 00:11:24,910
epsilon and omega, such
that N of t and
201
00:11:24,910 --> 00:11:27,100
omega is greater than--
202
00:11:27,100 --> 00:11:29,680
that's an integer, by the way.
203
00:11:29,680 --> 00:11:33,060
That's greater than or equal to
m of epsilon of omega for
204
00:11:33,060 --> 00:11:35,680
all t, which is greater
than or equal to t
205
00:11:35,680 --> 00:11:37,660
of epsilon of omega.
206
00:11:37,660 --> 00:11:41,370
That says that this difference
here is less than omega.
207
00:11:41,370 --> 00:11:44,700
And that's true for all epsilon
greater than 0.
208
00:11:44,700 --> 00:11:50,650
And that says that, in fact,
this limit has to exist.
209
00:11:50,650 --> 00:11:55,300
This limit over here is equal
to Y-bar with probability 1.
210
00:11:55,300 --> 00:12:03,770
So that's what we were trying to
prove here, that this limit
211
00:12:03,770 --> 00:12:05,040
is the same as this limit.
212
00:12:05,040 --> 00:12:07,490
So we found out what
this limit is.
213
00:12:07,490 --> 00:12:10,350
We found out that it exists with
probability 1, namely on
214
00:12:10,350 --> 00:12:11,760
the set A2.
215
00:12:11,760 --> 00:12:15,180
This is equal to this, not
necessarily on A2,
216
00:12:15,180 --> 00:12:17,070
but on A1 and A2.
217
00:12:17,070 --> 00:12:18,800
So we got into there.
218
00:12:18,800 --> 00:12:22,820
Now how do we get the fact that
this limit times this
219
00:12:22,820 --> 00:12:24,820
limit is equal to the limit
of [? these. ?]
220
00:12:24,820 --> 00:12:27,400
Now we have a chance of
proceeding, because we've
221
00:12:27,400 --> 00:12:31,520
actually shown that this limit
exists on some set with
222
00:12:31,520 --> 00:12:34,730
probability 1, this limit
exists on some set with
223
00:12:34,730 --> 00:12:36,020
probability 1.
224
00:12:36,020 --> 00:12:41,090
So we can look at that set and
say, for omega in that set,
225
00:12:41,090 --> 00:12:43,560
this limit exists and
this limit exists.
226
00:12:43,560 --> 00:12:47,570
Those limits are non-0 and
they're non-infinite.
227
00:12:47,570 --> 00:12:51,010
The important thing is that
they're non-infinite.
228
00:12:51,010 --> 00:12:57,920
And we move on from there,
and to do that carefully.
229
00:12:57,920 --> 00:13:01,220
And again, I'm not suggesting
that I expect any of you to do
230
00:13:01,220 --> 00:13:02,190
this on the quiz.
231
00:13:02,190 --> 00:13:05,230
I would have been amazed
if you had.
232
00:13:05,230 --> 00:13:09,130
It took me quite a while to sort
it out, because all these
233
00:13:09,130 --> 00:13:10,380
things are tangled together.
234
00:13:14,310 --> 00:13:15,480
Where am I?
235
00:13:15,480 --> 00:13:17,420
I want to be in the
next slide.
236
00:13:17,420 --> 00:13:18,440
Ah, there we go.
237
00:13:18,440 --> 00:13:20,870
Finally, we can interchange the
limit of a product of two
238
00:13:20,870 --> 00:13:21,980
functions--
239
00:13:21,980 --> 00:13:23,970
say, f of t, g of t--
240
00:13:23,970 --> 00:13:25,460
with the product
of the limits.
241
00:13:25,460 --> 00:13:26,820
Can we do that?
242
00:13:26,820 --> 00:13:29,930
If the two functions each have
finite limits, as the
243
00:13:29,930 --> 00:13:34,310
functions of interests do for
omega in A1, A2, then the
244
00:13:34,310 --> 00:13:35,850
answer is yes.
245
00:13:35,850 --> 00:13:39,700
And if you look at any book
on analysis, I'm sure that
246
00:13:39,700 --> 00:13:44,060
theorem is somewhere in the
first couple of chapters.
247
00:13:44,060 --> 00:13:48,050
But anyway, if you're the kind
of person like I am who would
248
00:13:48,050 --> 00:13:51,230
rather sort something out for
yourself rather than look it
249
00:13:51,230 --> 00:13:55,220
up, there's a trick involved
in doing it.
250
00:13:55,220 --> 00:13:57,330
It's this equality right here.
251
00:13:57,330 --> 00:13:59,940
f of t times g of t minus ab.
252
00:13:59,940 --> 00:14:04,010
What you want to do is somehow
make that look like f of t
253
00:14:04,010 --> 00:14:06,490
minus a, which you have
some control over, and
254
00:14:06,490 --> 00:14:08,090
g of t minus b.
255
00:14:08,090 --> 00:14:13,340
So the identity is this is equal
to f of t minus a times
256
00:14:13,340 --> 00:14:15,000
g of t minus b--
257
00:14:15,000 --> 00:14:16,730
we have control over that--
258
00:14:16,730 --> 00:14:22,680
plus a times g of t minus b,
plus b times f of t minus a.
259
00:14:22,680 --> 00:14:25,860
And you multiply and add all
those things together and you
260
00:14:25,860 --> 00:14:28,710
see that that is just
an identity.
261
00:14:28,710 --> 00:14:33,940
And therefore the magnitude of
f of t times g of t minus ab
262
00:14:33,940 --> 00:14:36,280
is less than or equal to this.
263
00:14:36,280 --> 00:14:39,420
And then you go through all the
epsilon delta stuff again.
264
00:14:39,420 --> 00:14:43,416
For any epsilon greater than
0, you choose a t such that
265
00:14:43,416 --> 00:14:46,500
this is less than or equal
to epsilon for t
266
00:14:46,500 --> 00:14:48,560
greater than the t epsilon.
267
00:14:48,560 --> 00:14:51,880
This is less than or equal to
epsilon for t greater than or
268
00:14:51,880 --> 00:14:53,530
equal to t epsilon.
269
00:14:53,530 --> 00:14:55,900
And then this difference here
is less than or equal to
270
00:14:55,900 --> 00:14:58,400
epsilon squared plus this.
271
00:14:58,400 --> 00:15:03,050
And with a little extra fiddling
around, that shows
272
00:15:03,050 --> 00:15:08,710
you have that f of t, g of t
minus ab approaches 0 as t
273
00:15:08,710 --> 00:15:09,310
gets large.
274
00:15:09,310 --> 00:15:11,820
So that's the whole thing.
275
00:15:11,820 --> 00:15:15,290
Now, let me reemphasize
again, I did not
276
00:15:15,290 --> 00:15:17,650
expect you to do that.
277
00:15:17,650 --> 00:15:20,970
I did not expect you to know how
to do analysis arguments
278
00:15:20,970 --> 00:15:22,890
like that, because
analysis is not a
279
00:15:22,890 --> 00:15:26,000
prerequisite for the course.
280
00:15:26,000 --> 00:15:28,770
I do want to show you that the
kinds of things we've been
281
00:15:28,770 --> 00:15:31,520
doing are not, in fact,
impossible.
282
00:15:31,520 --> 00:15:34,320
If you trace them out from
beginning to end and put in
283
00:15:34,320 --> 00:15:38,180
every little detail in them.
284
00:15:38,180 --> 00:15:40,700
If you have to go through
these kinds of arguments
285
00:15:40,700 --> 00:15:44,220
again, you will in fact know
how to make it precise and
286
00:15:44,220 --> 00:15:48,000
know how to put all
those details in.
287
00:15:48,000 --> 00:15:52,700
Let's go back to countable
state Markov chains.
288
00:15:52,700 --> 00:15:56,510
As we've said, two states are
in the same class as they
289
00:15:56,510 --> 00:15:57,780
communicate.
290
00:15:57,780 --> 00:16:01,410
It's the same definition as
for finite state chains.
291
00:16:01,410 --> 00:16:05,110
They communicate if there's some
path by which you can get
292
00:16:05,110 --> 00:16:07,940
from i to j, and there's some
path from which you can get
293
00:16:07,940 --> 00:16:09,100
from j to i.
294
00:16:09,100 --> 00:16:14,850
And you can't get there in one
step only, but you can get
295
00:16:14,850 --> 00:16:16,710
there in some finite
number of steps.
296
00:16:16,710 --> 00:16:21,950
So that's the definition of
two states communicating.
297
00:16:21,950 --> 00:16:26,020
The theorem that we sort of
proved last time is that all
298
00:16:26,020 --> 00:16:27,930
states in the same class are
299
00:16:27,930 --> 00:16:30,130
recurrent or all are transient.
300
00:16:30,130 --> 00:16:33,190
That's the same as the theorem
we have for finite state
301
00:16:33,190 --> 00:16:35,060
Markov chains.
302
00:16:35,060 --> 00:16:37,970
It's just a little hard
to establish here.
303
00:16:37,970 --> 00:16:44,010
The argument is that you assume
that j is recurrent.
304
00:16:44,010 --> 00:16:47,510
If j is recurrent,
then the sum has
305
00:16:47,510 --> 00:16:49,420
to be equal to infinity.
306
00:16:49,420 --> 00:16:50,940
How do you interpret
that sum there?
307
00:16:50,940 --> 00:16:53,780
What is it?
308
00:16:53,780 --> 00:17:03,050
P sub jj, super n, is the
probability that you will be
309
00:17:03,050 --> 00:17:06,540
in state j at time n given
that you're in
310
00:17:06,540 --> 00:17:08,510
state j at time 0.
311
00:17:08,510 --> 00:17:12,069
So what we're doing is we're
starting out in time 0.
312
00:17:12,069 --> 00:17:16,450
This quantity here is the
probability that we'll be in
313
00:17:16,450 --> 00:17:19,569
state j at time n.
314
00:17:19,569 --> 00:17:24,900
Since you either are or you're
not, since this is also equal
315
00:17:24,900 --> 00:17:30,830
to the expected value of state
j at time n, given
316
00:17:30,830 --> 00:17:32,770
state j at time 0.
317
00:17:32,770 --> 00:17:36,010
So when you add all these up,
you're adding expectations.
318
00:17:36,010 --> 00:17:39,960
So this quantity here is simply
the expected number of
319
00:17:39,960 --> 00:17:46,340
recurrences to state j from time
1 up to time infinity.
320
00:17:46,340 --> 00:17:49,410
And that number of recurrences
is equal to infinity.
321
00:17:49,410 --> 00:17:53,360
You remember we argued last time
that the probability of
322
00:17:53,360 --> 00:17:57,380
one recurrence had to be equal
to 1 if it was recurrent.
323
00:17:57,380 --> 00:18:01,400
If you got back to j once in
finite time, you're going to
324
00:18:01,400 --> 00:18:03,680
get back again in a
finite time again.
325
00:18:03,680 --> 00:18:05,860
You're going to get back
again in finite time.
326
00:18:05,860 --> 00:18:08,950
It might take a very, very long
time, but it's finite,
327
00:18:08,950 --> 00:18:12,290
and you have an infinite number
of returns as time goes
328
00:18:12,290 --> 00:18:14,470
to infinity.
329
00:18:14,470 --> 00:18:19,390
So that is the consequence
of j being recurrent.
330
00:18:19,390 --> 00:18:24,220
For any i such that j and i
communicate, there's some path
331
00:18:24,220 --> 00:18:28,000
at some length m such that the
probability of going from
332
00:18:28,000 --> 00:18:32,210
state i to state j in m steps
is greater than 0.
333
00:18:32,210 --> 00:18:35,070
That's by meaning
of communicate.
334
00:18:35,070 --> 00:18:39,394
And there's some m and
some pji of l.
335
00:18:42,180 --> 00:18:44,260
Oh, for [? some m. ?]
336
00:18:44,260 --> 00:18:48,810
And there's some way of getting
back from j to i.
337
00:18:48,810 --> 00:18:52,680
So what you're doing is going
from state i to state j, and
338
00:18:52,680 --> 00:18:54,780
there is some path
for doing that.
339
00:18:54,780 --> 00:18:58,440
You're wobbling around,
returning to state j,
340
00:18:58,440 --> 00:19:00,580
returning to state j,
maybe returning to
341
00:19:00,580 --> 00:19:02,270
state i along the way.
342
00:19:02,270 --> 00:19:03,550
That's part of it.
343
00:19:03,550 --> 00:19:07,960
And eventually there's
some path for going
344
00:19:07,960 --> 00:19:11,480
from j back to i again.
345
00:19:11,480 --> 00:19:16,360
So this sum here is greater
than or equal.
346
00:19:16,360 --> 00:19:19,745
And now all I'm doing is summing
up the paths which in
347
00:19:19,745 --> 00:19:24,010
m steps go from i to j, and
those paths which in the final
348
00:19:24,010 --> 00:19:26,280
l steps go from j back to i.
349
00:19:26,280 --> 00:19:29,390
And they do whatever they
want to in between.
350
00:19:29,390 --> 00:19:33,830
So I'm summing over the number
of times they are in between.
351
00:19:33,830 --> 00:19:38,750
And this sum here is
summing over pjjk.
352
00:19:38,750 --> 00:19:43,680
And that sum is infinite,
so this sum is infinite.
353
00:19:43,680 --> 00:19:49,570
So that shows that if j is
recurrent, then i is recurrent
354
00:19:49,570 --> 00:19:52,760
also for any i in
the same class.
355
00:19:52,760 --> 00:19:57,870
And you can do the same thing
reversing i and j, obviously.
356
00:19:57,870 --> 00:20:00,580
And if that's true for all
classes that are very
357
00:20:00,580 --> 00:20:03,910
recurrent, all law classes that
are transient have to be
358
00:20:03,910 --> 00:20:06,460
in the same class also, because
a state is either
359
00:20:06,460 --> 00:20:07,710
transient or it's recurrent.
360
00:20:10,310 --> 00:20:12,990
If a state j is recurrent,
then the
361
00:20:12,990 --> 00:20:16,840
recurrence time, T sub jj.
362
00:20:16,840 --> 00:20:21,800
When you read this chapter or
read my notes, I apologize
363
00:20:21,800 --> 00:20:25,260
because there's a huge
confusion here.
364
00:20:25,260 --> 00:20:28,290
And the confusion comes from
the fact that there's an
365
00:20:28,290 --> 00:20:30,790
extraordinary amount
of notation here.
366
00:20:30,790 --> 00:20:33,290
We're dealing with all the
notation of finite-state
367
00:20:33,290 --> 00:20:34,460
Markov chains.
368
00:20:34,460 --> 00:20:38,350
We're dealing with all the
notation of renewal processes.
369
00:20:38,350 --> 00:20:41,380
And we're jumping back and forth
between theorems for one
370
00:20:41,380 --> 00:20:42,980
and theorems for the other.
371
00:20:42,980 --> 00:20:45,560
And then we're inventing
a lot of new notation.
372
00:20:45,560 --> 00:20:50,100
And I have to rewrite
that section.
373
00:20:50,100 --> 00:20:53,330
But anyway, the results are all
correct as far as I know.
374
00:20:58,710 --> 00:21:01,000
I mean, all of you can remember
notation much better
375
00:21:01,000 --> 00:21:01,860
than I can.
376
00:21:01,860 --> 00:21:04,450
So if I can remember this
notation, you can also.
377
00:21:04,450 --> 00:21:05,830
Let me put it that way.
378
00:21:05,830 --> 00:21:07,900
So I can't feel too
sorry for you.
379
00:21:07,900 --> 00:21:10,780
I want to rewrite it because I'm
feeling sorry for myself
380
00:21:10,780 --> 00:21:14,290
after every year I go through
this and try to re-understand
381
00:21:14,290 --> 00:21:17,070
it again, and I find it
very hard to do it.
382
00:21:17,070 --> 00:21:20,190
So I'm going to rewrite
it and get rid of
383
00:21:20,190 --> 00:21:23,120
some of that notation.
384
00:21:23,120 --> 00:21:27,070
We've already seen that if you
have a chain like this, which
385
00:21:27,070 --> 00:21:30,230
is simply the Markov chain
corresponding to Bernoulli
386
00:21:30,230 --> 00:21:35,120
trials, if it's Bernoulli trials
with p equals 1/2, you
387
00:21:35,120 --> 00:21:37,530
move up a probability
1/2, you move down
388
00:21:37,530 --> 00:21:39,250
with probability 1/2.
389
00:21:39,250 --> 00:21:42,280
As we said, you eventually
disperse.
390
00:21:42,280 --> 00:21:44,960
And as you disperse, the
probability of being in any
391
00:21:44,960 --> 00:21:48,760
one of these states goes to 0.
392
00:21:48,760 --> 00:21:55,800
And what that means is that the
individual probabilities
393
00:21:55,800 --> 00:21:58,260
of the states is going to 0.
394
00:21:58,260 --> 00:22:02,340
You can also see, not so easily,
that you're eventually
395
00:22:02,340 --> 00:22:05,560
going to return to each state
with probability 1.
396
00:22:05,560 --> 00:22:08,650
And I'm sorry I didn't give
that definition first.
397
00:22:08,650 --> 00:22:10,260
We gave it last time.
398
00:22:10,260 --> 00:22:14,000
If the expected value of the
renewal time is less than
399
00:22:14,000 --> 00:22:16,780
infinity, then j is positive
recurrent.
400
00:22:19,380 --> 00:22:24,260
If T sub jj, the recurrence
time, is a random variable but
401
00:22:24,260 --> 00:22:28,460
it has infinite expectation,
then j is not recurrent.
402
00:22:28,460 --> 00:22:32,160
And finally, if none of those
things happen, j is transient.
403
00:22:32,160 --> 00:22:34,820
So that we went through
last time.
404
00:22:34,820 --> 00:22:40,880
And for p equals 1/2, and in
both of these situations, the
405
00:22:40,880 --> 00:22:45,270
probability of being in any
state is going to 0.
406
00:22:45,270 --> 00:22:50,460
The expected time of returning
is going to infinity.
407
00:22:50,460 --> 00:22:55,530
But with probability 1, you
will return eventually.
408
00:22:55,530 --> 00:22:59,880
So in both of these cases, these
are both examples of no
409
00:22:59,880 --> 00:23:01,130
recurrence.
410
00:23:03,850 --> 00:23:11,220
Let's say more about positive
recurrence and no recurrence.
411
00:23:11,220 --> 00:23:16,720
Suppose, first, that i and j
are both recurrent and they
412
00:23:16,720 --> 00:23:18,400
both communicate with
each other.
413
00:23:18,400 --> 00:23:21,660
In other words, there's a path
from i to j, there's a path
414
00:23:21,660 --> 00:23:24,260
from j to i.
415
00:23:24,260 --> 00:23:28,310
And I want to look at
the renewal process
416
00:23:28,310 --> 00:23:30,620
of returns to j.
417
00:23:30,620 --> 00:23:34,440
You've sorted out by now, I
think, that recurrence means
418
00:23:34,440 --> 00:23:37,070
exactly what you
think it means.
419
00:23:37,070 --> 00:23:42,110
A recurrence means, starting
from a state j, there's a
420
00:23:42,110 --> 00:23:46,570
recurrence to j if eventually
you come back to j.
421
00:23:46,570 --> 00:23:49,760
And this random variable, the
recurrence of random variable,
422
00:23:49,760 --> 00:23:53,290
is the amount of time it takes
you to get back to j once
423
00:23:53,290 --> 00:23:54,150
you've been in j.
424
00:23:54,150 --> 00:23:57,730
That's a random variable.
425
00:23:57,730 --> 00:24:03,070
So let's look at the renewal
process, starting in j, of
426
00:24:03,070 --> 00:24:05,430
returning to j eventually.
427
00:24:05,430 --> 00:24:07,990
This is one of the
things that makes
428
00:24:07,990 --> 00:24:09,910
this whole study awkward.
429
00:24:09,910 --> 00:24:15,020
We have renewal processes when
we start at j and we bob back
430
00:24:15,020 --> 00:24:17,480
to j at various periods
of time.
431
00:24:17,480 --> 00:24:21,390
If we start in i and we're
interested in returns to j,
432
00:24:21,390 --> 00:24:25,590
then we have something called
a delayed renewal process.
433
00:24:25,590 --> 00:24:28,710
All the theorems about
renewals apply there.
434
00:24:28,710 --> 00:24:31,235
It's a little harder to
see what's going on.
435
00:24:31,235 --> 00:24:33,890
It's in the end of
chapter four.
436
00:24:33,890 --> 00:24:36,970
You should have read it,
at least quickly.
437
00:24:36,970 --> 00:24:40,870
But we're going to avoid those
theorems and instead go
438
00:24:40,870 --> 00:24:44,080
directly using the theorems
of renewal processes.
439
00:24:44,080 --> 00:24:48,180
But there's still places where
the transitions are awkward.
440
00:24:48,180 --> 00:24:50,850
So I can warn you about that.
441
00:24:50,850 --> 00:24:56,690
But the renewal reward theorem,
if I look at this
442
00:24:56,690 --> 00:25:00,690
renewal process, I get a
renewal every time I
443
00:25:00,690 --> 00:25:03,160
return to state j.
444
00:25:03,160 --> 00:25:07,480
But in that renewal process of
returns to state j, what I'm
445
00:25:07,480 --> 00:25:11,980
really interested in is returns
to state i, because
446
00:25:11,980 --> 00:25:15,590
what I'm trying to do here is
relate how often do you go to
447
00:25:15,590 --> 00:25:19,030
state i with how often
do you go to state j?
448
00:25:19,030 --> 00:25:21,560
So we have a little bit of a
symmetry in it, because we're
449
00:25:21,560 --> 00:25:23,880
starting in state j,
because that gives
450
00:25:23,880 --> 00:25:25,320
us a renewal process.
451
00:25:25,320 --> 00:25:28,780
But now we have this renewal
reward process, where we give
452
00:25:28,780 --> 00:25:34,260
ourselves a reward of 1 every
time we hit state i.
453
00:25:34,260 --> 00:25:38,200
And we have a renewal every
time we hit state j.
454
00:25:38,200 --> 00:25:40,250
So how does that renewal
process work?
455
00:25:40,250 --> 00:25:43,810
Well, it's a renewal process
just like every other one
456
00:25:43,810 --> 00:25:45,210
we've studied.
457
00:25:45,210 --> 00:25:50,200
It has this peculiar feature
here that is a discrete time
458
00:25:50,200 --> 00:25:52,000
renewal process.
459
00:25:52,000 --> 00:25:55,930
And with discrete time renewal
processes, as we've seen, you
460
00:25:55,930 --> 00:26:00,070
can save yourself a lot of
aggravation by only looking at
461
00:26:00,070 --> 00:26:01,230
these discrete times.
462
00:26:01,230 --> 00:26:04,690
Namely, you only look
at integer times.
463
00:26:04,690 --> 00:26:07,940
And now when you only look
at integer times--
464
00:26:07,940 --> 00:26:10,840
well, whether you look at
integer times or not--
465
00:26:10,840 --> 00:26:14,340
this is the fundamental theorem
of renewal rewards.
466
00:26:14,340 --> 00:26:18,100
If you look at the limit as t
goes to infinity, there's the
467
00:26:18,100 --> 00:26:20,420
integral of the rewards
you pick up.
468
00:26:20,420 --> 00:26:23,810
For this discrete case, this
is just a summation of the
469
00:26:23,810 --> 00:26:25,550
rewards that you get.
470
00:26:25,550 --> 00:26:28,950
This summation here by the
theorem is equal to the
471
00:26:28,950 --> 00:26:33,140
expected number of rewards
within one renewal period.
472
00:26:33,140 --> 00:26:37,570
Namely, this is the expected
number of recurrences of state
473
00:26:37,570 --> 00:26:40,100
i per state j.
474
00:26:40,100 --> 00:26:46,090
So in between each occurrence
of state j, what's the
475
00:26:46,090 --> 00:26:48,520
expected number of
i's that I hit?
476
00:26:48,520 --> 00:26:50,330
And that's the number
that it is.
477
00:26:50,330 --> 00:26:53,510
We don't know what that number
is, but we could calculate it
478
00:26:53,510 --> 00:26:54,540
if we wanted to.
479
00:26:54,540 --> 00:26:57,490
It's not a limit of anything.
480
00:26:57,490 --> 00:27:01,580
Well, it's a sort of a limit,
but not very much of a limit
481
00:27:01,580 --> 00:27:03,110
that's well defined.
482
00:27:03,110 --> 00:27:06,150
And the theorem says that this
integral is equal to that
483
00:27:06,150 --> 00:27:10,040
expected value divided by
the expected recurrence
484
00:27:10,040 --> 00:27:13,290
time of T sub jj.
485
00:27:13,290 --> 00:27:18,160
Now, we argue that this is,
in fact, the number of
486
00:27:18,160 --> 00:27:20,210
occurrences of state i.
487
00:27:20,210 --> 00:27:21,980
So it's in the limit.
488
00:27:21,980 --> 00:27:28,620
It's 1 over the expected
value of the recurrence
489
00:27:28,620 --> 00:27:30,580
time to state i.
490
00:27:30,580 --> 00:27:35,180
So what this says is the 1 over
the recurrence time to
491
00:27:35,180 --> 00:27:39,320
state i is equal to the expected
number of recurrences
492
00:27:39,320 --> 00:27:43,410
to state i per state j divided
by the expected
493
00:27:43,410 --> 00:27:45,170
time in state j.
494
00:27:45,170 --> 00:27:49,070
If you think about that for a
minute, it's something which
495
00:27:49,070 --> 00:27:51,400
is intuitively obvious.
496
00:27:51,400 --> 00:27:55,060
I mean, you look at this long
sequences of things.
497
00:27:55,060 --> 00:27:57,780
You keep hitting j's every
once in a while.
498
00:27:57,780 --> 00:28:00,760
And then what you do,
is you count all of
499
00:28:00,760 --> 00:28:02,620
the i's that occur.
500
00:28:02,620 --> 00:28:05,620
So now looking at it this way,
you're going to count the
501
00:28:05,620 --> 00:28:09,340
number of i's that occur
in between each j.
502
00:28:09,340 --> 00:28:11,450
You can't have a simultaneous
i and j.
503
00:28:11,450 --> 00:28:12,960
The state is o or j.
504
00:28:16,010 --> 00:28:19,250
So for each recurrence period,
you count the number of i's
505
00:28:19,250 --> 00:28:20,250
that occur.
506
00:28:20,250 --> 00:28:24,640
And what this is then saying,
is the expected time between
507
00:28:24,640 --> 00:28:30,920
i's is equal to the expected
time between j's divided by--
508
00:28:30,920 --> 00:28:34,200
if I turn this equation upside
down, the expected time
509
00:28:34,200 --> 00:28:38,090
between i's is equal to the
expected time between j's
510
00:28:38,090 --> 00:28:43,170
divided by the expected
number of i's per j.
511
00:28:43,170 --> 00:28:45,610
What else would you expect?
512
00:28:45,610 --> 00:28:48,390
It has to be that way, right?
513
00:28:48,390 --> 00:28:50,890
But this says that it
indeed, is that way.
514
00:28:50,890 --> 00:28:54,100
Mathematics is sometimes
confusing with countable-state
515
00:28:54,100 --> 00:28:58,050
chains as we've seen.
516
00:28:58,050 --> 00:29:04,890
OK, so the theorem then says for
i and j recurrent, either
517
00:29:04,890 --> 00:29:09,240
both are positive-recurrent or
both are null-recurrent.
518
00:29:09,240 --> 00:29:11,610
So this is adding to the
theorem we had earlier.
519
00:29:11,610 --> 00:29:18,330
The theorem we had earlier says
that all states within a
520
00:29:18,330 --> 00:29:22,270
class are either recurrent
or they're transient.
521
00:29:22,270 --> 00:29:25,860
This now divides the ones that
are recurrent into two
522
00:29:25,860 --> 00:29:29,020
subsets, those that are
null-recurrent and those that
523
00:29:29,020 --> 00:29:30,630
are positive-recurrent.
524
00:29:30,630 --> 00:29:34,940
It says that for states within
a class, either all of them
525
00:29:34,940 --> 00:29:39,130
are recurrent or all
of them are--
526
00:29:39,130 --> 00:29:41,530
all of them are
positive-recurrent or all of
527
00:29:41,530 --> 00:29:45,290
them are null-recurrent.
528
00:29:45,290 --> 00:29:49,610
And this theorem shows it
because this theorem says
529
00:29:49,610 --> 00:29:53,980
there has to be an expected
number of occurrences of state
530
00:29:53,980 --> 00:29:56,790
i between each occurrence
of state j.
531
00:29:56,790 --> 00:29:57,810
Why is that?
532
00:29:57,810 --> 00:30:01,340
Because there has to be
a path from i to j.
533
00:30:01,340 --> 00:30:03,710
And there has to be a path that
doesn't go through i.
534
00:30:03,710 --> 00:30:07,890
Because if you have a path from
i that goes back to i and
535
00:30:07,890 --> 00:30:12,070
then off to j, there's also
this path from i to j.
536
00:30:12,070 --> 00:30:14,550
So there's a path from i to j.
537
00:30:14,550 --> 00:30:18,450
There's a path from j to i that
does not go through i.
538
00:30:18,450 --> 00:30:22,690
That has positive probability
because paths are only defined
539
00:30:22,690 --> 00:30:26,300
over transitions with positive
probability.
540
00:30:26,300 --> 00:30:30,990
So this quantity is always
positive if you're talking
541
00:30:30,990 --> 00:30:33,520
about two states in
the same class.
542
00:30:33,520 --> 00:30:39,050
So what this relationship says,
along with the fact that
543
00:30:39,050 --> 00:30:42,430
it's a very nice and convenient
relationship--
544
00:30:42,430 --> 00:30:44,590
I almost put it in the
quiz for finite
545
00:30:44,590 --> 00:30:46,570
state and Markov chains.
546
00:30:46,570 --> 00:30:49,860
And you cam be happy I didn't,
because proving it takes a
547
00:30:49,860 --> 00:30:56,100
little more agility than
what one might
548
00:30:56,100 --> 00:30:57,350
expect at this point.
549
00:31:00,500 --> 00:31:03,660
The theorem then says that if i
and j are recurrent, either
550
00:31:03,660 --> 00:31:07,290
both are positive-recurrent or
both are null-recurrent, what
551
00:31:07,290 --> 00:31:11,040
the overall theorem then says
is that for every class of
552
00:31:11,040 --> 00:31:15,270
states, either all of them are
transient, all of them are
553
00:31:15,270 --> 00:31:18,540
null-recurrent, or all of them
are positive-recurrent.
554
00:31:18,540 --> 00:31:21,550
And that's sort of a convenient
relationship.
555
00:31:21,550 --> 00:31:25,790
You can't have some states
that you never get to.
556
00:31:25,790 --> 00:31:29,810
Or you only get to with an
infinite recurrence time in a
557
00:31:29,810 --> 00:31:33,500
class and others that you keep
coming back to all the time.
558
00:31:33,500 --> 00:31:37,100
If there's a path from one to
the other, then they have to
559
00:31:37,100 --> 00:31:38,960
work the same way.
560
00:31:38,960 --> 00:31:41,330
This sort of makes it
obvious why that is.
561
00:31:47,410 --> 00:31:51,264
And this is too sensitive.
562
00:31:51,264 --> 00:31:52,590
OK.
563
00:31:52,590 --> 00:31:57,010
OK , so now we want to look
at steady state for
564
00:31:57,010 --> 00:31:58,490
positive-recurrent chain.
565
00:31:58,490 --> 00:32:00,780
Do you remember that when we
looked at finite state in
566
00:32:00,780 --> 00:32:04,910
Markov chains, we did all this
classification stuff, and then
567
00:32:04,910 --> 00:32:08,440
we went into all this
matrix stuff?
568
00:32:08,440 --> 00:32:12,370
And the outcome of the matrix
stuff, the most important
569
00:32:12,370 --> 00:32:15,980
things, were that there
is a steady state.
570
00:32:15,980 --> 00:32:21,130
There's always a set of
probabilities such that if you
571
00:32:21,130 --> 00:32:24,280
start the chain in those
probabilities, the chain stays
572
00:32:24,280 --> 00:32:25,910
in those probabilities.
573
00:32:25,910 --> 00:32:33,590
There's always a set of pi sub
i's, which are probabilities.
574
00:32:33,590 --> 00:32:36,020
They all sum to 1.
575
00:32:36,020 --> 00:32:39,020
They're all non-negative.
576
00:32:39,020 --> 00:32:42,390
And each of them satisfy the
relationship, the probability
577
00:32:42,390 --> 00:32:47,220
that you're in state j at time
t is equal to the probability
578
00:32:47,220 --> 00:32:50,470
that you're in state i at
time t minus 1 times the
579
00:32:50,470 --> 00:32:53,160
probability of going
from state i to j.
580
00:32:53,160 --> 00:32:57,780
This is completely familiar
from finite-state chains.
581
00:32:57,780 --> 00:33:00,060
And this is exactly
the same for
582
00:33:00,060 --> 00:33:02,090
countable-state and Markov chains.
583
00:33:02,090 --> 00:33:07,140
The only question is, it's now
not at all sure that that
584
00:33:07,140 --> 00:33:09,820
equation has a solution
anymore.
585
00:33:09,820 --> 00:33:12,490
And unfortunately, you can't
use matrix theory to prove
586
00:33:12,490 --> 00:33:14,200
that it has a solution.
587
00:33:14,200 --> 00:33:17,830
So we have to find some
other way of doing it.
588
00:33:17,830 --> 00:33:21,750
So we look in our toolbox,
which we developed
589
00:33:21,750 --> 00:33:23,230
throughout the term.
590
00:33:23,230 --> 00:33:25,450
And there's only one obvious
thing to try, and
591
00:33:25,450 --> 00:33:26,970
it's renewal theory.
592
00:33:26,970 --> 00:33:30,860
So we use renewal theory.
593
00:33:30,860 --> 00:33:33,750
We then want to have one other
definition, which you'll see
594
00:33:33,750 --> 00:33:37,330
throughout the rest of the term
and every time you start
595
00:33:37,330 --> 00:33:39,460
reading about Markov chains.
596
00:33:39,460 --> 00:33:43,160
When you read about Markov
chains in queuing kinds of
597
00:33:43,160 --> 00:33:46,900
situations, which are the kinds
of things that occur all
598
00:33:46,900 --> 00:33:50,490
over the place, almost all of
those Markov chains are
599
00:33:50,490 --> 00:33:53,430
countable-state Markov chains.
600
00:33:53,430 --> 00:33:57,860
And therefore, you need a
convenient word to talk about
601
00:33:57,860 --> 00:34:02,870
a class of states where all of
the states in that class
602
00:34:02,870 --> 00:34:04,540
communicate with each other.
603
00:34:04,540 --> 00:34:07,870
And irreducible is the
definition that we use.
604
00:34:07,870 --> 00:34:12,010
An irreducible Markov chain is
a Markov chain in which all
605
00:34:12,010 --> 00:34:15,500
pairs of states communicate
with each other.
606
00:34:15,500 --> 00:34:19,440
And before, when we were talking
about finite-state
607
00:34:19,440 --> 00:34:24,030
Markov chains, if all states
communicated with each other,
608
00:34:24,030 --> 00:34:25,170
then they were are recurrent.
609
00:34:25,170 --> 00:34:28,719
You had a recurrent Markov
chain, end of story.
610
00:34:28,719 --> 00:34:32,690
Now we've seen that you can have
a Markov chain where all
611
00:34:32,690 --> 00:34:35,130
the states communicate
with each other.
612
00:34:35,130 --> 00:34:38,980
We just had these
two examples--
613
00:34:38,980 --> 00:34:41,830
these two examples here
where they all
614
00:34:41,830 --> 00:34:43,449
communicate with each other.
615
00:34:43,449 --> 00:34:47,980
But depending on what p and q
are, they're either transition
616
00:34:47,980 --> 00:34:52,440
transient, or they're
positive-recurrent, or they're
617
00:34:52,440 --> 00:34:53,070
null-recurrent.
618
00:34:53,070 --> 00:34:57,150
The first one can't even be
positive-recurrent, but it can
619
00:34:57,150 --> 00:34:58,660
be recurrent.
620
00:34:58,660 --> 00:35:04,940
And the bottom one can also
be positive-recurrent.
621
00:35:04,940 --> 00:35:10,180
So any Markov chain where all
the states communicate with
622
00:35:10,180 --> 00:35:10,830
each other--
623
00:35:10,830 --> 00:35:13,490
there's a path from everything
to everything else--
624
00:35:13,490 --> 00:35:18,650
which is the usual situation,
is called an irreducible
625
00:35:18,650 --> 00:35:20,110
Markov chain.
626
00:35:20,110 --> 00:35:23,680
An irreducible can now be
positive-recurrent,
627
00:35:23,680 --> 00:35:25,400
null-recurrent, or transient.
628
00:35:25,400 --> 00:35:29,170
All the states in an irreducible
Markov chain have
629
00:35:29,170 --> 00:35:32,610
to be transient, or all
of them have to be
630
00:35:32,610 --> 00:35:35,160
positive-recurrent, or all
of them have to be
631
00:35:35,160 --> 00:35:36,910
null-recurrent.
632
00:35:36,910 --> 00:35:39,310
You can't share these
qualities over
633
00:35:39,310 --> 00:35:41,340
an irreducible chain.
634
00:35:41,340 --> 00:35:43,290
That's what this last
theorem just said.
635
00:35:47,750 --> 00:35:52,510
OK, so if a steady
state exists--
636
00:35:52,510 --> 00:35:56,560
namely if the solution to those
equations exist, and if
637
00:35:56,560 --> 00:36:01,180
the probability that X sub 0
equals i is equal to pi i.
638
00:36:01,180 --> 00:36:05,230
And incidentally, in the version
that got handed out,
639
00:36:05,230 --> 00:36:08,910
that equation there was
a little bit garbled.
640
00:36:08,910 --> 00:36:10,090
That one.
641
00:36:10,090 --> 00:36:14,510
Said the probability that X sub
0 was equal to pi i, which
642
00:36:14,510 --> 00:36:17,180
doesn't make any sense.
643
00:36:17,180 --> 00:36:20,170
If a steady-state exists
and you start out in
644
00:36:20,170 --> 00:36:21,090
steady-state--
645
00:36:21,090 --> 00:36:26,560
namely, the starting state X
sub 0 is in state i with
646
00:36:26,560 --> 00:36:30,550
probability pi sub i by for
every i, this is the same
647
00:36:30,550 --> 00:36:31,410
trick we played for
648
00:36:31,410 --> 00:36:33,440
finite-state and Markov chains.
649
00:36:33,440 --> 00:36:36,980
As we go through this, I will
try to explain what's the same
650
00:36:36,980 --> 00:36:38,170
and what's different.
651
00:36:38,170 --> 00:36:39,640
And this is completely
the same.
652
00:36:39,640 --> 00:36:41,875
So there's nothing new here.
653
00:36:45,420 --> 00:36:49,660
Then, this situation of being in
steady-state persists from
654
00:36:49,660 --> 00:36:52,350
one unit of time to the next.
655
00:36:52,350 --> 00:36:55,500
Namely, if you start out in
steady-state, then the
656
00:36:55,500 --> 00:37:05,110
probability that X sub 1 is
equal to j is equal to the sum
657
00:37:05,110 --> 00:37:07,280
over i of pi sub i.
658
00:37:07,280 --> 00:37:10,590
That's the probability that
X sub 0 is equal to i.
659
00:37:10,590 --> 00:37:14,440
Times P sub i j, which by the
steady-state equations, is
660
00:37:14,440 --> 00:37:15,920
equal to pi sub j.
661
00:37:15,920 --> 00:37:18,180
So you start out in
steady-state.
662
00:37:18,180 --> 00:37:21,790
After one transition, you're
in steady-state again.
663
00:37:21,790 --> 00:37:24,050
You're in steady-state
at time 1.
664
00:37:24,050 --> 00:37:26,140
Guess what, you're
in state time 2.
665
00:37:26,140 --> 00:37:27,750
You're in steady-state again.
666
00:37:27,750 --> 00:37:31,160
And you stay in steady-state
forever.
667
00:37:31,160 --> 00:37:35,990
So when you iterate, the
probability that you're in
668
00:37:35,990 --> 00:37:40,440
state j at time X sub n is
equal to pi sub j also.
669
00:37:40,440 --> 00:37:44,130
This is assuming that you
started out in steady-state.
670
00:37:44,130 --> 00:37:47,290
So again, we need some
new notation here.
671
00:37:47,290 --> 00:37:53,760
Let's let N sub j of tilde be
the number of visits to j in
672
00:37:53,760 --> 00:37:58,020
the period 0 to t starting
in steady-state.
673
00:37:58,020 --> 00:38:03,820
Namely, if you start in state j,
we get a renewal process to
674
00:38:03,820 --> 00:38:06,660
talk about the returns
to state j.
675
00:38:06,660 --> 00:38:11,830
If we start in steady-state,
then this first return to
676
00:38:11,830 --> 00:38:21,210
state j is going to have a
different set of probabilities
677
00:38:21,210 --> 00:38:24,120
than all subsequent returns
to state j.
678
00:38:24,120 --> 00:38:34,880
So N sub j of t, tilde is now
not a renewal process, but a
679
00:38:34,880 --> 00:38:37,390
delayed renewal process.
680
00:38:37,390 --> 00:38:39,660
So we have to deal with it
a little bit differently.
681
00:38:39,660 --> 00:38:43,610
But it's a very nice thing
because for all t, the
682
00:38:43,610 --> 00:38:50,620
expected number of returns to
state j over t transitions is
683
00:38:50,620 --> 00:38:53,360
equal to n times pi sub j.
684
00:38:53,360 --> 00:38:59,280
Pi sub j is the probability that
you will be in state j at
685
00:38:59,280 --> 00:39:00,920
any time n.
686
00:39:00,920 --> 00:39:03,050
And it stays the same
for every n.
687
00:39:03,050 --> 00:39:07,530
So if we look at the expected
number of times we hit state
688
00:39:07,530 --> 00:39:11,750
j, it's exactly equal
to n times pi sub j.
689
00:39:11,750 --> 00:39:14,470
And again, here's this
awkward thing about
690
00:39:14,470 --> 00:39:16,510
renewals and Markov.
691
00:39:16,510 --> 00:39:17,410
Yes?
692
00:39:17,410 --> 00:39:20,110
AUDIENCE: So is that sort of
like an ensemble average--
693
00:39:20,110 --> 00:39:20,560
PROFESSOR: Yes.
694
00:39:20,560 --> 00:39:22,520
AUDIENCE: Or is the time
average [INAUDIBLE]?
695
00:39:22,520 --> 00:39:24,000
PROFESSOR: Well, it's an
ensemble average and it's a
696
00:39:24,000 --> 00:39:25,970
time average.
697
00:39:25,970 --> 00:39:28,330
But the thing we're working
with here is the
698
00:39:28,330 --> 00:39:30,150
fact there's a time--
699
00:39:30,150 --> 00:39:33,370
is the fact that it's an
ensemble average, yes.
700
00:39:33,370 --> 00:39:35,790
But it's convenient
because it's an
701
00:39:35,790 --> 00:39:38,290
exact ensemble average.
702
00:39:38,290 --> 00:39:42,450
Usually, with renewal processes,
things are ugly
703
00:39:42,450 --> 00:39:45,970
until you start getting
into the limit zone.
704
00:39:45,970 --> 00:39:49,880
Here, everything is nice
and clean all the time.
705
00:39:49,880 --> 00:39:52,900
So we start out in steady-state
and we get this
706
00:39:52,900 --> 00:39:53,790
beautiful result.
707
00:39:53,790 --> 00:39:55,420
It's starting in steady-state.
708
00:39:55,420 --> 00:40:00,765
The expected number of visit
to state j by time n--
709
00:40:00,765 --> 00:40:03,380
oh, this is interesting.
710
00:40:03,380 --> 00:40:08,915
That t there should
be n obviously.
711
00:40:15,790 --> 00:40:19,580
Well, since we have t's
everywhere else, that n there
712
00:40:19,580 --> 00:40:22,360
should probably be t also.
713
00:40:22,360 --> 00:40:25,670
So you can fix it whichever
way you want.
714
00:40:25,670 --> 00:40:28,580
n's and t's are the same.
715
00:40:28,580 --> 00:40:32,185
I mean, for the purposes of this
lecture, let all t's be
716
00:40:32,185 --> 00:40:33,640
n's and let all n's be t's.
717
00:40:39,810 --> 00:40:41,420
This works for some things.
718
00:40:41,420 --> 00:40:44,670
This starts in steady state,
stays in steady state.
719
00:40:44,670 --> 00:40:47,970
It doesn't work for renewals
because it's a delayed renewal
720
00:40:47,970 --> 00:40:51,800
process, so you can't talk
about a renewal process
721
00:40:51,800 --> 00:40:55,130
starting in state j, because you
don't know that it starts
722
00:40:55,130 --> 00:40:57,520
in state j.
723
00:40:57,520 --> 00:41:00,810
So sometimes we want
to deal with this.
724
00:41:00,810 --> 00:41:02,750
Sometimes we want to
deal with this.
725
00:41:02,750 --> 00:41:06,650
This is the number of returns
to t starting in state j.
726
00:41:06,650 --> 00:41:14,150
This is the number of returns
to state j over 0 to t if we
727
00:41:14,150 --> 00:41:16,730
start in steady-state.
728
00:41:16,730 --> 00:41:20,200
Here's a useful hack, which you
can use a lot of the time.
729
00:41:23,480 --> 00:41:28,320
Look at what N sub
i j of t is.
730
00:41:28,320 --> 00:41:31,600
It's the number of times
you hit state j
731
00:41:31,600 --> 00:41:33,800
starting in state i.
732
00:41:33,800 --> 00:41:39,370
So let's look at it as you go
for while, you hit state j for
733
00:41:39,370 --> 00:41:41,240
the first time.
734
00:41:41,240 --> 00:41:45,120
After hitting state j for the
first time, you then go
735
00:41:45,120 --> 00:41:47,800
through a number of repetitions
of state j.
736
00:41:47,800 --> 00:41:51,460
But after that first time you
hit state j, you have a
737
00:41:51,460 --> 00:41:53,350
renewal process starting then.
738
00:41:53,350 --> 00:41:57,260
In other words, you have a
delayed renewal process up to
739
00:41:57,260 --> 00:41:58,590
the first renewal.
740
00:41:58,590 --> 00:42:00,600
After that, you have
all the statistics
741
00:42:00,600 --> 00:42:03,060
of a renewal process.
742
00:42:03,060 --> 00:42:08,720
So the idea then is N
sub i j of t is 1.
743
00:42:08,720 --> 00:42:13,180
Counts 1 for the first visit
to j, if there are any.
744
00:42:13,180 --> 00:42:17,700
Plus, N sub i j of t minus
1 for all the subsequent
745
00:42:17,700 --> 00:42:19,995
recurrences from j to j.
746
00:42:19,995 --> 00:42:23,330
Thus, when you look at the
expected values of this, the
747
00:42:23,330 --> 00:42:30,280
expected value of N sub i j of
t is less than or equal to 1
748
00:42:30,280 --> 00:42:34,720
for this first recurrence, for
this first visit, plus the
749
00:42:34,720 --> 00:42:40,140
expected value of N sub j j of
some number smaller than t.
750
00:42:40,140 --> 00:42:44,430
But N sub j j of
t grows with t.
751
00:42:44,430 --> 00:42:46,980
It's a number of visits
over some interval.
752
00:42:46,980 --> 00:42:49,710
And as the interval gets bigger
and bigger, the number
753
00:42:49,710 --> 00:42:52,380
of visits can't shrink.
754
00:42:52,380 --> 00:42:55,920
So you just put the t there
to make it an upper bound.
755
00:42:55,920 --> 00:43:00,810
And then, when you look at
starting in steady-state, what
756
00:43:00,810 --> 00:43:08,080
you get is the sum overall
starting states pi sub i of
757
00:43:08,080 --> 00:43:12,550
the expected value of
N sub i j of t.
758
00:43:12,550 --> 00:43:16,120
And this is less than or equal
to 1 plus the expected value
759
00:43:16,120 --> 00:43:19,030
of N sub j j of t also.
760
00:43:19,030 --> 00:43:26,485
So this says you can always get
from N tilde of t to N sub
761
00:43:26,485 --> 00:43:31,220
j j of t, by just giving
up this term 1
762
00:43:31,220 --> 00:43:33,250
here as an upper bound.
763
00:43:37,740 --> 00:43:40,150
If you don't like that proof--
764
00:43:40,150 --> 00:43:42,150
and it's not really a proof.
765
00:43:42,150 --> 00:43:45,730
If you try to make it a proof,
it gets kind of ugly.
766
00:43:45,730 --> 00:43:50,710
It's part of the proof of
theorem 4 in the text, which
767
00:43:50,710 --> 00:43:52,150
is even more ugly.
768
00:43:52,150 --> 00:43:56,790
Because it's mathematically
clean with equations, but you
769
00:43:56,790 --> 00:43:59,750
don't get any idea of why it's
true from looking at it.
770
00:43:59,750 --> 00:44:02,420
This you know why it's true from
looking at it, but you're
771
00:44:02,420 --> 00:44:05,590
not quite sure that it satisfies
the equations that
772
00:44:05,590 --> 00:44:07,200
you would like.
773
00:44:07,200 --> 00:44:10,540
I am trying to move you from
being totally dependent on
774
00:44:10,540 --> 00:44:15,950
equations to being more
dependent on ideas like this,
775
00:44:15,950 --> 00:44:17,990
where you can see
what's going on.
776
00:44:17,990 --> 00:44:22,200
But I'm also urging you, after
you see what's going on, to
777
00:44:22,200 --> 00:44:25,500
have a way to put the equations
in to see that
778
00:44:25,500 --> 00:44:28,170
you're absolutely
right with it.
779
00:44:28,170 --> 00:44:32,100
OK, now, we come to the major
theorem of countable-state and
780
00:44:32,100 --> 00:44:32,980
Markov chains.
781
00:44:32,980 --> 00:44:35,920
It's sort of the crucial
thing that everything
782
00:44:35,920 --> 00:44:37,340
else is based on.
783
00:44:37,340 --> 00:44:43,350
I mean, everything beyond
what we've already done.
784
00:44:43,350 --> 00:44:46,370
For any irreducible
Markov chain--
785
00:44:46,370 --> 00:44:49,810
in other words, for any Markov
chain where all the states
786
00:44:49,810 --> 00:44:55,100
communicate with each other,
the steady-state equations
787
00:44:55,100 --> 00:44:59,560
have a solution if and only
if the states are
788
00:44:59,560 --> 00:45:00,640
positive-recurrent.
789
00:45:00,640 --> 00:45:03,650
Now, remember, either all the
states are positive-recurrent
790
00:45:03,650 --> 00:45:04,710
or none of them are.
791
00:45:04,710 --> 00:45:07,750
So there's nothing
confusing there.
792
00:45:07,750 --> 00:45:10,610
If all the states are
positive-recurrent, then there
793
00:45:10,610 --> 00:45:12,460
is a steady-state solution.
794
00:45:12,460 --> 00:45:16,190
There is a solution to
those equations.
795
00:45:16,190 --> 00:45:22,090
And if the set of states are
transient, or null-recurrent,
796
00:45:22,090 --> 00:45:24,880
then there isn't a solution
to all those equations.
797
00:45:24,880 --> 00:45:31,040
If a solution exists, then the
probability, the steady-state
798
00:45:31,040 --> 00:45:35,980
probability is state i is 1 over
the main recurrence time
799
00:45:35,980 --> 00:45:36,960
to state i.
800
00:45:36,960 --> 00:45:40,570
This is a relationship that we
established by using renewal
801
00:45:40,570 --> 00:45:43,580
theory for finite-state
and Markov chains.
802
00:45:43,580 --> 00:45:45,315
We're just coming
back to it here.
803
00:45:48,180 --> 00:45:52,050
One thing which is important
here is that pi sub i is
804
00:45:52,050 --> 00:45:54,040
greater than 0 for all i.
805
00:45:54,040 --> 00:45:56,690
This is a property we
had for finite-state
806
00:45:56,690 --> 00:45:58,780
Markov chains also.
807
00:45:58,780 --> 00:46:01,340
But it's a good deal more
surprising here.
808
00:46:01,340 --> 00:46:04,520
When you have a countable number
of states, saying that
809
00:46:04,520 --> 00:46:09,490
every one of them has a positive
probability is--
810
00:46:09,490 --> 00:46:12,100
I don't think it's entirely
intuitive.
811
00:46:12,100 --> 00:46:13,700
If you think about it
for a long time,
812
00:46:13,700 --> 00:46:15,810
it's sort of intuitive.
813
00:46:15,810 --> 00:46:18,560
But it's the kind of intuitive
thing that really pushes your
814
00:46:18,560 --> 00:46:23,360
intuition into understanding
what's going on.
815
00:46:23,360 --> 00:46:31,910
So let's give a Pf of this,
of the only if part.
816
00:46:31,910 --> 00:46:33,660
And I will warn you
about reading the
817
00:46:33,660 --> 00:46:35,770
proof in the notes.
818
00:46:35,770 --> 00:46:39,470
It's ugly because it just goes
through a bunch of logical
819
00:46:39,470 --> 00:46:41,500
relationships and equations.
820
00:46:41,500 --> 00:46:45,070
You have no idea of where
it's going or why.
821
00:46:45,070 --> 00:46:47,210
And finally, at the
end it says, QED.
822
00:46:49,760 --> 00:46:50,430
I went through it.
823
00:46:50,430 --> 00:46:51,450
It's correct.
824
00:46:51,450 --> 00:46:54,790
But damned if I know why.
825
00:46:54,790 --> 00:46:59,560
And so, anyway, that has
to be rewritten.
826
00:46:59,560 --> 00:47:02,040
But, anyway here's the Pf.
827
00:47:02,040 --> 00:47:06,140
Start out by assuming that the
steady-state equations exist.
828
00:47:06,140 --> 00:47:07,390
We want to show
positive-recurrence.
829
00:47:10,210 --> 00:47:14,110
Pick any j and any t.
830
00:47:14,110 --> 00:47:17,140
Pick any state and any time.
831
00:47:17,140 --> 00:47:24,540
pi sub j is equal to the
expected value of N sub j
832
00:47:24,540 --> 00:47:26,050
tilde of t.
833
00:47:26,050 --> 00:47:27,890
That we chose for any
Markov chain at all.
834
00:47:27,890 --> 00:47:30,650
If you start out in
steady-state, you stay in
835
00:47:30,650 --> 00:47:31,850
steady-state.
836
00:47:31,850 --> 00:47:34,720
So under the assumption that
we're in steady-state--
837
00:47:37,900 --> 00:47:40,250
under the assumption that we
start out in steady-state, we
838
00:47:40,250 --> 00:47:42,370
stay in steady-state.
839
00:47:42,370 --> 00:47:48,520
This pi sub j times t has to be
the expected value of the
840
00:47:48,520 --> 00:47:54,120
number of recurrences to state
j over t time units.
841
00:47:54,120 --> 00:48:00,170
And what we showed on
the last slide--
842
00:48:00,170 --> 00:48:04,050
you must have realized I was
doing this for some reason.
843
00:48:04,050 --> 00:48:07,990
This is less than or equal to 1
plus the expected recurrence
844
00:48:07,990 --> 00:48:09,630
time of state j.
845
00:48:14,410 --> 00:48:18,540
So pi sub j is less than or
equal to 1 over t times this
846
00:48:18,540 --> 00:48:22,340
expected recurrence
time for state j.
847
00:48:22,340 --> 00:48:26,870
And if we go to the limit as t
goes to infinity, this 1 over
848
00:48:26,870 --> 00:48:29,680
t dribbles away to
nothingness.
849
00:48:29,680 --> 00:48:33,610
So this is less than or equal to
the limit of expected value
850
00:48:33,610 --> 00:48:36,300
of N sub j j of t over t.
851
00:48:36,300 --> 00:48:38,350
What is that?
852
00:48:38,350 --> 00:48:42,210
That's the expected number,
long-term rate of
853
00:48:42,210 --> 00:48:44,600
visits to state j.
854
00:48:44,600 --> 00:48:49,670
It's what we've shown as equal
to 1 over the expected renewal
855
00:48:49,670 --> 00:48:53,193
time of state j.
856
00:48:53,193 --> 00:48:59,350
Now, if the sum of the pi sub
j's is equal to 1, remember
857
00:48:59,350 --> 00:49:03,930
what happens when you sum a
countable set of numbers.
858
00:49:03,930 --> 00:49:07,940
If all of them are 0, then no
matter how many of them you
859
00:49:07,940 --> 00:49:09,960
sum, you have 0.
860
00:49:09,960 --> 00:49:13,100
And when you go to the limit,
you still have 0.
861
00:49:13,100 --> 00:49:16,550
So when you sum a set of
countable set of non-negative
862
00:49:16,550 --> 00:49:18,020
numbers, you have
to have a limit.
863
00:49:20,990 --> 00:49:22,240
Because it's non-decreasing.
864
00:49:24,720 --> 00:49:27,010
And that sum is equal to 1.
865
00:49:27,010 --> 00:49:29,670
Then somewhere along the line,
you've got to find the
866
00:49:29,670 --> 00:49:32,060
positive probability.
867
00:49:32,060 --> 00:49:33,180
One of the [INAUDIBLE]
868
00:49:33,180 --> 00:49:34,430
has to be positive.
869
00:49:37,300 --> 00:49:40,610
I mean, this is almost an
amusing proof because you work
870
00:49:40,610 --> 00:49:44,250
so hard to prove that one
of them is positive.
871
00:49:44,250 --> 00:49:47,580
And then, almost for free, you
get the fact that all of them
872
00:49:47,580 --> 00:49:50,790
have to be positive.
873
00:49:50,790 --> 00:49:53,940
So some pi j is greater
than 0.
874
00:49:53,940 --> 00:49:58,240
If pi j is less than or equal
to this, thus the limit as t
875
00:49:58,240 --> 00:50:02,570
approaches infinity of the
expected value of N sub j j of
876
00:50:02,570 --> 00:50:08,740
t over t is greater than 0 for
that j, which says j has to be
877
00:50:08,740 --> 00:50:10,930
positive-recurrent.
878
00:50:10,930 --> 00:50:15,270
Which says all the states have
to be positive-recurrent
879
00:50:15,270 --> 00:50:17,110
because we've already
shown that.
880
00:50:17,110 --> 00:50:19,910
So all the states are
positive-recurrent.
881
00:50:19,910 --> 00:50:23,390
Then you still have to show that
this inequality here is
882
00:50:23,390 --> 00:50:27,070
equality, and you've got to do
that by playing around with
883
00:50:27,070 --> 00:50:28,805
summing up these things.
884
00:50:33,660 --> 00:50:35,670
Something has been left
out, we have to sum
885
00:50:35,670 --> 00:50:37,150
those up over j.
886
00:50:37,150 --> 00:50:38,250
And that's another mess.
887
00:50:38,250 --> 00:50:40,210
I'm not going to do
it here in class.
888
00:50:40,210 --> 00:50:42,030
But just sort of see
why this happened.
889
00:50:42,030 --> 00:50:42,512
Yeah?
890
00:50:42,512 --> 00:50:43,762
AUDIENCE: [INAUDIBLE].
891
00:50:46,368 --> 00:50:49,260
Why do you have to show
the equality?
892
00:50:49,260 --> 00:50:51,630
PROFESSOR: Why do I have
to the equality?
893
00:50:51,630 --> 00:50:57,320
Because if I want to show that
all of the pi sub i's are
894
00:50:57,320 --> 00:51:01,190
positive, how do I show that?
895
00:51:01,190 --> 00:51:03,050
All I've done is started
out with an arbitrary--
896
00:51:03,050 --> 00:51:05,550
oh, I've started out with
an arbitrary j and
897
00:51:05,550 --> 00:51:08,830
an arbitrary t.
898
00:51:08,830 --> 00:51:12,830
Because I got the fact that this
was positive-recurrent by
899
00:51:12,830 --> 00:51:14,740
arguing that at least
one of the pi sub
900
00:51:14,740 --> 00:51:16,430
j's had to be positive.
901
00:51:16,430 --> 00:51:18,060
From this I can argue
that they're all
902
00:51:18,060 --> 00:51:22,690
positive-recurrent, which tells
me that this number is
903
00:51:22,690 --> 00:51:24,070
greater than 0.
904
00:51:24,070 --> 00:51:29,430
But that doesn't show me that
this number is greater than 0.
905
00:51:29,430 --> 00:51:30,570
But it is.
906
00:51:30,570 --> 00:51:31,380
I mean, it's all right.
907
00:51:31,380 --> 00:51:33,600
It all works out.
908
00:51:33,600 --> 00:51:38,080
But not quite in such a simple
way as you would hope.
909
00:51:38,080 --> 00:51:43,290
OK, so now let's go back to
what we called birth-death
910
00:51:43,290 --> 00:51:50,770
chains, but look at a slightly
more general version of them.
911
00:51:50,770 --> 00:51:53,630
These are things that you--
912
00:51:53,630 --> 00:51:56,710
I mean, queuing theory is
built on these things.
913
00:51:56,710 --> 00:51:59,260
Everything in queuing theory.
914
00:51:59,260 --> 00:52:02,900
Or not everything, but all the
things that come from a
915
00:52:02,900 --> 00:52:06,090
Poisson kind of background.
916
00:52:06,090 --> 00:52:12,880
All of these somehow look at
the birth-death chains.
917
00:52:12,880 --> 00:52:17,110
And the way a birth-death
chain works is you have
918
00:52:17,110 --> 00:52:19,670
arbitrary self-loops.
919
00:52:19,670 --> 00:52:22,780
You have positive probabilities
going from each
920
00:52:22,780 --> 00:52:25,760
state to the next state up.
921
00:52:25,760 --> 00:52:30,460
You have positive probabilities
going from the
922
00:52:30,460 --> 00:52:33,390
higher state to the
lower state.
923
00:52:33,390 --> 00:52:35,830
All transitions are
limited from--
924
00:52:35,830 --> 00:52:41,560
i can only go to i plus 1,
or i, for i minus 1.
925
00:52:41,560 --> 00:52:42,760
You can't make big jumps.
926
00:52:42,760 --> 00:52:46,340
You can only make jumps
of one step.
927
00:52:46,340 --> 00:52:49,540
And other than that, it's
completely general.
928
00:52:49,540 --> 00:52:52,485
OK, now we go through an
interesting argument.
929
00:52:55,210 --> 00:53:00,160
We look at an arbitrary
state i.
930
00:53:00,160 --> 00:53:08,080
And for this arbitrary state i,
like i equals 2, we look at
931
00:53:08,080 --> 00:53:11,860
the number of transitions
that go from 2 to 3.
932
00:53:11,860 --> 00:53:15,440
And the number transitions that
go from 3 to 2 for any
933
00:53:15,440 --> 00:53:18,010
old sample path whatsoever.
934
00:53:18,010 --> 00:53:20,370
And for any sample path,
the number of
935
00:53:20,370 --> 00:53:22,830
transitions that go up--
936
00:53:22,830 --> 00:53:27,430
if we start down there, before
you can come back,
937
00:53:27,430 --> 00:53:28,770
you've got to go up.
938
00:53:28,770 --> 00:53:33,120
So if you're on that side, you
have one more up transition
939
00:53:33,120 --> 00:53:34,880
than you have down transition.
940
00:53:34,880 --> 00:53:37,900
If you're on that side, you
have the same number of up
941
00:53:37,900 --> 00:53:41,470
transitions and down
transitions.
942
00:53:41,470 --> 00:53:44,870
So that as you look over a
longer and longer time, the
943
00:53:44,870 --> 00:53:49,280
number of up transitions is
effectively the same as the
944
00:53:49,280 --> 00:53:50,560
number of down transitions.
945
00:53:53,540 --> 00:53:58,420
If you have a steady-state, pi
sub i is the fraction of time
946
00:53:58,420 --> 00:54:00,270
you're in state i.
947
00:54:00,270 --> 00:54:08,340
pi sub i times p sub i is the
fraction of time you're going
948
00:54:08,340 --> 00:54:12,860
from state i to state
i plus 1.
949
00:54:12,860 --> 00:54:19,720
And pi sub i by plus 1 times q
sub i plus 1 is the fraction
950
00:54:19,720 --> 00:54:22,310
of time you're going
from state i plus 1
951
00:54:22,310 --> 00:54:23,680
down to state i.
952
00:54:23,680 --> 00:54:32,160
What we've just argued by the
fact that sample path averages
953
00:54:32,160 --> 00:54:37,940
and ensemble averages have to
be equal is that pi sub i
954
00:54:37,940 --> 00:54:42,750
times p sub i is equal to
pi sub i plus 1 times
955
00:54:42,750 --> 00:54:44,000
q sub i plus 1.
956
00:54:47,150 --> 00:54:50,400
In the next slide, I will
talk about whether to
957
00:54:50,400 --> 00:54:52,290
believe that or not.
958
00:54:52,290 --> 00:54:55,810
For the moment, let's
say we believe it.
959
00:54:55,810 --> 00:55:00,520
And from this equation, we
see that the steady-state
960
00:55:00,520 --> 00:55:04,960
probability of i plus 1 is
equal to the steady-state
961
00:55:04,960 --> 00:55:10,430
probability of i times p sub
i over q sub i plus 1.
962
00:55:10,430 --> 00:55:14,460
It says that the steady-state
probability of each pi is
963
00:55:14,460 --> 00:55:19,020
determined by the steady-state
probability of the state
964
00:55:19,020 --> 00:55:20,000
underneath it.
965
00:55:20,000 --> 00:55:21,410
So you just go up.
966
00:55:21,410 --> 00:55:24,100
You can calculate the
steady-state of each, the
967
00:55:24,100 --> 00:55:27,890
probability of each if you know
the probability of the
968
00:55:27,890 --> 00:55:30,000
state below it.
969
00:55:30,000 --> 00:55:36,300
So if you recurse on this, pi
sub i plus 1 is equal to pi
970
00:55:36,300 --> 00:55:41,470
sub i times this ratio is equal
to pi sub i minus 1
971
00:55:41,470 --> 00:55:47,170
times this ratio times p sub i
minus 1 over q sub i is equal
972
00:55:47,170 --> 00:55:52,390
to pi sub i minus 2 times
this triple of things.
973
00:55:52,390 --> 00:55:56,676
It tells you that what you want
to do is define row sub i
974
00:55:56,676 --> 00:56:01,580
as the difference of these two
probabilities, namely rob i,
975
00:56:01,580 --> 00:56:06,700
for any state i, is the ratio
of that probability to that
976
00:56:06,700 --> 00:56:08,310
probability.
977
00:56:08,310 --> 00:56:15,820
And this equation then turns
into pi sub i plus 1 equals pi
978
00:56:15,820 --> 00:56:18,620
sub i times row sub i.
979
00:56:18,620 --> 00:56:20,530
If you put all those
things together--
980
00:56:20,530 --> 00:56:23,490
if you just paste them one after
the other, the way I was
981
00:56:23,490 --> 00:56:24,710
suggesting--
982
00:56:24,710 --> 00:56:30,480
what you get is pi sub i is
equal to pi sub 0 times this
983
00:56:30,480 --> 00:56:32,230
product of terms.
984
00:56:32,230 --> 00:56:35,930
The product of terms looks
a little ugly.
985
00:56:35,930 --> 00:56:38,270
Why don't I care about
that very much?
986
00:56:38,270 --> 00:56:41,590
Well, because usually, when you
have a chain like this,
987
00:56:41,590 --> 00:56:44,930
all the Ps are the same and
all the Qs are the same--
988
00:56:44,930 --> 00:56:49,110
or all the Ps are the same for
some point beyond someplace,
989
00:56:49,110 --> 00:56:52,240
they're are different
before that.
990
00:56:52,240 --> 00:56:54,670
There's always some structure
to make life easy for you.
991
00:56:58,150 --> 00:56:58,950
Oh, that's my computer.
992
00:56:58,950 --> 00:57:00,450
It's telling me what
time it is.
993
00:57:00,450 --> 00:57:03,310
I'm sorry.
994
00:57:03,310 --> 00:57:03,740
OK.
995
00:57:03,740 --> 00:57:07,080
So pi sub i is this.
996
00:57:07,080 --> 00:57:10,070
We then have to calculate
pi sub 0.
997
00:57:10,070 --> 00:57:14,810
Pi sub 0 is then 1 divided
by the sum of all the
998
00:57:14,810 --> 00:57:19,730
probabilities is pi sub 0 times
all those other things.
999
00:57:19,730 --> 00:57:23,490
It's 1 plus the sum here.
1000
00:57:23,490 --> 00:57:29,680
And now if you don't believe
what I did here, and I don't
1001
00:57:29,680 --> 00:57:32,710
blame you for being a little
bit skeptical.
1002
00:57:32,710 --> 00:57:39,160
If you don't believe this, then
you look at this and you
1003
00:57:39,160 --> 00:57:43,150
say, OK, I can now go back and
look at the steady state
1004
00:57:43,150 --> 00:57:47,080
equations themselves and I can
plug this into the steady
1005
00:57:47,080 --> 00:57:49,240
state equations themselves.
1006
00:57:49,240 --> 00:57:53,020
And you will immediately see
that this solution satisfies
1007
00:57:53,020 --> 00:57:55,450
the steady state equations.
1008
00:57:55,450 --> 00:57:57,540
OK.
1009
00:57:57,540 --> 00:57:59,310
Oh, damn.
1010
00:57:59,310 --> 00:58:00,560
Excuse my language.
1011
00:58:03,930 --> 00:58:05,180
OK.
1012
00:58:08,330 --> 00:58:12,370
So we have our birth-death
chain with all these
1013
00:58:12,370 --> 00:58:14,150
transitions here.
1014
00:58:14,150 --> 00:58:17,710
We have our solution to it.
1015
00:58:17,710 --> 00:58:23,500
Note that the solution is only
a function of these rows.
1016
00:58:23,500 --> 00:58:28,150
It's only a function of the
ratio of p sub i to
1017
00:58:28,150 --> 00:58:29,790
Q sub i plus 1.
1018
00:58:29,790 --> 00:58:33,560
It doesn't depend on those
self loops at all.
1019
00:58:33,560 --> 00:58:34,810
Isn't that peculiar?
1020
00:58:37,400 --> 00:58:41,606
Completely independent of what
those self loops are.
1021
00:58:41,606 --> 00:58:44,470
Well, you'll see later that it's
not totally independent
1022
00:58:44,470 --> 00:58:46,685
of it, but it's essentially
independent of it.
1023
00:58:50,050 --> 00:58:54,930
And you think about that for a
while and suddenly it's not
1024
00:58:54,930 --> 00:58:59,810
that confusing because those
equations have come from
1025
00:58:59,810 --> 00:59:03,770
looking at up transitions
and down transitions.
1026
00:59:03,770 --> 00:59:07,720
By looking at an up transition
and a down transition at one
1027
00:59:07,720 --> 00:59:12,100
place here, it tells you
something about the fraction
1028
00:59:12,100 --> 00:59:14,750
of time you're over there and
the fraction of time you're
1029
00:59:14,750 --> 00:59:17,070
down there if you know what
these steady state
1030
00:59:17,070 --> 00:59:18,940
probabilities are.
1031
00:59:18,940 --> 00:59:21,830
So if you think about it for a
bit, you realize that these
1032
00:59:21,830 --> 00:59:26,060
steady state probabilities
cannot depend that strongly on
1033
00:59:26,060 --> 00:59:27,270
what those self loops are.
1034
00:59:27,270 --> 00:59:30,901
So this all sort
of makes sense.
1035
00:59:30,901 --> 00:59:34,580
The next thing is the expression
for pi 0--
1036
00:59:34,580 --> 00:59:36,360
namely this thing here--
1037
00:59:36,360 --> 00:59:37,780
is a product of these terms.
1038
00:59:40,460 --> 00:59:44,360
It converges and therefore the
chain is positive recurrent
1039
00:59:44,360 --> 00:59:47,650
because there is a solution to
the steady state equation.
1040
00:59:47,650 --> 00:59:50,750
It converges if the
row sub i's are
1041
00:59:50,750 --> 00:59:54,870
asymptotically less than 1.
1042
00:59:54,870 --> 00:59:57,900
So for example, if
the row sub i's--
1043
00:59:57,900 --> 01:00:00,390
beyond i equals 100--
1044
01:00:00,390 --> 01:00:05,500
are bounded by, say, 0.9, then
these terms have to go to 0
1045
01:00:05,500 --> 01:00:11,420
rapidly after i equals 100 and
this product has to converge.
1046
01:00:11,420 --> 01:00:15,380
I say essentially here of all
these particular cases where
1047
01:00:15,380 --> 01:00:19,520
the row sub i's are very close
to 1, and they're converging
1048
01:00:19,520 --> 01:00:23,190
very slowly to 1
and who knows.
1049
01:00:23,190 --> 01:00:26,130
But for most of the things we
do, these row sub i's are
1050
01:00:26,130 --> 01:00:29,640
strictly less than
1 as you move up.
1051
01:00:29,640 --> 01:00:33,140
And it says that you have
to have steady state
1052
01:00:33,140 --> 01:00:34,600
probabilities.
1053
01:00:34,600 --> 01:00:42,310
So for most birth-death chains,
it's almost immediate
1054
01:00:42,310 --> 01:00:46,280
to establish whether it's
recurrent, positive recurrent,
1055
01:00:46,280 --> 01:00:48,410
or not positive recurrent.
1056
01:00:48,410 --> 01:00:51,270
And we'll talk more about that
when we get into Markov
1057
01:00:51,270 --> 01:00:56,470
processes, but that's enough
of it for now.
1058
01:00:56,470 --> 01:00:59,200
Comment on methodology.
1059
01:00:59,200 --> 01:01:02,750
We could check the renewal
results carefully, because
1060
01:01:02,750 --> 01:01:05,820
what we're doing here is
assuming something rather
1061
01:01:05,820 --> 01:01:11,590
peculiar about time averages
and ensemble averages.
1062
01:01:11,590 --> 01:01:14,840
And sometimes you have to worry
about those things, but
1063
01:01:14,840 --> 01:01:17,990
here, we don't have to worry
about it because we have this
1064
01:01:17,990 --> 01:01:20,940
major theorem which tells
us if steady state
1065
01:01:20,940 --> 01:01:22,720
probabilities exist--
1066
01:01:22,720 --> 01:01:25,390
and they exist because they
satisfy these equations--
1067
01:01:25,390 --> 01:01:27,800
then you have positive
recurrence.
1068
01:01:27,800 --> 01:01:32,510
So it says the methodology to
use is not to get involved in
1069
01:01:32,510 --> 01:01:34,750
any deep theory, but just
to see if these
1070
01:01:34,750 --> 01:01:36,750
equations are satisfied.
1071
01:01:36,750 --> 01:01:41,070
Again, good mathematicians
are lazy--
1072
01:01:41,070 --> 01:01:43,870
good engineers are
even lazier.
1073
01:01:43,870 --> 01:01:47,110
That's my motto of the day.
1074
01:01:47,110 --> 01:01:50,180
And finally, birth-death
chains are going to be
1075
01:01:50,180 --> 01:01:54,770
particularly useful in queuing
where the births are arrivals
1076
01:01:54,770 --> 01:01:56,233
and the deaths are departures.
1077
01:02:00,160 --> 01:02:00,670
OK.
1078
01:02:00,670 --> 01:02:02,980
Now we come to reversibility.
1079
01:02:02,980 --> 01:02:06,930
I'm glad we're coming to that
towards the end of the lecture
1080
01:02:06,930 --> 01:02:12,400
because reversibility is
something which I don't think
1081
01:02:12,400 --> 01:02:15,340
any of you guys even--
1082
01:02:15,340 --> 01:02:17,500
and I think this is a
pretty smart class--
1083
01:02:17,500 --> 01:02:20,570
but I've never seen anybody who
understands reversibility
1084
01:02:20,570 --> 01:02:22,850
the first time they
think about it.
1085
01:02:22,850 --> 01:02:29,140
It's a very peculiar concept and
the results coming from it
1086
01:02:29,140 --> 01:02:34,490
are peculiar, and we will have
to live with it for a while.
1087
01:02:34,490 --> 01:02:40,250
But let's start out with
the easy things--
1088
01:02:40,250 --> 01:02:44,810
just with a definition of
what a Markov chain is.
1089
01:02:44,810 --> 01:02:49,480
This top equation here says
that the probability of a
1090
01:02:49,480 --> 01:02:52,620
whole bunch of states--
1091
01:02:52,620 --> 01:02:57,790
X sub n plus k down to X sub n
plus 1 given the stated time,
1092
01:02:57,790 --> 01:03:00,900
n, down to the stated time 0.
1093
01:03:00,900 --> 01:03:04,200
Because of the Markov condition,
that has to be
1094
01:03:04,200 --> 01:03:07,190
equal to the probability
of these terms
1095
01:03:07,190 --> 01:03:09,580
just given X sub n.
1096
01:03:09,580 --> 01:03:12,890
Namely, if you know what X sub
n is, for the future, you
1097
01:03:12,890 --> 01:03:16,060
don't have to know what any of
those previous states are.
1098
01:03:16,060 --> 01:03:19,980
You get that directly from
where we started with the
1099
01:03:19,980 --> 01:03:23,040
Markov chains-- the probability
of X sub n plus 1,
1100
01:03:23,040 --> 01:03:25,400
given all this stuff, and
then you just add the
1101
01:03:25,400 --> 01:03:27,260
other things onto it.
1102
01:03:27,260 --> 01:03:34,530
Now, if you define A plus as any
event which is defined in
1103
01:03:34,530 --> 01:03:40,660
terms of X sub n plus 1, X of n
plus 2, and so forth up, and
1104
01:03:40,660 --> 01:03:44,670
if you define A minus as
anything which is a function
1105
01:03:44,670 --> 01:03:51,410
of X sub n minus 1, X sub n
minus 2, down to X sub 0, then
1106
01:03:51,410 --> 01:03:54,130
what this equations
says is that the
1107
01:03:54,130 --> 01:03:57,180
probability of any A plus--
1108
01:03:57,180 --> 01:03:59,900
given X sub n and A minus--
1109
01:03:59,900 --> 01:04:06,080
is equal to the probability
of A plus given X sub n.
1110
01:04:06,080 --> 01:04:08,140
And this hasn't gotten
hard yet.
1111
01:04:08,140 --> 01:04:10,770
If you think this is
hard, just wait.
1112
01:04:16,230 --> 01:04:19,670
If we now multiply this by
the probability of A
1113
01:04:19,670 --> 01:04:21,800
minus given X sub n--
1114
01:04:21,800 --> 01:04:26,130
and what I'm trying to get at
is, how do you reason about
1115
01:04:26,130 --> 01:04:28,560
the probabilities of
earlier states
1116
01:04:28,560 --> 01:04:30,440
given the present state?
1117
01:04:30,440 --> 01:04:33,790
We're used to proceeding
in time.
1118
01:04:33,790 --> 01:04:36,930
We're used to looking at
the past for telling
1119
01:04:36,930 --> 01:04:39,300
what the future is.
1120
01:04:39,300 --> 01:04:41,540
And every once and a while, you
want to look at the future
1121
01:04:41,540 --> 01:04:44,036
and predict what the
past had to be.
1122
01:04:44,036 --> 01:04:48,470
It's probably more important to
talk about the future given
1123
01:04:48,470 --> 01:04:50,585
the past, because sometimes
you don't know
1124
01:04:50,585 --> 01:04:51,450
what the future is.
1125
01:04:51,450 --> 01:04:55,210
But mathematically, you
have to sort that out.
1126
01:04:55,210 --> 01:05:02,620
So if we multiply this equation
by the probability of
1127
01:05:02,620 --> 01:05:05,350
A minus, given X sub n,
we don't know what
1128
01:05:05,350 --> 01:05:08,950
that is, but it exists.
1129
01:05:08,950 --> 01:05:11,370
It's a defined conditional
probability.
1130
01:05:11,370 --> 01:05:14,460
Then what we get is the
probability of A plus and A
1131
01:05:14,460 --> 01:05:18,710
minus, given X sub n, is equal
to the probability of A plus,
1132
01:05:18,710 --> 01:05:21,410
given X sub n, times the
probability of A
1133
01:05:21,410 --> 01:05:24,060
minus, given X sub n.
1134
01:05:24,060 --> 01:05:28,940
So that the probability of the
future and the past, given
1135
01:05:28,940 --> 01:05:32,350
what's happening now, is equal
to the probability of the
1136
01:05:32,350 --> 01:05:35,970
future, given what's happening
now, times the probability the
1137
01:05:35,970 --> 01:05:38,760
past, given what's
happening now.
1138
01:05:38,760 --> 01:05:41,160
Which may be a more interesting
way of looking at
1139
01:05:41,160 --> 01:05:45,560
past and future and present
than this totally
1140
01:05:45,560 --> 01:05:46,710
asymmetric way here.
1141
01:05:46,710 --> 01:05:50,720
This is a nice, symmetric
way of looking at it.
1142
01:05:50,720 --> 01:05:54,810
And as soon as you see that this
has to be true, then you
1143
01:05:54,810 --> 01:05:58,680
can turn around and write this
the opposite way, and you see
1144
01:05:58,680 --> 01:06:02,365
that the probability of A minus,
given X sub n, and A
1145
01:06:02,365 --> 01:06:04,570
plus is equal to the
probability of A
1146
01:06:04,570 --> 01:06:08,670
minus given X sub n.
1147
01:06:08,670 --> 01:06:12,120
Which says that the probability
of the past, given
1148
01:06:12,120 --> 01:06:15,970
X sub n and the future, is equal
to the probability of
1149
01:06:15,970 --> 01:06:19,560
the past just given X sub n.
1150
01:06:19,560 --> 01:06:24,530
You can go from past the future
or you can go from
1151
01:06:24,530 --> 01:06:26,780
future to past.
1152
01:06:26,780 --> 01:06:33,020
And incidentally, if you people
have trouble trying to
1153
01:06:33,020 --> 01:06:35,530
think of the past and
the future as
1154
01:06:35,530 --> 01:06:38,210
being symmetric animals--
1155
01:06:38,210 --> 01:06:39,460
and I do too--
1156
01:06:41,810 --> 01:06:47,090
everything we do with time can
also be done on a line going
1157
01:06:47,090 --> 01:06:49,770
from left to right, or it
can be done on a line
1158
01:06:49,770 --> 01:06:51,210
going from bottom up.
1159
01:06:51,210 --> 01:06:55,150
Going from bottom up, it's
hard to say that this is
1160
01:06:55,150 --> 01:06:56,430
symmetric to this.
1161
01:06:56,430 --> 01:07:00,950
If you look at it on a line
going from left to right, it's
1162
01:07:00,950 --> 01:07:04,950
kind of easy to see that this
is symmetric between left to
1163
01:07:04,950 --> 01:07:07,290
right and right to left.
1164
01:07:07,290 --> 01:07:10,900
So every time you get confused
about these arguments, put
1165
01:07:10,900 --> 01:07:15,180
them on a line and argue right
to left and left to right
1166
01:07:15,180 --> 01:07:18,670
instead of earlier and later.
1167
01:07:18,670 --> 01:07:23,080
Because mathematically, it's
the same thing, but it's
1168
01:07:23,080 --> 01:07:25,950
easier to see these
symmetries.
1169
01:07:25,950 --> 01:07:30,400
And now, if you think of A minus
as being X sub n minus
1170
01:07:30,400 --> 01:07:35,060
1, and you think of A plus as
being X sub n plus 1, X sub n
1171
01:07:35,060 --> 01:07:38,920
plus 2 and so forth up, what
this equation says is that the
1172
01:07:38,920 --> 01:07:43,820
probability of the last state
in the past, given the state
1173
01:07:43,820 --> 01:07:47,180
now and everything in the
future, is equal to the
1174
01:07:47,180 --> 01:07:52,690
probability of the last state
in the past given X sub n.
1175
01:07:52,690 --> 01:07:54,570
Now, this isn't reversibility.
1176
01:07:54,570 --> 01:07:58,350
I'm not saying that these
are special process.
1177
01:07:58,350 --> 01:08:02,640
This is true for any Markov
chain in the world.
1178
01:08:02,640 --> 01:08:05,470
These relationships
are always true.
1179
01:08:05,470 --> 01:08:12,750
This is one reason why many
people view this as the real
1180
01:08:12,750 --> 01:08:17,310
Markov condition, as opposed to
any of these other things.
1181
01:08:17,310 --> 01:08:22,010
They say that three events
have a Markov condition
1182
01:08:22,010 --> 01:08:25,720
between them if there's
one of them which is
1183
01:08:25,720 --> 01:08:27,899
in between the other.
1184
01:08:27,899 --> 01:08:31,590
Where you can say that the
probability of the left one,
1185
01:08:31,590 --> 01:08:34,560
given the middle one, times
the right one, given the
1186
01:08:34,560 --> 01:08:39,250
middle one, is equal to the
probability of the left and
1187
01:08:39,250 --> 01:08:40,550
the right given the middle.
1188
01:08:40,550 --> 01:08:45,859
It says that the past and the
future, given the present, are
1189
01:08:45,859 --> 01:08:48,439
independent of each other.
1190
01:08:48,439 --> 01:08:51,180
It says that as soon as you
know what the present is,
1191
01:08:51,180 --> 01:08:53,250
everything down there
is independent of
1192
01:08:53,250 --> 01:08:56,100
everything up there.
1193
01:08:56,100 --> 01:08:57,465
That's a pretty powerful
condition.
1194
01:09:00,490 --> 01:09:04,390
And you'll see that we can do an
awful lot with it, so it's
1195
01:09:04,390 --> 01:09:06,420
going to be important.
1196
01:09:06,420 --> 01:09:06,770
OK.
1197
01:09:06,770 --> 01:09:11,760
So let's go on with that.
1198
01:09:11,760 --> 01:09:13,109
By Bayes rule--
1199
01:09:13,109 --> 01:09:17,439
and incidentally, this is why
Bayes got into so much trouble
1200
01:09:17,439 --> 01:09:20,229
with the other statisticians
in the world.
1201
01:09:20,229 --> 01:09:22,910
Because the other statisticians
in the world
1202
01:09:22,910 --> 01:09:28,870
really got emotionally upset at
the idea of talking about
1203
01:09:28,870 --> 01:09:31,630
the past given the future.
1204
01:09:31,630 --> 01:09:36,410
That was almost an attack on
their religion as well as all
1205
01:09:36,410 --> 01:09:39,410
the mathematics they knew and
everything else they knew.
1206
01:09:39,410 --> 01:09:44,420
It was really hitting them below
the belt, so to speak.
1207
01:09:44,420 --> 01:09:47,939
So they didn't like this.
1208
01:09:47,939 --> 01:09:51,600
But now, we've recognized that
Bayes' Law is just the
1209
01:09:51,600 --> 01:09:56,540
consequence of the axioms of
probability, and there's
1210
01:09:56,540 --> 01:09:58,330
nothing strange about it.
1211
01:09:58,330 --> 01:10:00,930
You write down these conditional
probabilities and
1212
01:10:00,930 --> 01:10:04,150
that's sitting there,
facing you.
1213
01:10:04,150 --> 01:10:08,950
But what it says here is that
the probability of the state
1214
01:10:08,950 --> 01:10:13,890
at time n minus 1, given the
state of time n, is equal to
1215
01:10:13,890 --> 01:10:17,280
the probability of the state
of time n, given n minus 1,
1216
01:10:17,280 --> 01:10:21,860
times the probability of X
n minus 1 divided by the
1217
01:10:21,860 --> 01:10:23,260
probability of X n.
1218
01:10:23,260 --> 01:10:26,450
In other words, you put this
over in this side, and it says
1219
01:10:26,450 --> 01:10:30,310
the probability of X n times the
probability of X n minus 1
1220
01:10:30,310 --> 01:10:33,870
given X n is that probability
up there.
1221
01:10:33,870 --> 01:10:37,780
It says that the probability
of A given B times the
1222
01:10:37,780 --> 01:10:40,870
probability of B is equal to the
probability of A times the
1223
01:10:40,870 --> 01:10:45,920
probability of B given A. And
that's just the definition of
1224
01:10:45,920 --> 01:10:49,930
a conditional probability,
nothing more.
1225
01:10:49,930 --> 01:10:50,630
OK.
1226
01:10:50,630 --> 01:10:54,950
If the forward chain is in
a steady state, then the
1227
01:10:54,950 --> 01:10:59,610
probability that X sub n minus
1 equals j, given X sub n
1228
01:10:59,610 --> 01:11:06,670
equals i, is pji times pi sub
j divided by pi sub i.
1229
01:11:06,670 --> 01:11:10,510
These probabilities become
just probabilities which
1230
01:11:10,510 --> 01:11:13,990
depend on i but not on n.
1231
01:11:13,990 --> 01:11:17,550
Now what's going on here is
when you look at this
1232
01:11:17,550 --> 01:11:24,370
equation, it looks peculiar
because normally with a Markov
1233
01:11:24,370 --> 01:11:29,360
chain, we start out at time 0
with some assumed probability
1234
01:11:29,360 --> 01:11:31,020
distribution.
1235
01:11:31,020 --> 01:11:34,340
And as soon as you start out
with some assumed probability
1236
01:11:34,340 --> 01:11:42,890
distribution at time 0 and you
start talking about the past
1237
01:11:42,890 --> 01:11:47,830
condition on the future,
it gets very sticky.
1238
01:11:47,830 --> 01:11:53,550
Because when you talk about
the past condition on the
1239
01:11:53,550 --> 01:11:57,170
future, you can only go back to
time equals 0, and you know
1240
01:11:57,170 --> 01:11:59,650
what's happening down there
because you have some
1241
01:11:59,650 --> 01:12:04,080
established probabilities
at 0.
1242
01:12:04,080 --> 01:12:10,330
So what it says is in this
equation here, it says that
1243
01:12:10,330 --> 01:12:14,950
the Markov chain, defined
by this rule--
1244
01:12:14,950 --> 01:12:16,590
I guess I ought to back
and look at the
1245
01:12:16,590 --> 01:12:18,160
previous slide for that.
1246
01:12:22,900 --> 01:12:28,050
This is saying the probability
of the state at time n minus
1247
01:12:28,050 --> 01:12:32,300
1, conditional on the entire
future, is equal to the
1248
01:12:32,300 --> 01:12:36,020
probability of X sub n minus
1 just given X sub n.
1249
01:12:36,020 --> 01:12:39,340
This is the Markov condition,
but it's the Markov condition
1250
01:12:39,340 --> 01:12:40,610
turned around.
1251
01:12:40,610 --> 01:12:45,160
Usually we talk about the next
state given the previous state
1252
01:12:45,160 --> 01:12:46,770
and everything before that.
1253
01:12:46,770 --> 01:12:49,740
Here, we're talking about
the previous state given
1254
01:12:49,740 --> 01:12:51,720
everything after that.
1255
01:12:51,720 --> 01:12:55,820
So this really is the Markov
condition on what we might
1256
01:12:55,820 --> 01:12:58,060
view as a backward chain.
1257
01:12:58,060 --> 01:13:05,040
But to be a Markov chain, these
transition probabilities
1258
01:13:05,040 --> 01:13:08,230
have to be independent of n.
1259
01:13:08,230 --> 01:13:11,360
The transition probabilities
are not going to be
1260
01:13:11,360 --> 01:13:16,690
independent of n if you have
these arbitrary probabilities
1261
01:13:16,690 --> 01:13:19,295
at time 0 lousing
everything up.
1262
01:13:19,295 --> 01:13:22,200
So you can get around
this in two ways.
1263
01:13:22,200 --> 01:13:25,490
One way to get around it is
to say let's restrict our
1264
01:13:25,490 --> 01:13:31,390
attention to positive recurrent
processes which are
1265
01:13:31,390 --> 01:13:33,660
starting out in a
steady state.
1266
01:13:33,660 --> 01:13:36,740
And if we start out in a steady
state, then these
1267
01:13:36,740 --> 01:13:38,030
probabilities here--
1268
01:13:41,180 --> 01:13:44,400
looking at these probabilities
here--
1269
01:13:44,400 --> 01:13:47,270
if you go from here down to
here, you'll find out that
1270
01:13:47,270 --> 01:13:49,720
this does not depend on n.
1271
01:13:49,720 --> 01:13:53,740
And if you have an initial state
which is something other
1272
01:13:53,740 --> 01:13:57,800
than steady state, then these
will depend on it.
1273
01:13:57,800 --> 01:14:03,340
Let me put this down in
the next chain up.
1274
01:14:03,340 --> 01:14:08,780
The probability of X sub n minus
1 given X sub n is going
1275
01:14:08,780 --> 01:14:14,410
to be independent of n if this
is independent of n, which it
1276
01:14:14,410 --> 01:14:18,070
is, because we have a
homogeneous Markov chain.
1277
01:14:18,070 --> 01:14:21,490
And this is independent of n and
this is independent of n.
1278
01:14:21,490 --> 01:14:27,380
Now, this will just be the
probability of pi sub i if X
1279
01:14:27,380 --> 01:14:29,730
sub n minus 1 is equal to i.
1280
01:14:29,730 --> 01:14:33,540
And this will be pi sub i if
probability of X sub n is
1281
01:14:33,540 --> 01:14:34,430
equal to i.
1282
01:14:34,430 --> 01:14:40,810
So this and this will be
independent of n if in fact we
1283
01:14:40,810 --> 01:14:42,730
start out in a steady state.
1284
01:14:42,730 --> 01:14:44,580
In other words, it won't be.
1285
01:14:44,580 --> 01:14:49,500
So what we're doing here is we
normally think of a Markov
1286
01:14:49,500 --> 01:14:54,050
chain starting out at time 0
because how else can you get
1287
01:14:54,050 --> 01:14:55,950
it started?
1288
01:14:55,950 --> 01:15:00,310
And we think of it in forward
time, and then we say, well,
1289
01:15:00,310 --> 01:15:02,580
we want to make it homogeneous,
because we want
1290
01:15:02,580 --> 01:15:06,100
to make it always do the same
thing in the future otherwise
1291
01:15:06,100 --> 01:15:09,290
it doesn't really look much
like the a Markov chain.
1292
01:15:09,290 --> 01:15:12,610
So what we're saying is that
this backward chain-- we have
1293
01:15:12,610 --> 01:15:15,680
backward probabilities
defined now--
1294
01:15:15,680 --> 01:15:20,870
the backward probabilities are
homogeneous if the forward
1295
01:15:20,870 --> 01:15:23,520
probabilities start
in a steady state.
1296
01:15:23,520 --> 01:15:26,600
You could probably make a
similar statement but say the
1297
01:15:26,600 --> 01:15:30,290
forward probabilities are
homogeneous if the backward
1298
01:15:30,290 --> 01:15:32,070
probabilities start
in a steady state.
1299
01:15:32,070 --> 01:15:34,510
But I don't know when you're
going to start.
1300
01:15:34,510 --> 01:15:38,160
You're going to have to start it
sometime in the future, and
1301
01:15:38,160 --> 01:15:40,880
that gets too philosophical
to understand.
1302
01:15:40,880 --> 01:15:42,130
OK.
1303
01:15:43,980 --> 01:15:46,730
If we think of the chain as
starting in a steady state at
1304
01:15:46,730 --> 01:15:50,150
time minus infinity, these are
also the equations of the
1305
01:15:50,150 --> 01:15:52,250
homogeneous Markov chain.
1306
01:15:52,250 --> 01:15:54,930
We can start at time minus
infinity wherever we want to--
1307
01:15:54,930 --> 01:15:56,550
it doesn't make any
difference--
1308
01:15:56,550 --> 01:15:59,780
because by the time we get to
state 0, we will be in steady
1309
01:15:59,780 --> 01:16:03,380
state, and the whole range of
where we want to look at
1310
01:16:03,380 --> 01:16:06,020
things will be in
steady state.
1311
01:16:06,020 --> 01:16:07,360
OK.
1312
01:16:07,360 --> 01:16:12,480
So aside from this issue about
starting at 0 and steady state
1313
01:16:12,480 --> 01:16:17,420
and things like that, what we've
really shown here is
1314
01:16:17,420 --> 01:16:21,810
that you can look at a Markov
chain either going forward or
1315
01:16:21,810 --> 01:16:23,840
going backward.
1316
01:16:23,840 --> 01:16:28,180
Or look at it going rightward
or going leftward.
1317
01:16:28,180 --> 01:16:31,510
And that's really pretty
important.
1318
01:16:31,510 --> 01:16:32,210
OK.
1319
01:16:32,210 --> 01:16:34,380
That still doesn't say anything
about it being
1320
01:16:34,380 --> 01:16:36,960
reversible.
1321
01:16:36,960 --> 01:16:39,250
What reversibility is--
1322
01:16:39,250 --> 01:16:42,050
it comes from looking at
this equation here.
1323
01:16:42,050 --> 01:16:45,010
This says what the transition
probabilities are, going
1324
01:16:45,010 --> 01:16:49,570
backwards, and this
is the transition
1325
01:16:49,570 --> 01:16:51,450
probabilities going forward.
1326
01:16:51,450 --> 01:16:53,140
These are the steady state
probabilities.
1327
01:16:55,730 --> 01:17:05,540
And if we define P star of ji
as a backward transition
1328
01:17:05,540 --> 01:17:06,320
probabilities--
1329
01:17:06,320 --> 01:17:10,750
namely, the probability that at
this time or in stage A--
1330
01:17:10,750 --> 01:17:14,430
given that in this next time,
which to us, is the previous
1331
01:17:14,430 --> 01:17:18,510
time, we're in state i, is the
probability of going in a
1332
01:17:18,510 --> 01:17:21,270
backward direction
from j to i.
1333
01:17:24,970 --> 01:17:29,105
This gets into whether this is
P star of ij or P star of ij.
1334
01:17:29,105 --> 01:17:32,770
But I did check it carefully,
so it has to be right.
1335
01:17:32,770 --> 01:17:39,400
So anyway, when you substitute
this in for this, the
1336
01:17:39,400 --> 01:17:44,630
conditions that you get is pi
sub i times P star of ij is
1337
01:17:44,630 --> 01:17:48,040
equal to pi j times P of ji.
1338
01:17:48,040 --> 01:17:50,515
These are the same equations
that we had for
1339
01:17:50,515 --> 01:17:52,380
a birth-death chain.
1340
01:17:52,380 --> 01:17:55,490
But now, we're not talking
about birth-death chains.
1341
01:17:55,490 --> 01:17:59,710
Now we're talking about
any old chain.
1342
01:17:59,710 --> 01:18:00,533
Yeah?
1343
01:18:00,533 --> 01:18:04,397
AUDIENCE: Doesn't this only
make sense for positive
1344
01:18:04,397 --> 01:18:09,196
recurring chains?
1345
01:18:09,196 --> 01:18:11,591
PROFESSOR: Yes.
1346
01:18:11,591 --> 01:18:12,070
Sorry.
1347
01:18:12,070 --> 01:18:15,450
I should keep emphasizing that,
because it only makes
1348
01:18:15,450 --> 01:18:18,450
sense when you can define the
steady state probabilities.
1349
01:18:18,450 --> 01:18:19,850
Yes.
1350
01:18:19,850 --> 01:18:24,130
The steady state probabilities
are necessary in order to even
1351
01:18:24,130 --> 01:18:26,720
define this P star of ji.
1352
01:18:26,720 --> 01:18:31,070
But once you have that steady
state condition, and once you
1353
01:18:31,070 --> 01:18:33,770
know what the steady state
probabilities are, then you
1354
01:18:33,770 --> 01:18:35,970
can calculate backward
probabilities, you can
1355
01:18:35,970 --> 01:18:39,540
calculate forward probabilities,
and this is a
1356
01:18:39,540 --> 01:18:42,300
very simple relationship
that they satisfy.
1357
01:18:42,300 --> 01:18:47,260
It makes sense because this
is a normal form.
1358
01:18:47,260 --> 01:18:51,420
You look at state transition
probabilities and you look at
1359
01:18:51,420 --> 01:18:53,960
the probability of being in
one state and then the
1360
01:18:53,960 --> 01:18:57,250
probability of going
to the next state.
1361
01:18:57,250 --> 01:18:59,680
And the question is the
next state back there
1362
01:18:59,680 --> 01:19:01,020
or is it over there?
1363
01:19:01,020 --> 01:19:06,130
And if it's a star, then it
means it's back there.
1364
01:19:06,130 --> 01:19:11,670
And then we define a chain as
being reversible if P star of
1365
01:19:11,670 --> 01:19:18,370
ij is equal to P sub ij,
for all i and all j.
1366
01:19:18,370 --> 01:19:21,340
And what that means is that
all birth-death chains are
1367
01:19:21,340 --> 01:19:22,590
reversible.
1368
01:19:24,640 --> 01:19:26,570
And now let me show you
what that means.
1369
01:19:29,970 --> 01:19:34,290
If we look at arrivals and
departures for a birth-death
1370
01:19:34,290 --> 01:19:38,610
change, sometimes you go in a
self loop, so you don't go up
1371
01:19:38,610 --> 01:19:40,340
and you don't go down.
1372
01:19:40,340 --> 01:19:43,350
Other times you either
go down or you go up.
1373
01:19:43,350 --> 01:19:44,940
We have arrivals coming in.
1374
01:19:44,940 --> 01:19:49,250
Arrivals correspond to upper
transitions, departures
1375
01:19:49,250 --> 01:19:52,930
correspond to downward
transitions so that when you
1376
01:19:52,930 --> 01:19:58,020
look at it in a normal way,
you start out at time 0.
1377
01:19:58,020 --> 01:19:59,710
You're in state 0.
1378
01:19:59,710 --> 01:20:02,800
You have an arrival.
1379
01:20:02,800 --> 01:20:04,850
Nothing happens for a while.
1380
01:20:04,850 --> 01:20:07,970
You have another arrival, but
this time, you have a
1381
01:20:07,970 --> 01:20:12,110
departure, you have another
departure, and you wind up in
1382
01:20:12,110 --> 01:20:13,360
state 0 again.
1383
01:20:16,390 --> 01:20:21,010
As far states are concerned,
you go from
1384
01:20:21,010 --> 01:20:23,440
state 0 to state 1.
1385
01:20:23,440 --> 01:20:25,460
You stay in state 1.
1386
01:20:25,460 --> 01:20:27,360
In other words, this is the
difference between arrivals
1387
01:20:27,360 --> 01:20:27,510
and departures.
1388
01:20:27,510 --> 01:20:29,570
This is what the state is.
1389
01:20:29,570 --> 01:20:31,440
You stay in state 1.
1390
01:20:31,440 --> 01:20:34,440
Then you go up and you get
another arrival, you get a
1391
01:20:34,440 --> 01:20:37,160
departure, and then you
get a departure,
1392
01:20:37,160 --> 01:20:38,790
according to this chain.
1393
01:20:38,790 --> 01:20:43,070
Now, let's look at it
coming in this way.
1394
01:20:43,070 --> 01:20:45,480
When we look at it
coming backwards
1395
01:20:45,480 --> 01:20:47,970
in time, what happens?
1396
01:20:47,970 --> 01:20:51,450
We're going along here,
we're in state 0,
1397
01:20:51,450 --> 01:20:55,070
suddenly we move up.
1398
01:20:55,070 --> 01:21:01,130
If we want to view this as a
backward moving Markov chain,
1399
01:21:01,130 --> 01:21:04,660
this corresponds to an
arrival of something.
1400
01:21:04,660 --> 01:21:07,470
This corresponds to
another arrival.
1401
01:21:07,470 --> 01:21:09,710
This corresponds
to a departure.
1402
01:21:09,710 --> 01:21:12,720
We go along here with nothing
else happening, we get another
1403
01:21:12,720 --> 01:21:16,180
departure, and there we are back
in a steady state again.
1404
01:21:19,210 --> 01:21:23,280
And for any birth-death
chain, we can do this.
1405
01:21:23,280 --> 01:21:29,120
Because any birth-death chain
we can look at as an arrival
1406
01:21:29,120 --> 01:21:31,100
and departure process.
1407
01:21:31,100 --> 01:21:34,140
We have arrivals, we
have departures--
1408
01:21:34,140 --> 01:21:35,900
we might have states
that go negative.
1409
01:21:35,900 --> 01:21:39,450
That would be rather awkward,
but we can have that.
1410
01:21:39,450 --> 01:21:45,330
But now, if we know that these
steady state probabilities
1411
01:21:45,330 --> 01:21:48,510
govern the probability of
arrivals and the probabilities
1412
01:21:48,510 --> 01:21:53,620
of departures, if we know that
we have reversibility, then
1413
01:21:53,620 --> 01:21:57,360
these events here have the
same probability as these
1414
01:21:57,360 --> 01:21:59,620
events here.
1415
01:21:59,620 --> 01:22:02,800
It means that when we look at
things going from right back
1416
01:22:02,800 --> 01:22:07,140
to left, it means that the
things that we viewed as
1417
01:22:07,140 --> 01:22:11,170
departures here look
like arrivals.
1418
01:22:11,170 --> 01:22:13,730
And what we're going to do next
time is use that to prove
1419
01:22:13,730 --> 01:22:19,950
Burke's Theorem, which says
using this idea that if you
1420
01:22:19,950 --> 01:22:25,210
look at the process of
departures in a birth-death
1421
01:22:25,210 --> 01:22:29,130
chain where arrivals are all
with probability P and
1422
01:22:29,130 --> 01:22:32,590
departures are all with
probability Q, then you get
1423
01:22:32,590 --> 01:22:37,600
this nice set of probabilities
for arrivals and departures.
1424
01:22:37,600 --> 01:22:41,380
Arrivals are independent
of everything else--
1425
01:22:41,380 --> 01:22:43,960
same probability at every
unit of time.
1426
01:22:43,960 --> 01:22:45,950
Departures are the same way.
1427
01:22:45,950 --> 01:22:49,680
But when you're looking from
left to right, you can only
1428
01:22:49,680 --> 01:22:54,000
get departures when your state
is greater than 0.
1429
01:22:54,000 --> 01:22:58,680
When you're coming in looking
this way, these things that
1430
01:22:58,680 --> 01:23:03,220
looked like departures before
are looking like arrivals.
1431
01:23:03,220 --> 01:23:08,240
These arrivals form a Bernoulli
Process, and the
1432
01:23:08,240 --> 01:23:13,110
Bernoulli Process says that
given the future, the
1433
01:23:13,110 --> 01:23:17,630
probability of a departure,
at any instant of time, is
1434
01:23:17,630 --> 01:23:18,995
independent of everything
in the future.
1435
01:23:23,120 --> 01:23:25,920
Now, that is not intuitive.
1436
01:23:25,920 --> 01:23:28,960
If you think it's intuitive,
go back and think again.
1437
01:23:28,960 --> 01:23:30,550
Because it's not.
1438
01:23:30,550 --> 01:23:33,350
But anyway, I'm going
to stop here.
1439
01:23:33,350 --> 01:23:35,400
You have this to mull over.
1440
01:23:35,400 --> 01:23:38,500
We will try to sort it out
a little more next time.
1441
01:23:38,500 --> 01:23:40,880
This is something that's going
to take your cooperation to
1442
01:23:40,880 --> 01:23:42,130
sort it out, also.