1
00:00:00,000 --> 00:00:00,550
2
00:00:00,550 --> 00:00:01,400
Hey, everyone.
3
00:00:01,400 --> 00:00:02,730
Welcome back.
4
00:00:02,730 --> 00:00:04,970
Today, we're going to do another
fun problem that has
5
00:00:04,970 --> 00:00:07,830
to do with a random number
of coin flips.
6
00:00:07,830 --> 00:00:11,340
So the experiment we're going
to run is as follows.
7
00:00:11,340 --> 00:00:14,300
We're given a fair six-sided
die, and we roll it.
8
00:00:14,300 --> 00:00:17,240
And then we take a fair coin,
and we flip it the number of
9
00:00:17,240 --> 00:00:19,640
times indicated by the die.
10
00:00:19,640 --> 00:00:23,520
That is to say, if I roll a four
on my die, then I flip
11
00:00:23,520 --> 00:00:25,700
the coin four times.
12
00:00:25,700 --> 00:00:29,160
And then we're interested in
some statistics regarding the
13
00:00:29,160 --> 00:00:31,820
number of heads that show
up in our sequence.
14
00:00:31,820 --> 00:00:34,730
In particular, we want to
compute the expectation and
15
00:00:34,730 --> 00:00:38,460
the variance of the number
of heads that we see.
16
00:00:38,460 --> 00:00:41,310
So the first step of this
problem is to translate the
17
00:00:41,310 --> 00:00:42,660
English to the math.
18
00:00:42,660 --> 00:00:45,540
So we have to define
some notation.
19
00:00:45,540 --> 00:00:48,250
I went ahead and did
that for us.
20
00:00:48,250 --> 00:00:52,170
I defined n to be the outcome
of the die role.
21
00:00:52,170 --> 00:00:54,830
Now, since we flip the coin the
number of times shown by
22
00:00:54,830 --> 00:00:57,830
the die roll, n is equivalently
the number of
23
00:00:57,830 --> 00:00:59,840
flips that we perform.
24
00:00:59,840 --> 00:01:02,990
And n, of course, is a random
variable, and I've written its
25
00:01:02,990 --> 00:01:04,599
PMF up here.
26
00:01:04,599 --> 00:01:08,100
So Pn of n is just a discrete
uniform random variable
27
00:01:08,100 --> 00:01:11,610
between 1 and 6, because we're
told that the die has six
28
00:01:11,610 --> 00:01:14,790
sides and that it's fair.
29
00:01:14,790 --> 00:01:17,990
Now, I also defined h
to be the number of
30
00:01:17,990 --> 00:01:19,440
heads that we see.
31
00:01:19,440 --> 00:01:22,000
So that's the quantity
of interest.
32
00:01:22,000 --> 00:01:24,600
And it turns out that Bernoulli
random variables
33
00:01:24,600 --> 00:01:27,260
will be very helpful to
us in this problem.
34
00:01:27,260 --> 00:01:32,820
So I defined x sub i as
1 if the ith flip is
35
00:01:32,820 --> 00:01:35,010
heads, and 0 otherwise.
36
00:01:35,010 --> 00:01:37,440
And what we're going to do now
is, we're going to use these x
37
00:01:37,440 --> 00:01:41,920
sub i's to come up with
an expression for h.
38
00:01:41,920 --> 00:01:45,010
So if you want to count the
number of heads, one possible
39
00:01:45,010 --> 00:01:49,990
thing you could do is start with
0 and then look at the
40
00:01:49,990 --> 00:01:51,470
first coin flip.
41
00:01:51,470 --> 00:01:54,990
If it's heads, you add 1 to 0,
which I'm going to call your
42
00:01:54,990 --> 00:01:56,270
running sum.
43
00:01:56,270 --> 00:02:00,020
If the first flip is
tails, you add 0.
44
00:02:00,020 --> 00:02:03,230
And similarly, after that, after
every trial, if you see
45
00:02:03,230 --> 00:02:05,270
heads, you add 1 to
your running sum.
46
00:02:05,270 --> 00:02:07,590
If you see a tails, you add 0.
47
00:02:07,590 --> 00:02:11,440
And in that way, we can
precisely compute h.
48
00:02:11,440 --> 00:02:17,090
So the mathematical statement of
what I just said is that h
49
00:02:17,090 --> 00:02:23,650
is equal to x1 plus x2
plus x3, all the way
50
00:02:23,650 --> 00:02:27,280
through x sub n.
51
00:02:27,280 --> 00:02:32,170
So now, we are interested
in computing e of h, the
52
00:02:32,170 --> 00:02:34,150
expectation of h.
53
00:02:34,150 --> 00:02:39,280
So your knee jerk reaction might
be to say, oh, well, by
54
00:02:39,280 --> 00:02:42,760
linearity of expectation,
we know that this is an
55
00:02:42,760 --> 00:02:47,710
expectation of x1, et cetera
through the expectation of xn.
56
00:02:47,710 --> 00:02:51,000
But in this case, you would
actually be wrong.
57
00:02:51,000 --> 00:02:52,710
Don't do that.
58
00:02:52,710 --> 00:02:56,110
And the reason that this is not
going to work for us is
59
00:02:56,110 --> 00:03:00,250
because we're dealing
with a random
60
00:03:00,250 --> 00:03:02,600
number of random variables.
61
00:03:02,600 --> 00:03:05,050
So each xi is a random
variable.
62
00:03:05,050 --> 00:03:07,600
And we have capital n of them.
63
00:03:07,600 --> 00:03:10,240
But capital n is a
random variable.
64
00:03:10,240 --> 00:03:12,720
It denotes the outcome
of our die roll.
65
00:03:12,720 --> 00:03:17,100
So we actually cannot just
take the sum of these
66
00:03:17,100 --> 00:03:18,420
expectations.
67
00:03:18,420 --> 00:03:21,790
Instead, we're going to have
to condition on n and use
68
00:03:21,790 --> 00:03:24,080
iterated expectation.
69
00:03:24,080 --> 00:03:29,630
So this is the mathematical
statement of what I just said.
70
00:03:29,630 --> 00:03:34,390
And the reason why this works
is because conditioning on n
71
00:03:34,390 --> 00:03:36,970
will take us to the case that
we already know how to deal
72
00:03:36,970 --> 00:03:40,540
with, where we have a known
number of random variables.
73
00:03:40,540 --> 00:03:43,460
And of course, iterated
expectations holds, as you saw
74
00:03:43,460 --> 00:03:45,360
in lecture.
75
00:03:45,360 --> 00:03:48,310
I will briefly mention here that
the formula we're going
76
00:03:48,310 --> 00:03:50,620
to derive is derived
in the book.
77
00:03:50,620 --> 00:03:52,270
And it was probably derived
in lecture.
78
00:03:52,270 --> 00:03:54,850
So if you want, you can just
go to that formula
79
00:03:54,850 --> 00:03:55,790
immediately.
80
00:03:55,790 --> 00:03:58,730
But I think the derivation of
the formula that we need is
81
00:03:58,730 --> 00:04:00,760
quick and is helpful.
82
00:04:00,760 --> 00:04:02,010
So I'm going to go through
it quickly.
83
00:04:02,010 --> 00:04:07,060
84
00:04:07,060 --> 00:04:08,440
Let's do it over here.
85
00:04:08,440 --> 00:04:11,650
Plugging in our running
sum for h, we get this
86
00:04:11,650 --> 00:04:12,670
expression--
87
00:04:12,670 --> 00:04:20,010
x1 plus x2 et cetera plus
xn, conditioned on n.
88
00:04:20,010 --> 00:04:24,940
And this, of course,
is n times the
89
00:04:24,940 --> 00:04:27,830
expectation of x sub i.
90
00:04:27,830 --> 00:04:30,890
So again, I'm going through
this quickly,
91
00:04:30,890 --> 00:04:32,420
because it's in the book.
92
00:04:32,420 --> 00:04:37,560
But this step holds, because
each of these xi's have the
93
00:04:37,560 --> 00:04:38,410
same statistics.
94
00:04:38,410 --> 00:04:41,900
They're all Bernoulli with
parameter of 1/2, because our
95
00:04:41,900 --> 00:04:43,840
coin is fair.
96
00:04:43,840 --> 00:04:49,660
And so I used x sub i to say it
doesn't really matter which
97
00:04:49,660 --> 00:04:52,317
integer you pick for i, because
the expectation of xi
98
00:04:52,317 --> 00:04:55,470
is the same for all i.
99
00:04:55,470 --> 00:05:00,360
So this now, the expectation
of x sub i, this is just a
100
00:05:00,360 --> 00:05:03,020
number, it's just some constant,
so you can pull it
101
00:05:03,020 --> 00:05:04,860
out of the expectation.
102
00:05:04,860 --> 00:05:08,310
So you get the expectation
of x sub i times the
103
00:05:08,310 --> 00:05:10,970
expectation of n.
104
00:05:10,970 --> 00:05:15,540
So I gave away the answer
to this a second ago.
105
00:05:15,540 --> 00:05:19,360
But x sub i is just a Bernoulli
random variable with
106
00:05:19,360 --> 00:05:22,400
parameter of success of 1/2.
107
00:05:22,400 --> 00:05:25,130
And we know already that the
expectation of such a random
108
00:05:25,130 --> 00:05:28,680
variable is just p, or 1/2.
109
00:05:28,680 --> 00:05:32,860
So this is 1/2 times
expectation of n.
110
00:05:32,860 --> 00:05:34,720
And now n we know is a discrete
111
00:05:34,720 --> 00:05:36,900
uniform random variable.
112
00:05:36,900 --> 00:05:40,510
And there's a formula that I'm
going to use, which hopefully
113
00:05:40,510 --> 00:05:42,270
some of you may remember.
114
00:05:42,270 --> 00:05:46,930
If you have a discrete uniform
random variable that takes on
115
00:05:46,930 --> 00:05:49,605
values between a and b--
116
00:05:49,605 --> 00:05:53,350
117
00:05:53,350 --> 00:05:54,640
let's use w--
118
00:05:54,640 --> 00:05:57,960
if you call this random variable
w, then we have that
119
00:05:57,960 --> 00:06:04,890
the variance of w is equal to b
minus a times b minus a plus
120
00:06:04,890 --> 00:06:07,310
2 divided by 12.
121
00:06:07,310 --> 00:06:08,620
So that's the variance.
122
00:06:08,620 --> 00:06:10,630
We don't actually need the
variance, but we will need
123
00:06:10,630 --> 00:06:11,570
this later.
124
00:06:11,570 --> 00:06:15,470
And the expectation of w--
125
00:06:15,470 --> 00:06:17,520
actually, let's just
do it up here right
126
00:06:17,520 --> 00:06:19,000
ahead for this problem.
127
00:06:19,000 --> 00:06:21,930
Because we have a discrete
uniform random variable, the
128
00:06:21,930 --> 00:06:23,980
expectation is just
the middle.
129
00:06:23,980 --> 00:06:28,780
So you agree hopefully that the
middle is right at 3.5,
130
00:06:28,780 --> 00:06:31,020
which is also 7/2.
131
00:06:31,020 --> 00:06:34,330
So this is times 7/2, which
is equal to 7/4.
132
00:06:34,330 --> 00:06:37,140
133
00:06:37,140 --> 00:06:39,550
So we are done with
part of part a.
134
00:06:39,550 --> 00:06:42,915
I'm going to write this answer
over here, so I can erase.
135
00:06:42,915 --> 00:06:47,260
136
00:06:47,260 --> 00:06:49,810
And we're going to do something
very similar to
137
00:06:49,810 --> 00:06:51,930
compute the variance.
138
00:06:51,930 --> 00:06:54,810
To compute the variance,
we are going to also
139
00:06:54,810 --> 00:06:55,840
condition on n.
140
00:06:55,840 --> 00:06:58,320
So we get rid of this source
of randomness.
141
00:06:58,320 --> 00:07:00,830
And then we're going to use law
of total variance, which
142
00:07:00,830 --> 00:07:04,050
you've also seen in lecture.
143
00:07:04,050 --> 00:07:06,800
And again, the formula
for this variance is
144
00:07:06,800 --> 00:07:08,030
derived in the book.
145
00:07:08,030 --> 00:07:10,770
So I'm going to go through
it quickly.
146
00:07:10,770 --> 00:07:13,690
But make sure you understand
this derivation, because it
147
00:07:13,690 --> 00:07:17,350
exercises a lot of stuff
we taught you.
148
00:07:17,350 --> 00:07:22,840
So this, just using law of
total variance, is the
149
00:07:22,840 --> 00:07:27,880
variance of expectation of h
given n, plus the expectation
150
00:07:27,880 --> 00:07:32,540
of the variance of h given n.
151
00:07:32,540 --> 00:07:37,470
And now, plugging in
this running sum
152
00:07:37,470 --> 00:07:42,520
for h, you get this.
153
00:07:42,520 --> 00:07:43,490
It's a mouthful to write.
154
00:07:43,490 --> 00:07:46,860
Bear with me.
155
00:07:46,860 --> 00:07:51,780
x1 through xn given n--
156
00:07:51,780 --> 00:07:53,020
so I didn't do anything fancy.
157
00:07:53,020 --> 00:07:56,040
I just plugged this into here.
158
00:07:56,040 --> 00:07:58,680
So this term is similar
to what we saw
159
00:07:58,680 --> 00:08:01,230
in a previous problem.
160
00:08:01,230 --> 00:08:04,490
By linearity of expectation and
due to the fact that all
161
00:08:04,490 --> 00:08:07,190
of the x i's are distributed in
the same way, they have the
162
00:08:07,190 --> 00:08:11,130
same expectation, this
becomes n times the
163
00:08:11,130 --> 00:08:14,940
expectation of x sub i.
164
00:08:14,940 --> 00:08:17,800
And let's do this
term over here.
165
00:08:17,800 --> 00:08:18,680
This term--
166
00:08:18,680 --> 00:08:22,860
well, conditioned on
n, this n is known.
167
00:08:22,860 --> 00:08:27,650
So we essentially have
a finite known sum of
168
00:08:27,650 --> 00:08:29,830
independent random variables.
169
00:08:29,830 --> 00:08:32,490
We know that the variance of
a sum of independent random
170
00:08:32,490 --> 00:08:35,409
variables is the sum
of the variances.
171
00:08:35,409 --> 00:08:38,909
So this is the variance of x1
plus the variance of x2 et
172
00:08:38,909 --> 00:08:42,620
cetera, plus the
variance of xn.
173
00:08:42,620 --> 00:08:45,980
And furthermore, again, because
all of these xi's have
174
00:08:45,980 --> 00:08:48,780
the same distribution, the
variance is the same.
175
00:08:48,780 --> 00:08:52,710
So we can actually write this
as n times the variance of x
176
00:08:52,710 --> 00:08:55,120
sub i, where x sub i
just corresponds
177
00:08:55,120 --> 00:08:56,250
to one of the trials.
178
00:08:56,250 --> 00:08:58,290
It doesn't matter which one,
because they all have the same
179
00:08:58,290 --> 00:09:00,430
variance and expectation.
180
00:09:00,430 --> 00:09:03,720
So now, we're almost
home free.
181
00:09:03,720 --> 00:09:06,000
This is just some scaler.
182
00:09:06,000 --> 00:09:08,300
So we can take it out of
the variance, but we
183
00:09:08,300 --> 00:09:09,840
have to square it.
184
00:09:09,840 --> 00:09:14,100
So this becomes expectation
of xi squared times
185
00:09:14,100 --> 00:09:16,260
the variance of n.
186
00:09:16,260 --> 00:09:19,480
And then this variance is also
just a scalar, so we can take
187
00:09:19,480 --> 00:09:20,440
it outside.
188
00:09:20,440 --> 00:09:25,940
So then we get variance of x sub
i times expectation of n.
189
00:09:25,940 --> 00:09:29,920
Now, we know that the
expectation of x sub i is just
190
00:09:29,920 --> 00:09:32,810
the probability of success,
which is 1/2.
191
00:09:32,810 --> 00:09:37,430
So we have 1/2 squared, or 1/4,
times the variance of n.
192
00:09:37,430 --> 00:09:40,950
So that's where this formula
comes in handy.
193
00:09:40,950 --> 00:09:45,170
b is equal to 6, a
is equal to 1.
194
00:09:45,170 --> 00:09:50,210
So we get that the variance
of n is equal to 5 times--
195
00:09:50,210 --> 00:09:52,360
and then 5 plus 2 is 7--
196
00:09:52,360 --> 00:09:54,160
divided by 12.
197
00:09:54,160 --> 00:09:56,540
So this is just a formula from
the book that you guys
198
00:09:56,540 --> 00:09:57,400
hopefully remember.
199
00:09:57,400 --> 00:09:59,510
So we get 35/12.
200
00:09:59,510 --> 00:10:02,370
201
00:10:02,370 --> 00:10:05,940
And then the variance of xi,
we know the variance of a
202
00:10:05,940 --> 00:10:09,370
Bernoulli random variable is
just p times 1 minus p.
203
00:10:09,370 --> 00:10:14,270
So in our case, that's 1/2
times 1/2, which is 1/4.
204
00:10:14,270 --> 00:10:15,400
So we get 1/4.
205
00:10:15,400 --> 00:10:18,280
And then the expectation of n,
we remember from our previous
206
00:10:18,280 --> 00:10:22,070
computation, is just 7/2.
207
00:10:22,070 --> 00:10:25,990
So I will let you guys do this
arithmetic on your own time.
208
00:10:25,990 --> 00:10:27,900
But the answer comes
out to be 77/48.
209
00:10:27,900 --> 00:10:30,760
210
00:10:30,760 --> 00:10:34,020
So I will go ahead and put
our answer over here--
211
00:10:34,020 --> 00:10:36,880
77/48--
212
00:10:36,880 --> 00:10:38,130
so that I can erase.
213
00:10:38,130 --> 00:10:40,490
214
00:10:40,490 --> 00:10:43,050
So I want you guys to
start thinking about
215
00:10:43,050 --> 00:10:45,520
part b while I erase.
216
00:10:45,520 --> 00:10:49,330
Essentially, you do the same
experiment that we did in part
217
00:10:49,330 --> 00:10:54,840
a, except now we use two
dice instead of one.
218
00:10:54,840 --> 00:10:58,690
So in part b, just to repeat,
you now have two dice.
219
00:10:58,690 --> 00:11:00,510
You roll them.
220
00:11:00,510 --> 00:11:01,600
You look at the outcome.
221
00:11:01,600 --> 00:11:05,220
If you have an outcome of four
on one die and six on another
222
00:11:05,220 --> 00:11:08,040
die, then you flip the
coin 10 times.
223
00:11:08,040 --> 00:11:09,880
So it's the same exact
experiment.
224
00:11:09,880 --> 00:11:11,920
We're interested in the number
of heads we want the
225
00:11:11,920 --> 00:11:14,400
expectation and the variance.
226
00:11:14,400 --> 00:11:17,040
But this step is now a
little bit different.
227
00:11:17,040 --> 00:11:20,380
228
00:11:20,380 --> 00:11:24,560
Again, let's approach this by
defining some notation first.
229
00:11:24,560 --> 00:11:30,780
Now, I want to let n1 be the
outcome of the first die.
230
00:11:30,780 --> 00:11:35,200
231
00:11:35,200 --> 00:11:39,450
And then you can let n2 be the
outcome of the second die.
232
00:11:39,450 --> 00:11:45,010
233
00:11:45,010 --> 00:11:46,900
And we'll start with
just that.
234
00:11:46,900 --> 00:11:51,240
So one way you could approach
this problem is say, OK, if n1
235
00:11:51,240 --> 00:11:54,690
is the outcome of my first die
and n2 is the outcome of my
236
00:11:54,690 --> 00:11:57,720
second die, then the number of
coin flips that I'm going to
237
00:11:57,720 --> 00:11:59,250
make is n1 plus n2.
238
00:11:59,250 --> 00:12:03,460
239
00:12:03,460 --> 00:12:05,650
This is the total coin flips.
240
00:12:05,650 --> 00:12:08,570
241
00:12:08,570 --> 00:12:12,830
So you could just repeat the
same exact math that we did in
242
00:12:12,830 --> 00:12:17,610
part a, except everywhere that
you see an n, you replace that
243
00:12:17,610 --> 00:12:20,850
n with n1 plus n2.
244
00:12:20,850 --> 00:12:23,730
So that will get you to your
answer, but it will require
245
00:12:23,730 --> 00:12:25,400
slightly more work.
246
00:12:25,400 --> 00:12:27,590
We're going to think about
this problem slightly
247
00:12:27,590 --> 00:12:29,510
differently.
248
00:12:29,510 --> 00:12:33,430
So the way we are thinking about
it just now, we roll two
249
00:12:33,430 --> 00:12:35,040
dice at the same time.
250
00:12:35,040 --> 00:12:38,630
We add the results
of the die rolls.
251
00:12:38,630 --> 00:12:43,560
And then we flip the coin
that number of times.
252
00:12:43,560 --> 00:12:48,540
But another way you can think
about this is, you roll one
253
00:12:48,540 --> 00:12:51,380
die, and then you flip the coin
the number of times shown
254
00:12:51,380 --> 00:12:53,990
by that die and count
the number of heads.
255
00:12:53,990 --> 00:12:57,410
And then you take the second
die and you roll it.
256
00:12:57,410 --> 00:13:02,960
And then you flip the coin that
many more times and count
257
00:13:02,960 --> 00:13:05,600
the number of heads
after that.
258
00:13:05,600 --> 00:13:16,140
So you could define h1 to be
number of heads in the first
259
00:13:16,140 --> 00:13:18,035
n1 coin flips.
260
00:13:18,035 --> 00:13:21,670
261
00:13:21,670 --> 00:13:29,940
And you could just let h2 be
the number of heads in the
262
00:13:29,940 --> 00:13:34,620
last n2 coin flips.
263
00:13:34,620 --> 00:13:38,320
So hopefully that terminology
is not confusing you.
264
00:13:38,320 --> 00:13:42,990
Essentially, what I'm saying
is, n1 plus n2 means you'll
265
00:13:42,990 --> 00:13:51,082
have n1 flips, followed by
n2 flips, for a total
266
00:13:51,082 --> 00:13:53,330
of n1 plus n2 flips.
267
00:13:53,330 --> 00:13:56,300
And then within the first n1
flips, you can get some number
268
00:13:56,300 --> 00:13:59,390
of heads, which we're
calling h1.
269
00:13:59,390 --> 00:14:02,200
And in the last n2 flips, you
can get some number of heads,
270
00:14:02,200 --> 00:14:04,340
which is h2.
271
00:14:04,340 --> 00:14:08,400
So the total number of heads
that we get at the end--
272
00:14:08,400 --> 00:14:10,351
I'm going to call it h star--
273
00:14:10,351 --> 00:14:14,590
is equal to h1 plus h2.
274
00:14:14,590 --> 00:14:17,370
And what part b is really
asking us for is the
275
00:14:17,370 --> 00:14:22,200
expectation of h star and
the variance of h star.
276
00:14:22,200 --> 00:14:26,406
But here's where something
really beautiful happens.
277
00:14:26,406 --> 00:14:31,220
h1 and h2 are independent,
and they are
278
00:14:31,220 --> 00:14:33,000
statistically the same.
279
00:14:33,000 --> 00:14:36,760
So the reason why they're
independent is because--
280
00:14:36,760 --> 00:14:40,200
well, first of all, all of our
coin flips are independent.
281
00:14:40,200 --> 00:14:44,560
And they're statistically the
same, because the experiment
282
00:14:44,560 --> 00:14:45,650
is exactly the same.
283
00:14:45,650 --> 00:14:47,110
And everything's independent.
284
00:14:47,110 --> 00:14:52,940
So instead of imagining one
person rolling two die and
285
00:14:52,940 --> 00:14:55,390
then summing the outcomes and
flipping a coin that many
286
00:14:55,390 --> 00:14:58,670
times and counting heads, you
can imagine one person takes
287
00:14:58,670 --> 00:15:00,720
one die and goes
into one room.
288
00:15:00,720 --> 00:15:02,710
A second person takes a
second die and goes
289
00:15:02,710 --> 00:15:04,500
into another room.
290
00:15:04,500 --> 00:15:06,100
They run their experiments.
291
00:15:06,100 --> 00:15:08,520
Then they report back
to a third person
292
00:15:08,520 --> 00:15:09,930
the number of heads.
293
00:15:09,930 --> 00:15:13,460
And that person adds them
together to get h star.
294
00:15:13,460 --> 00:15:16,660
And in that scenario, everything
is very clearly
295
00:15:16,660 --> 00:15:18,560
independent.
296
00:15:18,560 --> 00:15:21,330
So the expectation of h star--
297
00:15:21,330 --> 00:15:23,560
you actually don't need
independence for this part,
298
00:15:23,560 --> 00:15:26,580
because linearly of expectation
always holds.
299
00:15:26,580 --> 00:15:31,450
But you get the expectation of
h1 plus the expectation of h2.
300
00:15:31,450 --> 00:15:35,200
And because these guys are
statistically equivalent, this
301
00:15:35,200 --> 00:15:39,700
is just two times the
expectation of h.
302
00:15:39,700 --> 00:15:42,840
And the expectation of h we
calculated in part a.
303
00:15:42,840 --> 00:15:47,760
So this is 2 times 7 over 4.
304
00:15:47,760 --> 00:15:49,670
Now, for the variance,
here's where the
305
00:15:49,670 --> 00:15:50,930
independence comes in.
306
00:15:50,930 --> 00:15:54,238
I'm actually going to write this
somewhere where I don't
307
00:15:54,238 --> 00:15:55,200
have to bend over.
308
00:15:55,200 --> 00:16:00,690
So the variance of h star is
equal to the variance of h1
309
00:16:00,690 --> 00:16:03,820
plus the variance of
h2 by independence.
310
00:16:03,820 --> 00:16:07,020
And that's equal to 2 times
the variance of h, because
311
00:16:07,020 --> 00:16:09,740
they are statistically
the same.
312
00:16:09,740 --> 00:16:12,810
And the variance of h
we computed already.
313
00:16:12,810 --> 00:16:18,650
So this is just 2 times
77 over 48.
314
00:16:18,650 --> 00:16:23,390
So the succient answer to part
b is that both the mean and
315
00:16:23,390 --> 00:16:28,830
the variance double from part
A. So hopefully you guys
316
00:16:28,830 --> 00:16:30,060
enjoyed this problem.
317
00:16:30,060 --> 00:16:32,510
We covered a bunch of things.
318
00:16:32,510 --> 00:16:37,870
So we saw how to deal with
having a random number of
319
00:16:37,870 --> 00:16:38,830
random variables.
320
00:16:38,830 --> 00:16:41,840
Usually we have a fixed number
of random variables.
321
00:16:41,840 --> 00:16:44,670
In this problem, the number of
random variables we were
322
00:16:44,670 --> 00:16:47,350
adding together was
itself random.
323
00:16:47,350 --> 00:16:49,410
So to handle that, we
conditioned on n.
324
00:16:49,410 --> 00:16:53,110
And to compute expectation, we
use iterated expectation.
325
00:16:53,110 --> 00:16:57,450
To compute variance, we used
law of total variance.
326
00:16:57,450 --> 00:17:03,020
And then in part b, we were
just a little bit clever.
327
00:17:03,020 --> 00:17:07,270
We thought about how can we
reinterpret this experiment to
328
00:17:07,270 --> 00:17:08,859
reduce computation.
329
00:17:08,859 --> 00:17:12,869
And we realized that part b is
essentially two independent
330
00:17:12,869 --> 00:17:14,150
trials of part a.
331
00:17:14,150 --> 00:17:15,900
So both the mean and the
variance should double.
332
00:17:15,900 --> 00:17:19,567