WEBVTT
00:00:02.830 --> 00:00:04.030
Welcome back.
00:00:04.030 --> 00:00:07.370
So now we're going to finish
the rest of this problem.
00:00:07.370 --> 00:00:12.730
For part e, we've calculated
what the map and LMS
00:00:12.730 --> 00:00:13.840
estimators are.
00:00:13.840 --> 00:00:16.600
And now we're going to calculate
what the conditional
00:00:16.600 --> 00:00:17.750
mean squared error is.
00:00:17.750 --> 00:00:23.130
So it's a way to measure how
good these estimators are.
00:00:23.130 --> 00:00:25.740
So let's start out
generically.
00:00:25.740 --> 00:00:32.530
For any estimator theta hat,
the conditional MSE is--
00:00:32.530 --> 00:00:33.940
conditional mean
squared error--
00:00:37.810 --> 00:00:39.060
is equal to this.
00:00:42.610 --> 00:00:50.750
It's the estimator minus the
actual value squared
00:00:50.750 --> 00:00:55.450
conditioned on X being equal
to some little x.
00:00:55.450 --> 00:00:56.870
So the mean squared error.
00:00:56.870 --> 00:01:00.470
So you take the error, which is
the difference between your
00:01:00.470 --> 00:01:02.530
estimator and the true
value, square it, and
00:01:02.530 --> 00:01:03.570
then take the mean.
00:01:03.570 --> 00:01:07.190
And it's conditioned on the
actual value of what x is.
00:01:07.190 --> 00:01:11.260
Or, conditioned on the
data that you get is.
00:01:11.260 --> 00:01:15.200
So to calculate this, we use
our standard definition of
00:01:15.200 --> 00:01:16.730
what conditional expectation
would be.
00:01:16.730 --> 00:01:23.850
So it's theta hat minus
theta squared.
00:01:23.850 --> 00:01:33.780
And we weight that by the
appropriate conditional PDF,
00:01:33.780 --> 00:01:36.460
which in this case would
be the posterior.
00:01:36.460 --> 00:01:41.120
And we integrate this from x--
00:01:41.120 --> 00:01:44.450
from theta equals x
to theta equals 1.
00:01:44.450 --> 00:01:50.160
Now, we can go through some
algebra and this will tell us
00:01:50.160 --> 00:01:55.460
that this is theta hat squared
minus 2 theta hat theta minus
00:01:55.460 --> 00:01:57.410
plus theta squared.
00:01:57.410 --> 00:02:02.210
And this posterior we know from
before is 1 over theta
00:02:02.210 --> 00:02:08.669
times absolute value
of log x d theta.
00:02:08.669 --> 00:02:15.540
And when we do out this
integral, it's going to be--
00:02:15.540 --> 00:02:18.070
we can split up in to three
different terms.
00:02:18.070 --> 00:02:21.585
So there's theta hat squared
times this and
00:02:21.585 --> 00:02:22.280
you integrate it.
00:02:22.280 --> 00:02:26.210
But in fact, this is just
a conditional density.
00:02:26.210 --> 00:02:30.990
When you integrate it from x to
1, this will just integrate
00:02:30.990 --> 00:02:34.940
up to 1 because it is
a valid density.
00:02:34.940 --> 00:02:39.410
So the first term is just
theta hat squared.
00:02:39.410 --> 00:02:43.360
Now, the second term is you can
pull out of 2 theta hat
00:02:43.360 --> 00:02:51.320
and integrate theta times 1
over theta times absolute
00:02:51.320 --> 00:02:59.500
value of log of x d
theta from x to 1.
00:02:59.500 --> 00:03:04.430
And then the last one is
integral of theta squared 1
00:03:04.430 --> 00:03:13.130
over theta absolute value of
log x d theta from x to 1.
00:03:13.130 --> 00:03:16.660
OK, so we can do some more--
00:03:16.660 --> 00:03:24.040
with some more calculus, we get
a final answer is this.
00:03:24.040 --> 00:03:28.530
So this will integrate to
1 minus x over absolute
00:03:28.530 --> 00:03:31.600
value of log x.
00:03:31.600 --> 00:03:38.900
And this will integrate to 1
minus x squared over 2 times
00:03:38.900 --> 00:03:43.280
absolute value of log x.
00:03:43.280 --> 00:03:48.460
So this tells us for any generic
estimate theta hat,
00:03:48.460 --> 00:03:51.100
this would be what the
conditional mean
00:03:51.100 --> 00:03:52.590
squared error would be.
00:03:52.590 --> 00:03:58.150
Now, let's calculate what it
actually is for the specific
00:03:58.150 --> 00:04:00.030
estimates that we actually
came up with.
00:04:00.030 --> 00:04:04.120
So for the MAP rule, the
estimate of theta hat is just
00:04:04.120 --> 00:04:06.220
equal to x.
00:04:06.220 --> 00:04:09.180
So when we plug that into
this, we get that the
00:04:09.180 --> 00:04:19.430
conditional MSE is just equal to
x squared minus 2x 1 minus
00:04:19.430 --> 00:04:30.920
x absolute value of log x plus 1
minus x squared over 2 times
00:04:30.920 --> 00:04:35.290
absolute value of log of x.
00:04:35.290 --> 00:04:41.880
And for the LMS estimate,
remember this was equal to--
00:04:41.880 --> 00:04:48.500
theta hat was 1 minus x over
absolute value of log x.
00:04:48.500 --> 00:04:52.080
And so when you plug this
particular theta hat into this
00:04:52.080 --> 00:04:54.570
formula, what you get is that
the conditional mean squared
00:04:54.570 --> 00:05:01.580
error is equal to 1 minus x
squared over 2 times absolute
00:05:01.580 --> 00:05:07.990
value of log of x minus
1 minus x over log
00:05:07.990 --> 00:05:12.480
of x quantity squared.
00:05:12.480 --> 00:05:17.830
So these two expressions tells
us what the mean squared error
00:05:17.830 --> 00:05:21.290
is for the MAP rule
and the LMS rule.
00:05:21.290 --> 00:05:23.740
And it's kind of hard to
actually interpret exactly
00:05:23.740 --> 00:05:27.130
which one is better based on
just these expressions.
00:05:27.130 --> 00:05:33.980
So it's helpful to plot out
what the conditional mean
00:05:33.980 --> 00:05:36.440
squared error is.
00:05:36.440 --> 00:05:38.130
So we're plotting for x.
00:05:38.130 --> 00:05:41.030
For each possible actual
data that we observe--
00:05:41.030 --> 00:05:42.845
data point that we observe,
what is the
00:05:42.845 --> 00:05:44.100
mean squared error?
00:05:44.100 --> 00:05:48.980
So let's do the MAP
rule first.
00:05:48.980 --> 00:05:51.000
The MAP rule would look
something like this.
00:05:57.070 --> 00:06:02.810
And it turns out that the LMS
rule is better, and it will
00:06:02.810 --> 00:06:11.980
look like this dotted line
here on the bottom.
00:06:11.980 --> 00:06:16.850
And so it turns out that if your
metric for how good your
00:06:16.850 --> 00:06:21.220
estimate is is the conditional
mean squared error, then LMS
00:06:21.220 --> 00:06:22.540
is better than MAP.
00:06:22.540 --> 00:06:28.650
And this is true because LMS
is actually designed to
00:06:28.650 --> 00:06:31.790
actually minimize what this
mean squared error is.
00:06:31.790 --> 00:06:35.865
And so in this case, the LMS
estimator should have a better
00:06:35.865 --> 00:06:40.680
mean squared error than
the map estimator.
00:06:40.680 --> 00:06:47.440
OK, now the last part of the
problem, we calculate one more
00:06:47.440 --> 00:06:53.360
type of estimator, which is
the linear LMS estimator.
00:06:53.360 --> 00:06:59.410
So notice that the LMS estimator
was this one.
00:06:59.410 --> 00:07:02.820
It was 1 minus x over absolute
value of log of x.
00:07:02.820 --> 00:07:08.160
And this is not linear in x,
which means sometimes it's
00:07:08.160 --> 00:07:10.560
difficult to calculate.
00:07:10.560 --> 00:07:16.160
And so what we do is we tried to
come up with a linear form
00:07:16.160 --> 00:07:19.800
of this, something that is like
ax plus b, where a and b
00:07:19.800 --> 00:07:23.070
are some constant numbers.
00:07:23.070 --> 00:07:27.480
But that also does well in terms
of having a small mean
00:07:27.480 --> 00:07:28.780
squared error.
00:07:28.780 --> 00:07:33.860
And so we know from the class
that in order to calculate the
00:07:33.860 --> 00:07:42.290
linear LMS, the linear LMS we
know we just need to calculate
00:07:42.290 --> 00:07:43.930
a few different parts.
00:07:43.930 --> 00:07:48.870
So it's equal to the expectation
of the parameter
00:07:48.870 --> 00:07:58.910
plus the covariance of theta and
x over the variance of x
00:07:58.910 --> 00:08:02.850
times x minus expectation
of x.
00:08:06.980 --> 00:08:08.560
Now, in order to do this,
we just need to
00:08:08.560 --> 00:08:09.730
calculate four things.
00:08:09.730 --> 00:08:12.530
We need the expectation of
theta, the covariance, the
00:08:12.530 --> 00:08:15.725
variance, and the expectation
of x.
00:08:15.725 --> 00:08:18.840
OK, so let's calculate what
these things are.
00:08:18.840 --> 00:08:22.120
Expectation of theta.
00:08:22.120 --> 00:08:24.350
We know that theta is uniformly
distributed
00:08:24.350 --> 00:08:25.490
between 0 and 1.
00:08:25.490 --> 00:08:28.710
And so the expectation
of theta is the
00:08:28.710 --> 00:08:29.560
easiest one to calculate.
00:08:29.560 --> 00:08:31.610
It's just 1/2.
00:08:31.610 --> 00:08:35.590
What about the expectation
of x?
00:08:35.590 --> 00:08:38.280
Well, expectation of x is a
little bit more complicated.
00:08:38.280 --> 00:08:41.909
But remember, like in previous
problems, it's helpful when
00:08:41.909 --> 00:08:47.090
you have a hierarchy of
randomness to try to use the
00:08:47.090 --> 00:08:49.720
law of iterated expectations.
00:08:49.720 --> 00:08:54.960
So the delay, which
is x, is random.
00:08:54.960 --> 00:08:57.270
But it's randomness depends
on the actual
00:08:57.270 --> 00:08:59.790
distribution, which is theta.
00:08:59.790 --> 00:09:01.100
Which itself is random.
00:09:01.100 --> 00:09:05.420
And so let's try to condition
on theta and see
00:09:05.420 --> 00:09:06.940
if that helps us.
00:09:06.940 --> 00:09:10.590
OK, so if we knew what theta
was, then what is the
00:09:10.590 --> 00:09:12.040
expectation of x?
00:09:12.040 --> 00:09:14.540
Well, we know that given
theta, x is uniformly
00:09:14.540 --> 00:09:16.300
distributed between
0 and theta.
00:09:16.300 --> 00:09:21.230
And so the mean would be
just theta over 2.
00:09:21.230 --> 00:09:26.580
And so this would just be
expectation of theta over 2.
00:09:26.580 --> 00:09:30.490
And we know this is just 1/2
times the expectation of
00:09:30.490 --> 00:09:32.150
theta, which is 1/2.
00:09:32.150 --> 00:09:33.930
So this is just 1/4.
00:09:36.510 --> 00:09:41.690
Now, let's calculate
the variance of x.
00:09:41.690 --> 00:09:47.180
The variance of x takes some
more work because we need to
00:09:47.180 --> 00:09:51.000
use the law of total variance,
which is this.
00:09:51.000 --> 00:09:56.140
That the variance of theta is
equal to the expectation of
00:09:56.140 --> 00:09:59.630
the conditional variance plus
the variance of the
00:09:59.630 --> 00:10:01.460
conditional expectation.
00:10:05.510 --> 00:10:07.540
Let's see if we can figure
out what these
00:10:07.540 --> 00:10:08.270
different parts are.
00:10:08.270 --> 00:10:12.015
What is the conditional variance
of x given theta?
00:10:12.015 --> 00:10:16.070
Well, given theta, x we know
is uniformly distributed
00:10:16.070 --> 00:10:17.740
between 0 and theta.
00:10:17.740 --> 00:10:24.820
And remember for uniform
distribution of width c, the
00:10:24.820 --> 00:10:28.680
variance of that uniform
distribution is just c
00:10:28.680 --> 00:10:31.330
squared over 12.
00:10:31.330 --> 00:10:33.960
And so in this case, what is
the width of this uniform
00:10:33.960 --> 00:10:34.550
distribution?
00:10:34.550 --> 00:10:36.730
Well, it's uniformly distributed
between 0 and
00:10:36.730 --> 00:10:38.820
theta, so the width is theta.
00:10:38.820 --> 00:10:42.535
So this variance should be
theta squared over 12.
00:10:45.190 --> 00:10:48.540
OK, what about the expectation
of x given theta?
00:10:48.540 --> 00:10:50.830
Well, we already argued earlier
that the expectation
00:10:50.830 --> 00:10:56.530
of x given theta is
just theta over 2.
00:10:56.530 --> 00:10:58.150
So now let's fill in the rest.
00:10:58.150 --> 00:11:02.390
What's the expectation of
theta squared over 12?
00:11:02.390 --> 00:11:07.760
Well, that takes a little bit
more work because this is
00:11:07.760 --> 00:11:09.810
just-- you can think
of it as 1/12.
00:11:09.810 --> 00:11:11.160
You could pull the 1/12
out times the
00:11:11.160 --> 00:11:13.140
expectation of theta squared.
00:11:13.140 --> 00:11:16.340
Well, the expectation of theta
squared we can calculate from
00:11:16.340 --> 00:11:21.300
the variance of theta plus
the expectation of
00:11:21.300 --> 00:11:24.730
theta quantity squared.
00:11:24.730 --> 00:11:27.690
Because that is just the
definition of variance.
00:11:27.690 --> 00:11:31.920
Variance is equal to expectation
of theta squared
00:11:31.920 --> 00:11:35.440
minus expectation of theta
quantity squared.
00:11:35.440 --> 00:11:37.560
So we've just reversed
the formula.
00:11:37.560 --> 00:11:41.180
Now, the second half is the
variance of theta over 2.
00:11:41.180 --> 00:11:43.000
Well, remember when you pull
out a constant from a
00:11:43.000 --> 00:11:44.920
variance, you have
to square it.
00:11:44.920 --> 00:11:49.370
So this is just equal to 1/4
times the variance of theta.
00:11:53.040 --> 00:11:54.740
Well, what is the variance
of theta?
00:11:54.740 --> 00:11:58.460
The variance of theta is
the variance of uniform
00:11:58.460 --> 00:11:59.600
between 0 and 1.
00:11:59.600 --> 00:12:00.810
So the width is 1.
00:12:00.810 --> 00:12:03.100
So you get 1 squared over 12.
00:12:03.100 --> 00:12:05.760
And the variance is 1/12.
00:12:05.760 --> 00:12:06.860
What is the mean of theta?
00:12:06.860 --> 00:12:08.990
It's 1/2 when you square
that, you get 1/4.
00:12:12.350 --> 00:12:13.900
Finally for here, the
variance of theta
00:12:13.900 --> 00:12:15.240
like we said, is 1/12.
00:12:15.240 --> 00:12:18.060
So you get 1/12.
00:12:18.060 --> 00:12:21.050
And now, when you combine all
these, you get that the
00:12:21.050 --> 00:12:29.780
variance ends up being 7/144.
00:12:29.780 --> 00:12:31.340
Now we have almost everything.
00:12:31.340 --> 00:12:32.520
The last thing we need
to calculate is
00:12:32.520 --> 00:12:34.030
this covariance term.
00:12:34.030 --> 00:12:39.640
What is the covariance
of theta and x?
00:12:39.640 --> 00:12:43.480
Well, the covariance we know is
just the expectation of the
00:12:43.480 --> 00:12:48.570
product of theta and x minus
the product of the
00:12:48.570 --> 00:12:50.420
expectations.
00:12:50.420 --> 00:12:55.980
So the expectation of x times
the expectation of theta.
00:12:55.980 --> 00:12:58.490
All right, so we already know
what expectation of theta is.
00:12:58.490 --> 00:12:59.180
That's 1/2.
00:12:59.180 --> 00:13:01.030
And expectation of x was 1/4.
00:13:01.030 --> 00:13:03.020
So the only thing that we don't
know is expectation of
00:13:03.020 --> 00:13:05.230
the product of the two.
00:13:05.230 --> 00:13:11.230
So once again, let's try to
use iterated expectations.
00:13:11.230 --> 00:13:20.010
So let's calculate this as
the expectation of this
00:13:20.010 --> 00:13:22.730
conditional expectation.
00:13:22.730 --> 00:13:25.180
So we, again, condition
on theta.
00:13:25.180 --> 00:13:29.590
And minus the expectation
of theta is 1/2.
00:13:29.590 --> 00:13:34.100
Times 1/4, which is the
expectation of x.
00:13:34.100 --> 00:13:36.485
Now, what is this conditional
expectation?
00:13:39.730 --> 00:13:42.490
Well, the expectation
of theta--
00:13:42.490 --> 00:13:46.280
if you know what theta is, then
the expectation of theta
00:13:46.280 --> 00:13:47.450
is just theta.
00:13:47.450 --> 00:13:49.770
You already know what it is, so
you know for sure that the
00:13:49.770 --> 00:13:52.000
expectation is just
equal to theta.
00:13:52.000 --> 00:13:55.430
And what is the expectation
of x given theta?
00:13:55.430 --> 00:13:57.700
Well, the expectation of x given
theta we already said
00:13:57.700 --> 00:13:59.210
was theta over 2.
00:13:59.210 --> 00:14:03.220
So what you get is this entire
expression is just going to be
00:14:03.220 --> 00:14:10.070
equal to theta times theta
over 2, or expectation of
00:14:10.070 --> 00:14:15.120
theta squared over
2 minus 1/8.
00:14:15.120 --> 00:14:18.700
Now, what is the expectation
of theta squared over 2?
00:14:18.700 --> 00:14:20.650
Well, we know that--
00:14:20.650 --> 00:14:23.170
we already calculated out
what expectation of
00:14:23.170 --> 00:14:25.440
theta squared is.
00:14:25.440 --> 00:14:30.360
So we know that expectation
of theta squared
00:14:30.360 --> 00:14:33.070
is 1/12 plus 1/4.
00:14:33.070 --> 00:14:38.140
So what we get is we need a 1/2
times 1/12 plus 1/4, which
00:14:38.140 --> 00:14:42.950
is 1/3 minus 1/8.
00:14:42.950 --> 00:14:48.110
So the answer is 1/6 minus
1/8, which is 1/24.
00:14:54.410 --> 00:14:57.630
Now, let's actually plug
this in and figure out
00:14:57.630 --> 00:14:59.380
what this value is.
00:14:59.380 --> 00:15:01.860
So when you get everything--
00:15:04.480 --> 00:15:11.750
when you combine everything,
you get that the
00:15:11.750 --> 00:15:13.370
LMS estimator is--
00:15:13.370 --> 00:15:18.230
the linear LMS estimator
is going to be--
00:15:22.600 --> 00:15:23.860
expectation of theta is 1/2.
00:15:27.360 --> 00:15:31.090
The covariance is 1/24.
00:15:31.090 --> 00:15:33.790
The variance is 7/144.
00:15:33.790 --> 00:15:43.830
And when you divide that, it's
equal to 6/7 times x minus 1/4
00:15:43.830 --> 00:15:46.940
because expectation
of x is 1/4.
00:15:46.940 --> 00:15:50.610
And you can simplify this a
little bit and get that this
00:15:50.610 --> 00:15:55.160
is equal to 6/7 times
x plus 2/7.
00:15:58.580 --> 00:16:02.760
So now we have three different
types of estimators.
00:16:02.760 --> 00:16:05.670
The map estimator,
which is this.
00:16:05.670 --> 00:16:06.900
Notice that it's kind
of complicated.
00:16:06.900 --> 00:16:07.950
You have x squared terms.
00:16:07.950 --> 00:16:10.150
You have more x squared terms.
00:16:10.150 --> 00:16:13.980
And you have absolute
value of log of x.
00:16:13.980 --> 00:16:17.590
And then you have the LMS, which
is, again, nonlinear.
00:16:17.590 --> 00:16:20.010
And now you have something
that looks very simple--
00:16:20.010 --> 00:16:20.490
much simpler.
00:16:20.490 --> 00:16:24.420
It's just 6/7 x plus 2/7.
00:16:24.420 --> 00:16:27.560
And that is the linear
LMS estimator.
00:16:27.560 --> 00:16:34.200
And it turns out that you can,
again, plot these to see what
00:16:34.200 --> 00:16:35.270
this one looks like.
00:16:35.270 --> 00:16:44.220
So here is our original plot
of x and theta hat.
00:16:44.220 --> 00:16:45.470
So the map estimator--
00:16:50.230 --> 00:16:53.500
sorry, the map estimator was
just theta hat equals x.
00:16:53.500 --> 00:16:57.600
This was the mean squared error
of the map estimator.
00:16:57.600 --> 00:17:02.490
So the map estimator is just
this diagonal straight line.
00:17:02.490 --> 00:17:05.950
The LMS estimator looked
like this.
00:17:05.950 --> 00:17:10.079
And it turns out that the linear
LMS estimator will look
00:17:10.079 --> 00:17:17.520
something like this.
00:17:17.520 --> 00:17:20.800
So it is fairly close to
the LMS estimator, but
00:17:20.800 --> 00:17:22.180
not quite the same.
00:17:22.180 --> 00:17:25.740
And note, especially that
depending on what x is, if x
00:17:25.740 --> 00:17:28.380
is fairly close to the 1, you
might actually get an estimate
00:17:28.380 --> 00:17:31.010
of theta that's greater
than 1.
00:17:31.010 --> 00:17:34.040
So for example, if you observe
that Julian is actually an
00:17:34.040 --> 00:17:37.610
hour late, then x is 1 and your
estimate of theta from
00:17:37.610 --> 00:17:40.310
the linear LMS estimator
would be 8/7, which is
00:17:40.310 --> 00:17:43.400
greater than 1.
00:17:43.400 --> 00:17:49.240
That doesn't quite make sense
because we know that theta is
00:17:49.240 --> 00:17:51.230
bounded to be only
between 0 and 1.
00:17:51.230 --> 00:17:54.350
So you shouldn't get an estimate
of theta that's
00:17:54.350 --> 00:17:55.420
greater than 1.
00:17:55.420 --> 00:17:58.670
And that's one of the side
effects of having the linear
00:17:58.670 --> 00:17:59.480
LMS estimator.
00:17:59.480 --> 00:18:02.942
So that sometimes you will have
an estimator that doesn't
00:18:02.942 --> 00:18:05.430
quite make sense.
00:18:05.430 --> 00:18:09.590
But what you get instead when
sacrificing that is you get a
00:18:09.590 --> 00:18:13.540
simple form of the estimator
that's linear.
00:18:13.540 --> 00:18:16.070
And now, let's actually
consider what
00:18:16.070 --> 00:18:18.530
the performance is.
00:18:18.530 --> 00:18:22.790
And it turns out that the
performance in terms of the
00:18:22.790 --> 00:18:26.880
conditional mean squared error
is actually fairly close to
00:18:26.880 --> 00:18:28.570
the LMS estimator.
00:18:28.570 --> 00:18:30.620
So it looks like this.
00:18:30.620 --> 00:18:33.810
Pretty close, pretty close,
until you get close to 1.
00:18:33.810 --> 00:18:36.080
In which case, it does worse.
00:18:36.080 --> 00:18:39.810
And it does worse precisely
because it will come up with
00:18:39.810 --> 00:18:42.150
estimates of theta which
are greater than 1,
00:18:42.150 --> 00:18:44.390
which are too large.
00:18:44.390 --> 00:18:48.410
But otherwise, it does pretty
well with a estimator that is
00:18:48.410 --> 00:18:52.800
much simpler in form than
the LMS estimator.
00:18:52.800 --> 00:18:56.940
So in this problem, which had
several parts, we actually
00:18:56.940 --> 00:18:59.740
went through, basically, all
the different concepts and
00:18:59.740 --> 00:19:03.340
tools within Chapter Eight
for Bayesian inference.
00:19:03.340 --> 00:19:08.750
We talked about the prior, the
posterior, calculating the
00:19:08.750 --> 00:19:10.060
posterior using the
Bayes' rule.
00:19:10.060 --> 00:19:12.170
We calculated the
MAP estimator.
00:19:12.170 --> 00:19:14.630
We calculated the
LMS estimator.
00:19:14.630 --> 00:19:17.430
From those, we calculated what
the mean squared error for
00:19:17.430 --> 00:19:19.790
each one of those and
compared the two.
00:19:19.790 --> 00:19:23.310
And then, we looked at the
linear LMS estimator as
00:19:23.310 --> 00:19:26.930
another example and calculated
what that estimator is, along
00:19:26.930 --> 00:19:30.560
with the mean squared error
for that and compared all
00:19:30.560 --> 00:19:32.140
three of these.
00:19:32.140 --> 00:19:34.800
So I hope that was a good review
problem for Chapter
00:19:34.800 --> 00:19:36.320
Eight, and we'll see
you next time.