WEBVTT

00:00:09.744 --> 00:00:11.160
FRANK ACKERMAN:
First about issues

00:00:11.160 --> 00:00:14.340
of cost benefit analysis,
if we're doing it,

00:00:14.340 --> 00:00:17.500
and then come back to
the issues that Lee

00:00:17.500 --> 00:00:20.394
outlined about what's wrong
with cost benefit analysis.

00:00:20.394 --> 00:00:21.810
Because whether
or not you believe

00:00:21.810 --> 00:00:24.930
something is wrong with it,
many decisions are made with it.

00:00:24.930 --> 00:00:30.690
So, well first of all, I'm
impressed at what everyone

00:00:30.690 --> 00:00:34.290
has done here and
particularly people

00:00:34.290 --> 00:00:37.890
doing this level of
presentation cold and level

00:00:37.890 --> 00:00:41.650
of impersonation of
participants in the debate

00:00:41.650 --> 00:00:46.160
is very impressive.

00:00:46.160 --> 00:00:48.330
You seem to have grasped
most of the things

00:00:48.330 --> 00:00:50.460
that I would end up talking
about in cost benefit

00:00:50.460 --> 00:00:53.610
analysis in an initial
discussion of it, so let

00:00:53.610 --> 00:00:58.920
me try to move on to some of
the more difficult or advanced

00:00:58.920 --> 00:01:03.420
issues that I think are
touched on indirectly.

00:01:03.420 --> 00:01:05.970
The first speaker
mentioned the willingness

00:01:05.970 --> 00:01:09.120
to pay or willingness to accept.

00:01:09.120 --> 00:01:12.410
This is a distinction that
sounds almost the same.

00:01:12.410 --> 00:01:15.090
They turn out to be extremely
different in practice.

00:01:15.090 --> 00:01:17.280
And economists who have
written about methods

00:01:17.280 --> 00:01:22.121
have pretty much come to
a consensus on willingness

00:01:22.121 --> 00:01:22.620
to pay.

00:01:22.620 --> 00:01:25.260
I'm not sure that
it's necessarily

00:01:25.260 --> 00:01:26.520
for that great reason.

00:01:26.520 --> 00:01:28.860
One of the things it said
is that people feel strongly

00:01:28.860 --> 00:01:31.410
about the environment, they
might have ridiculously high

00:01:31.410 --> 00:01:33.760
willingness to accept values
that they wouldn't accept

00:01:33.760 --> 00:01:36.270
damage unless you
offered them an absurdly

00:01:36.270 --> 00:01:37.630
large amount of money.

00:01:37.630 --> 00:01:42.360
But I mean, that might be
data rather than a problem

00:01:42.360 --> 00:01:45.190
in the analysis.

00:01:45.190 --> 00:01:50.940
Nonetheless, the practice has
settled on willingness to pay.

00:01:50.940 --> 00:01:52.710
I think there's a
question about what

00:01:52.710 --> 00:01:57.570
do the numbers mean if the
costs and benefits are not

00:01:57.570 --> 00:01:59.100
comparably monetized.

00:01:59.100 --> 00:02:02.730
I think lurking behind
the whole discussion

00:02:02.730 --> 00:02:05.040
is the assumption
that the cost benefit

00:02:05.040 --> 00:02:08.570
results are meaningful because
they are comparably complete.

00:02:08.570 --> 00:02:10.679
And that if they are
not comparably complete,

00:02:10.679 --> 00:02:13.681
if for instance, the
accounting of cost

00:02:13.681 --> 00:02:15.930
is much more complete than
the accounting of benefits,

00:02:15.930 --> 00:02:19.920
then you have at
best, a lower bound

00:02:19.920 --> 00:02:26.040
rather than a point estimate
of the exact right estimate.

00:02:26.040 --> 00:02:30.060
So, you know, that's
also essentially

00:02:30.060 --> 00:02:31.460
never recognized sometimes.

00:02:31.460 --> 00:02:34.290
There's a throwaway
qualification

00:02:34.290 --> 00:02:38.930
about what the unquantified
benefits might mean.

00:02:38.930 --> 00:02:42.020
The question about how economic
benefits of job creation

00:02:42.020 --> 00:02:45.650
are handled is a
separate puzzle.

00:02:45.650 --> 00:02:50.210
This depends on the
macroeconomic theories

00:02:50.210 --> 00:02:51.200
that one subscribes to.

00:02:51.200 --> 00:02:54.890
Cost benefit analysis is
sometimes, but not necessarily

00:02:54.890 --> 00:02:57.820
embedded in a
theory that assumes

00:02:57.820 --> 00:03:01.160
free markets reach a state
of employment more or less

00:03:01.160 --> 00:03:02.310
all the time.

00:03:02.310 --> 00:03:05.210
Computer generally equilibrium
models, which essentially

00:03:05.210 --> 00:03:09.860
conceal this assumption
behind waves of mathematics,

00:03:09.860 --> 00:03:13.190
but assume that
labor markets clear.

00:03:13.190 --> 00:03:14.495
Well, guess what?

00:03:14.495 --> 00:03:17.390
If labor markets clear,
you don't create net jobs

00:03:17.390 --> 00:03:20.560
by putting people to work.

00:03:20.560 --> 00:03:22.430
There are essentially
no policy makers

00:03:22.430 --> 00:03:25.660
in the country who actually
act as if they believe this.

00:03:25.660 --> 00:03:29.470
I mean, are we creating local
jobs is a central question

00:03:29.470 --> 00:03:31.790
for every policymaker.

00:03:31.790 --> 00:03:33.530
So in that sense,
the calculation

00:03:33.530 --> 00:03:36.470
is correct relative
to what people assume.

00:03:36.470 --> 00:03:41.528
But not necessarily correct
relative to the theories.

00:03:41.528 --> 00:03:47.200
The who should, the question
of is it worth doing this.

00:03:47.200 --> 00:03:49.850
You know, does cost benefit
analysis on its own terms show

00:03:49.850 --> 00:03:52.550
that it's worth
remediating versus what

00:03:52.550 --> 00:03:56.570
does it show about development
are, as I think people noted,

00:03:56.570 --> 00:03:58.400
are two separate questions.

00:03:58.400 --> 00:04:01.220
And what it showed,
you know, is it worth

00:04:01.220 --> 00:04:04.162
remediating is a question that
takes cost benefit analysis.

00:04:04.162 --> 00:04:05.870
What should be developed
there strikes me

00:04:05.870 --> 00:04:07.790
as probably a more
straightforward

00:04:07.790 --> 00:04:11.760
financial calculation
about development.

00:04:11.760 --> 00:04:13.610
Unless it has
environmental impacts

00:04:13.610 --> 00:04:15.040
we haven't talked about.

00:04:15.040 --> 00:04:19.540
Also the question
of who should, who

00:04:19.540 --> 00:04:23.200
should pay versus who benefits,
are again, separate questions.

00:04:23.200 --> 00:04:26.410
Cost benefit analysis
identifies, again

00:04:26.410 --> 00:04:30.160
in its own terms, is it worth
it for society to do it,

00:04:30.160 --> 00:04:31.670
not who should pay.

00:04:31.670 --> 00:04:34.150
Should the people who benefit
the most from development

00:04:34.150 --> 00:04:37.880
pay for it, is a policy question
about distribution of benefits.

00:04:37.880 --> 00:04:39.804
If you were building
low income housing,

00:04:39.804 --> 00:04:42.220
you would never suggest that
the people who benefited most

00:04:42.220 --> 00:04:44.530
from low income housing
should pay the cost.

00:04:44.530 --> 00:04:48.850
That's not the point
of low income housing.

00:04:48.850 --> 00:04:54.430
The discounting question I think
applies in particular to health

00:04:54.430 --> 00:04:56.860
costs, one of the
other debates--

00:04:56.860 --> 00:04:59.340
Which my co-author, Lisa
Heinzerling particularly

00:04:59.340 --> 00:05:02.530
has highlighted, is that
if you have diseases

00:05:02.530 --> 00:05:04.930
that have a long latency
period, as you might

00:05:04.930 --> 00:05:06.790
well in Superfund pollution.

00:05:06.790 --> 00:05:10.640
Things that will show up 20,
30, 40 years after exposure,

00:05:10.640 --> 00:05:13.630
do you discount them from the
time when the disease appears

00:05:13.630 --> 00:05:16.842
or from the time when the
risk, the exposure occurs?

00:05:16.842 --> 00:05:18.300
At a high discount
rate, this could

00:05:18.300 --> 00:05:20.020
make a very large difference.

00:05:20.020 --> 00:05:22.330
The government
practice has drifted

00:05:22.330 --> 00:05:25.010
toward the more conservative
approach of discounting

00:05:25.010 --> 00:05:26.260
from when the disease appears.

00:05:26.260 --> 00:05:29.200
But the discourse of
risk which is involved,

00:05:29.200 --> 00:05:32.070
seems to point to discounting
from the time when

00:05:32.070 --> 00:05:35.560
the exposure happened, which
makes them look much larger.

00:05:35.560 --> 00:05:37.620
LEE: Just expand on that,
because people I think

00:05:37.620 --> 00:05:40.622
were a little fuzzy about
discounting in the first place.

00:05:40.622 --> 00:05:42.080
AUDIENCE: I have
your slides if you

00:05:42.080 --> 00:05:45.220
want one of those particular
slides brought up,

00:05:45.220 --> 00:05:46.615
your discounting slides.

00:05:50.800 --> 00:05:53.570
FRANK ACKERMAN: Well, let
me try just saying it once.

00:05:53.570 --> 00:05:59.530
So discounting applies to cases
where the costs and benefits

00:05:59.530 --> 00:06:00.670
happen at different years.

00:06:00.670 --> 00:06:03.970
No one is indifferent between
whether costs or benefits

00:06:03.970 --> 00:06:07.750
happen now or 10 years from now.

00:06:07.750 --> 00:06:09.670
And so trying to
express everything

00:06:09.670 --> 00:06:12.250
as an equivalent present value,
the farther in the future

00:06:12.250 --> 00:06:17.960
it is, generally the less it
seems like it's worth today.

00:06:17.960 --> 00:06:21.400
So if costs and benefits
happen at very different times,

00:06:21.400 --> 00:06:25.390
as they do in many environmental
problems, the rate at which we

00:06:25.390 --> 00:06:26.490
discount the future.

00:06:26.490 --> 00:06:26.990
Right?

00:06:26.990 --> 00:06:30.880
We can all agree that getting
paid far in the future

00:06:30.880 --> 00:06:33.160
is worth less than getting
paid the same amount today.

00:06:33.160 --> 00:06:35.140
But how much less is it worth?

00:06:35.140 --> 00:06:38.120
What's the discount rate ?

00:06:38.120 --> 00:06:42.100
Brings up, that will
affect essentially,

00:06:42.100 --> 00:06:45.730
the trade-off between, the
price ratio between the future

00:06:45.730 --> 00:06:46.910
and the present.

00:06:46.910 --> 00:06:48.670
And so the more, the
higher the discount

00:06:48.670 --> 00:06:50.710
rate, the more
unfavorable that is.

00:06:50.710 --> 00:06:55.360
So one of the issues with the
kind of toxic health hazards

00:06:55.360 --> 00:06:57.820
that arise in this
scenario in particular,

00:06:57.820 --> 00:07:01.810
is that you can be
exposed to them today

00:07:01.810 --> 00:07:05.240
and you know, cancer is
famous for, many cancers have

00:07:05.240 --> 00:07:08.080
a very long latency
period before there's

00:07:08.080 --> 00:07:10.190
any detectable disease.

00:07:10.190 --> 00:07:13.150
But they come from exposures,
childhood exposures,

00:07:13.150 --> 00:07:16.510
exposures decades earlier.

00:07:16.510 --> 00:07:19.510
People have the, people who
immigrate internationally

00:07:19.510 --> 00:07:22.210
have the cancer patterns of
the country they lived in

00:07:22.210 --> 00:07:25.060
before they were 20 generally.

00:07:25.060 --> 00:07:27.270
So even cancers
occur late in life.

00:07:27.270 --> 00:07:30.040
So in that case, if you're
discounting the future

00:07:30.040 --> 00:07:34.510
at a big value, you're exposed
today, you show signs of cancer

00:07:34.510 --> 00:07:36.160
30 years from now.

00:07:36.160 --> 00:07:39.490
Should we treat that as a harm
that was done to you today

00:07:39.490 --> 00:07:40.630
when you were exposed?

00:07:40.630 --> 00:07:43.180
Or a harm that was done to
you 30 years from now when you

00:07:43.180 --> 00:07:45.350
had cancer?

00:07:45.350 --> 00:07:48.579
At a high enough discount rate,
those would be very different.

00:07:48.579 --> 00:07:51.453
AUDIENCE: For discounting,
does that only happen when you,

00:07:51.453 --> 00:07:53.848
when the costs and
benefits are only looked

00:07:53.848 --> 00:07:56.243
at for present day people?

00:07:56.243 --> 00:07:59.188
Like if you include future
people in your accounting,

00:07:59.188 --> 00:08:01.079
you no longer have discounting?

00:08:01.079 --> 00:08:01.870
FRANK ACKERMAN: No.

00:08:01.870 --> 00:08:03.640
Then you really
have discounting.

00:08:03.640 --> 00:08:05.950
Because there's no
way for future people

00:08:05.950 --> 00:08:07.640
to be at the table
and make decisions.

00:08:07.640 --> 00:08:12.610
So I mean the decisions are
being made by today's people.

00:08:12.610 --> 00:08:14.170
You don't know what
the future wants.

00:08:14.170 --> 00:08:17.830
You don't know what the future's
willingness to pay will be.

00:08:17.830 --> 00:08:19.770
So we're making
decisions about what we

00:08:19.770 --> 00:08:21.420
think those things are worth.

00:08:21.420 --> 00:08:26.260
And the farther in
the future they are,

00:08:26.260 --> 00:08:30.168
the more we discount
them at a positive rate.

00:08:30.168 --> 00:08:32.490
AUDIENCE: I don't know if we
have time to clarify this.

00:08:32.490 --> 00:08:34.222
I'm wondering why,
I think we can

00:08:34.222 --> 00:08:36.825
make a reasonable assumption
that for air pollution,

00:08:36.825 --> 00:08:38.806
how much it matters to us.

00:08:38.806 --> 00:08:42.409
The people in the future also
don't want air pollution.

00:08:42.409 --> 00:08:45.620
They relate to air pollution at
least similarly to us, right?

00:08:45.620 --> 00:08:48.480
There wouldn't be
discounting in that case?

00:08:48.480 --> 00:08:49.980
FRANK ACKERMAN:
Well, I mean, people

00:08:49.980 --> 00:08:51.396
who have thought
a lot about this,

00:08:51.396 --> 00:08:53.430
and again, Mark Sagoff,
who Lee mentioned,

00:08:53.430 --> 00:08:55.200
is a philosopher
who's looked at some

00:08:55.200 --> 00:08:57.060
of this, who essentially
concluded we're

00:08:57.060 --> 00:08:58.635
going to create the future.

00:08:58.635 --> 00:09:01.830
If we preserve wild nature
and act like we value it,

00:09:01.830 --> 00:09:04.380
we'll probably have
descendants who value that.

00:09:04.380 --> 00:09:08.460
If we create a world that's
all paved and has strip malls

00:09:08.460 --> 00:09:10.230
and excellent video
games, we'll probably

00:09:10.230 --> 00:09:13.530
create descendants
who value that.

00:09:13.530 --> 00:09:16.590
So it's like, there's
a circularity in that

00:09:16.590 --> 00:09:18.690
what we do today will
actually create the future's

00:09:18.690 --> 00:09:20.130
preferences.

00:09:20.130 --> 00:09:24.630
So there is no way actually to
do something from a hypothesis

00:09:24.630 --> 00:09:28.110
about what the future prefers,
because not only do they not

00:09:28.110 --> 00:09:31.596
exist yet, but we
will create them.

00:09:31.596 --> 00:09:34.190
LEE: We have two
minutes [INAUDIBLE]..

00:09:34.190 --> 00:09:35.850
FRANK ACKERMAN: Oh, my God.

00:09:39.530 --> 00:09:41.550
So what do you do
with this refusal

00:09:41.550 --> 00:09:44.885
of cost benefit analysis
for such excellent reasons

00:09:44.885 --> 00:09:48.390
as Lee outlines?

00:09:48.390 --> 00:09:51.030
You know, I think again,
separating in this story,

00:09:51.030 --> 00:09:53.460
separating the cleanup
from the development,

00:09:53.460 --> 00:09:57.570
there might be a stronger
case for the cleanup

00:09:57.570 --> 00:09:59.330
and a weaker one
for development--

00:09:59.330 --> 00:10:03.000
The more you're thinking about
these non-monetized values.

00:10:03.000 --> 00:10:05.310
I think that I've
come to the conclusion

00:10:05.310 --> 00:10:08.530
that despite the validity
of all those critiques

00:10:08.530 --> 00:10:10.770
and the importance of
saying them every time

00:10:10.770 --> 00:10:12.690
you get a chance, that if--

00:10:12.690 --> 00:10:14.940
You have more than six
minutes to talk about one

00:10:14.940 --> 00:10:17.220
of these things that
you have to then go

00:10:17.220 --> 00:10:21.390
on and say using the prevailing
values, what would you get.

00:10:21.390 --> 00:10:26.460
Try to avoid endorsing
them as a sign of,

00:10:26.460 --> 00:10:29.200
you know, yes, we think this
is the greatest idea ever,

00:10:29.200 --> 00:10:30.375
but saying--

00:10:30.375 --> 00:10:32.520
I've ended up
saying using values

00:10:32.520 --> 00:10:35.130
that have become conventional,
here's what you would conclude.

00:10:35.130 --> 00:10:37.752
So in this case,
$35 million damage

00:10:37.752 --> 00:10:39.210
is not very large
if you think it's

00:10:39.210 --> 00:10:40.620
going to kill a few people.

00:10:40.620 --> 00:10:44.210
Because values of life in
the $6 to $8 million range

00:10:44.210 --> 00:10:46.265
have become conventional.

00:10:46.265 --> 00:10:47.640
If we had more
time, I could tell

00:10:47.640 --> 00:10:51.530
you how absurd the basis for the
$6 to $8 million per life is,

00:10:51.530 --> 00:10:53.170
and the paradoxes
that come from that.

00:10:53.170 --> 00:10:58.860
But given that that has become
semi standard in the policy

00:10:58.860 --> 00:11:05.280
discourse, a policy that saves
a few lives predicatively

00:11:05.280 --> 00:11:08.970
is clearly worth $35
million to society

00:11:08.970 --> 00:11:10.840
in conventional
cost benefit terms.

00:11:10.840 --> 00:11:15.630
So that things that
kind of hold your nose

00:11:15.630 --> 00:11:18.480
and go with the lesser evil,
which American politics is

00:11:18.480 --> 00:11:21.290
so full of, occurs here too.

00:11:21.290 --> 00:11:24.270
LEE: Would that last
sentence, in a sense,

00:11:24.270 --> 00:11:27.480
be adequate to just
say to the governor,

00:11:27.480 --> 00:11:31.790
I think you don't need to spend
money on an elaborate analysis,

00:11:31.790 --> 00:11:35.710
even using the prevailing values
of $6 to $7 million per person.

00:11:35.710 --> 00:11:39.180
We undoubtedly would
save six or seven lives.

00:11:39.180 --> 00:11:42.940
Maybe 60 or 70, maybe 600 or
700 in the course of this.

00:11:42.940 --> 00:11:46.084
So we don't even need
to any further analysis.

00:11:46.084 --> 00:11:47.625
FRANK ACKERMAN: I
think that's right.

00:11:47.625 --> 00:11:51.150
If it's clear that it
saves a lot of lives,

00:11:51.150 --> 00:11:54.176
and air pollution
often kills people.

00:11:54.176 --> 00:11:55.800
So the things that
reduce air pollution

00:11:55.800 --> 00:11:58.830
are particularly
successful in this.

00:11:58.830 --> 00:12:00.750
There are a handful
of these values

00:12:00.750 --> 00:12:02.220
that have become standard.

00:12:02.220 --> 00:12:04.950
There's been an argument
that you shouldn't

00:12:04.950 --> 00:12:09.420
allow other values, there's
been a very partisan discourse

00:12:09.420 --> 00:12:10.830
about which values are allowed.

00:12:10.830 --> 00:12:13.440
But the handful of values
which have been allowed,

00:12:13.440 --> 00:12:17.410
create strong arguments
for some policies.

00:12:17.410 --> 00:12:19.230
Superfund one's
actually are trickier

00:12:19.230 --> 00:12:21.940
to demonstrate the number of
deaths than air pollution.

00:12:21.940 --> 00:12:24.373
Air pollution so
clearly kills people

00:12:24.373 --> 00:12:26.706
that it just doesn't have a
chance in these cost benefit

00:12:26.706 --> 00:12:28.895
settings.

00:12:28.895 --> 00:12:30.770
LEE: We will stop out
of respect for the fact

00:12:30.770 --> 00:12:32.700
that you have to
study more economics.

00:12:32.700 --> 00:12:35.770
Frank, thank you so much.

00:12:35.770 --> 00:12:39.120
[APPLAUSE]