1 00:00:09,744 --> 00:00:11,160 FRANK ACKERMAN: First about issues 2 00:00:11,160 --> 00:00:14,340 of cost benefit analysis, if we're doing it, 3 00:00:14,340 --> 00:00:17,500 and then come back to the issues that Lee 4 00:00:17,500 --> 00:00:20,394 outlined about what's wrong with cost benefit analysis. 5 00:00:20,394 --> 00:00:21,810 Because whether or not you believe 6 00:00:21,810 --> 00:00:24,930 something is wrong with it, many decisions are made with it. 7 00:00:24,930 --> 00:00:30,690 So, well first of all, I'm impressed at what everyone 8 00:00:30,690 --> 00:00:34,290 has done here and particularly people 9 00:00:34,290 --> 00:00:37,890 doing this level of presentation cold and level 10 00:00:37,890 --> 00:00:41,650 of impersonation of participants in the debate 11 00:00:41,650 --> 00:00:46,160 is very impressive. 12 00:00:46,160 --> 00:00:48,330 You seem to have grasped most of the things 13 00:00:48,330 --> 00:00:50,460 that I would end up talking about in cost benefit 14 00:00:50,460 --> 00:00:53,610 analysis in an initial discussion of it, so let 15 00:00:53,610 --> 00:00:58,920 me try to move on to some of the more difficult or advanced 16 00:00:58,920 --> 00:01:03,420 issues that I think are touched on indirectly. 17 00:01:03,420 --> 00:01:05,970 The first speaker mentioned the willingness 18 00:01:05,970 --> 00:01:09,120 to pay or willingness to accept. 19 00:01:09,120 --> 00:01:12,410 This is a distinction that sounds almost the same. 20 00:01:12,410 --> 00:01:15,090 They turn out to be extremely different in practice. 21 00:01:15,090 --> 00:01:17,280 And economists who have written about methods 22 00:01:17,280 --> 00:01:22,121 have pretty much come to a consensus on willingness 23 00:01:22,121 --> 00:01:22,620 to pay. 24 00:01:22,620 --> 00:01:25,260 I'm not sure that it's necessarily 25 00:01:25,260 --> 00:01:26,520 for that great reason. 26 00:01:26,520 --> 00:01:28,860 One of the things it said is that people feel strongly 27 00:01:28,860 --> 00:01:31,410 about the environment, they might have ridiculously high 28 00:01:31,410 --> 00:01:33,760 willingness to accept values that they wouldn't accept 29 00:01:33,760 --> 00:01:36,270 damage unless you offered them an absurdly 30 00:01:36,270 --> 00:01:37,630 large amount of money. 31 00:01:37,630 --> 00:01:42,360 But I mean, that might be data rather than a problem 32 00:01:42,360 --> 00:01:45,190 in the analysis. 33 00:01:45,190 --> 00:01:50,940 Nonetheless, the practice has settled on willingness to pay. 34 00:01:50,940 --> 00:01:52,710 I think there's a question about what 35 00:01:52,710 --> 00:01:57,570 do the numbers mean if the costs and benefits are not 36 00:01:57,570 --> 00:01:59,100 comparably monetized. 37 00:01:59,100 --> 00:02:02,730 I think lurking behind the whole discussion 38 00:02:02,730 --> 00:02:05,040 is the assumption that the cost benefit 39 00:02:05,040 --> 00:02:08,570 results are meaningful because they are comparably complete. 40 00:02:08,570 --> 00:02:10,679 And that if they are not comparably complete, 41 00:02:10,679 --> 00:02:13,681 if for instance, the accounting of cost 42 00:02:13,681 --> 00:02:15,930 is much more complete than the accounting of benefits, 43 00:02:15,930 --> 00:02:19,920 then you have at best, a lower bound 44 00:02:19,920 --> 00:02:26,040 rather than a point estimate of the exact right estimate. 45 00:02:26,040 --> 00:02:30,060 So, you know, that's also essentially 46 00:02:30,060 --> 00:02:31,460 never recognized sometimes. 47 00:02:31,460 --> 00:02:34,290 There's a throwaway qualification 48 00:02:34,290 --> 00:02:38,930 about what the unquantified benefits might mean. 49 00:02:38,930 --> 00:02:42,020 The question about how economic benefits of job creation 50 00:02:42,020 --> 00:02:45,650 are handled is a separate puzzle. 51 00:02:45,650 --> 00:02:50,210 This depends on the macroeconomic theories 52 00:02:50,210 --> 00:02:51,200 that one subscribes to. 53 00:02:51,200 --> 00:02:54,890 Cost benefit analysis is sometimes, but not necessarily 54 00:02:54,890 --> 00:02:57,820 embedded in a theory that assumes 55 00:02:57,820 --> 00:03:01,160 free markets reach a state of employment more or less 56 00:03:01,160 --> 00:03:02,310 all the time. 57 00:03:02,310 --> 00:03:05,210 Computer generally equilibrium models, which essentially 58 00:03:05,210 --> 00:03:09,860 conceal this assumption behind waves of mathematics, 59 00:03:09,860 --> 00:03:13,190 but assume that labor markets clear. 60 00:03:13,190 --> 00:03:14,495 Well, guess what? 61 00:03:14,495 --> 00:03:17,390 If labor markets clear, you don't create net jobs 62 00:03:17,390 --> 00:03:20,560 by putting people to work. 63 00:03:20,560 --> 00:03:22,430 There are essentially no policy makers 64 00:03:22,430 --> 00:03:25,660 in the country who actually act as if they believe this. 65 00:03:25,660 --> 00:03:29,470 I mean, are we creating local jobs is a central question 66 00:03:29,470 --> 00:03:31,790 for every policymaker. 67 00:03:31,790 --> 00:03:33,530 So in that sense, the calculation 68 00:03:33,530 --> 00:03:36,470 is correct relative to what people assume. 69 00:03:36,470 --> 00:03:41,528 But not necessarily correct relative to the theories. 70 00:03:41,528 --> 00:03:47,200 The who should, the question of is it worth doing this. 71 00:03:47,200 --> 00:03:49,850 You know, does cost benefit analysis on its own terms show 72 00:03:49,850 --> 00:03:52,550 that it's worth remediating versus what 73 00:03:52,550 --> 00:03:56,570 does it show about development are, as I think people noted, 74 00:03:56,570 --> 00:03:58,400 are two separate questions. 75 00:03:58,400 --> 00:04:01,220 And what it showed, you know, is it worth 76 00:04:01,220 --> 00:04:04,162 remediating is a question that takes cost benefit analysis. 77 00:04:04,162 --> 00:04:05,870 What should be developed there strikes me 78 00:04:05,870 --> 00:04:07,790 as probably a more straightforward 79 00:04:07,790 --> 00:04:11,760 financial calculation about development. 80 00:04:11,760 --> 00:04:13,610 Unless it has environmental impacts 81 00:04:13,610 --> 00:04:15,040 we haven't talked about. 82 00:04:15,040 --> 00:04:19,540 Also the question of who should, who 83 00:04:19,540 --> 00:04:23,200 should pay versus who benefits, are again, separate questions. 84 00:04:23,200 --> 00:04:26,410 Cost benefit analysis identifies, again 85 00:04:26,410 --> 00:04:30,160 in its own terms, is it worth it for society to do it, 86 00:04:30,160 --> 00:04:31,670 not who should pay. 87 00:04:31,670 --> 00:04:34,150 Should the people who benefit the most from development 88 00:04:34,150 --> 00:04:37,880 pay for it, is a policy question about distribution of benefits. 89 00:04:37,880 --> 00:04:39,804 If you were building low income housing, 90 00:04:39,804 --> 00:04:42,220 you would never suggest that the people who benefited most 91 00:04:42,220 --> 00:04:44,530 from low income housing should pay the cost. 92 00:04:44,530 --> 00:04:48,850 That's not the point of low income housing. 93 00:04:48,850 --> 00:04:54,430 The discounting question I think applies in particular to health 94 00:04:54,430 --> 00:04:56,860 costs, one of the other debates-- 95 00:04:56,860 --> 00:04:59,340 Which my co-author, Lisa Heinzerling particularly 96 00:04:59,340 --> 00:05:02,530 has highlighted, is that if you have diseases 97 00:05:02,530 --> 00:05:04,930 that have a long latency period, as you might 98 00:05:04,930 --> 00:05:06,790 well in Superfund pollution. 99 00:05:06,790 --> 00:05:10,640 Things that will show up 20, 30, 40 years after exposure, 100 00:05:10,640 --> 00:05:13,630 do you discount them from the time when the disease appears 101 00:05:13,630 --> 00:05:16,842 or from the time when the risk, the exposure occurs? 102 00:05:16,842 --> 00:05:18,300 At a high discount rate, this could 103 00:05:18,300 --> 00:05:20,020 make a very large difference. 104 00:05:20,020 --> 00:05:22,330 The government practice has drifted 105 00:05:22,330 --> 00:05:25,010 toward the more conservative approach of discounting 106 00:05:25,010 --> 00:05:26,260 from when the disease appears. 107 00:05:26,260 --> 00:05:29,200 But the discourse of risk which is involved, 108 00:05:29,200 --> 00:05:32,070 seems to point to discounting from the time when 109 00:05:32,070 --> 00:05:35,560 the exposure happened, which makes them look much larger. 110 00:05:35,560 --> 00:05:37,620 LEE: Just expand on that, because people I think 111 00:05:37,620 --> 00:05:40,622 were a little fuzzy about discounting in the first place. 112 00:05:40,622 --> 00:05:42,080 AUDIENCE: I have your slides if you 113 00:05:42,080 --> 00:05:45,220 want one of those particular slides brought up, 114 00:05:45,220 --> 00:05:46,615 your discounting slides. 115 00:05:50,800 --> 00:05:53,570 FRANK ACKERMAN: Well, let me try just saying it once. 116 00:05:53,570 --> 00:05:59,530 So discounting applies to cases where the costs and benefits 117 00:05:59,530 --> 00:06:00,670 happen at different years. 118 00:06:00,670 --> 00:06:03,970 No one is indifferent between whether costs or benefits 119 00:06:03,970 --> 00:06:07,750 happen now or 10 years from now. 120 00:06:07,750 --> 00:06:09,670 And so trying to express everything 121 00:06:09,670 --> 00:06:12,250 as an equivalent present value, the farther in the future 122 00:06:12,250 --> 00:06:17,960 it is, generally the less it seems like it's worth today. 123 00:06:17,960 --> 00:06:21,400 So if costs and benefits happen at very different times, 124 00:06:21,400 --> 00:06:25,390 as they do in many environmental problems, the rate at which we 125 00:06:25,390 --> 00:06:26,490 discount the future. 126 00:06:26,490 --> 00:06:26,990 Right? 127 00:06:26,990 --> 00:06:30,880 We can all agree that getting paid far in the future 128 00:06:30,880 --> 00:06:33,160 is worth less than getting paid the same amount today. 129 00:06:33,160 --> 00:06:35,140 But how much less is it worth? 130 00:06:35,140 --> 00:06:38,120 What's the discount rate ? 131 00:06:38,120 --> 00:06:42,100 Brings up, that will affect essentially, 132 00:06:42,100 --> 00:06:45,730 the trade-off between, the price ratio between the future 133 00:06:45,730 --> 00:06:46,910 and the present. 134 00:06:46,910 --> 00:06:48,670 And so the more, the higher the discount 135 00:06:48,670 --> 00:06:50,710 rate, the more unfavorable that is. 136 00:06:50,710 --> 00:06:55,360 So one of the issues with the kind of toxic health hazards 137 00:06:55,360 --> 00:06:57,820 that arise in this scenario in particular, 138 00:06:57,820 --> 00:07:01,810 is that you can be exposed to them today 139 00:07:01,810 --> 00:07:05,240 and you know, cancer is famous for, many cancers have 140 00:07:05,240 --> 00:07:08,080 a very long latency period before there's 141 00:07:08,080 --> 00:07:10,190 any detectable disease. 142 00:07:10,190 --> 00:07:13,150 But they come from exposures, childhood exposures, 143 00:07:13,150 --> 00:07:16,510 exposures decades earlier. 144 00:07:16,510 --> 00:07:19,510 People have the, people who immigrate internationally 145 00:07:19,510 --> 00:07:22,210 have the cancer patterns of the country they lived in 146 00:07:22,210 --> 00:07:25,060 before they were 20 generally. 147 00:07:25,060 --> 00:07:27,270 So even cancers occur late in life. 148 00:07:27,270 --> 00:07:30,040 So in that case, if you're discounting the future 149 00:07:30,040 --> 00:07:34,510 at a big value, you're exposed today, you show signs of cancer 150 00:07:34,510 --> 00:07:36,160 30 years from now. 151 00:07:36,160 --> 00:07:39,490 Should we treat that as a harm that was done to you today 152 00:07:39,490 --> 00:07:40,630 when you were exposed? 153 00:07:40,630 --> 00:07:43,180 Or a harm that was done to you 30 years from now when you 154 00:07:43,180 --> 00:07:45,350 had cancer? 155 00:07:45,350 --> 00:07:48,579 At a high enough discount rate, those would be very different. 156 00:07:48,579 --> 00:07:51,453 AUDIENCE: For discounting, does that only happen when you, 157 00:07:51,453 --> 00:07:53,848 when the costs and benefits are only looked 158 00:07:53,848 --> 00:07:56,243 at for present day people? 159 00:07:56,243 --> 00:07:59,188 Like if you include future people in your accounting, 160 00:07:59,188 --> 00:08:01,079 you no longer have discounting? 161 00:08:01,079 --> 00:08:01,870 FRANK ACKERMAN: No. 162 00:08:01,870 --> 00:08:03,640 Then you really have discounting. 163 00:08:03,640 --> 00:08:05,950 Because there's no way for future people 164 00:08:05,950 --> 00:08:07,640 to be at the table and make decisions. 165 00:08:07,640 --> 00:08:12,610 So I mean the decisions are being made by today's people. 166 00:08:12,610 --> 00:08:14,170 You don't know what the future wants. 167 00:08:14,170 --> 00:08:17,830 You don't know what the future's willingness to pay will be. 168 00:08:17,830 --> 00:08:19,770 So we're making decisions about what we 169 00:08:19,770 --> 00:08:21,420 think those things are worth. 170 00:08:21,420 --> 00:08:26,260 And the farther in the future they are, 171 00:08:26,260 --> 00:08:30,168 the more we discount them at a positive rate. 172 00:08:30,168 --> 00:08:32,490 AUDIENCE: I don't know if we have time to clarify this. 173 00:08:32,490 --> 00:08:34,222 I'm wondering why, I think we can 174 00:08:34,222 --> 00:08:36,825 make a reasonable assumption that for air pollution, 175 00:08:36,825 --> 00:08:38,806 how much it matters to us. 176 00:08:38,806 --> 00:08:42,409 The people in the future also don't want air pollution. 177 00:08:42,409 --> 00:08:45,620 They relate to air pollution at least similarly to us, right? 178 00:08:45,620 --> 00:08:48,480 There wouldn't be discounting in that case? 179 00:08:48,480 --> 00:08:49,980 FRANK ACKERMAN: Well, I mean, people 180 00:08:49,980 --> 00:08:51,396 who have thought a lot about this, 181 00:08:51,396 --> 00:08:53,430 and again, Mark Sagoff, who Lee mentioned, 182 00:08:53,430 --> 00:08:55,200 is a philosopher who's looked at some 183 00:08:55,200 --> 00:08:57,060 of this, who essentially concluded we're 184 00:08:57,060 --> 00:08:58,635 going to create the future. 185 00:08:58,635 --> 00:09:01,830 If we preserve wild nature and act like we value it, 186 00:09:01,830 --> 00:09:04,380 we'll probably have descendants who value that. 187 00:09:04,380 --> 00:09:08,460 If we create a world that's all paved and has strip malls 188 00:09:08,460 --> 00:09:10,230 and excellent video games, we'll probably 189 00:09:10,230 --> 00:09:13,530 create descendants who value that. 190 00:09:13,530 --> 00:09:16,590 So it's like, there's a circularity in that 191 00:09:16,590 --> 00:09:18,690 what we do today will actually create the future's 192 00:09:18,690 --> 00:09:20,130 preferences. 193 00:09:20,130 --> 00:09:24,630 So there is no way actually to do something from a hypothesis 194 00:09:24,630 --> 00:09:28,110 about what the future prefers, because not only do they not 195 00:09:28,110 --> 00:09:31,596 exist yet, but we will create them. 196 00:09:31,596 --> 00:09:34,190 LEE: We have two minutes [INAUDIBLE].. 197 00:09:34,190 --> 00:09:35,850 FRANK ACKERMAN: Oh, my God. 198 00:09:39,530 --> 00:09:41,550 So what do you do with this refusal 199 00:09:41,550 --> 00:09:44,885 of cost benefit analysis for such excellent reasons 200 00:09:44,885 --> 00:09:48,390 as Lee outlines? 201 00:09:48,390 --> 00:09:51,030 You know, I think again, separating in this story, 202 00:09:51,030 --> 00:09:53,460 separating the cleanup from the development, 203 00:09:53,460 --> 00:09:57,570 there might be a stronger case for the cleanup 204 00:09:57,570 --> 00:09:59,330 and a weaker one for development-- 205 00:09:59,330 --> 00:10:03,000 The more you're thinking about these non-monetized values. 206 00:10:03,000 --> 00:10:05,310 I think that I've come to the conclusion 207 00:10:05,310 --> 00:10:08,530 that despite the validity of all those critiques 208 00:10:08,530 --> 00:10:10,770 and the importance of saying them every time 209 00:10:10,770 --> 00:10:12,690 you get a chance, that if-- 210 00:10:12,690 --> 00:10:14,940 You have more than six minutes to talk about one 211 00:10:14,940 --> 00:10:17,220 of these things that you have to then go 212 00:10:17,220 --> 00:10:21,390 on and say using the prevailing values, what would you get. 213 00:10:21,390 --> 00:10:26,460 Try to avoid endorsing them as a sign of, 214 00:10:26,460 --> 00:10:29,200 you know, yes, we think this is the greatest idea ever, 215 00:10:29,200 --> 00:10:30,375 but saying-- 216 00:10:30,375 --> 00:10:32,520 I've ended up saying using values 217 00:10:32,520 --> 00:10:35,130 that have become conventional, here's what you would conclude. 218 00:10:35,130 --> 00:10:37,752 So in this case, $35 million damage 219 00:10:37,752 --> 00:10:39,210 is not very large if you think it's 220 00:10:39,210 --> 00:10:40,620 going to kill a few people. 221 00:10:40,620 --> 00:10:44,210 Because values of life in the $6 to $8 million range 222 00:10:44,210 --> 00:10:46,265 have become conventional. 223 00:10:46,265 --> 00:10:47,640 If we had more time, I could tell 224 00:10:47,640 --> 00:10:51,530 you how absurd the basis for the $6 to $8 million per life is, 225 00:10:51,530 --> 00:10:53,170 and the paradoxes that come from that. 226 00:10:53,170 --> 00:10:58,860 But given that that has become semi standard in the policy 227 00:10:58,860 --> 00:11:05,280 discourse, a policy that saves a few lives predicatively 228 00:11:05,280 --> 00:11:08,970 is clearly worth $35 million to society 229 00:11:08,970 --> 00:11:10,840 in conventional cost benefit terms. 230 00:11:10,840 --> 00:11:15,630 So that things that kind of hold your nose 231 00:11:15,630 --> 00:11:18,480 and go with the lesser evil, which American politics is 232 00:11:18,480 --> 00:11:21,290 so full of, occurs here too. 233 00:11:21,290 --> 00:11:24,270 LEE: Would that last sentence, in a sense, 234 00:11:24,270 --> 00:11:27,480 be adequate to just say to the governor, 235 00:11:27,480 --> 00:11:31,790 I think you don't need to spend money on an elaborate analysis, 236 00:11:31,790 --> 00:11:35,710 even using the prevailing values of $6 to $7 million per person. 237 00:11:35,710 --> 00:11:39,180 We undoubtedly would save six or seven lives. 238 00:11:39,180 --> 00:11:42,940 Maybe 60 or 70, maybe 600 or 700 in the course of this. 239 00:11:42,940 --> 00:11:46,084 So we don't even need to any further analysis. 240 00:11:46,084 --> 00:11:47,625 FRANK ACKERMAN: I think that's right. 241 00:11:47,625 --> 00:11:51,150 If it's clear that it saves a lot of lives, 242 00:11:51,150 --> 00:11:54,176 and air pollution often kills people. 243 00:11:54,176 --> 00:11:55,800 So the things that reduce air pollution 244 00:11:55,800 --> 00:11:58,830 are particularly successful in this. 245 00:11:58,830 --> 00:12:00,750 There are a handful of these values 246 00:12:00,750 --> 00:12:02,220 that have become standard. 247 00:12:02,220 --> 00:12:04,950 There's been an argument that you shouldn't 248 00:12:04,950 --> 00:12:09,420 allow other values, there's been a very partisan discourse 249 00:12:09,420 --> 00:12:10,830 about which values are allowed. 250 00:12:10,830 --> 00:12:13,440 But the handful of values which have been allowed, 251 00:12:13,440 --> 00:12:17,410 create strong arguments for some policies. 252 00:12:17,410 --> 00:12:19,230 Superfund one's actually are trickier 253 00:12:19,230 --> 00:12:21,940 to demonstrate the number of deaths than air pollution. 254 00:12:21,940 --> 00:12:24,373 Air pollution so clearly kills people 255 00:12:24,373 --> 00:12:26,706 that it just doesn't have a chance in these cost benefit 256 00:12:26,706 --> 00:12:28,895 settings. 257 00:12:28,895 --> 00:12:30,770 LEE: We will stop out of respect for the fact 258 00:12:30,770 --> 00:12:32,700 that you have to study more economics. 259 00:12:32,700 --> 00:12:35,770 Frank, thank you so much. 260 00:12:35,770 --> 00:12:39,120 [APPLAUSE]