1 00:00:09,940 --> 00:00:11,156 PROFESSOR: I 2 00:00:11,156 --> 00:00:17,670 LAWRENCE SUSSKIND: I think, collectively, I did not 3 00:00:17,670 --> 00:00:21,870 hear the full-fledged justification 4 00:00:21,870 --> 00:00:23,690 of the precautionary principle. 5 00:00:23,690 --> 00:00:25,980 And I think that focusing on the fact 6 00:00:25,980 --> 00:00:30,480 that its application would be the greatest 7 00:00:30,480 --> 00:00:32,280 good for the greatest number of people 8 00:00:32,280 --> 00:00:35,850 is probably manifestly wrong because I 9 00:00:35,850 --> 00:00:38,370 think there are plenty of instances where 10 00:00:38,370 --> 00:00:42,180 being precautionary means that you look out 11 00:00:42,180 --> 00:00:45,210 for subgroups or sub effects which 12 00:00:45,210 --> 00:00:48,750 in fact, are very unlikely. 13 00:00:48,750 --> 00:00:51,460 In a world of certainty, one can argue 14 00:00:51,460 --> 00:00:54,600 utilitarian ethics applies. 15 00:00:54,600 --> 00:00:57,210 If you ban a chemical, you're going to have a certain cost 16 00:00:57,210 --> 00:00:59,925 and you going to have a certain benefit. 17 00:00:59,925 --> 00:01:03,990 As soon as the cost becomes uncertain, 18 00:01:03,990 --> 00:01:06,540 or the benefit becomes uncertain, 19 00:01:06,540 --> 00:01:08,130 which is related to risk assessment 20 00:01:08,130 --> 00:01:10,890 because the benefit of controlling the chemical 21 00:01:10,890 --> 00:01:15,390 is derived from doing some sort of risk assessment 22 00:01:15,390 --> 00:01:17,030 which has high uncertainties in it. 23 00:01:17,030 --> 00:01:19,154 But it's not the only thing that has uncertainties. 24 00:01:19,154 --> 00:01:21,030 Costs have uncertainties, too. 25 00:01:21,030 --> 00:01:24,390 So as soon as you depart from a well-defined benefit 26 00:01:24,390 --> 00:01:27,210 and a well-defined cost. 27 00:01:27,210 --> 00:01:30,420 Utilitarian ethics is no longer important because what's 28 00:01:30,420 --> 00:01:36,240 important is how you value the mistakes that you could make. 29 00:01:36,240 --> 00:01:39,610 A type I error comes when you've failed to regulate something, 30 00:01:39,610 --> 00:01:41,580 and it later comes out to be worse 31 00:01:41,580 --> 00:01:45,474 than you thought or at the tails of the distribution of what 32 00:01:45,474 --> 00:01:46,140 you're thinking. 33 00:01:46,140 --> 00:01:49,500 A type II error is where you regulate, 34 00:01:49,500 --> 00:01:52,160 and things turn out to be not as bad as you thought 35 00:01:52,160 --> 00:01:54,540 and then impose a higher cost upon. 36 00:01:54,540 --> 00:01:58,480 But there's a type III error which 37 00:01:58,480 --> 00:02:00,606 is you're working on the wrong problem. 38 00:02:00,606 --> 00:02:02,230 And I think that's what you all really. 39 00:02:02,230 --> 00:02:06,030 Missed the point is that when you ban a chemical, 40 00:02:06,030 --> 00:02:07,960 that isn't all that happens. 41 00:02:07,960 --> 00:02:11,950 What happens is you end up coming face 42 00:02:11,950 --> 00:02:14,860 to face with a substitute, with an alternative technology, 43 00:02:14,860 --> 00:02:17,960 with alternative ways to meet the public needs. 44 00:02:17,960 --> 00:02:21,520 And so a straight cost-benefit analysis, 45 00:02:21,520 --> 00:02:27,700 even with uncertainty, unless it includes the alternatives 46 00:02:27,700 --> 00:02:29,810 that you have is not a complete analysis. 47 00:02:29,810 --> 00:02:31,820 And you can make anything look good 48 00:02:31,820 --> 00:02:35,510 depending upon how you value life or value the environment. 49 00:02:35,510 --> 00:02:39,310 And so the precautionary principle, properly, 50 00:02:39,310 --> 00:02:42,270 which is which applies when you have great uncertainties, 51 00:02:42,270 --> 00:02:44,200 has to ask the question, what are 52 00:02:44,200 --> 00:02:49,042 the alternatives to the action that you've decided to promote. 53 00:02:49,042 --> 00:02:50,000 There are alternatives. 54 00:02:50,000 --> 00:02:52,420 There are there substitutes 55 00:02:52,420 --> 00:02:55,090 I'll give you one more comment. 56 00:02:55,090 --> 00:02:57,460 I know the word of Cass Sunstein well. 57 00:02:57,460 --> 00:03:00,340 I graduated from [INAUDIBLE],, same law school. 58 00:03:00,340 --> 00:03:04,170 I think he is a poor scholar. 59 00:03:04,170 --> 00:03:05,230 He does not understand. 60 00:03:05,230 --> 00:03:06,659 SPEAKER 1: You heard it here. 61 00:03:06,659 --> 00:03:08,200 LAWRENCE SUSSKIND: He does not really 62 00:03:08,200 --> 00:03:11,390 understand what this whole issue was about, 63 00:03:11,390 --> 00:03:14,860 has never really understood it. 64 00:03:14,860 --> 00:03:20,320 For example, people say, suppose you regulate vinyl chloride. 65 00:03:20,320 --> 00:03:23,230 You put worker controls on vinyl chloride. 66 00:03:23,230 --> 00:03:26,290 And the workers now earn a smaller salary 67 00:03:26,290 --> 00:03:27,750 because the vinyl chloride producer 68 00:03:27,750 --> 00:03:29,229 can't afford to pay them. 69 00:03:29,229 --> 00:03:30,020 So what do they do? 70 00:03:30,020 --> 00:03:31,228 They quit their health clubs. 71 00:03:31,228 --> 00:03:33,890 They buy less lean hamburger. 72 00:03:33,890 --> 00:03:37,780 And the health impact, is argued by Sunstein, 73 00:03:37,780 --> 00:03:39,760 is worse than if you regulated the hazard 74 00:03:39,760 --> 00:03:41,710 because you made the workers poor. 75 00:03:41,710 --> 00:03:44,720 And they eat fat hamburger and don't exercise. 76 00:03:44,720 --> 00:03:46,150 But that doesn't take into account 77 00:03:46,150 --> 00:03:49,760 the fact that when you regulate vinyl chloride, what replaces 78 00:03:49,760 --> 00:03:52,220 it, let's say, is polyethylene. 79 00:03:52,220 --> 00:03:54,460 So those industries increase production, 80 00:03:54,460 --> 00:03:57,510 increase the wages of their workers, who now join health 81 00:03:57,510 --> 00:03:59,195 clubs and lean hamburgers. 82 00:03:59,195 --> 00:04:01,195 Now if you're really going to take this calculus 83 00:04:01,195 --> 00:04:03,580 to the extreme, you could see how ridiculous 84 00:04:03,580 --> 00:04:06,250 it is because you can't really determine. 85 00:04:06,250 --> 00:04:08,900 But what you can determine and what regulation 86 00:04:08,900 --> 00:04:13,060 shows, when it's stringent, you don't make marginal changes 87 00:04:13,060 --> 00:04:14,220 in production. 88 00:04:14,220 --> 00:04:17,260 You may see serious shifts in the nature 89 00:04:17,260 --> 00:04:18,820 of industrial policy. 90 00:04:18,820 --> 00:04:20,589 And you shift to nonchemical alternatives. 91 00:04:20,589 --> 00:04:23,050 For example, the banning of CFCs didn't just 92 00:04:23,050 --> 00:04:26,890 supply one inhalant which was slightly better. 93 00:04:26,890 --> 00:04:29,220 People started to use pump cans. 94 00:04:29,220 --> 00:04:31,000 You sell less product, then that's 95 00:04:31,000 --> 00:04:32,380 too bad for the producer. 96 00:04:32,380 --> 00:04:35,070 But you don't need to inhalant to deliver 97 00:04:35,070 --> 00:04:37,190 deodorant or those things. 98 00:04:37,190 --> 00:04:40,460 So it is narrowly defined. 99 00:04:40,460 --> 00:04:45,040 You work on the wrong problem, and you ignore how 100 00:04:45,040 --> 00:04:47,500 you feel about air avoidance. 101 00:04:47,500 --> 00:04:49,605 That's when the precautionary principle comes in. 102 00:04:49,605 --> 00:04:50,480 SPEAKER 1: Beautiful. 103 00:04:50,480 --> 00:04:54,302 AUDIENCE: --go along with it, slash bring up that point 104 00:04:54,302 --> 00:04:58,230 that I think there is a middle ground where with some things 105 00:04:58,230 --> 00:05:00,864 I think we should look into alternatives 106 00:05:00,864 --> 00:05:05,560 and have them as backseat if we find that whatever 107 00:05:05,560 --> 00:05:07,948 precautionary didn't work out. 108 00:05:07,948 --> 00:05:13,198 I think it's really dangerous to just ignore other alternatives. 109 00:05:13,198 --> 00:05:18,826 Like the Susskind article said, going with that action 110 00:05:18,826 --> 00:05:25,840 leads to unintended things that maybe could've been a benefit. 111 00:05:25,840 --> 00:05:29,620 LAWRENCE SUSSKIND: Take the global warming situation. 112 00:05:29,620 --> 00:05:32,500 This middle ground, I think, is a fiction. 113 00:05:32,500 --> 00:05:36,650 You either have to regard the probability of serious global 114 00:05:36,650 --> 00:05:42,250 impacts as within the realm of possibility, or you don't. 115 00:05:42,250 --> 00:05:45,070 Even people who aren't certain about the science, 116 00:05:45,070 --> 00:05:47,290 you ask them the question, should we 117 00:05:47,290 --> 00:05:49,810 plan for a rising sea level? 118 00:05:49,810 --> 00:05:52,240 Should we plan for floods and droughts 119 00:05:52,240 --> 00:05:54,710 would be a bad thing to do. 120 00:05:54,710 --> 00:05:56,560 And the answer invariably comes, no, it 121 00:05:56,560 --> 00:05:58,660 should be a good thing to do. 122 00:05:58,660 --> 00:06:01,080 And if you act in a timely fashion, 123 00:06:01,080 --> 00:06:04,519 even if you turn out to be wrong, 124 00:06:04,519 --> 00:06:06,310 imagine what would happen if you turned out 125 00:06:06,310 --> 00:06:09,490 to be right and turn out to have the coastal cities 126 00:06:09,490 --> 00:06:12,340 disappearing [INAUDIBLE]. 127 00:06:12,340 --> 00:06:14,410 I think the precautionary principle's 128 00:06:14,410 --> 00:06:16,210 been criticized by people who have 129 00:06:16,210 --> 00:06:20,470 an economic dog in that fight, that it's too expensive 130 00:06:20,470 --> 00:06:23,140 and their particular industry will disappear. 131 00:06:23,140 --> 00:06:26,830 But the alternative is to have, generally, other people 132 00:06:26,830 --> 00:06:28,340 bear the cost. 133 00:06:28,340 --> 00:06:31,360 So who bears the cost and who reaps the benefit 134 00:06:31,360 --> 00:06:33,070 of the decided policy? 135 00:06:33,070 --> 00:06:34,360 It's very important. 136 00:06:34,360 --> 00:06:38,500 And the distributional effects of going one way or the other 137 00:06:38,500 --> 00:06:39,490 are really serious. 138 00:06:39,490 --> 00:06:43,420 If you decide to put a carbon tax on oil, let me tell you, 139 00:06:43,420 --> 00:06:46,990 the petroleum industry will survive. 140 00:06:46,990 --> 00:06:49,120 If you don't put a carbon tax on the floor, 141 00:06:49,120 --> 00:06:51,650 it's not clear that the planet will survive. 142 00:06:51,650 --> 00:06:55,690 So if they ask whose ox is being gored, 143 00:06:55,690 --> 00:07:00,490 how certain are we that you are producing an irreducible effect 144 00:07:00,490 --> 00:07:01,420 of these people. 145 00:07:01,420 --> 00:07:03,700 And do you want to keep your options open? 146 00:07:03,700 --> 00:07:05,950 So I reject the fact that what you're trying to do 147 00:07:05,950 --> 00:07:07,730 is find a middle ground. 148 00:07:07,730 --> 00:07:09,688 LAWRENCE SUSSKIND: But I want to turn the dial, 149 00:07:09,688 --> 00:07:14,530 as a last note, one more tick beyond what Nick just said. 150 00:07:14,530 --> 00:07:18,460 I believe, take the climate change example, 151 00:07:18,460 --> 00:07:21,940 that if you say, let's take this seriously 152 00:07:21,940 --> 00:07:25,420 because it might happen that a whole series 153 00:07:25,420 --> 00:07:27,370 of second and third order decisions 154 00:07:27,370 --> 00:07:30,940 can get made that, in fact, wouldn't get made otherwise. 155 00:07:30,940 --> 00:07:33,460 It's related to the point you're making first. 156 00:07:33,460 --> 00:07:35,820 And that will create all kinds of benefits 157 00:07:35,820 --> 00:07:38,170 because if people don't do it to create these benefits, 158 00:07:38,170 --> 00:07:39,610 they'll do it to create benefits. 159 00:07:39,610 --> 00:07:41,111 But they never would have searched. 160 00:07:41,111 --> 00:07:43,610 They never would've tried, They never would've looked there. 161 00:07:43,610 --> 00:07:47,350 So when we talk to communities about taking various moves 162 00:07:47,350 --> 00:07:50,484 to manage potential climate risks, 163 00:07:50,484 --> 00:07:52,150 and once you do it in a way that creates 164 00:07:52,150 --> 00:07:56,860 all kinds of other second and third order benefits now, 165 00:07:56,860 --> 00:07:59,080 instead of just looking at what the costs are 166 00:07:59,080 --> 00:08:01,360 and arguing that the long term benefits outweigh 167 00:08:01,360 --> 00:08:03,700 the short-term costs, my argument 168 00:08:03,700 --> 00:08:06,700 is why don't you try to do something that deals with those 169 00:08:06,700 --> 00:08:09,940 risks and creates all kinds of short-term benefits. 170 00:08:09,940 --> 00:08:12,190 And that becomes a design consideration 171 00:08:12,190 --> 00:08:13,744 in formulating the policy. 172 00:08:13,744 --> 00:08:15,410 And if you can't do it, you can't do it. 173 00:08:15,410 --> 00:08:17,680 But I would argue that the problem 174 00:08:17,680 --> 00:08:22,450 with the precautionary principle is it doesn't link to, 175 00:08:22,450 --> 00:08:27,460 and let's look at ways of using this theory that we're not 176 00:08:27,460 --> 00:08:28,840 going to do something. 177 00:08:28,840 --> 00:08:32,200 I told you the story about the central Arctic and fisheries 178 00:08:32,200 --> 00:08:33,010 question. 179 00:08:33,010 --> 00:08:34,570 You can use a period of time to do 180 00:08:34,570 --> 00:08:37,570 more scientific investigation that might in fact lead 181 00:08:37,570 --> 00:08:40,419 us to understand better ways of using 182 00:08:40,419 --> 00:08:44,290 those resources as opposed to, oh my god, we're scared. 183 00:08:44,290 --> 00:08:45,540 A bad thing could happen. 184 00:08:45,540 --> 00:08:47,010 It might only be a 1% chance. 185 00:08:47,010 --> 00:08:49,160 But, oh my god, if it happens horrible, 186 00:08:49,160 --> 00:08:51,860 horrible things will affect us in the future. 187 00:08:51,860 --> 00:08:53,220 So let's not do it. 188 00:08:53,220 --> 00:08:54,040 LAWRENCE SUSSKIND: But the precautionary principal 189 00:08:54,040 --> 00:08:56,070 has been linked to alternatives analysis. 190 00:08:56,070 --> 00:08:59,210 Now nobody doesn't ask, what are the alternatives. 191 00:08:59,210 --> 00:09:03,490 And by the way, the big failure, Bush and Cheney decided, 192 00:09:03,490 --> 00:09:05,920 we're going to get Saddam before he gets us. 193 00:09:05,920 --> 00:09:08,530 He was operating under precautionary principle. 194 00:09:08,530 --> 00:09:11,710 Look what the application of that precautionary principle 195 00:09:11,710 --> 00:09:12,910 brought us. 196 00:09:12,910 --> 00:09:15,640 SPEAKER 1: If you only say, let's 197 00:09:15,640 --> 00:09:18,760 look at alternatives, that's not the same as let's 198 00:09:18,760 --> 00:09:19,802 search for alternatives-- 199 00:09:19,802 --> 00:09:20,843 LAWRENCE SUSSKIND: Right. 200 00:09:20,843 --> 00:09:22,450 SPEAKER 1: --that both reduce the risk 201 00:09:22,450 --> 00:09:24,783 and create a variety of alternatives. 202 00:09:24,783 --> 00:09:25,530 LAWRENCE SUSSKIND: And develop those alternatives. 203 00:09:25,530 --> 00:09:27,280 SPEAKER 1: And let's be clear about trying 204 00:09:27,280 --> 00:09:29,526 to distribute the benefits of those alternatives 205 00:09:29,526 --> 00:09:30,400 in a progressive way. 206 00:09:30,400 --> 00:09:31,990 You could add that, too. 207 00:09:31,990 --> 00:09:32,560 Anyway. 208 00:09:32,560 --> 00:09:35,310 Thank you both for coming.