1 00:00:00,000 --> 00:00:02,400 [SQUEAKING] 2 00:00:02,400 --> 00:00:04,800 [RUSTLING] 3 00:00:04,800 --> 00:00:10,275 [CLICKING] 4 00:00:10,275 --> 00:00:11,400 FRANK SCHILBACH: All right. 5 00:00:11,400 --> 00:00:12,442 I'm going to get started. 6 00:00:12,442 --> 00:00:16,650 This is Lecture 23, about policy with behavioral agents, 7 00:00:16,650 --> 00:00:21,400 the last lecture of 14.13, for better or worse. 8 00:00:21,400 --> 00:00:21,900 OK. 9 00:00:21,900 --> 00:00:23,108 So what's the plan for today? 10 00:00:23,108 --> 00:00:25,410 We're going to talk about paternalism, 11 00:00:25,410 --> 00:00:27,890 what is paternalism, reasons for paternalism, 12 00:00:27,890 --> 00:00:30,990 why it might be popular or not. 13 00:00:30,990 --> 00:00:34,320 We're going to talk about the specific example of what 14 00:00:34,320 --> 00:00:37,155 was coined as libertarian paternalism. 15 00:00:37,155 --> 00:00:39,360 I'll explain more, which is the Save More Tomorrow 16 00:00:39,360 --> 00:00:42,150 plan, what's good about this specific example. 17 00:00:42,150 --> 00:00:43,620 And we'll discuss that. 18 00:00:43,620 --> 00:00:47,050 And finally, we'll talk about some market solutions. 19 00:00:47,050 --> 00:00:49,780 Often one solution that's often mentioned 20 00:00:49,780 --> 00:00:52,120 is to say, well, why can't the market 21 00:00:52,120 --> 00:00:53,320 solve these kinds of issues. 22 00:00:53,320 --> 00:00:55,750 And we're going to talk about why that may or may not 23 00:00:55,750 --> 00:00:58,480 be a good idea. 24 00:00:58,480 --> 00:01:01,690 So then, let me just ask very simply 25 00:01:01,690 --> 00:01:05,319 to start with, from your classes that you've 26 00:01:05,319 --> 00:01:07,330 taken in economics and otherwise, 27 00:01:07,330 --> 00:01:11,230 what are some reasons for government policies? 28 00:01:11,230 --> 00:01:13,120 And it doesn't mean that you necessarily 29 00:01:13,120 --> 00:01:14,050 endorse those reasons. 30 00:01:14,050 --> 00:01:16,050 But what are the reasons why the government, why 31 00:01:16,050 --> 00:01:18,040 we might want the government to engage 32 00:01:18,040 --> 00:01:22,640 in an economic or other sort of of government policies? 33 00:01:22,640 --> 00:01:25,390 So one reason might be that people don't quite 34 00:01:25,390 --> 00:01:27,420 know what's good for them. 35 00:01:27,420 --> 00:01:32,230 And so we can potentially help them by doing so. 36 00:01:32,230 --> 00:01:34,450 That's, of course, the reason-- 37 00:01:34,450 --> 00:01:37,700 that's, of course, what we're going to talk about today. 38 00:01:37,700 --> 00:01:40,820 Let me rephrase my question. 39 00:01:40,820 --> 00:01:44,120 In neoclassical economics, what are reasons for the government 40 00:01:44,120 --> 00:01:45,560 to engage? 41 00:01:45,560 --> 00:01:47,457 The behavioral ones of course, we're 42 00:01:47,457 --> 00:01:49,790 going to call internalities, which is essentially people 43 00:01:49,790 --> 00:01:51,830 must optimize in certain ways. 44 00:01:51,830 --> 00:01:54,420 But what are the neoclassical reasons 45 00:01:54,420 --> 00:01:56,840 for the government to engage? 46 00:01:56,840 --> 00:02:00,120 So this would be things like externalities. 47 00:02:00,120 --> 00:02:03,170 This could be individuals causing externalities. 48 00:02:03,170 --> 00:02:05,240 Externalities, essentially, is a fancy word 49 00:02:05,240 --> 00:02:10,430 for people causing some positive or negative effects on others 50 00:02:10,430 --> 00:02:13,300 that they might not be able to internalize. 51 00:02:13,300 --> 00:02:16,320 This could be smokers or individuals doing that. 52 00:02:16,320 --> 00:02:21,270 Like, for example, smoking might cause health effects on others. 53 00:02:21,270 --> 00:02:23,390 Could be also farms polluting the environment 54 00:02:23,390 --> 00:02:28,280 or polluting water or some other resources of other people 55 00:02:28,280 --> 00:02:30,208 polluting like-- 56 00:02:30,208 --> 00:02:32,000 otherwise the air, and so on, and so forth, 57 00:02:32,000 --> 00:02:33,980 which might be bad for others' health. 58 00:02:33,980 --> 00:02:36,470 And to the extent that the firm or others are not 59 00:02:36,470 --> 00:02:38,600 sort of taking that into account, 60 00:02:38,600 --> 00:02:41,600 that will lead to socially sub optimal outcomes. 61 00:02:41,600 --> 00:02:44,240 Now the government would say, well, let's tax that. 62 00:02:44,240 --> 00:02:46,010 Because we want to make sure that people 63 00:02:46,010 --> 00:02:49,970 are taking into account these externalities. 64 00:02:49,970 --> 00:02:51,710 And with the optimal tax schedule, 65 00:02:51,710 --> 00:02:56,750 potentially, these can get back to the [INAUDIBLE] potentially. 66 00:02:56,750 --> 00:02:59,750 So usually the terms that look like free markets, et cetera, 67 00:02:59,750 --> 00:03:01,760 they essentially will-- the free market 68 00:03:01,760 --> 00:03:06,110 will achieve that things are efficient and potentially 69 00:03:06,110 --> 00:03:07,980 under some conditions. 70 00:03:07,980 --> 00:03:10,880 But what the market will not typically achieve 71 00:03:10,880 --> 00:03:13,040 is some form of equity. 72 00:03:13,040 --> 00:03:17,420 And to the extent that we are interested in that, 73 00:03:17,420 --> 00:03:20,600 you might want to engage in social security, insurance 74 00:03:20,600 --> 00:03:23,640 program, unemployment insurance, and so on, and so forth. 75 00:03:23,640 --> 00:03:25,430 So if you have lots of competition, 76 00:03:25,430 --> 00:03:28,610 the prices will fall down to often marginal costs. 77 00:03:28,610 --> 00:03:31,130 And that's good for consumers because stuff is cheap. 78 00:03:31,130 --> 00:03:32,690 It could [INAUDIBLE] only one firm, 79 00:03:32,690 --> 00:03:35,660 the firm, the monopolist in the [INAUDIBLE] case, 80 00:03:35,660 --> 00:03:39,390 the monopolist will charge above a marginal cost. 81 00:03:39,390 --> 00:03:41,780 So prices will be higher, fewer things will be sold, 82 00:03:41,780 --> 00:03:44,010 and people will be less happy about that. 83 00:03:44,010 --> 00:03:48,080 And then there's sort of this antitrust in 1420, as you say, 84 00:03:48,080 --> 00:03:51,170 exactly Nancy Rose and others are working on 85 00:03:51,170 --> 00:03:54,410 that, antitrust commissions, et cetera, 86 00:03:54,410 --> 00:03:56,660 trying to essentially make sure there's 87 00:03:56,660 --> 00:03:58,370 sufficient computation in markets 88 00:03:58,370 --> 00:04:01,190 because we think that's good for welfare. 89 00:04:01,190 --> 00:04:04,190 So there's other sort of market failures. 90 00:04:04,190 --> 00:04:07,270 For example, public goods, parts, and so on, 91 00:04:07,270 --> 00:04:09,020 the government might want to provide that. 92 00:04:09,020 --> 00:04:13,310 Because in general, people might under-provide that. 93 00:04:13,310 --> 00:04:15,440 Could be also other things, like information, 94 00:04:15,440 --> 00:04:19,640 [INAUDIBLE] information, tend to fail, reason being that, 95 00:04:19,640 --> 00:04:21,930 when I come up with an idea that's 96 00:04:21,930 --> 00:04:23,900 useful for a lot of people, I might not 97 00:04:23,900 --> 00:04:26,510 be able to internalize the rents from that 98 00:04:26,510 --> 00:04:30,897 if I'm under investing, therefore, in R&D, and so on. 99 00:04:30,897 --> 00:04:33,230 So now I think we have all the things I have on my list. 100 00:04:33,230 --> 00:04:35,480 There's macroeconomic policy, as Tom just mentioned. 101 00:04:35,480 --> 00:04:38,810 The Federal Reserve might try to [INAUDIBLE] its interest rate 102 00:04:38,810 --> 00:04:40,460 and fiscal stimulus. 103 00:04:40,460 --> 00:04:44,135 Competition policy, that's what Even just mentioned. 104 00:04:44,135 --> 00:04:45,510 Then there's just redistribution, 105 00:04:45,510 --> 00:04:46,350 social insurance. 106 00:04:46,350 --> 00:04:49,050 I think that's what Justin said on unemployment insurance, 107 00:04:49,050 --> 00:04:50,680 social security, and so on. 108 00:04:50,680 --> 00:04:52,350 And then there might be externality. 109 00:04:52,350 --> 00:04:55,350 That's what Natalie said, particular taxes 110 00:04:55,350 --> 00:04:58,780 and other sort of market failures, such as innovation. 111 00:04:58,780 --> 00:05:02,790 That's what Jose, as well, said. 112 00:05:02,790 --> 00:05:05,463 What we're going to focus on now is none of these things which, 113 00:05:05,463 --> 00:05:06,880 and that's getting back to what we 114 00:05:06,880 --> 00:05:09,870 said at the very beginning, which is the internalities. 115 00:05:09,870 --> 00:05:13,882 So that's not a traditional way for governments to engage. 116 00:05:13,882 --> 00:05:16,465 And if you went back, like, 20 years from now, the government, 117 00:05:16,465 --> 00:05:19,890 there would be no such urge to use such considerations. 118 00:05:19,890 --> 00:05:23,432 But more recently, in part, perhaps, 119 00:05:23,432 --> 00:05:25,640 as a consequence of the rise of behavioral economics, 120 00:05:25,640 --> 00:05:28,860 people are much more interested now in internalities. 121 00:05:28,860 --> 00:05:30,440 So what are internalities? 122 00:05:30,440 --> 00:05:33,430 It's essentially sort of similar to [INAUDIBLE].. 123 00:05:33,430 --> 00:05:35,730 The coin is very similar. 124 00:05:35,730 --> 00:05:41,540 Instead of causing damage or not internalizing sufficiently 125 00:05:41,540 --> 00:05:43,490 effects on other people, this is now 126 00:05:43,490 --> 00:05:46,130 effects on your future self, essentially. 127 00:05:46,130 --> 00:05:48,860 So that the consumer is not fully [INAUDIBLE] the costs 128 00:05:48,860 --> 00:05:54,560 and/or benefits she imposes on her future potential self. 129 00:05:54,560 --> 00:05:56,060 And if the government is sort of now 130 00:05:56,060 --> 00:05:58,477 confident that that's the case, well, then, the government 131 00:05:58,477 --> 00:06:00,810 might want to intervene. 132 00:06:00,810 --> 00:06:01,910 Now what's paternalism? 133 00:06:01,910 --> 00:06:05,870 Paternalism is essentially a version of that. 134 00:06:05,870 --> 00:06:09,200 There's a very convoluted definition 135 00:06:09,200 --> 00:06:11,952 by the Merriam-Webster dictionary, which 136 00:06:11,952 --> 00:06:13,160 I'm not going to read to you. 137 00:06:13,160 --> 00:06:15,000 But it is one definition. 138 00:06:15,000 --> 00:06:17,930 And there's a better one from David Laibson, which 139 00:06:17,930 --> 00:06:20,510 is "An attempt to influence or control people's 140 00:06:20,510 --> 00:06:22,407 conduct for their own good. 141 00:06:22,407 --> 00:06:24,740 In other words, when the motivation for the intervention 142 00:06:24,740 --> 00:06:26,240 is not about externalities." 143 00:06:26,240 --> 00:06:28,160 So it's to say if we try to affect 144 00:06:28,160 --> 00:06:29,840 people's behavior for their own good 145 00:06:29,840 --> 00:06:33,140 in some way, that sort of assumes, of course, that people 146 00:06:33,140 --> 00:06:36,440 are not fully optimizing in certain ways, not when it comes 147 00:06:36,440 --> 00:06:40,640 to their effects on others, [INAUDIBLE] 148 00:06:40,640 --> 00:06:43,633 the effects on themselves, [INAUDIBLE].. 149 00:06:43,633 --> 00:06:44,860 OK? 150 00:06:44,860 --> 00:06:45,360 [INAUDIBLE] 151 00:06:45,360 --> 00:06:46,423 There was a question? 152 00:06:50,770 --> 00:06:55,340 Now when is paternalism warranted? 153 00:06:55,340 --> 00:06:58,430 Because [INAUDIBLE] going through the list of topics 154 00:06:58,430 --> 00:07:00,740 that we discussed in the class. 155 00:07:00,740 --> 00:07:04,460 And it's important to understand that not all of these topics 156 00:07:04,460 --> 00:07:07,520 are, in fact, warranting intervention. 157 00:07:07,520 --> 00:07:10,410 For example, if you think people have certain risk preferences, 158 00:07:10,410 --> 00:07:13,130 reference-dependent preferences, or social preferences, 159 00:07:13,130 --> 00:07:15,470 that's not a bias or problem that we need to fix. 160 00:07:15,470 --> 00:07:16,580 People care about others. 161 00:07:16,580 --> 00:07:19,250 And that's just the way their utility function is like. 162 00:07:19,250 --> 00:07:26,540 And then you show that people can optimize accordingly. 163 00:07:26,540 --> 00:07:29,643 That's not necessarily a bias or a problem. 164 00:07:29,643 --> 00:07:32,060 Of course, if there's problems with self-control, problems 165 00:07:32,060 --> 00:07:34,227 with people at [INAUDIBLE] inconsistent preferences, 166 00:07:34,227 --> 00:07:35,750 we might want to help people. 167 00:07:35,750 --> 00:07:37,790 That assumes, of course, that we know which self 168 00:07:37,790 --> 00:07:40,910 is the one we want to support. 169 00:07:40,910 --> 00:07:43,370 There's lots of space here, when we 170 00:07:43,370 --> 00:07:45,020 think about non-standard beliefs, 171 00:07:45,020 --> 00:07:48,980 for people to make mistakes, or for our governments and others 172 00:07:48,980 --> 00:07:49,860 to intervene. 173 00:07:49,860 --> 00:07:52,410 This would be limited attention, also things like memory, 174 00:07:52,410 --> 00:07:54,950 like, people just forget stuff. 175 00:07:54,950 --> 00:07:57,860 Learning failure is where people have just wrong information. 176 00:07:57,860 --> 00:08:01,490 Overoptimism, overconfidence, projection, and attribution 177 00:08:01,490 --> 00:08:03,590 bias, these are all essentially people that 178 00:08:03,590 --> 00:08:04,940 make mistakes in some ways. 179 00:08:04,940 --> 00:08:07,640 And to the extent that they make mistakes, 180 00:08:07,640 --> 00:08:09,360 we might want to intervene. 181 00:08:09,360 --> 00:08:11,210 Importantly, we discussed this already, 182 00:08:11,210 --> 00:08:14,480 motivated beliefs are not necessarily a reason, 183 00:08:14,480 --> 00:08:17,150 or going to be very careful with that. 184 00:08:17,150 --> 00:08:20,660 That is, as we said previously when we talked about health 185 00:08:20,660 --> 00:08:21,720 issues, for example. 186 00:08:21,720 --> 00:08:25,590 So if somebody does not want to know their health 187 00:08:25,590 --> 00:08:28,980 status because they want to feel happy or make themselves think 188 00:08:28,980 --> 00:08:32,492 that they are happy, and they don't want to know that, 189 00:08:32,492 --> 00:08:34,659 because they want to have a few years of their lives 190 00:08:34,659 --> 00:08:36,576 in which they think can [INAUDIBLE] themselves 191 00:08:36,576 --> 00:08:40,500 into thinking they're happy, healthy and happy, 192 00:08:40,500 --> 00:08:42,659 it's not obvious that they're making a mistake. 193 00:08:42,659 --> 00:08:44,250 In a sense, they could be making a mistake. 194 00:08:44,250 --> 00:08:46,917 Because potentially, there could be health investments or health 195 00:08:46,917 --> 00:08:49,818 behaviors that they could engage in to improve their behavior. 196 00:08:49,818 --> 00:08:51,360 But it could also just be that, look, 197 00:08:51,360 --> 00:08:53,140 there's actually not much one can do. 198 00:08:53,140 --> 00:08:55,650 And so now, if we intervene, we might actually 199 00:08:55,650 --> 00:08:56,580 make things worse. 200 00:08:56,580 --> 00:08:58,663 Because they're somebody who actually doesn't even 201 00:08:58,663 --> 00:09:00,600 want to know what's going on. 202 00:09:00,600 --> 00:09:02,460 Often, that's actually the case. 203 00:09:02,460 --> 00:09:03,855 But instead of just flagging that 204 00:09:03,855 --> 00:09:07,590 for motivated beliefs, when people derive utility 205 00:09:07,590 --> 00:09:11,430 from beliefs, things get tricky quite easily. 206 00:09:11,430 --> 00:09:13,680 And then we talked about non-standard decision-making, 207 00:09:13,680 --> 00:09:14,700 gender discrimination. 208 00:09:14,700 --> 00:09:17,010 There's a clear case for intervention 209 00:09:17,010 --> 00:09:20,700 if people don't get treated the right way, as they should, 210 00:09:20,700 --> 00:09:22,830 or any other discrimination. 211 00:09:22,830 --> 00:09:25,500 And then we talked about these other aspects here. 212 00:09:25,500 --> 00:09:27,720 Of course, then, defaults, frames, and hudges 213 00:09:27,720 --> 00:09:29,980 are not necessarily reasons to intervene. 214 00:09:29,980 --> 00:09:32,710 These are more like instruments that we can use to intervene. 215 00:09:32,710 --> 00:09:35,040 You can set defaults, or frames, or nudges 216 00:09:35,040 --> 00:09:38,520 in certain ways that help us to make people 217 00:09:38,520 --> 00:09:43,680 make different choices in ways that are potentially fairly 218 00:09:43,680 --> 00:09:46,380 cheap and then also in ways that don't really necessarily 219 00:09:46,380 --> 00:09:48,360 distort behaviors of everyone. 220 00:09:48,360 --> 00:09:50,255 I'm going to get back to that in a second. 221 00:09:50,255 --> 00:09:52,780 And when it comes to poverty and mental health, 222 00:09:52,780 --> 00:09:54,790 these are more like aspects that we say, 223 00:09:54,790 --> 00:09:56,730 if poverty affects people's decision-making, 224 00:09:56,730 --> 00:10:00,420 it causes behavioral biases, or low productivity, and so on. 225 00:10:00,420 --> 00:10:02,350 Then things like cash transfers, [INAUDIBLE] 226 00:10:02,350 --> 00:10:05,270 might help business [? sincerely. ?] Not sure 227 00:10:05,270 --> 00:10:06,570 that's counts as paternalism. 228 00:10:06,570 --> 00:10:08,550 That's more, like, sort of support programs. 229 00:10:08,550 --> 00:10:11,220 And then things like happiness and mental health, 230 00:10:11,220 --> 00:10:12,870 here you would think about if you 231 00:10:12,870 --> 00:10:15,190 think people are misoptimizing in certain ways, 232 00:10:15,190 --> 00:10:19,500 they sort of underuse psychological services, 233 00:10:19,500 --> 00:10:22,050 or if they sort of misoptimize in various ways, 234 00:10:22,050 --> 00:10:24,600 but we think they could be experimenting more, 235 00:10:24,600 --> 00:10:27,733 they could be doing more things to improve their own happiness. 236 00:10:27,733 --> 00:10:29,150 Perhaps there's some ways in which 237 00:10:29,150 --> 00:10:31,530 you could push people to experiment more and try 238 00:10:31,530 --> 00:10:32,710 to figure that out. 239 00:10:32,710 --> 00:10:35,190 Notice that I'm sort of careful here in my wording. 240 00:10:35,190 --> 00:10:37,290 I'm saying "pushing people to experiment 241 00:10:37,290 --> 00:10:39,690 and trying things out" as opposed to, like, 242 00:10:39,690 --> 00:10:41,400 pushing people to do certain things. 243 00:10:41,400 --> 00:10:44,760 For example, if we said people should be working more 244 00:10:44,760 --> 00:10:46,478 and they work to more or working less. 245 00:10:46,478 --> 00:10:48,270 And while it's very hard for the government 246 00:10:48,270 --> 00:10:50,880 to actually understand what's good and bad, 247 00:10:50,880 --> 00:10:53,220 but it could be that people are maybe working too much. 248 00:10:53,220 --> 00:10:54,870 And you could sort of push them towards trying 249 00:10:54,870 --> 00:10:55,620 to think about it. 250 00:10:55,620 --> 00:10:57,960 And then see that, OK, they want to experiment and sort 251 00:10:57,960 --> 00:10:58,650 of learn more. 252 00:10:58,650 --> 00:11:01,990 Would they perhaps be happier by doing so? 253 00:11:01,990 --> 00:11:03,180 Any questions on this? 254 00:11:09,530 --> 00:11:13,198 So then there's different forms of paternalisms 255 00:11:13,198 --> 00:11:14,240 that you can think about. 256 00:11:14,240 --> 00:11:17,210 There's hard paternalism and there's soft paternalism. 257 00:11:17,210 --> 00:11:20,000 And what's hard paternalism, essentially forcing choices 258 00:11:20,000 --> 00:11:22,940 on people, changing prices significantly, 259 00:11:22,940 --> 00:11:25,700 and mandate choices or procedures, 260 00:11:25,700 --> 00:11:29,150 outlaw products, including taxes. 261 00:11:29,150 --> 00:11:31,733 So you think of taxes as sort of changing prices. 262 00:11:31,733 --> 00:11:32,900 You can make things cheaper. 263 00:11:32,900 --> 00:11:35,450 You can make things more expensive for people. 264 00:11:35,450 --> 00:11:37,040 You can subsidize things. 265 00:11:37,040 --> 00:11:39,042 You can outlaw [INAUDIBLE]. 266 00:11:41,840 --> 00:11:44,600 Hard paternalism, if you're really significantly changing 267 00:11:44,600 --> 00:11:48,770 people's optimization problem in pretty harsh ways 268 00:11:48,770 --> 00:11:52,640 by essentially changing prices, potentially 269 00:11:52,640 --> 00:11:55,770 making some options infinitely expensive. 270 00:11:55,770 --> 00:11:56,270 Right? 271 00:11:56,270 --> 00:11:58,400 It's just to say, look, I'm just not allowing 272 00:11:58,400 --> 00:11:59,900 you to buy certain goods. 273 00:11:59,900 --> 00:12:02,990 You can think of the price as just being infinitely expensive 274 00:12:02,990 --> 00:12:03,750 for those goods. 275 00:12:03,750 --> 00:12:05,380 So that's hard paternalism. 276 00:12:05,380 --> 00:12:08,870 Now in contrast, there's other forms of soft paternalism. 277 00:12:08,870 --> 00:12:10,737 There's two versions of that. 278 00:12:10,737 --> 00:12:13,070 There's what's called the libertarian paternalism, which 279 00:12:13,070 --> 00:12:16,940 is Thaler and Sunstein are known for, 280 00:12:16,940 --> 00:12:19,550 which are essentially policies that constructively influence 281 00:12:19,550 --> 00:12:22,460 behavior while preserving or nearly preserving freedom 282 00:12:22,460 --> 00:12:23,100 of choice. 283 00:12:23,100 --> 00:12:24,950 But that's essentially nudges that we 284 00:12:24,950 --> 00:12:26,520 talked about previously. 285 00:12:26,520 --> 00:12:29,820 Essentially, they're saying, you can choose whatever you want. 286 00:12:29,820 --> 00:12:33,740 But I'm trying to influence your behavior in some ways that 287 00:12:33,740 --> 00:12:37,400 push you towards some options or some outcomes 288 00:12:37,400 --> 00:12:39,690 that we think are better than others. 289 00:12:39,690 --> 00:12:40,190 Right? 290 00:12:40,190 --> 00:12:43,970 Notice that the government, or the policymaker, 291 00:12:43,970 --> 00:12:46,310 needs to be very clear on what's good and what's bad. 292 00:12:46,310 --> 00:12:47,630 And then, sort of try to understand if maybe 293 00:12:47,630 --> 00:12:50,040 here's some good behavior that we have identified, 294 00:12:50,040 --> 00:12:53,900 let's now try to push you towards that direction. 295 00:12:53,900 --> 00:12:59,000 But since we're kind of unsure what people want, 296 00:12:59,000 --> 00:13:01,190 we're not sort of trying to exclude some options 297 00:13:01,190 --> 00:13:02,565 for some people because you might 298 00:13:02,565 --> 00:13:05,460 make things worse for some people, 299 00:13:05,460 --> 00:13:08,390 but instead just try to push you towards that. 300 00:13:08,390 --> 00:13:09,870 Very much related to that is what's 301 00:13:09,870 --> 00:13:11,970 called asymmetric paternalism. 302 00:13:11,970 --> 00:13:15,710 This is the term from Camerer et al., which 303 00:13:15,710 --> 00:13:17,950 are policies that help people who make mistakes 304 00:13:17,950 --> 00:13:22,160 while interfering minimally with people who behave optimally. 305 00:13:22,160 --> 00:13:24,740 That is to say, you want to be careful. 306 00:13:24,740 --> 00:13:27,350 If we see people doing certain things, 307 00:13:27,350 --> 00:13:28,970 you want to be careful not to make 308 00:13:28,970 --> 00:13:31,395 things worse for some people. 309 00:13:31,395 --> 00:13:33,770 Suppose there are some people, for a share of the people, 310 00:13:33,770 --> 00:13:35,960 like, half of the population, as behavioral agents, 311 00:13:35,960 --> 00:13:37,460 they make a bunch of mistakes. 312 00:13:37,460 --> 00:13:40,985 And half of the population doesn't make mistakes at all. 313 00:13:40,985 --> 00:13:42,860 Now you could say and some people are smoking 314 00:13:42,860 --> 00:13:44,902 and some people are not smoking in each of these. 315 00:13:44,902 --> 00:13:47,168 You might say, well, the smokers are making mistakes. 316 00:13:47,168 --> 00:13:48,960 The behavioral smokers are making mistakes. 317 00:13:48,960 --> 00:13:49,940 So let's help them. 318 00:13:49,940 --> 00:13:51,780 So they're going to tax smoking. 319 00:13:51,780 --> 00:13:52,280 Right? 320 00:13:52,280 --> 00:13:53,947 Now what's going to happen is, well, you 321 00:13:53,947 --> 00:13:56,300 might just improve some outcomes for those people. 322 00:13:56,300 --> 00:13:58,590 And that might be good. 323 00:13:58,590 --> 00:14:00,987 But there will be other people who just like smoking. 324 00:14:00,987 --> 00:14:02,820 And they're just happy, very happy to smoke. 325 00:14:02,820 --> 00:14:04,570 And they know that there are health risks. 326 00:14:04,570 --> 00:14:05,910 But they decide to smoke anyway. 327 00:14:05,910 --> 00:14:07,590 If you now start to tax smoking, you're 328 00:14:07,590 --> 00:14:09,507 going to distort their choices and make things 329 00:14:09,507 --> 00:14:10,990 worse off for them. 330 00:14:10,990 --> 00:14:14,340 So in particular, asymmetric paternalism 331 00:14:14,340 --> 00:14:18,030 is trying to avoid that by essentially saying, look, 332 00:14:18,030 --> 00:14:21,000 we will try to come up with some policies 333 00:14:21,000 --> 00:14:23,340 to make things better for some people. 334 00:14:23,340 --> 00:14:25,140 But we want to be very careful not 335 00:14:25,140 --> 00:14:28,080 to make things worse for others, in particular, people who 336 00:14:28,080 --> 00:14:30,740 are already optimizing, anyway. 337 00:14:30,740 --> 00:14:31,310 OK? 338 00:14:31,310 --> 00:14:33,740 And so libertarian paternalism is a version of that. 339 00:14:33,740 --> 00:14:35,180 Or it's just very closely related. 340 00:14:35,180 --> 00:14:39,440 Which is essentially to say, if you preserve or merely preserve 341 00:14:39,440 --> 00:14:42,380 freedom of choice, then it often follows 342 00:14:42,380 --> 00:14:45,620 that you're not going to make things worse 343 00:14:45,620 --> 00:14:47,130 for people optimizing anyway. 344 00:14:47,130 --> 00:14:47,630 Right? 345 00:14:47,630 --> 00:14:49,430 Because if somebody is optimizing anyway, 346 00:14:49,430 --> 00:14:53,060 and I'm just essentially sort of providing these small nudges 347 00:14:53,060 --> 00:14:55,483 one way or the other, then often, 348 00:14:55,483 --> 00:14:56,900 I'm not going to make things worse 349 00:14:56,900 --> 00:14:58,170 for people optimizing anyway. 350 00:14:58,170 --> 00:14:59,795 That's not necessarily always the case. 351 00:14:59,795 --> 00:15:02,540 But it's pretty much closely related. 352 00:15:02,540 --> 00:15:04,910 I'm saying there could be some people who just smoke-- 353 00:15:04,910 --> 00:15:07,535 they have, like, five cigarettes a day, or one cigarette a day. 354 00:15:07,535 --> 00:15:10,540 And suppose there's two types of people. 355 00:15:10,540 --> 00:15:13,100 There's one type of people who are essentially 356 00:15:13,100 --> 00:15:14,540 making mistakes. 357 00:15:14,540 --> 00:15:16,430 They have internalities in the sense 358 00:15:16,430 --> 00:15:19,100 that they misunderstand the consequences of smoking 359 00:15:19,100 --> 00:15:20,760 for their future self. 360 00:15:20,760 --> 00:15:22,600 These are people who make mistakes. 361 00:15:22,600 --> 00:15:24,920 Now those people who might want to help 362 00:15:24,920 --> 00:15:27,890 by increasing the price of smoking, by taxing it, 363 00:15:27,890 --> 00:15:29,750 or in other ways. 364 00:15:29,750 --> 00:15:30,410 OK? 365 00:15:30,410 --> 00:15:32,420 But there could be other people who essentially 366 00:15:32,420 --> 00:15:34,520 are fully aware of the consequences of smoking 367 00:15:34,520 --> 00:15:35,930 for their future health. 368 00:15:35,930 --> 00:15:38,360 And those are optimizing. 369 00:15:38,360 --> 00:15:40,190 And they're essentially in the optimization 370 00:15:40,190 --> 00:15:44,510 to decide that smoking is optimal for them. 371 00:15:44,510 --> 00:15:48,020 Those people might be made worse off by cigarette taxes. 372 00:15:48,020 --> 00:15:50,840 Because essentially, what you do, what those taxes are, 373 00:15:50,840 --> 00:15:54,230 any other taxes, you distort people's behaviors. 374 00:15:54,230 --> 00:15:59,300 And then you have a situation where we have potentially 375 00:15:59,300 --> 00:16:02,000 made some people better off, but also other people 376 00:16:02,000 --> 00:16:02,990 are worse off. 377 00:16:02,990 --> 00:16:07,550 And a lot of people would then be upset or unhappy about this. 378 00:16:07,550 --> 00:16:10,010 Because they say, like, look, those people who screw up 379 00:16:10,010 --> 00:16:12,437 their choices or they don't know what they're doing, 380 00:16:12,437 --> 00:16:13,520 we're trying to help them. 381 00:16:13,520 --> 00:16:15,440 But now that comes at the cost of others who 382 00:16:15,440 --> 00:16:16,880 actually prefer optimizing it. 383 00:16:16,880 --> 00:16:18,770 And in some sense, that gets you in trouble. 384 00:16:18,770 --> 00:16:22,550 Because there will be unhappy people on the other side of it, 385 00:16:22,550 --> 00:16:24,650 and potentially, quite a few of them. 386 00:16:24,650 --> 00:16:25,850 All right. 387 00:16:25,850 --> 00:16:28,610 So what are the arguments against paternalism? 388 00:16:28,610 --> 00:16:30,860 Or why is paternalism bad? 389 00:16:30,860 --> 00:16:34,430 You might say that sounds pretty good. 390 00:16:34,430 --> 00:16:37,310 But, like, now, if you try to actually implement this, 391 00:16:37,310 --> 00:16:40,570 what's bad about it? 392 00:16:40,570 --> 00:16:42,720 So one typical argument is to say, well, 393 00:16:42,720 --> 00:16:44,310 people make mistakes. 394 00:16:44,310 --> 00:16:46,990 Well, the government is made out of people. 395 00:16:46,990 --> 00:16:50,140 And so, how do we know that the people who 396 00:16:50,140 --> 00:16:54,060 run the government actually know better than people 397 00:16:54,060 --> 00:16:55,300 choose for themselves? 398 00:16:55,300 --> 00:16:59,000 And if the government is also making mistakes, 399 00:16:59,000 --> 00:17:01,208 then, essentially choosing for others, 400 00:17:01,208 --> 00:17:03,000 it's not clear that it makes things better. 401 00:17:03,000 --> 00:17:05,339 Or it could make things worse. 402 00:17:05,339 --> 00:17:09,060 In particular, if you think that the government doesn't have 403 00:17:09,060 --> 00:17:09,930 enough information. 404 00:17:09,930 --> 00:17:10,560 [INAUDIBLE] 405 00:17:10,560 --> 00:17:12,240 So I know what's best for me. 406 00:17:12,240 --> 00:17:16,020 And so therefore, I'm best equipped 407 00:17:16,020 --> 00:17:18,180 to make decisions for myself. 408 00:17:18,180 --> 00:17:20,195 It might potentially, in particular 409 00:17:20,195 --> 00:17:22,859 if there's heterogeneity, it might make some people 410 00:17:22,859 --> 00:17:23,410 better off. 411 00:17:23,410 --> 00:17:25,500 But that might come at the cost of some others. 412 00:17:25,500 --> 00:17:27,990 And then it's not clear that on average or overall making 413 00:17:27,990 --> 00:17:28,840 things better. 414 00:17:28,840 --> 00:17:31,190 [INAUDIBLE] very important [INAUDIBLE] 415 00:17:31,190 --> 00:17:35,490 in particular that comes up a lot in 14.75, 416 00:17:35,490 --> 00:17:37,740 for example, which is Political Economy, that 417 00:17:37,740 --> 00:17:38,930 will come up often. 418 00:17:38,930 --> 00:17:41,010 Because while we might want to help people, 419 00:17:41,010 --> 00:17:42,060 want to give people money, but then 420 00:17:42,060 --> 00:17:43,935 the government gets essentially-- the village 421 00:17:43,935 --> 00:17:47,570 government or whatever gets captured by elites [INAUDIBLE] 422 00:17:47,570 --> 00:17:49,540 keep the money for themselves. 423 00:17:49,540 --> 00:17:54,830 And so the more, the more opportunity they 424 00:17:54,830 --> 00:17:58,340 get for that, [INAUDIBLE] resources and power, 425 00:17:58,340 --> 00:18:02,750 the more opportunity they give for misuse, and corruption, 426 00:18:02,750 --> 00:18:04,110 and so on, and so forth. 427 00:18:04,110 --> 00:18:07,630 And so that's essentially saying that-- 428 00:18:07,630 --> 00:18:10,480 so this one class up argument is to say, well, 429 00:18:10,480 --> 00:18:13,500 even if the government wanted to do the very best for people, 430 00:18:13,500 --> 00:18:17,528 if the government wanted [INAUDIBLE] for its citizens, 431 00:18:17,528 --> 00:18:18,570 that would be hard to do. 432 00:18:18,570 --> 00:18:20,195 Because it doesn't know how to do that. 433 00:18:20,195 --> 00:18:24,850 There is inherent problems with that because of heterogeneity, 434 00:18:24,850 --> 00:18:27,650 knowledge issues, and so on, and so forth. 435 00:18:27,650 --> 00:18:29,900 And then, there's a second type of argument, 436 00:18:29,900 --> 00:18:34,390 which Tad just mentioned, which is, well, 437 00:18:34,390 --> 00:18:36,880 who's there to say that the government is actually 438 00:18:36,880 --> 00:18:39,920 benevolent and actually wants people to be better off? 439 00:18:39,920 --> 00:18:43,490 Maybe they just want to get rich, and so on, and so forth. 440 00:18:43,490 --> 00:18:45,610 And so now, if you let them make choices, 441 00:18:45,610 --> 00:18:49,240 then that opens the door for corruption and misuse of money, 442 00:18:49,240 --> 00:18:50,277 and so on, and so forth. 443 00:18:50,277 --> 00:18:52,610 And so there's sort of a slippery slope argument to say, 444 00:18:52,610 --> 00:18:54,860 well, there's some things in which I'm going to argue. 445 00:18:54,860 --> 00:18:56,500 For some things, it's pretty clear 446 00:18:56,500 --> 00:19:00,535 that we can make things better, or make 447 00:19:00,535 --> 00:19:02,910 some people, at least, better off without hurting others. 448 00:19:02,910 --> 00:19:04,220 And there's some cases where you say, 449 00:19:04,220 --> 00:19:06,137 let's have the government do that or whatever, 450 00:19:06,137 --> 00:19:07,930 but set policies that help people. 451 00:19:07,930 --> 00:19:09,305 By the way, it doesn't have to be 452 00:19:09,305 --> 00:19:11,430 necessarily the government [INAUDIBLE] 453 00:19:11,430 --> 00:19:12,790 company [INAUDIBLE]. 454 00:19:15,398 --> 00:19:17,440 But then, [INAUDIBLE] once we started with, like, 455 00:19:17,440 --> 00:19:18,970 some things, then they're going to do more and more things. 456 00:19:18,970 --> 00:19:20,140 And at some point, they'll reach a point 457 00:19:20,140 --> 00:19:21,160 where the government actually doesn't 458 00:19:21,160 --> 00:19:22,160 know what they're doing. 459 00:19:22,160 --> 00:19:25,040 And then they make things worse off. 460 00:19:25,040 --> 00:19:28,930 So I have a bunch of different arguments here. 461 00:19:28,930 --> 00:19:31,473 I think we mentioned quite a few of them. 462 00:19:31,473 --> 00:19:32,890 There's informational issues, sort 463 00:19:32,890 --> 00:19:35,050 of not understanding preferences, people optimizing 464 00:19:35,050 --> 00:19:37,090 anyway. 465 00:19:37,090 --> 00:19:39,610 The comments said, or we discussed previously, 466 00:19:39,610 --> 00:19:42,800 that some people's choices might be distorted. 467 00:19:42,800 --> 00:19:45,442 So these are issues of heterogeneity, often. 468 00:19:45,442 --> 00:19:47,400 Then there's sort of the government is terrible 469 00:19:47,400 --> 00:19:49,540 and wastes money, anyway. 470 00:19:49,540 --> 00:19:52,450 The government does not like people's best interests 471 00:19:52,450 --> 00:19:53,000 in mind. 472 00:19:53,000 --> 00:19:55,030 There's regulatory capture. 473 00:19:55,030 --> 00:19:59,660 And there's also what Chad mentioned, freedom of choice. 474 00:19:59,660 --> 00:20:01,160 Then there are some other arguments, 475 00:20:01,160 --> 00:20:03,310 which is freedom of choice matters, per se. 476 00:20:03,310 --> 00:20:04,920 That's a little trickier to grasp. 477 00:20:04,920 --> 00:20:07,323 But you say, look, it's just I want to choose for myself. 478 00:20:07,323 --> 00:20:09,490 And I don't want the government to tell me anything. 479 00:20:09,490 --> 00:20:11,290 And that's important by itself. 480 00:20:11,290 --> 00:20:12,550 This is not about the outcome. 481 00:20:12,550 --> 00:20:15,940 This is about the freedom of choice mattering itself. 482 00:20:15,940 --> 00:20:19,390 Then similarly, there's arguments about people just 483 00:20:19,390 --> 00:20:20,710 don't like hard paternalism. 484 00:20:20,710 --> 00:20:22,780 They just don't like to be told there are, like, 485 00:20:22,780 --> 00:20:27,967 certain options to be excluded for various reasons. 486 00:20:27,967 --> 00:20:30,300 Perhaps partially because they don't want the government 487 00:20:30,300 --> 00:20:33,510 to be powerful. 488 00:20:33,510 --> 00:20:35,490 And then, finally, there is some sort 489 00:20:35,490 --> 00:20:38,700 of practical problems, which is once we agree 490 00:20:38,700 --> 00:20:41,567 that we want some paternalistic policies, 491 00:20:41,567 --> 00:20:42,900 which policies should we choose? 492 00:20:42,900 --> 00:20:45,730 And how do we know which is better than another? 493 00:20:45,730 --> 00:20:48,052 I'm going to get back to that in a bit as well. 494 00:20:48,052 --> 00:20:50,010 There's also practical implementation problems, 495 00:20:50,010 --> 00:20:51,910 which is to say even if the government wanted 496 00:20:51,910 --> 00:20:53,550 to do the right things, the government 497 00:20:53,550 --> 00:20:56,050 is often inefficient, and so on, and so forth. 498 00:20:56,050 --> 00:20:57,540 And so it's too costly to do. 499 00:20:57,540 --> 00:21:00,430 And practically, it's just hard to do certain things. 500 00:21:00,430 --> 00:21:03,280 Now what are arguments in favor? 501 00:21:03,280 --> 00:21:06,890 I'm going to just speed things up a bit. 502 00:21:06,890 --> 00:21:09,283 So the arguments in favor are often very simple. 503 00:21:09,283 --> 00:21:11,450 Which is that people make mistakes, people screw up, 504 00:21:11,450 --> 00:21:14,530 and we can help them potentially screw up less. 505 00:21:14,530 --> 00:21:17,260 People often also don't like to necessarily make 506 00:21:17,260 --> 00:21:18,430 choices for themselves. 507 00:21:18,430 --> 00:21:21,040 It could be just because they don't like doing it. 508 00:21:21,040 --> 00:21:23,070 It could also be just as costly to do that. 509 00:21:23,070 --> 00:21:24,820 And that's just, there are some activities 510 00:21:24,820 --> 00:21:26,170 that are just hard to do. 511 00:21:26,170 --> 00:21:28,390 And I'd rather have somebody regulate this for me, 512 00:21:28,390 --> 00:21:30,880 as opposed to, like, me having to choose 513 00:21:30,880 --> 00:21:32,920 or make these choices all for myself. 514 00:21:32,920 --> 00:21:34,780 An example would be, suppose there's 515 00:21:34,780 --> 00:21:37,600 insurance plans or some form of retirement plans, 516 00:21:37,600 --> 00:21:40,107 and so on, if there are some options that are just terrible 517 00:21:40,107 --> 00:21:42,190 and they just would be exploiting me and making me 518 00:21:42,190 --> 00:21:44,530 worse off, I could probably figure it out 519 00:21:44,530 --> 00:21:46,510 if you gave me these plans and I spent, like, 520 00:21:46,510 --> 00:21:48,730 a day studying them. 521 00:21:48,730 --> 00:21:50,470 But you know, I have better stuff to do. 522 00:21:50,470 --> 00:21:53,470 And I'd rather have somebody else screen out the bad plans 523 00:21:53,470 --> 00:21:56,650 and sort of then offer me sort of the choices that 524 00:21:56,650 --> 00:21:58,710 try to not exploit me. 525 00:21:58,710 --> 00:22:01,632 And then, maybe, make some choices from a limited set. 526 00:22:01,632 --> 00:22:03,090 That's not to say I couldn't do it. 527 00:22:03,090 --> 00:22:05,173 But it's just sort of saying it's efficient for me 528 00:22:05,173 --> 00:22:08,670 to try to deal with these things on my own. 529 00:22:08,670 --> 00:22:10,730 Now, and then finally, and this is, I think, 530 00:22:10,730 --> 00:22:12,230 one of the most important arguments, 531 00:22:12,230 --> 00:22:16,410 is that there are a lot of soft paternalism 532 00:22:16,410 --> 00:22:21,030 regimes where people can opt out and are unaffected potentially. 533 00:22:21,030 --> 00:22:24,120 So when you go back to the previous concerns 534 00:22:24,120 --> 00:22:26,520 that people might have, these are often 535 00:22:26,520 --> 00:22:28,470 arguments against hard paternalism. 536 00:22:28,470 --> 00:22:30,870 These are often things about government wasting money, 537 00:22:30,870 --> 00:22:32,310 and so on, and so forth. 538 00:22:32,310 --> 00:22:36,900 If this is about sending reminders or, like, phrasing 539 00:22:36,900 --> 00:22:40,860 letters in one way versus another, in some simple way, 540 00:22:40,860 --> 00:22:43,380 or putting food or putting apples next 541 00:22:43,380 --> 00:22:45,780 to the counter versus candy bars, 542 00:22:45,780 --> 00:22:48,690 this is not about the government taking over or restricting 543 00:22:48,690 --> 00:22:50,010 people's free choice. 544 00:22:50,010 --> 00:22:52,620 It's rather to say we're just trying to make people better 545 00:22:52,620 --> 00:22:56,350 off in some simple ways to do. 546 00:22:56,350 --> 00:23:01,200 And so a lot of these criticism or arguments 547 00:23:01,200 --> 00:23:05,810 go away once you go into soft paternalism territory. 548 00:23:05,810 --> 00:23:08,022 And so then, at the end of the day, 549 00:23:08,022 --> 00:23:09,730 I think it's hard to make arguments often 550 00:23:09,730 --> 00:23:10,960 for paternalism. 551 00:23:10,960 --> 00:23:14,080 Because you need to be very confident [INAUDIBLE],, right? 552 00:23:14,080 --> 00:23:18,130 It's much easier to argue for soft paternalism, 553 00:23:18,130 --> 00:23:20,440 simple policies that are not expensive, 554 00:23:20,440 --> 00:23:22,630 that arguably are not hurting a lot of people, 555 00:23:22,630 --> 00:23:26,540 that make at least some people better off. 556 00:23:26,540 --> 00:23:27,640 And I think-- sorry. 557 00:23:27,640 --> 00:23:29,280 This is a little hard to see, maybe. 558 00:23:29,280 --> 00:23:31,590 But sometimes it's actually quite hard 559 00:23:31,590 --> 00:23:34,230 to figure out whether people are making mistakes. 560 00:23:34,230 --> 00:23:38,730 This is, for example, a survey from Gallup 561 00:23:38,730 --> 00:23:40,480 that looks at Americans and looks 562 00:23:40,480 --> 00:23:44,790 at people who call themselves very or somewhat 563 00:23:44,790 --> 00:23:46,740 overweight versus the percentage who 564 00:23:46,740 --> 00:23:50,650 are overweight and obese by some standards by the CDC and so on. 565 00:23:50,650 --> 00:23:52,680 And so here you see essentially that people 566 00:23:52,680 --> 00:23:56,460 tend to [INAUDIBLE] more obese. 567 00:23:56,460 --> 00:23:59,340 Because the US tends to be quite high [INAUDIBLE] 568 00:23:59,340 --> 00:24:01,140 increasing people's perception of that 569 00:24:01,140 --> 00:24:03,353 might be not necessarily the case. 570 00:24:03,353 --> 00:24:04,770 Now in some ways, like, who are we 571 00:24:04,770 --> 00:24:07,145 to sort of say that people should change their behaviors, 572 00:24:07,145 --> 00:24:09,640 or that they're making mistakes? 573 00:24:09,640 --> 00:24:11,820 So in some situations, at least, it's quite tricky 574 00:24:11,820 --> 00:24:14,490 to figure out what's a mistake and what's not 575 00:24:14,490 --> 00:24:16,210 and what should we do or not. 576 00:24:16,210 --> 00:24:19,002 And, again, you get into tricky issues 577 00:24:19,002 --> 00:24:21,210 when it comes to, in particular, for different selves 578 00:24:21,210 --> 00:24:22,590 and self-control. 579 00:24:22,590 --> 00:24:24,867 Again, who are we to say that somebody 580 00:24:24,867 --> 00:24:26,700 should lose weight, or change their behavior 581 00:24:26,700 --> 00:24:30,648 in certain ways, when perhaps that's not in their best 582 00:24:30,648 --> 00:24:32,440 interest, or that's not what they're doing. 583 00:24:32,440 --> 00:24:34,920 Maybe they're fully optimizing. 584 00:24:34,920 --> 00:24:38,680 How would we know that that's not the case? 585 00:24:38,680 --> 00:24:40,960 Now what are some forms of paternalism 586 00:24:40,960 --> 00:24:43,330 that are quite popular? 587 00:24:43,330 --> 00:24:46,780 There's lots of examples of in fact for popular paternalism. 588 00:24:46,780 --> 00:24:47,830 There's Social Security. 589 00:24:47,830 --> 00:24:49,660 There's health care programs for retirees, 590 00:24:49,660 --> 00:24:53,650 including Medicare, which tends to be quite popular. 591 00:24:53,650 --> 00:24:56,350 There's often restrictions of investment menus 592 00:24:56,350 --> 00:24:58,900 and 401k savings plans, where essentially that's 593 00:24:58,900 --> 00:25:00,520 precisely what I was just saying, 594 00:25:00,520 --> 00:25:03,790 is where essentially just bad plans and dominated plans 595 00:25:03,790 --> 00:25:06,070 are often just excluded. 596 00:25:06,070 --> 00:25:09,190 There's often consumer safety regulations that the FDA has 597 00:25:09,190 --> 00:25:11,658 essentially certain drugs cannot be offered. 598 00:25:11,658 --> 00:25:14,200 You cannot take certain drugs the government essentially just 599 00:25:14,200 --> 00:25:17,980 sort of [INAUDIBLE] if they're not being safe. 600 00:25:17,980 --> 00:25:19,630 There's mandatory education. 601 00:25:19,630 --> 00:25:21,400 In many places, you have to go to school. 602 00:25:21,400 --> 00:25:27,380 You can't just sort of say I'm just-- sort of my kid 603 00:25:27,380 --> 00:25:28,550 is learning in the forest. 604 00:25:28,550 --> 00:25:30,156 That's not an option. 605 00:25:30,156 --> 00:25:32,900 It often has to do with protection 606 00:25:32,900 --> 00:25:35,340 of minors in some ways. 607 00:25:35,340 --> 00:25:37,730 Then there's cigarette taxes, or sin taxes, 608 00:25:37,730 --> 00:25:38,870 as they're often called. 609 00:25:38,870 --> 00:25:40,200 But it's also hard paternalism. 610 00:25:40,200 --> 00:25:42,860 Notice that here, you're changing people's prices. 611 00:25:42,860 --> 00:25:47,368 In many places, prostitution, polygamy are banned. 612 00:25:47,368 --> 00:25:48,410 These are like hard bans. 613 00:25:48,410 --> 00:25:54,320 So you're just not allowed to marry three spouses. 614 00:25:54,320 --> 00:25:56,060 Mandating face masks, interestingly, that 615 00:25:56,060 --> 00:25:58,220 seems to be-- that's hard paternalism, as you have, 616 00:25:58,220 --> 00:26:00,070 for example, in Cambridge. 617 00:26:00,070 --> 00:26:01,570 I think that the punishment is going 618 00:26:01,570 --> 00:26:04,670 to be $300 if you go outside and don't wear a face mask. 619 00:26:04,670 --> 00:26:06,890 I'm pretty sure in Cambridge that's quite popular. 620 00:26:06,890 --> 00:26:09,840 Notice, of course, that's not true in all states, and so on. 621 00:26:09,840 --> 00:26:11,510 So there might be some policies that 622 00:26:11,510 --> 00:26:14,370 are very much popular in some places, 623 00:26:14,370 --> 00:26:16,860 but not at all in others. 624 00:26:16,860 --> 00:26:20,210 There's also some soft paternalism 625 00:26:20,210 --> 00:26:23,570 that tends to be quite popular. 626 00:26:23,570 --> 00:26:25,460 There's some places where it's not clear 627 00:26:25,460 --> 00:26:27,360 whether it's hard versus soft paternalism, 628 00:26:27,360 --> 00:26:29,270 or what exactly the definition is. 629 00:26:29,270 --> 00:26:32,170 Things like banning junk food in school vending machines, 630 00:26:32,170 --> 00:26:33,920 you can call that sort of hard paternalism 631 00:26:33,920 --> 00:26:36,220 because you take away options but of course, it 632 00:26:36,220 --> 00:26:40,220 doesn't really prevent you from bringing in candy to school 633 00:26:40,220 --> 00:26:40,730 anyway. 634 00:26:40,730 --> 00:26:42,997 So why call that hard or soft paternalism? 635 00:26:42,997 --> 00:26:43,830 I don't really care. 636 00:26:43,830 --> 00:26:47,060 But that tends to be quite a popular policy. 637 00:26:47,060 --> 00:26:49,860 Calorie disclosure laws, which essentially just is saying, 638 00:26:49,860 --> 00:26:53,807 OK, you have to put in some information for our customers, 639 00:26:53,807 --> 00:26:55,140 which could be easily available. 640 00:26:55,140 --> 00:26:57,515 You could just look it up online or the like. 641 00:26:57,515 --> 00:26:59,390 And things like default enrollment and 401(k) 642 00:26:59,390 --> 00:27:04,560 [INAUDIBLE] all tend to be quite popular. 643 00:27:04,560 --> 00:27:08,960 And for some goods, you know, there's huge taxes. 644 00:27:08,960 --> 00:27:12,180 For example, cigarette taxes tend to be extremely high. 645 00:27:12,180 --> 00:27:14,300 This is from 2019. 646 00:27:14,300 --> 00:27:18,270 Notice that there's this huge variation across places. 647 00:27:18,270 --> 00:27:24,510 If you look at some places here, these are some of the highest, 648 00:27:24,510 --> 00:27:27,880 such as Connecticut, including Massachusetts, 649 00:27:27,880 --> 00:27:31,050 it's $3, $4 per pack of cigarettes. 650 00:27:31,050 --> 00:27:34,710 In addition, there's also like $1 from the federal government. 651 00:27:34,710 --> 00:27:41,820 Then there's some other places, including North Dakota, 652 00:27:41,820 --> 00:27:46,920 or North Carolina, and so on, where cigarette taxes 653 00:27:46,920 --> 00:27:49,110 are extremely low. 654 00:27:49,110 --> 00:27:50,610 So that's kind of quite interesting, 655 00:27:50,610 --> 00:27:53,220 because it sort of says, here's a policy 656 00:27:53,220 --> 00:27:58,140 that, depending on your view of how people make choices, 657 00:27:58,140 --> 00:28:01,350 governments make very different decisions. 658 00:28:01,350 --> 00:28:04,500 And presumably that has to do with freedom of choice 659 00:28:04,500 --> 00:28:09,330 or the popularity of that choice itself in the specific state. 660 00:28:09,330 --> 00:28:11,520 Of course, it also has to do with, potentially, 661 00:28:11,520 --> 00:28:13,820 the party, and so on. 662 00:28:13,820 --> 00:28:16,300 Similarly, there's also an aspiration [INAUDIBLE] 663 00:28:16,300 --> 00:28:18,850 state alcohol taxes. 664 00:28:18,850 --> 00:28:21,220 In fact, they vary hugely. 665 00:28:21,220 --> 00:28:23,890 In particular, you find some states 666 00:28:23,890 --> 00:28:30,610 where the taxes per gallon are $20, $30, 667 00:28:30,610 --> 00:28:34,180 and some other states where it's essentially close to 0. 668 00:28:34,180 --> 00:28:38,110 Notice that there's also a bunch of other blue laws 669 00:28:38,110 --> 00:28:40,900 as they're referred to, which is things, like, on certain days, 670 00:28:40,900 --> 00:28:44,938 you're not allowed to sell alcohol, and so on. 671 00:28:44,938 --> 00:28:46,230 Now that's kind of interesting. 672 00:28:46,230 --> 00:28:50,380 Because that's sort of additional restrictions, 673 00:28:50,380 --> 00:28:54,510 which would tend to be quite inefficient in various ways. 674 00:28:54,510 --> 00:28:56,310 That is to say, some economists would 675 00:28:56,310 --> 00:28:59,370 tend to think that if there's certain behavior that we 676 00:28:59,370 --> 00:29:03,360 don't want people to engage in, we should tax it. 677 00:29:03,360 --> 00:29:06,030 Instead, you could essentially sort of outlaw it 678 00:29:06,030 --> 00:29:07,170 in certain ways. 679 00:29:07,170 --> 00:29:08,670 That tends to be quite inefficient. 680 00:29:08,670 --> 00:29:11,370 Because often, people find ways to get it anyway. 681 00:29:11,370 --> 00:29:13,050 Plus, the government does not get 682 00:29:13,050 --> 00:29:16,910 any taxes or revenue from that. 683 00:29:16,910 --> 00:29:20,570 Now what are some unpopular paternalism or things 684 00:29:20,570 --> 00:29:23,870 that people don't like? 685 00:29:23,870 --> 00:29:24,970 That's exactly it. 686 00:29:24,970 --> 00:29:28,270 One of them is helmets and seatbelts. 687 00:29:28,270 --> 00:29:31,960 So there, often there's no externality here. 688 00:29:31,960 --> 00:29:35,320 This is essentially just something to protect yourself. 689 00:29:35,320 --> 00:29:39,480 While they tend to be quite unpopular in many places, 690 00:29:39,480 --> 00:29:41,950 it's not quite clear why that is. 691 00:29:41,950 --> 00:29:44,020 So often, it's sort of a thing that's not cool, 692 00:29:44,020 --> 00:29:46,960 or like there's some social norms in some ways where people 693 00:29:46,960 --> 00:29:48,900 think it's not a thing you want to do. 694 00:29:48,900 --> 00:29:52,300 So I was sort of explaining to my 14-year-old, 15-year-old 695 00:29:52,300 --> 00:29:54,435 niece to wear a helmet was sort of very hard. 696 00:29:54,435 --> 00:29:55,810 And then she would say, yes, yes, 697 00:29:55,810 --> 00:29:58,960 but then drive around, ride her bike around the corner 698 00:29:58,960 --> 00:30:01,340 and then take it off, and so on. 699 00:30:01,340 --> 00:30:06,030 So that seems to be about something about personal trade 700 00:30:06,030 --> 00:30:08,690 offs between risk and reward in some ways 701 00:30:08,690 --> 00:30:12,440 where the government of some people 702 00:30:12,440 --> 00:30:16,760 might just not value the benefits of looking cool 703 00:30:16,760 --> 00:30:20,680 or whatever sufficiently, or as much as the individual itself. 704 00:30:20,680 --> 00:30:24,650 And then they tend to be sort of unhappy about it. 705 00:30:24,650 --> 00:30:26,840 Of course, the alternative way of putting this 706 00:30:26,840 --> 00:30:29,900 is that the individual itself is underestimating the risk 707 00:30:29,900 --> 00:30:32,750 potentially or doesn't take into account how bad it 708 00:30:32,750 --> 00:30:36,680 would be to actually have a bicycle accident or the like. 709 00:30:36,680 --> 00:30:40,527 But that's essentially something about people disagreeing 710 00:30:40,527 --> 00:30:42,860 with the government, or with their parents, or the like, 711 00:30:42,860 --> 00:30:47,000 about the risk versus reward and the cost benefits 712 00:30:47,000 --> 00:30:48,080 of certain actions. 713 00:30:48,080 --> 00:30:50,480 And there seems to be disagreement. 714 00:30:50,480 --> 00:30:52,150 And then you could sort of mandate that. 715 00:30:52,150 --> 00:30:53,570 But that leads to unhappiness. 716 00:30:53,570 --> 00:30:56,000 Because there's precisely that disagreement. 717 00:30:56,000 --> 00:31:00,770 Now wearing masks is in some sense quite similar. 718 00:31:00,770 --> 00:31:02,122 There, it's quite different. 719 00:31:02,122 --> 00:31:04,580 What's different here is that there's externalities, right? 720 00:31:04,580 --> 00:31:08,090 So the externalities here are that certain people benefit 721 00:31:08,090 --> 00:31:10,280 a lot from having their shops open, 722 00:31:10,280 --> 00:31:13,250 or being able to sell things. 723 00:31:13,250 --> 00:31:15,720 And others are affected a lot. 724 00:31:15,720 --> 00:31:20,180 So in particular, the elderly might 725 00:31:20,180 --> 00:31:24,050 be disproportionately affected and sort of might essentially-- 726 00:31:24,050 --> 00:31:28,400 the number of people who will die, well, clearly, 727 00:31:28,400 --> 00:31:31,280 if people are not wearing masks, or if shops, 728 00:31:31,280 --> 00:31:34,220 if people are running outside, or meeting, 729 00:31:34,220 --> 00:31:36,330 and so on, and so forth. 730 00:31:36,330 --> 00:31:40,158 But there's others who are really-- 731 00:31:40,158 --> 00:31:41,450 for them, that's really costly. 732 00:31:41,450 --> 00:31:44,570 Because their businesses are not open and the like. 733 00:31:44,570 --> 00:31:50,790 Now there could be entirely rational issues 734 00:31:50,790 --> 00:31:53,790 where you might say, well, people just 735 00:31:53,790 --> 00:31:57,022 don't care about certain other groups of the society. 736 00:31:57,022 --> 00:31:58,980 And therefore, you want to have your shop open. 737 00:31:58,980 --> 00:32:02,533 And you don't want to be told that you should be doing that. 738 00:32:02,533 --> 00:32:04,950 There could be also mistakes in the sense that people just 739 00:32:04,950 --> 00:32:07,492 don't understand, necessarily, there are these externalities. 740 00:32:07,492 --> 00:32:09,990 They might sort of say, well, I'm not affected. 741 00:32:09,990 --> 00:32:10,950 And I'll be just fine. 742 00:32:10,950 --> 00:32:13,860 And it might be, actually, that the risk for themselves 743 00:32:13,860 --> 00:32:14,910 is not that high. 744 00:32:14,910 --> 00:32:17,910 But of course, there's large externalities in others 745 00:32:17,910 --> 00:32:21,200 that they might not appreciate or not take into account. 746 00:32:21,200 --> 00:32:21,700 Exactly. 747 00:32:21,700 --> 00:32:26,450 So Chad, Carmen is saying speed limits, similarly, 748 00:32:26,450 --> 00:32:28,650 are similar to that. 749 00:32:28,650 --> 00:32:30,550 Because it's sort of, like, I'm driving. 750 00:32:30,550 --> 00:32:36,330 But there's [INAUDIBLE] in certain accidents. 751 00:32:36,330 --> 00:32:39,570 And so why is that? 752 00:32:39,570 --> 00:32:45,220 Or so how do we think about [INAUDIBLE] say? 753 00:32:45,220 --> 00:32:48,300 So I mentioned here, this is an example. 754 00:32:48,300 --> 00:32:54,200 So here we have [INAUDIBLE] are quite popular. 755 00:32:54,200 --> 00:32:55,510 And why is that popular? 756 00:32:55,510 --> 00:33:00,760 And why is making certain drugs illegal, why is that unpopular? 757 00:33:00,760 --> 00:33:02,800 What's the problem there? 758 00:33:02,800 --> 00:33:04,660 So I think there's, when you think about, 759 00:33:04,660 --> 00:33:07,220 when you do, like, economic analysis of that, 760 00:33:07,220 --> 00:33:10,740 is [INAUDIBLE] you can think this might be externality. 761 00:33:10,740 --> 00:33:12,490 Externalities might actually be much lower 762 00:33:12,490 --> 00:33:14,230 than for smoking or drinking. 763 00:33:14,230 --> 00:33:17,290 Because drunk driving and so on is very lethal. 764 00:33:17,290 --> 00:33:19,360 There might be also some externalities 765 00:33:19,360 --> 00:33:22,210 from smoking weed. 766 00:33:22,210 --> 00:33:26,220 But the externalities [INAUDIBLE] 767 00:33:26,220 --> 00:33:29,170 are for drinking, well, then there could be [INAUDIBLE] 768 00:33:29,170 --> 00:33:31,510 But, again, for internalities, usually the solution 769 00:33:31,510 --> 00:33:34,422 would be just the typical sin taxes, and so on. 770 00:33:34,422 --> 00:33:36,130 You can say, well, do you want to tax it. 771 00:33:36,130 --> 00:33:36,630 Sure. 772 00:33:36,630 --> 00:33:38,505 But then you at least would get some revenue. 773 00:33:38,505 --> 00:33:39,713 You would increase the price. 774 00:33:39,713 --> 00:33:41,740 You would get some revenue for the government, 775 00:33:41,740 --> 00:33:44,840 plus you would lower people's consumption. 776 00:33:44,840 --> 00:33:48,610 In addition, there's all this stuff about locking people up 777 00:33:48,610 --> 00:33:49,510 unnecessarily. 778 00:33:49,510 --> 00:33:51,010 By making it illegal, essentially we 779 00:33:51,010 --> 00:33:53,630 create a large prisoner population, 780 00:33:53,630 --> 00:33:57,770 and so on, which surely is [INAUDIBLE] society. 781 00:33:57,770 --> 00:34:00,610 So once you sort of have this policy [INAUDIBLE],, 782 00:34:00,610 --> 00:34:02,260 there's actually not a lot of reasons 783 00:34:02,260 --> 00:34:05,890 for outlawing how these sort of tend 784 00:34:05,890 --> 00:34:10,732 to be gut reactions or some sort of judgments 785 00:34:10,732 --> 00:34:11,940 that were made at some point. 786 00:34:11,940 --> 00:34:14,290 And there's some lobby arguing in favor of alcohol 787 00:34:14,290 --> 00:34:17,530 and maybe less of a lobby arguing for other drugs. 788 00:34:17,530 --> 00:34:20,469 But as we say, the sentiment has very much changed. 789 00:34:20,469 --> 00:34:22,719 And it seems to me, sooner or later, pretty much 790 00:34:22,719 --> 00:34:27,530 all states are moving towards legalization 791 00:34:27,530 --> 00:34:31,969 and that leads to, essentially, [INAUDIBLE] government 792 00:34:31,969 --> 00:34:35,335 revenues as well. 793 00:34:35,335 --> 00:34:37,460 So I think one argument perhaps that one could make 794 00:34:37,460 --> 00:34:40,370 is sort of these gateway-drug arguments, which 795 00:34:40,370 --> 00:34:43,260 I don't think there's actually that much evidence for that. 796 00:34:43,260 --> 00:34:46,940 But you say, well, actually, marijuana might not 797 00:34:46,940 --> 00:34:48,679 be so bad by itself, but perhaps it 798 00:34:48,679 --> 00:34:51,080 leads to other [? worries. ?] It's not 799 00:34:51,080 --> 00:34:54,560 obvious at all that legalization of that-- so A, 800 00:34:54,560 --> 00:34:57,139 it's not obvious that that's necessarily true, but B, 801 00:34:57,139 --> 00:35:00,530 it's not obvious that making it legal versus illegal changes 802 00:35:00,530 --> 00:35:01,520 that rationale. 803 00:35:01,520 --> 00:35:04,370 Because when it's illegal, lots of people 804 00:35:04,370 --> 00:35:05,510 are using drugs anyway. 805 00:35:05,510 --> 00:35:07,302 And maybe it makes it even more [INAUDIBLE] 806 00:35:07,302 --> 00:35:10,290 and maybe more dangerous, potentially. 807 00:35:10,290 --> 00:35:12,020 But I think it's important to realize 808 00:35:12,020 --> 00:35:14,990 that some of these policies are very much, in some ways, 809 00:35:14,990 --> 00:35:19,250 arbitrary, based on some legacy or some lobbies 810 00:35:19,250 --> 00:35:20,670 and so on and so forth. 811 00:35:20,670 --> 00:35:22,695 And what's important here is then 812 00:35:22,695 --> 00:35:26,540 that the popularity itself then also will 813 00:35:26,540 --> 00:35:27,973 affect policies in some way. 814 00:35:27,973 --> 00:35:29,390 So like, [INAUDIBLE],, as you say, 815 00:35:29,390 --> 00:35:31,820 the public sentiment has shifted in various [INAUDIBLE] 816 00:35:31,820 --> 00:35:35,960 some ways stuff will become legal as a result of that. 817 00:35:35,960 --> 00:35:38,270 And once you do it in some states and things seem fine 818 00:35:38,270 --> 00:35:40,100 and the states make a lot of money from it, 819 00:35:40,100 --> 00:35:44,940 a lot of other states eventually will follow suit, in part 820 00:35:44,940 --> 00:35:49,190 because it's also much harder to enforce laws 821 00:35:49,190 --> 00:35:50,900 if you have a bunch of surrounding states 822 00:35:50,900 --> 00:35:52,985 where you can buy stuff legally. 823 00:35:52,985 --> 00:35:54,860 And that makes it, of course, trickier, then, 824 00:35:54,860 --> 00:35:58,280 to enforce the law anyway. 825 00:35:58,280 --> 00:36:01,700 I have some other unpopular examples [INAUDIBLE] 826 00:36:01,700 --> 00:36:08,120 unpopular-- paternalism, junk food bans, junk food taxes. 827 00:36:08,120 --> 00:36:10,530 And some of these things, again, are changing over time. 828 00:36:10,530 --> 00:36:15,170 So it may well be that things are different 829 00:36:15,170 --> 00:36:17,000 in 10, 20 years from now. 830 00:36:17,000 --> 00:36:21,800 Gambling laws, pornography laws, other kinds of sin taxes. 831 00:36:21,800 --> 00:36:23,850 Mandating face masks-- and again, in some states, 832 00:36:23,850 --> 00:36:25,350 they're just really, really popular. 833 00:36:25,350 --> 00:36:27,770 And people are on the streets protesting 834 00:36:27,770 --> 00:36:33,290 against it, which is quite interesting [INAUDIBLE] 835 00:36:33,290 --> 00:36:34,190 might be. 836 00:36:34,190 --> 00:36:38,480 But often it is the case that there are some forms of-- 837 00:36:38,480 --> 00:36:42,380 that there's real economic interest behind that, often. 838 00:36:42,380 --> 00:36:44,180 And perhaps, [? so ?] again, that 839 00:36:44,180 --> 00:36:46,190 could be very much fully rational 840 00:36:46,190 --> 00:36:48,260 and could also be that people might misunderstand 841 00:36:48,260 --> 00:36:52,010 the externalities that are at play. 842 00:36:52,010 --> 00:36:54,890 There's some other kinds of hard paternalism that's 843 00:36:54,890 --> 00:36:56,990 also kind of unpopular and also not a particularly 844 00:36:56,990 --> 00:36:59,000 efficient thing to do, which is things 845 00:36:59,000 --> 00:37:01,593 like the ban on 16-ounce sodas, which 846 00:37:01,593 --> 00:37:04,010 is kind of like a weird thing to do if you think about it, 847 00:37:04,010 --> 00:37:08,690 because you can just buy smaller bottles 848 00:37:08,690 --> 00:37:14,470 or the like, which tends to be quite inefficient overall. 849 00:37:14,470 --> 00:37:16,940 Here's a sign for seatbelts. 850 00:37:19,970 --> 00:37:22,910 Now, OK, so here's the legalization, 851 00:37:22,910 --> 00:37:26,120 which also there is massive variation across states. 852 00:37:26,120 --> 00:37:28,160 Now, there's some nudges. 853 00:37:28,160 --> 00:37:30,380 And some policies have massive impacts. 854 00:37:30,380 --> 00:37:31,950 And we studied at least some of this. 855 00:37:31,950 --> 00:37:35,380 For example, this was automatic enrollment. 856 00:37:35,380 --> 00:37:38,780 There is essentially a huge swing towards that. 857 00:37:38,780 --> 00:37:43,100 Essentially lots of companies and so on are now using that. 858 00:37:43,100 --> 00:37:48,200 And the fraction of people who are participating in retirement 859 00:37:48,200 --> 00:37:51,140 plans is way higher than the fraction of people when 860 00:37:51,140 --> 00:37:52,700 there's voluntary [INAUDIBLE]. 861 00:37:52,700 --> 00:37:57,380 There's other policies which are quite simple policies which 862 00:37:57,380 --> 00:38:03,625 are things trying to get people to appear in court. 863 00:38:03,625 --> 00:38:10,830 So essentially lots of people who 864 00:38:10,830 --> 00:38:14,820 have to appear in court for, like, initial court hearings 865 00:38:14,820 --> 00:38:16,650 tend to not show up in those meetings. 866 00:38:16,650 --> 00:38:18,720 And as you can imagine, that's not a great idea. 867 00:38:18,720 --> 00:38:25,020 Because that essentially leads to really bad consequences. 868 00:38:25,020 --> 00:38:26,760 Now, what you can essentially just do 869 00:38:26,760 --> 00:38:29,763 is try to do very simple policies. 870 00:38:29,763 --> 00:38:32,055 And there's a very nice paper by [INAUDIBLE] and others 871 00:38:32,055 --> 00:38:33,120 that looks at this. 872 00:38:33,120 --> 00:38:35,370 You can sort of change the forms. 873 00:38:35,370 --> 00:38:37,350 You can phrase things differently. 874 00:38:37,350 --> 00:38:41,200 That essentially tries to nudge people in the right direction. 875 00:38:41,200 --> 00:38:43,735 You can also do things like text message reminders. 876 00:38:43,735 --> 00:38:46,110 And the text message literally just says, your court date 877 00:38:46,110 --> 00:38:49,200 is on May 22 at 9:00 AM. 878 00:38:49,200 --> 00:38:51,000 Please show up. 879 00:38:51,000 --> 00:38:52,830 And you remind people to do that, 880 00:38:52,830 --> 00:38:55,260 and what you get essentially is that you 881 00:38:55,260 --> 00:38:58,560 get large decreases in timely court appearances-- 882 00:38:58,560 --> 00:39:02,220 or non-appearances, or the fraction of people 883 00:39:02,220 --> 00:39:08,040 who don't show up in court, to their own court hearing, 884 00:39:08,040 --> 00:39:12,530 decreases a lot by using very, very simple policies. 885 00:39:12,530 --> 00:39:15,360 Once you do those kinds of policies, presumably, 886 00:39:15,360 --> 00:39:18,030 [INAUDIBLE] not much to say against this policy 887 00:39:18,030 --> 00:39:22,215 in the sense that you might say, well, so A, 888 00:39:22,215 --> 00:39:24,600 you're not restricting anybody's choices. 889 00:39:24,600 --> 00:39:26,580 You're also not really making anybody worse 890 00:39:26,580 --> 00:39:28,980 off by sending some text message reminders. 891 00:39:28,980 --> 00:39:31,015 There might be some small inconveniences 892 00:39:31,015 --> 00:39:32,640 of receiving these text messages, which 893 00:39:32,640 --> 00:39:35,130 are kind of annoying and spam for some people. 894 00:39:35,130 --> 00:39:38,340 But overall, nobody's really made off by this. 895 00:39:38,340 --> 00:39:40,240 It's a very cheap thing to do. 896 00:39:40,240 --> 00:39:41,700 So there's not a lot of arguments 897 00:39:41,700 --> 00:39:43,735 against such policies, where you say, look, 898 00:39:43,735 --> 00:39:45,610 here's a policy that's extremely cheap to do. 899 00:39:45,610 --> 00:39:46,860 Some people are benefiting. 900 00:39:46,860 --> 00:39:48,990 Presumably showing up in court is a good thing-- 901 00:39:48,990 --> 00:39:50,670 for your court hearing. 902 00:39:50,670 --> 00:39:53,430 And it's cheap thing to do. 903 00:39:53,430 --> 00:39:56,700 And the benefits are potentially large. 904 00:39:56,700 --> 00:39:59,070 Because now we prevent some people 905 00:39:59,070 --> 00:40:01,140 from essentially ending up in jail 906 00:40:01,140 --> 00:40:04,770 for too long or for unnecessary time 907 00:40:04,770 --> 00:40:06,870 because they failed to appear to their hearings. 908 00:40:09,782 --> 00:40:11,490 But so then I think it's also important-- 909 00:40:11,490 --> 00:40:14,032 so there's some examples that essentially show large effects, 910 00:40:14,032 --> 00:40:16,320 including automatic enrollment, including things 911 00:40:16,320 --> 00:40:19,740 like text messages of some interventions 912 00:40:19,740 --> 00:40:22,350 to induce court appearances. 913 00:40:22,350 --> 00:40:25,230 But it's important to understand that not everything always 914 00:40:25,230 --> 00:40:25,770 works. 915 00:40:25,770 --> 00:40:27,750 What you see often, or what I've shown 916 00:40:27,750 --> 00:40:32,530 you are examples of successful examples of stuff that works. 917 00:40:32,530 --> 00:40:35,000 But there are some examples that just don't work at all. 918 00:40:35,000 --> 00:40:37,510 So here's an example of a clinical trial, which is called 919 00:40:37,510 --> 00:40:41,020 the HeartStrong randomized clinical trial, 920 00:40:41,020 --> 00:40:42,978 which gave people essentially a kitchen 921 00:40:42,978 --> 00:40:45,270 sink of things that we think are all good things to do. 922 00:40:45,270 --> 00:40:47,070 This is stuff like [? reminders, ?] 923 00:40:47,070 --> 00:40:49,380 financial incentives, social support, 924 00:40:49,380 --> 00:40:52,560 and so on and so forth, for people, essentially, 925 00:40:52,560 --> 00:40:55,230 who had heart attacks, by giving them 926 00:40:55,230 --> 00:40:58,920 wireless bottles, lottery-based incentives, and again, 927 00:40:58,920 --> 00:41:04,020 social support, having friends and family to help them. 928 00:41:04,020 --> 00:41:07,820 And essentially that did exactly nothing for people 929 00:41:07,820 --> 00:41:09,850 and had no effect whatsoever. 930 00:41:09,850 --> 00:41:12,810 So it's not to say that we think everything works, always, 931 00:41:12,810 --> 00:41:15,480 and we should just always nudge people and provide incentives 932 00:41:15,480 --> 00:41:16,810 and so on and so forth. 933 00:41:16,810 --> 00:41:20,265 And something that might work but in fact there's 934 00:41:20,265 --> 00:41:22,140 no reason to think that it might always work. 935 00:41:22,140 --> 00:41:26,830 In some situations, this might be helpful and others not. 936 00:41:26,830 --> 00:41:30,160 More concerningly there's also a very recent paper 937 00:41:30,160 --> 00:41:34,060 from Stefano DellaVigna that just came out like a week ago 938 00:41:34,060 --> 00:41:37,605 that looks at academic studies and nudge units. 939 00:41:37,605 --> 00:41:38,980 That's to say what they're trying 940 00:41:38,980 --> 00:41:41,650 to do is look at treatment effects in academic studies 941 00:41:41,650 --> 00:41:44,050 that are published in papers which are papers 942 00:41:44,050 --> 00:41:48,400 like the stuff that we saw on default effects 943 00:41:48,400 --> 00:41:52,135 and so on, which is Carroll et al is the 401(k) savings one, 944 00:41:52,135 --> 00:41:56,200 here is the study on FAFSA involvement that I showed you, 945 00:41:56,200 --> 00:42:00,210 there's some changing many orders 946 00:42:00,210 --> 00:42:02,440 in buffet line for healthier food consumption. 947 00:42:02,440 --> 00:42:04,380 So there's a bunch of papers that usually you 948 00:42:04,380 --> 00:42:06,630 would teach in a typical class. 949 00:42:06,630 --> 00:42:09,900 And what you see here is, on the y-axis, 950 00:42:09,900 --> 00:42:12,300 you could see treatment effects and percentage points. 951 00:42:12,300 --> 00:42:15,150 On the x-axis here you see the control group tick up, 952 00:42:15,150 --> 00:42:17,100 which is to say, if you don't do anything, 953 00:42:17,100 --> 00:42:18,990 what does the control group do? 954 00:42:18,990 --> 00:42:20,787 And what do you see is pretty large effects 955 00:42:20,787 --> 00:42:22,620 in many of these studies that are published. 956 00:42:22,620 --> 00:42:24,910 There are some questions on this is like U-shaped or inverse U 957 00:42:24,910 --> 00:42:25,680 or whatever. 958 00:42:25,680 --> 00:42:27,373 Let's ignore that for a second. 959 00:42:27,373 --> 00:42:29,790 But overall what you see is pretty large treatment effects 960 00:42:29,790 --> 00:42:30,810 in these situations. 961 00:42:33,430 --> 00:42:38,260 And this is like 26 trials and 71 different nudges here. 962 00:42:38,260 --> 00:42:41,200 When you instead look at nudge units-- so 963 00:42:41,200 --> 00:42:42,100 what are nudge units? 964 00:42:42,100 --> 00:42:43,600 These are essentially, for example, 965 00:42:43,600 --> 00:42:49,010 the UK government has what's called the BIT, the Behavioral 966 00:42:49,010 --> 00:42:52,570 Insights Team, which essentially is a team inside the UK 967 00:42:52,570 --> 00:42:55,900 government that tries to support or nudge people 968 00:42:55,900 --> 00:42:57,030 to make better choices. 969 00:42:57,030 --> 00:43:00,340 And that's very much coming from the nudge movement, 970 00:43:00,340 --> 00:43:03,160 from Thaler and so on, that sort of advised the UK 971 00:43:03,160 --> 00:43:05,080 government to set this up. 972 00:43:05,080 --> 00:43:06,708 And they're now nudging away. 973 00:43:06,708 --> 00:43:09,250 Because the government is really happy about cheap stuff that 974 00:43:09,250 --> 00:43:12,960 doesn't cost very much but potentially has large benefits. 975 00:43:12,960 --> 00:43:17,810 Now what they then also do, often, 976 00:43:17,810 --> 00:43:20,840 is they evaluate their interventions in part 977 00:43:20,840 --> 00:43:23,390 because they try to figure out what works best. 978 00:43:23,390 --> 00:43:25,190 Now, when you look at these interventions, 979 00:43:25,190 --> 00:43:28,730 you see essentially that the treatment effects 980 00:43:28,730 --> 00:43:30,650 in those kinds of interventions tend 981 00:43:30,650 --> 00:43:32,108 to be much larger than on the left. 982 00:43:32,108 --> 00:43:33,608 If you at the right, there's a bunch 983 00:43:33,608 --> 00:43:36,050 of stuff, a massive bunch of points that are pretty high, 984 00:43:36,050 --> 00:43:37,378 above 10%, 20%. 985 00:43:37,378 --> 00:43:39,170 When you look to the right, essentially you 986 00:43:39,170 --> 00:43:42,470 see almost everything is between 0% and 10%. 987 00:43:42,470 --> 00:43:45,380 A lot of it seems to be not statistically significantly 988 00:43:45,380 --> 00:43:48,110 different from 0. 989 00:43:48,110 --> 00:43:51,050 So that should give us some pause in the sense of we 990 00:43:51,050 --> 00:43:54,620 should be careful in necessarily recommending a bunch of stuff 991 00:43:54,620 --> 00:43:56,750 to government and saying we should always nudge 992 00:43:56,750 --> 00:43:59,300 and so on and so forth. 993 00:43:59,300 --> 00:44:00,330 What's going on here? 994 00:44:00,330 --> 00:44:03,318 Well, some of this could be about publication bias. 995 00:44:03,318 --> 00:44:05,735 Essentially there's a bunch of studies that are being run, 996 00:44:05,735 --> 00:44:07,520 and what's being published is essentially 997 00:44:07,520 --> 00:44:11,052 stuff that's highly significant and flashy and important. 998 00:44:11,052 --> 00:44:13,010 There could be also issues with implementation, 999 00:44:13,010 --> 00:44:14,960 that maybe what's done with researchers 1000 00:44:14,960 --> 00:44:17,970 is sort of implemented better, potentially, 1001 00:44:17,970 --> 00:44:20,010 and maybe what's done at scale is 1002 00:44:20,010 --> 00:44:23,050 just harder to do in nudge units and harder to do carefully. 1003 00:44:23,050 --> 00:44:25,068 And then the effects might be lower. 1004 00:44:25,068 --> 00:44:26,610 Not to say that governments shouldn't 1005 00:44:26,610 --> 00:44:30,120 nudge in a sense of, if there are effects 1006 00:44:30,120 --> 00:44:33,240 of the order of magnitude something like 5%, 1007 00:44:33,240 --> 00:44:35,970 percentage points, and that's really cheap to do, 1008 00:44:35,970 --> 00:44:37,710 you should probably do it anyway. 1009 00:44:37,710 --> 00:44:40,002 But it's just the same, nudges are not the magic bullet 1010 00:44:40,002 --> 00:44:42,043 that sort of change everything and the government 1011 00:44:42,043 --> 00:44:43,313 should always engage on them. 1012 00:44:43,313 --> 00:44:44,730 And it's quite important, in fact, 1013 00:44:44,730 --> 00:44:46,230 for the governments, or governments, 1014 00:44:46,230 --> 00:44:48,910 or any sort of these policy units. 1015 00:44:48,910 --> 00:44:51,323 And firms, by the way, do the exact same thing 1016 00:44:51,323 --> 00:44:53,740 to try and figure out what works and what doesn't, and not 1017 00:44:53,740 --> 00:44:57,130 just rely on the academic evidence that might sort of be 1018 00:44:57,130 --> 00:45:00,010 more proof-of-concept rather than sort of something that 1019 00:45:00,010 --> 00:45:02,200 evaluates cost-effectiveness. 1020 00:45:02,200 --> 00:45:03,460 Any questions on this? 1021 00:45:12,420 --> 00:45:15,090 Now, another question-- this is a little blurry-- 1022 00:45:15,090 --> 00:45:18,907 another question you might ask, and you 1023 00:45:18,907 --> 00:45:20,490 might have seen these kinds of letters 1024 00:45:20,490 --> 00:45:23,220 before, whether everybody wants to be nudged. 1025 00:45:23,220 --> 00:45:27,600 And so what there's a very famous study by Allcott 1026 00:45:27,600 --> 00:45:31,740 and others that looked at energy comparisons. 1027 00:45:31,740 --> 00:45:35,090 It's a study done with Opower, the company, that essentially 1028 00:45:35,090 --> 00:45:37,340 tries to reduce energy usage. 1029 00:45:37,340 --> 00:45:39,050 And the way this works is what you do 1030 00:45:39,050 --> 00:45:42,840 is you send people these kinds of letters. 1031 00:45:42,840 --> 00:45:48,180 And I'm getting these letters too that sort of say, here's 1032 00:45:48,180 --> 00:45:51,930 your energy consumption in the last month, the last year. 1033 00:45:51,930 --> 00:45:54,383 Here is all of your neighbors. 1034 00:45:54,383 --> 00:45:55,425 So it's your comparisons. 1035 00:45:55,425 --> 00:45:58,480 You see the blue bar is like how much did you use 1036 00:45:58,480 --> 00:46:01,230 and the gray bar is like how much 1037 00:46:01,230 --> 00:46:02,772 did all of your neighbors use. 1038 00:46:02,772 --> 00:46:04,230 And then there's the green bar that 1039 00:46:04,230 --> 00:46:07,390 sort of says how much do efficient neighbors use. 1040 00:46:07,390 --> 00:46:10,960 And then there's smileys, where you have two smileys if you're 1041 00:46:10,960 --> 00:46:12,850 doing great, one smiley if you're doing good, 1042 00:46:12,850 --> 00:46:14,800 like this fellow here. 1043 00:46:14,800 --> 00:46:16,330 And then there's no smiley, I guess, 1044 00:46:16,330 --> 00:46:19,180 if you're not doing well. 1045 00:46:19,180 --> 00:46:21,010 Sometimes there's also like a sad smiley 1046 00:46:21,010 --> 00:46:23,790 if you're using lots of energy and so on. 1047 00:46:23,790 --> 00:46:26,520 And so then essentially what this is trying to do 1048 00:46:26,520 --> 00:46:29,100 is trying to provide some social comparison here. 1049 00:46:29,100 --> 00:46:30,840 It's a very cheap policy to do. 1050 00:46:30,840 --> 00:46:35,670 In particular, when you're sending energy bills anyway, 1051 00:46:35,670 --> 00:46:38,440 it's very cheap to add this at the top, where essentially 1052 00:46:38,440 --> 00:46:40,440 you're sending people energy bills anyway. 1053 00:46:40,440 --> 00:46:42,300 And now you're adding this social comparison 1054 00:46:42,300 --> 00:46:45,763 to try to nudge people to use less energy. 1055 00:46:45,763 --> 00:46:47,930 There's a question of why the company might do that. 1056 00:46:47,930 --> 00:46:50,670 But that's a separate issue. 1057 00:46:50,670 --> 00:46:56,790 And what you see here is, when you look at treatment effects, 1058 00:46:56,790 --> 00:46:58,980 there tends to be clearer treatment effects-- 1059 00:46:58,980 --> 00:47:02,370 at least some treatment effects of providing 1060 00:47:02,370 --> 00:47:07,710 those kinds of information reduces overall energy 1061 00:47:07,710 --> 00:47:11,450 consumption in RCTs that people have. 1062 00:47:11,450 --> 00:47:14,750 Now, one question you might ask is, well, shouldn't we 1063 00:47:14,750 --> 00:47:15,620 then always do this? 1064 00:47:15,620 --> 00:47:17,520 So what's the reason not to do that? 1065 00:47:17,520 --> 00:47:22,780 And it turns out [INAUDIBLE] not everybody wants to be nudged. 1066 00:47:22,780 --> 00:47:25,300 So here's an intervention where now, again, 1067 00:47:25,300 --> 00:47:28,030 Allcott and co-author have now, in fact, 1068 00:47:28,030 --> 00:47:32,800 asked people about, would you like to receive these letters 1069 00:47:32,800 --> 00:47:34,540 or would you like to receive these kinds 1070 00:47:34,540 --> 00:47:35,950 of social comparisons? 1071 00:47:35,950 --> 00:47:40,210 And what they ask is what's your positive willingness 1072 00:47:40,210 --> 00:47:41,140 to pay for it. 1073 00:47:41,140 --> 00:47:44,890 So would you like to have a dollar, versus these reports? 1074 00:47:44,890 --> 00:47:47,380 So a dollar and no reports versus 1075 00:47:47,380 --> 00:47:48,860 just receiving the reports. 1076 00:47:48,860 --> 00:47:50,920 What is people's willingness to pay? 1077 00:47:50,920 --> 00:47:52,450 But they also ask their willingness 1078 00:47:52,450 --> 00:47:54,530 to pay not to receive these reports. 1079 00:47:54,530 --> 00:47:57,450 That is to say, I could either tell you-- 1080 00:47:57,450 --> 00:48:06,460 I would ask you, you receive zero dollars and no reports 1081 00:48:06,460 --> 00:48:08,320 or you receive a dollar and the reports. 1082 00:48:08,320 --> 00:48:11,050 And some people turn that down, which essentially 1083 00:48:11,050 --> 00:48:14,650 implies that they'd rather have a dollar less and no reports. 1084 00:48:14,650 --> 00:48:17,980 They're willing to pay not to receive reports. 1085 00:48:17,980 --> 00:48:19,762 And this is what you see here on the left, 1086 00:48:19,762 --> 00:48:21,220 is essentially people's willingness 1087 00:48:21,220 --> 00:48:22,520 to pay for home reports. 1088 00:48:22,520 --> 00:48:24,020 So there are quite a few people here 1089 00:48:24,020 --> 00:48:25,940 on the right who have positive willingness to pay. 1090 00:48:25,940 --> 00:48:28,070 They think it's kind of useful, it's kind of interesting. 1091 00:48:28,070 --> 00:48:29,987 Maybe it helps them reduce energy consumption. 1092 00:48:29,987 --> 00:48:32,230 Maybe that motivates them in certain ways. 1093 00:48:32,230 --> 00:48:34,390 But some people are just saying, essentially, 1094 00:48:34,390 --> 00:48:36,587 I do not want these reports. 1095 00:48:36,587 --> 00:48:38,420 And the reason being that, essentially, it's 1096 00:48:38,420 --> 00:48:41,870 kind of just annoying to get nagged all the time. 1097 00:48:41,870 --> 00:48:45,270 Maybe in some cases you just don't have a lot of choice. 1098 00:48:45,270 --> 00:48:48,140 And in some cases maybe you just like to use lots of energy 1099 00:48:48,140 --> 00:48:49,230 for whatever reason. 1100 00:48:49,230 --> 00:48:51,230 And it's really annoying to be told all the time 1101 00:48:51,230 --> 00:48:53,525 that your neighbor is really better than you are. 1102 00:48:53,525 --> 00:48:56,210 It makes people uncomfortable and makes them 1103 00:48:56,210 --> 00:48:58,327 unhappy and so on and so forth. 1104 00:48:58,327 --> 00:48:59,660 Maybe it makes them feel guilty. 1105 00:48:59,660 --> 00:49:01,520 Who knows? 1106 00:49:01,520 --> 00:49:03,950 Maybe they just don't want to think about it. 1107 00:49:03,950 --> 00:49:06,320 But importantly, there are some people who 1108 00:49:06,320 --> 00:49:08,570 just do not want to be nudged. 1109 00:49:08,570 --> 00:49:10,520 And I think that's important. 1110 00:49:10,520 --> 00:49:13,400 Because when you think back about the libertarian 1111 00:49:13,400 --> 00:49:16,520 paternalism or the soft paternalism criteria, 1112 00:49:16,520 --> 00:49:19,910 asymmetric paternalism that we talked about, well, 1113 00:49:19,910 --> 00:49:21,740 what the asymmetric paternalism would say 1114 00:49:21,740 --> 00:49:24,710 is we should not make anybody worse off. 1115 00:49:24,710 --> 00:49:26,240 So now if we send all these letters 1116 00:49:26,240 --> 00:49:29,360 and make, actually, people uncomfortable, 1117 00:49:29,360 --> 00:49:32,840 then in fact we are potentially making people worse off. 1118 00:49:32,840 --> 00:49:36,790 And you can think about some other types of interventions. 1119 00:49:36,790 --> 00:49:39,620 If you [INAUDIBLE] letters from MIT all the time, 1120 00:49:39,620 --> 00:49:42,644 how are you doing compared to classmates, 1121 00:49:42,644 --> 00:49:47,960 in terms of study efforts, in terms of problem sets, 1122 00:49:47,960 --> 00:49:52,970 [INAUDIBLE] and so on, potentially that would likely 1123 00:49:52,970 --> 00:49:56,120 increase study efforts in some ways and would perhaps increase 1124 00:49:56,120 --> 00:49:57,530 learning in some ways. 1125 00:49:57,530 --> 00:49:59,990 But it's not clear at all that we would increase welfare. 1126 00:49:59,990 --> 00:50:01,740 Because maybe everybody would be even more 1127 00:50:01,740 --> 00:50:03,590 stressed and everybody would sort of 1128 00:50:03,590 --> 00:50:07,730 be less happy and so on and so forth, at least in some cases. 1129 00:50:12,220 --> 00:50:15,310 Yeah, actually so Jenna is asking a question 1130 00:50:15,310 --> 00:50:17,140 about who's unhappy about this. 1131 00:50:24,020 --> 00:50:27,490 So the asymmetric paternalism is just 1132 00:50:27,490 --> 00:50:34,630 to say suppose there are some people who are just very 1133 00:50:34,630 --> 00:50:37,270 happy with their choice about using a lot of energy, 1134 00:50:37,270 --> 00:50:39,800 and they just don't want to be nudged. 1135 00:50:39,800 --> 00:50:41,850 They're fully rationally choosing this. 1136 00:50:41,850 --> 00:50:43,870 Maybe they know, deep down in their heart, 1137 00:50:43,870 --> 00:50:47,440 that it's bad to ruin the planet and climate and so on and so 1138 00:50:47,440 --> 00:50:48,140 forth. 1139 00:50:48,140 --> 00:50:49,765 But they really made a choice that they 1140 00:50:49,765 --> 00:50:51,130 want to use lots of energy. 1141 00:50:51,130 --> 00:50:52,750 Those people are not the people who 1142 00:50:52,750 --> 00:50:54,550 want to receive any letters. 1143 00:50:54,550 --> 00:50:58,582 So asymmetric paternalism would say we should just 1144 00:50:58,582 --> 00:50:59,290 leave them alone. 1145 00:50:59,290 --> 00:51:01,930 They should just be doing whatever they want to do. 1146 00:51:01,930 --> 00:51:04,510 And us sending them letters to make them feel bad 1147 00:51:04,510 --> 00:51:06,520 if they don't even change their behavior anyway, 1148 00:51:06,520 --> 00:51:08,440 or even if they would change their behavior, 1149 00:51:08,440 --> 00:51:09,502 that's not for us to say. 1150 00:51:09,502 --> 00:51:10,960 And we should just leave them alone 1151 00:51:10,960 --> 00:51:13,422 because we make them worse off by receiving these letters 1152 00:51:13,422 --> 00:51:15,880 by their revealed preference to say they are willing to pay 1153 00:51:15,880 --> 00:51:18,420 not to receive these letters. 1154 00:51:18,420 --> 00:51:21,510 Now, there could be some situations where people 1155 00:51:21,510 --> 00:51:25,060 who have potentially large treatment effects, 1156 00:51:25,060 --> 00:51:26,790 these are people who use lots of energy 1157 00:51:26,790 --> 00:51:28,470 and they just don't know about it. 1158 00:51:28,470 --> 00:51:30,630 And those, you want to sort of send them 1159 00:51:30,630 --> 00:51:32,070 letters and push them. 1160 00:51:32,070 --> 00:51:34,940 But I think the whole point of the asymmetric paternalism 1161 00:51:34,940 --> 00:51:37,480 is to say we shouldn't make anybody worse off. 1162 00:51:37,480 --> 00:51:40,350 So if somebody reveals to us, by their preference, which 1163 00:51:40,350 --> 00:51:42,660 is what these fellows are doing here, that they really 1164 00:51:42,660 --> 00:51:44,370 don't want to receive these letters, 1165 00:51:44,370 --> 00:51:48,120 we should respect that. 1166 00:51:48,120 --> 00:51:50,850 And then one potential solution here 1167 00:51:50,850 --> 00:51:53,560 is to say, well, why don't we just let people opt out, 1168 00:51:53,560 --> 00:51:55,810 where you just say, OK, you send people these letters. 1169 00:51:55,810 --> 00:51:57,435 And then you have to click on something 1170 00:51:57,435 --> 00:51:58,980 or you have to opt in or out. 1171 00:51:58,980 --> 00:52:00,630 And then maybe people who really don't 1172 00:52:00,630 --> 00:52:05,070 want to receive these letters, we can essentially just let 1173 00:52:05,070 --> 00:52:06,030 them be. 1174 00:52:06,030 --> 00:52:07,920 There's no judgment here in terms 1175 00:52:07,920 --> 00:52:11,600 of is that socially what we want to do 1176 00:52:11,600 --> 00:52:13,300 and is this a good or a bad thing, 1177 00:52:13,300 --> 00:52:16,920 this is just to say we should respect people's preferences 1178 00:52:16,920 --> 00:52:19,105 in certain ways. 1179 00:52:19,105 --> 00:52:19,980 Does that make sense? 1180 00:52:24,530 --> 00:52:27,260 Now the next question you might ask 1181 00:52:27,260 --> 00:52:30,410 is, well, what form should paternalism take? 1182 00:52:30,410 --> 00:52:32,690 And that's the thing, since it's important, 1183 00:52:32,690 --> 00:52:36,980 because you could take 401(k) savings policies, 1184 00:52:36,980 --> 00:52:38,980 you can come up with a lots of different options 1185 00:52:38,980 --> 00:52:43,380 with which you could design your 401(k) savings policies. 1186 00:52:43,380 --> 00:52:45,510 You could have active decisions. 1187 00:52:45,510 --> 00:52:47,940 You can have defaults for participation. 1188 00:52:47,940 --> 00:52:51,280 You have defaults for all this information. 1189 00:52:51,280 --> 00:52:55,130 You have a personal financial planner sitting down with you. 1190 00:52:55,130 --> 00:52:57,110 You might have a minimum amount of-- this is, 1191 00:52:57,110 --> 00:52:58,860 I guess, hard paternalism-- minimum amount 1192 00:52:58,860 --> 00:53:01,710 of non-zero 401(k) savings rate, which is just force people 1193 00:53:01,710 --> 00:53:02,730 to save. 1194 00:53:02,730 --> 00:53:06,210 And it's important to say it's not just about paternalism 1195 00:53:06,210 --> 00:53:07,170 versus non-paternalism. 1196 00:53:07,170 --> 00:53:10,830 Once you get into the paternalism realm, 1197 00:53:10,830 --> 00:53:12,570 it's actually very hard to figure out 1198 00:53:12,570 --> 00:53:14,010 what should we do because there's 1199 00:53:14,010 --> 00:53:15,093 so many different options. 1200 00:53:15,093 --> 00:53:19,100 And it's hard to tell which one is better than the other. 1201 00:53:19,100 --> 00:53:21,500 More generally, even, it's actually quite hard 1202 00:53:21,500 --> 00:53:24,680 to come up with any non-paternalism option. 1203 00:53:24,680 --> 00:53:28,547 That is to say, think of all the options that are there. 1204 00:53:28,547 --> 00:53:30,630 In some sense, once you think that defaults affect 1205 00:53:30,630 --> 00:53:33,930 people's decisions, then no savings default 1206 00:53:33,930 --> 00:53:35,280 is also a form of paternalism. 1207 00:53:35,280 --> 00:53:37,740 In some sense you say, you choose something for people, 1208 00:53:37,740 --> 00:53:39,930 and that's going to push them to not, say, 1209 00:53:39,930 --> 00:53:42,240 fully choose [INAUDIBLE] like there's nothing special 1210 00:53:42,240 --> 00:53:47,340 about the zero [INAUDIBLE] also have a 10% savings default. 1211 00:53:47,340 --> 00:53:50,040 And so it's important to understand, 1212 00:53:50,040 --> 00:53:51,850 regardless of what choice you choose, 1213 00:53:51,850 --> 00:53:54,660 you're going to affect people's choices and their behaviors 1214 00:53:54,660 --> 00:53:56,020 in certain ways. 1215 00:53:56,020 --> 00:54:00,930 And so that's what it means, essentially, that 1216 00:54:00,930 --> 00:54:07,080 have to take a stance in which you affect people's behavior. 1217 00:54:07,080 --> 00:54:10,940 In a way, just saying we're just choosing a zero savings default 1218 00:54:10,940 --> 00:54:13,350 is in some sense a cop-out. 1219 00:54:13,350 --> 00:54:15,750 Because even the zero savings default 1220 00:54:15,750 --> 00:54:18,880 is affecting people's behavior quite a bit. 1221 00:54:18,880 --> 00:54:22,320 So in a way, the discussion should rather be about what is 1222 00:54:22,320 --> 00:54:26,610 the optimal way or the optimal paternalism-- soft paternalism, 1223 00:54:26,610 --> 00:54:27,390 I would argue-- 1224 00:54:27,390 --> 00:54:30,480 rather than saying paternalism versus not. 1225 00:54:30,480 --> 00:54:32,685 Because regardless of what option you choose, 1226 00:54:32,685 --> 00:54:34,950 you're going to affect behaviors in certain ways 1227 00:54:34,950 --> 00:54:37,410 because precisely the way you provide information 1228 00:54:37,410 --> 00:54:39,720 to people on their default frames, et cetera, 1229 00:54:39,720 --> 00:54:42,640 affect people's choices. 1230 00:54:42,640 --> 00:54:43,560 Does that make sense? 1231 00:54:46,200 --> 00:54:47,040 OK. 1232 00:54:47,040 --> 00:54:51,690 So here's a very quick example of a very useful and helpful 1233 00:54:51,690 --> 00:54:54,060 example of libertarian paternalism 1234 00:54:54,060 --> 00:54:56,720 that has been quite successful. 1235 00:54:56,720 --> 00:55:00,270 So Thaler and Benartzi were asked 1236 00:55:00,270 --> 00:55:04,410 to design a 401(k) savings for a company that wanted to increase 1237 00:55:04,410 --> 00:55:06,570 its employees' savings. 1238 00:55:06,570 --> 00:55:09,060 So the company hired a financial advisor. 1239 00:55:09,060 --> 00:55:11,070 The employees could sit down with the advisor 1240 00:55:11,070 --> 00:55:13,650 to evaluate their financial situation and make a plan. 1241 00:55:13,650 --> 00:55:17,040 So for those who agreed to talk to the advisor, the advisor-- 1242 00:55:17,040 --> 00:55:18,390 he, I guess, in this case-- 1243 00:55:18,390 --> 00:55:19,860 recommended a savings rate. 1244 00:55:19,860 --> 00:55:21,270 Some agreed to implement this. 1245 00:55:21,270 --> 00:55:23,410 And then Benartzi and Thaler got the leftovers. 1246 00:55:23,410 --> 00:55:26,790 So that's arguably a group that's negatively 1247 00:55:26,790 --> 00:55:30,840 selected towards saving a lot. 1248 00:55:30,840 --> 00:55:34,680 Now, what's then the SMarT-- the Save More Tomorrow-- plan? 1249 00:55:34,680 --> 00:55:37,110 Employees were then approached about increasing 1250 00:55:37,110 --> 00:55:40,110 their contribution rate a considerable time 1251 00:55:40,110 --> 00:55:42,000 before the next scheduled pay raise. 1252 00:55:42,000 --> 00:55:44,250 So that's to say, before the next scheduled pay raise, 1253 00:55:44,250 --> 00:55:46,740 they were essentially asked about whether they would 1254 00:55:46,740 --> 00:55:49,050 like to join voluntarily and if they 1255 00:55:49,050 --> 00:55:53,010 wanted to do their contribution to the plan 1256 00:55:53,010 --> 00:55:54,540 was going to be increased beginning 1257 00:55:54,540 --> 00:55:57,120 with the first paycheck after the raise. 1258 00:55:57,120 --> 00:55:59,350 So that's to say, I told you, in December, you're 1259 00:55:59,350 --> 00:56:00,870 going to get a pay raise, would you 1260 00:56:00,870 --> 00:56:04,020 like to increase your contribution then 1261 00:56:04,020 --> 00:56:06,600 once you receive that pay raise? 1262 00:56:06,600 --> 00:56:11,700 And so then you could choose a preset maximum. 1263 00:56:11,700 --> 00:56:14,850 And then what would happen is the contribution rate 1264 00:56:14,850 --> 00:56:17,760 would automatically increase over time with each pay raise 1265 00:56:17,760 --> 00:56:19,300 that you would receive. 1266 00:56:19,300 --> 00:56:22,560 Suppose your pay raise is like, say, 5% or 3%, 1267 00:56:22,560 --> 00:56:24,240 what you would get then is like-- 1268 00:56:24,240 --> 00:56:25,475 and suppose it's 3%. 1269 00:56:25,475 --> 00:56:29,935 You would get only a 2% increase in your pay and 1% or 1% 1270 00:56:29,935 --> 00:56:31,560 percentage point of your pay would then 1271 00:56:31,560 --> 00:56:34,140 go towards the retirement contribution. 1272 00:56:34,140 --> 00:56:35,970 Importantly, the employee can opt out 1273 00:56:35,970 --> 00:56:38,610 of the plan at any point in time by essentially just making 1274 00:56:38,610 --> 00:56:41,530 a phone call or the like. 1275 00:56:41,530 --> 00:56:46,540 Now, why is this plan a libertarian paternalism? 1276 00:56:46,540 --> 00:56:49,430 Well because of aspects 2 and 4. 1277 00:56:49,430 --> 00:56:50,055 It's voluntary. 1278 00:56:50,055 --> 00:56:51,560 So you can do whatever you want. 1279 00:56:51,560 --> 00:56:54,010 You don't have to choose this at all. 1280 00:56:54,010 --> 00:56:55,570 You can essentially decline it. 1281 00:56:55,570 --> 00:56:58,130 You can also opt out at any point in time. 1282 00:56:58,130 --> 00:57:01,242 So arguably nobody is made worse off. 1283 00:57:01,242 --> 00:57:02,950 Because essentially what we're doing here 1284 00:57:02,950 --> 00:57:06,440 is increasing people's options by saying, 1285 00:57:06,440 --> 00:57:08,990 here's an option for you to increase your saving 1286 00:57:08,990 --> 00:57:11,150 automatically over time. 1287 00:57:11,150 --> 00:57:14,660 And you can leave it whenever you would like to do. 1288 00:57:14,660 --> 00:57:18,200 So arguably nobody should be worse off by this. 1289 00:57:18,200 --> 00:57:22,380 And potentially some people are made better off quite a bit. 1290 00:57:22,380 --> 00:57:23,990 And that's exactly what you see here. 1291 00:57:23,990 --> 00:57:26,930 Essentially, when you look at the people who 1292 00:57:26,930 --> 00:57:28,860 joined the SMarT plan-- 1293 00:57:28,860 --> 00:57:31,690 notice that this is not random here. 1294 00:57:31,690 --> 00:57:34,630 So there's some concerns about an indication. 1295 00:57:34,630 --> 00:57:38,103 But setting that aside, if you look at the people who, again, 1296 00:57:38,103 --> 00:57:39,520 arguably, are negatively selected, 1297 00:57:39,520 --> 00:57:44,280 these are people who declined the financial consultant, 1298 00:57:44,280 --> 00:57:46,970 they essentially increased their savings dramatically 1299 00:57:46,970 --> 00:57:49,970 as a consequence of this SMarT plan. 1300 00:57:49,970 --> 00:57:53,150 And now the SMarT plan has been introduced or implemented 1301 00:57:53,150 --> 00:57:55,377 in quite a few places. 1302 00:57:55,377 --> 00:57:56,960 But anyway, that's not the point here. 1303 00:57:56,960 --> 00:57:59,300 I want to say what's good about the plan is, 1304 00:57:59,300 --> 00:58:03,110 well, it's a SMarT plan not just because of the acronym but also 1305 00:58:03,110 --> 00:58:06,200 because it's using behavioral principles or stuff 1306 00:58:06,200 --> 00:58:09,750 that you learned in class quite a bit and carefully. 1307 00:58:09,750 --> 00:58:11,195 So what are these principles? 1308 00:58:11,195 --> 00:58:13,760 So why is this a useful plan to use? 1309 00:58:13,760 --> 00:58:17,210 And I'm going to show you again what the plan is. 1310 00:58:17,210 --> 00:58:18,620 What's clever about this plan? 1311 00:58:18,620 --> 00:58:20,432 What is it using carefully? 1312 00:58:36,265 --> 00:58:38,890 By the way, I don't know if you can hear this background noise. 1313 00:58:38,890 --> 00:58:42,910 My neighbor has engaged in various home improvement 1314 00:58:42,910 --> 00:58:45,817 projects since the quarantine. 1315 00:58:48,499 --> 00:58:53,175 I have so far refrained from complaining about it 1316 00:58:53,175 --> 00:58:55,300 because I don't know what's going on in their lives 1317 00:58:55,300 --> 00:58:55,800 otherwise. 1318 00:58:55,800 --> 00:58:59,680 And maybe that's a way of dealing with stress 1319 00:58:59,680 --> 00:59:02,670 and so on, if you can hear this. 1320 00:59:02,670 --> 00:59:07,620 Once you opt in, the default is for the contribution rate 1321 00:59:07,620 --> 00:59:08,560 to increase. 1322 00:59:08,560 --> 00:59:11,430 So the default is essentially working towards increasing 1323 00:59:11,430 --> 00:59:12,510 people's safety. 1324 00:59:12,510 --> 00:59:14,018 So there's two things going on here. 1325 00:59:14,018 --> 00:59:16,560 One is present bias, when you say, well, yeah, in the future, 1326 00:59:16,560 --> 00:59:17,693 I want to save more. 1327 00:59:17,693 --> 00:59:19,110 So precisely that's kind of great. 1328 00:59:19,110 --> 00:59:22,690 Because now the future self will be virtuous and so on. 1329 00:59:22,690 --> 00:59:25,000 So if people have present bias, that will help. 1330 00:59:25,000 --> 00:59:27,660 And then there's also the part that's 1331 00:59:27,660 --> 00:59:30,060 about paycheck increases. 1332 00:59:30,060 --> 00:59:32,190 Remember, people tend to be loss-averse. 1333 00:59:32,190 --> 00:59:35,460 So what this plan is also accomplishing is, 1334 00:59:35,460 --> 00:59:37,230 so if I'm asking you right now, today, 1335 00:59:37,230 --> 00:59:39,850 would you like to increase your savings rate, well, not only 1336 00:59:39,850 --> 00:59:41,490 are you present-biased, but also you're 1337 00:59:41,490 --> 00:59:43,180 going to have a cut in your paycheck. 1338 00:59:43,180 --> 00:59:45,315 And people hate cuts in their paychecks. 1339 00:59:45,315 --> 00:59:47,190 So instead, it's saying, well, in the future, 1340 00:59:47,190 --> 00:59:49,470 your raise will be lower. 1341 00:59:49,470 --> 00:59:54,300 And people are less averse to reducing those increases 1342 00:59:54,300 --> 00:59:57,057 compared to experiencing pay cuts. 1343 00:59:57,057 --> 00:59:58,640 So that's, I think, what we have here. 1344 00:59:58,640 --> 01:00:00,750 So this essentially present bias is helping. 1345 01:00:00,750 --> 01:00:02,580 Then there's reference dependence. 1346 01:00:02,580 --> 01:00:05,475 And then there's defaults that help people switching out 1347 01:00:05,475 --> 01:00:08,190 of some form of present bias, where precisely as we said 1348 01:00:08,190 --> 01:00:09,990 previously, people are not switching out 1349 01:00:09,990 --> 01:00:11,290 of those kinds of plans. 1350 01:00:11,290 --> 01:00:13,172 So if you think that's a good thing 1351 01:00:13,172 --> 01:00:14,880 to increase people's savings, here that's 1352 01:00:14,880 --> 01:00:16,410 [INAUDIBLE],, because people might actually 1353 01:00:16,410 --> 01:00:17,970 procrastinate switching out of it 1354 01:00:17,970 --> 01:00:19,990 once they're in in the first place. 1355 01:00:19,990 --> 01:00:22,680 So that seems to be a great example. 1356 01:00:22,680 --> 01:00:29,830 Because not only is it preserving people's free choice 1357 01:00:29,830 --> 01:00:33,640 and not making people worse off, but also it's 1358 01:00:33,640 --> 01:00:38,008 using, in a smart way, behavioral principles. 1359 01:00:38,008 --> 01:00:39,550 Let me briefly talk about the market. 1360 01:00:39,550 --> 01:00:41,925 And we talked about this a little bit already previously, 1361 01:00:41,925 --> 01:00:43,810 but I want to make this explicit. 1362 01:00:43,810 --> 01:00:48,130 So one argument that often comes in particular out 1363 01:00:48,130 --> 01:00:52,677 of [INAUDIBLE] is, well, if people have 1364 01:00:52,677 --> 01:00:54,510 self-control problems or other issues, well, 1365 01:00:54,510 --> 01:00:57,170 will not the market come up with some goods or some solutions 1366 01:00:57,170 --> 01:00:58,430 that will help people? 1367 01:00:58,430 --> 01:01:00,950 After all, there's money to be made 1368 01:01:00,950 --> 01:01:02,690 in improving people's choices. 1369 01:01:02,690 --> 01:01:04,880 So if, really, self-control problems or the like 1370 01:01:04,880 --> 01:01:06,770 are really important, why is not anybody 1371 01:01:06,770 --> 01:01:09,410 coming up with a market solution and a great product that 1372 01:01:09,410 --> 01:01:09,980 helps people? 1373 01:01:09,980 --> 01:01:13,220 And then they're going to make a lot of money from doing so. 1374 01:01:13,220 --> 01:01:16,250 And so particularly if consumers are sophisticated 1375 01:01:16,250 --> 01:01:18,830 and they demand ways to change the consumption, 1376 01:01:18,830 --> 01:01:21,980 and then there will be somebody who is providing that good. 1377 01:01:21,980 --> 01:01:24,410 And if the welfare effects of those kinds 1378 01:01:24,410 --> 01:01:28,735 of self-control or other distortions are large, 1379 01:01:28,735 --> 01:01:30,610 then you can make a lot of money from helping 1380 01:01:30,610 --> 01:01:32,920 people improve their behavior. 1381 01:01:32,920 --> 01:01:38,990 So what's problematic with this line of argument? 1382 01:01:38,990 --> 01:01:41,625 Well, the market is precisely part of the problem. 1383 01:01:41,625 --> 01:01:43,250 So on the one hand, you can say, if you 1384 01:01:43,250 --> 01:01:45,950 try to-- so if you Google-- and this is quite a while ago, 1385 01:01:45,950 --> 01:01:50,010 but right now it's, in some sense, more sophisticated ways 1386 01:01:50,010 --> 01:01:51,900 and there's lots of targeting going on. 1387 01:01:51,900 --> 01:01:54,453 But if you Google impulsive credit-card spending, what 1388 01:01:54,453 --> 01:01:56,120 you're going to get is a bunch of advice 1389 01:01:56,120 --> 01:01:57,225 to help you with that. 1390 01:01:57,225 --> 01:01:59,600 But at the same time, you also get a bunch of credit card 1391 01:01:59,600 --> 01:02:02,090 offers or you're going to get targeted eventually 1392 01:02:02,090 --> 01:02:05,690 by companies that try to precisely exploit you. 1393 01:02:05,690 --> 01:02:07,670 So on the one hand, there will be 1394 01:02:07,670 --> 01:02:09,920 a side of the market who will potentially help you, 1395 01:02:09,920 --> 01:02:11,810 but there's another side of the market who 1396 01:02:11,810 --> 01:02:13,520 will want to exploit you. 1397 01:02:13,520 --> 01:02:15,740 And importantly, what the market will do 1398 01:02:15,740 --> 01:02:18,470 is firms will try to target naive people. 1399 01:02:18,470 --> 01:02:20,660 And you might remember this from lectures 5 and 6. 1400 01:02:20,660 --> 01:02:22,550 But essentially firms precisely try 1401 01:02:22,550 --> 01:02:26,750 to find people who are perhaps less sophisticated financially 1402 01:02:26,750 --> 01:02:27,870 in other ways. 1403 01:02:27,870 --> 01:02:30,560 So the firms are precisely trying to target naive people. 1404 01:02:30,560 --> 01:02:32,930 Because those are the people you can exploit. 1405 01:02:32,930 --> 01:02:36,440 Moreover, the naive are the ones who will think, 1406 01:02:36,440 --> 01:02:39,290 well, I don't have any problems, so I don't actually 1407 01:02:39,290 --> 01:02:42,240 need any goods or any sort of products that might help you. 1408 01:02:42,240 --> 01:02:44,150 So not only are they being targeted 1409 01:02:44,150 --> 01:02:47,450 by people who try to screw them over, 1410 01:02:47,450 --> 01:02:51,290 but also they actually are reluctant to seek help 1411 01:02:51,290 --> 01:02:55,790 because precisely they're naive and they think it's not 1412 01:02:55,790 --> 01:02:57,510 even necessary to seek help. 1413 01:02:57,510 --> 01:03:00,360 So even a firm who wants to help has trouble making money 1414 01:03:00,360 --> 01:03:00,860 with this. 1415 01:03:00,860 --> 01:03:02,990 Because people, if they're naive, 1416 01:03:02,990 --> 01:03:07,950 will not necessarily understand that help is required. 1417 01:03:07,950 --> 01:03:11,030 And that's precisely the tension that we see in many markets, 1418 01:03:11,030 --> 01:03:14,088 where there's some firms who might want 1419 01:03:14,088 --> 01:03:16,130 to help and support-- in this case, self-control, 1420 01:03:16,130 --> 01:03:18,740 but it's also true for other issues. 1421 01:03:18,740 --> 01:03:21,110 But there's, on the other side, firms 1422 01:03:21,110 --> 01:03:24,140 that try to break down the self-control from consumers 1423 01:03:24,140 --> 01:03:28,010 precisely because they can make a bunch of money from that. 1424 01:03:28,010 --> 01:03:32,730 And so now the second part, "severely limits 1425 01:03:32,730 --> 01:03:34,680 the market's ability to provide self-control," 1426 01:03:34,680 --> 01:03:36,810 because it essentially [INAUDIBLE].. 1427 01:03:36,810 --> 01:03:40,200 Now, the government of course can try to address this. 1428 01:03:40,200 --> 01:03:42,220 And this is what behavioral IO would be about. 1429 01:03:42,220 --> 01:03:44,310 Behavioral IO would be about saying-- 1430 01:03:44,310 --> 01:03:45,810 Industrial Organization, which would 1431 01:03:45,810 --> 01:03:48,180 be there's behavioral consumers, there's 1432 01:03:48,180 --> 01:03:50,460 firms who try to exploit consumers. 1433 01:03:50,460 --> 01:03:53,130 Now the government is trying to support, 1434 01:03:53,130 --> 01:03:56,520 or by some form of regulation, trying to help customers, 1435 01:03:56,520 --> 01:04:00,180 in particular things like the Consumer Financial Protection 1436 01:04:00,180 --> 01:04:04,260 Bureau and so on trying to help consumers not 1437 01:04:04,260 --> 01:04:07,860 to get exploited by banning advertising, by capping 1438 01:04:07,860 --> 01:04:09,680 interest rates, and so on and so forth, 1439 01:04:09,680 --> 01:04:14,590 but also by, for example, mandating disclosure 1440 01:04:14,590 --> 01:04:17,640 and so on and so forth. 1441 01:04:17,640 --> 01:04:22,640 Now I'm going to say one final thing about nudges, which is, 1442 01:04:22,640 --> 01:04:24,363 on the one hand, we talk about nudges 1443 01:04:24,363 --> 01:04:26,030 in a sense of saying that, look, there's 1444 01:04:26,030 --> 01:04:28,560 some behavior that we're trying to improve, 1445 01:04:28,560 --> 01:04:31,040 which would be like suppose people keep forgetting 1446 01:04:31,040 --> 01:04:32,720 certain things, to take their medication 1447 01:04:32,720 --> 01:04:34,070 or to appear in court. 1448 01:04:34,070 --> 01:04:39,860 Let's send reminders to improve behaviors. 1449 01:04:39,860 --> 01:04:42,920 We can set defaults in certain ways and so on and so forth, 1450 01:04:42,920 --> 01:04:47,540 which is policies to improve good behaviors, assuming 1451 01:04:47,540 --> 01:04:49,850 that the person who makes those choices 1452 01:04:49,850 --> 01:04:52,980 is benevolent in certain ways. 1453 01:04:52,980 --> 01:04:54,920 Now, on the other hand, of course, 1454 01:04:54,920 --> 01:04:57,080 firms can use the same techniques 1455 01:04:57,080 --> 01:05:00,810 on the same approaches to do the exact opposite 1456 01:05:00,810 --> 01:05:01,790 and so to harm people. 1457 01:05:01,790 --> 01:05:06,470 And this is what Thaler and Sunstein called sludges, which 1458 01:05:06,470 --> 01:05:10,430 essentially is like using these nudges in bad ways 1459 01:05:10,430 --> 01:05:14,520 by trying to push people in wrong directions. 1460 01:05:14,520 --> 01:05:17,450 So an example of that would be offering rebates on a product 1461 01:05:17,450 --> 01:05:22,010 and then having customers required to mail in forms 1462 01:05:22,010 --> 01:05:24,140 and to, like-- and you may have encountered these 1463 01:05:24,140 --> 01:05:26,240 and wondered why I have to mail in this form. 1464 01:05:26,240 --> 01:05:27,410 And this is kind of tedious. 1465 01:05:27,410 --> 01:05:29,504 And then you end up not doing it. 1466 01:05:29,504 --> 01:05:32,570 [INAUDIBLE] want you to think, oh, 1467 01:05:32,570 --> 01:05:35,600 I'm getting this great discount, and then actually never send it 1468 01:05:35,600 --> 01:05:37,850 in because then they don't have to actually pay you. 1469 01:05:37,850 --> 01:05:40,623 They can sell you the good, but then they 1470 01:05:40,623 --> 01:05:43,040 don't have to give you the discount at the end of the day. 1471 01:05:43,040 --> 01:05:46,520 So there's a bunch of types of cases, 1472 01:05:46,520 --> 01:05:49,330 both on the private sector but also from the public sector, 1473 01:05:49,330 --> 01:05:51,320 such as things as voter registration. 1474 01:05:51,320 --> 01:05:53,420 When you think about it, why is voter registration 1475 01:05:53,420 --> 01:05:54,915 harder in certain places? 1476 01:05:54,915 --> 01:05:57,290 Well, the reason why it's hard is because certain parties 1477 01:05:57,290 --> 01:05:59,930 and groups do not want certain types of groups of people 1478 01:05:59,930 --> 01:06:00,860 to vote. 1479 01:06:00,860 --> 01:06:03,140 And those are essentially deliberate ways 1480 01:06:03,140 --> 01:06:06,740 of making it hard for people to make choices that aren't really 1481 01:06:06,740 --> 01:06:08,990 good for them or for society. 1482 01:06:08,990 --> 01:06:12,710 And that's what Thaler and Sunstein would call sludges. 1483 01:06:12,710 --> 01:06:14,360 And then here it's important to realize 1484 01:06:14,360 --> 01:06:17,270 that it's, in fact, identifying and eliminating 1485 01:06:17,270 --> 01:06:20,090 such judges, which essentially are certain ways that 1486 01:06:20,090 --> 01:06:22,670 make it hard for people to make good choices for themselves 1487 01:06:22,670 --> 01:06:25,580 and for society, can be equally important or perhaps as 1488 01:06:25,580 --> 01:06:28,310 important as sort of implementing nudges 1489 01:06:28,310 --> 01:06:30,640 to improve people's behavior in certain ways. 1490 01:06:30,640 --> 01:06:32,830 So it's not just about taking behavior 1491 01:06:32,830 --> 01:06:35,000 where people, on their own, screw up in certain ways 1492 01:06:35,000 --> 01:06:38,270 and trying to improve them and trying to help them, 1493 01:06:38,270 --> 01:06:43,460 but rather helping people being misdirected in certain ways 1494 01:06:43,460 --> 01:06:46,640 to be exploited. 1495 01:06:46,640 --> 01:06:48,950 And in many cases, in particular in companies, 1496 01:06:48,950 --> 01:06:54,122 identify those kinds of sludges and calling companies out 1497 01:06:54,122 --> 01:06:55,580 gets you pretty far, in some sense, 1498 01:06:55,580 --> 01:06:57,320 on social media and the like. 1499 01:06:57,320 --> 01:06:59,450 Once called out, companies often back down 1500 01:06:59,450 --> 01:07:01,400 and try to reduce them because they 1501 01:07:01,400 --> 01:07:04,280 don't want to have bad reputations of exploiting 1502 01:07:04,280 --> 01:07:05,600 customers. 1503 01:07:05,600 --> 01:07:08,520 OK, so let me sort of summarize the lecture and then briefly, 1504 01:07:08,520 --> 01:07:10,670 in two minutes, the whole class. 1505 01:07:10,670 --> 01:07:13,940 So some forms of, in particular, soft paternalism 1506 01:07:13,940 --> 01:07:15,470 can unambiguously improve welfare. 1507 01:07:15,470 --> 01:07:17,220 And we should be quite excited about that. 1508 01:07:17,220 --> 01:07:18,530 And they can help quite a bit. 1509 01:07:18,530 --> 01:07:21,110 Now, on the other hand, nudges can also make things worse. 1510 01:07:21,110 --> 01:07:22,527 And particularly if the government 1511 01:07:22,527 --> 01:07:25,040 doesn't know what's good for people and we 1512 01:07:25,040 --> 01:07:26,870 want to be careful in nudging. 1513 01:07:26,870 --> 01:07:29,780 And there's some examples [INAUDIBLE] in class. 1514 01:07:29,780 --> 01:07:33,640 But there's also some examples of nudges actually backfiring. 1515 01:07:33,640 --> 01:07:36,330 Now, in addition to that, some people dislike being nudged. 1516 01:07:36,330 --> 01:07:38,710 And we should respect that and take that into account. 1517 01:07:38,710 --> 01:07:41,950 We try to maximize utility, remember, as opposed 1518 01:07:41,950 --> 01:07:46,690 to certain behaviors or take-up of certain outcomes. 1519 01:07:46,690 --> 01:07:49,030 And you want to be careful not to make people worse off. 1520 01:07:49,030 --> 01:07:51,670 And helping people opt out of certain nudges 1521 01:07:51,670 --> 01:07:54,670 or certain [? interventions ?] should be quite helpful. 1522 01:07:54,670 --> 01:07:57,650 In addition, the market does not solve everything. 1523 01:07:57,650 --> 01:08:00,310 So we should not have faith in the market solving things. 1524 01:08:00,310 --> 01:08:04,340 In particular, the market might make things worse. 1525 01:08:04,340 --> 01:08:08,170 And finally, reducing sludges can go a long way 1526 01:08:08,170 --> 01:08:11,050 in improving behavior overall. 1527 01:08:11,050 --> 01:08:14,150 Now, the big-picture summary of this class, 1528 01:08:14,150 --> 01:08:17,660 what is it, in one slide, that I want you to take away and have 1529 01:08:17,660 --> 01:08:18,160 learned? 1530 01:08:18,160 --> 01:08:19,785 Of course, things are more complicated. 1531 01:08:19,785 --> 01:08:22,370 But let me give you a very brief summary. 1532 01:08:22,370 --> 01:08:24,744 So one, psychological considerations 1533 01:08:24,744 --> 01:08:27,032 can be quite important in a lot of economic choices 1534 01:08:27,032 --> 01:08:27,615 and behaviors. 1535 01:08:27,615 --> 01:08:29,859 And we should take them very seriously. 1536 01:08:29,859 --> 01:08:32,109 For you, personally, I think that means, in some ways, 1537 01:08:32,109 --> 01:08:35,020 understanding your own biases and mistakes and issues better 1538 01:08:35,020 --> 01:08:37,270 and try to improve decision-making. 1539 01:08:37,270 --> 01:08:38,810 It seems quite important. 1540 01:08:38,810 --> 01:08:41,750 That's true for short-run or simple sort of considerations, 1541 01:08:41,750 --> 01:08:43,510 such as how to deal with procrastination 1542 01:08:43,510 --> 01:08:45,160 and exercising and so on. 1543 01:08:45,160 --> 01:08:47,957 It's also true for long-run considerations like which 1544 01:08:47,957 --> 01:08:50,290 jobs you want to have, what friends do you want to keep, 1545 01:08:50,290 --> 01:08:51,457 and so on and so forth. 1546 01:08:51,457 --> 01:08:52,874 It could make you [? both happy ?] 1547 01:08:52,874 --> 01:08:55,000 or unhappy in the future. 1548 01:08:55,000 --> 01:08:59,260 And you want to be quite mindful about that. 1549 01:08:59,260 --> 01:09:02,859 Small changes or thinking about things for a while 1550 01:09:02,859 --> 01:09:04,370 can make big differences. 1551 01:09:04,370 --> 01:09:08,170 So I very much encourage you to introspect, 1552 01:09:08,170 --> 01:09:10,960 be more mindful or mindful in your choices. 1553 01:09:10,960 --> 01:09:13,990 As I said last time, experimenting, seeking advice, 1554 01:09:13,990 --> 01:09:15,970 including psychological support and so on, 1555 01:09:15,970 --> 01:09:17,823 seem to be quite helpful. 1556 01:09:17,823 --> 01:09:19,990 At the same time, you also shouldn't stress too much 1557 01:09:19,990 --> 01:09:25,210 and just try to be happy and don't be too harsh on yourself. 1558 01:09:25,210 --> 01:09:27,817 Often, people are doing quite well already overall. 1559 01:09:27,817 --> 01:09:29,859 It's also worth understanding that sometimes it's 1560 01:09:29,859 --> 01:09:31,359 actually easier to help your friends 1561 01:09:31,359 --> 01:09:33,020 or others than yourself. 1562 01:09:33,020 --> 01:09:34,810 It's often easier to see, in others, 1563 01:09:34,810 --> 01:09:37,960 how they're misoptimizing and how to improve their behaviors. 1564 01:09:37,960 --> 01:09:40,359 So in some ways, helping others might 1565 01:09:40,359 --> 01:09:43,390 be, in fact one of the things you might have taken away 1566 01:09:43,390 --> 01:09:44,979 from this class. 1567 01:09:44,979 --> 01:09:46,840 And then, finally, you want to think about, 1568 01:09:46,840 --> 01:09:49,060 in particular, when you think about designs of teams, 1569 01:09:49,060 --> 01:09:50,979 working with others, incentives, products, 1570 01:09:50,979 --> 01:09:53,470 and so on, if you try to design an app or any website 1571 01:09:53,470 --> 01:09:57,010 or [INAUDIBLE],, thinking about psychological considerations 1572 01:09:57,010 --> 01:09:59,830 and designing things [INAUDIBLE] is quite important 1573 01:09:59,830 --> 01:10:05,220 that you should [INAUDIBLE] take those things into account. 1574 01:10:05,220 --> 01:10:06,880 Somebody is asking about books to read. 1575 01:10:06,880 --> 01:10:08,443 So I just added them here. 1576 01:10:08,443 --> 01:10:10,110 I was planning to say a little bit more, 1577 01:10:10,110 --> 01:10:12,840 but I'm out of time so I'm just leaving this here. 1578 01:10:12,840 --> 01:10:15,900 I'll try to add a couple of sentences for each [INAUDIBLE].. 1579 01:10:15,900 --> 01:10:19,770 There's some quite amazing books that you can read. 1580 01:10:19,770 --> 01:10:22,940 In particular, for example, Cialdini's '93 Influence book 1581 01:10:22,940 --> 01:10:24,800 is an amazing book to read. 1582 01:10:24,800 --> 01:10:26,760 Then Ariely has really nice books, 1583 01:10:26,760 --> 01:10:28,280 including Predictably Irrational. 1584 01:10:28,280 --> 01:10:31,515 And some of these are more basic than others. 1585 01:10:31,515 --> 01:10:33,350 But if you're interested in poverty, 1586 01:10:33,350 --> 01:10:35,060 Mullainathan and Shafir's Scarcity book 1587 01:10:35,060 --> 01:10:37,880 is really an amazing read. 1588 01:10:37,880 --> 01:10:39,710 And then Danny Kahneman's Thinking, 1589 01:10:39,710 --> 01:10:45,328 Fast and Slow is also extremely insightful and interesting. 1590 01:10:45,328 --> 01:10:47,870 Of course there's other books, for example, Dick Thaler talks 1591 01:10:47,870 --> 01:10:50,390 about the history of behavioral economics, 1592 01:10:50,390 --> 01:10:53,960 Lewis is what's called The Undoing Project, which 1593 01:10:53,960 --> 01:10:56,210 is about Kahneman, first his friendship, 1594 01:10:56,210 --> 01:10:58,670 and how they sort of essentially came to, 1595 01:10:58,670 --> 01:11:02,660 in some really important ways, revolutionizing 1596 01:11:02,660 --> 01:11:03,890 some parts of economics. 1597 01:11:03,890 --> 01:11:05,810 So there's lots of things to read. 1598 01:11:05,810 --> 01:11:08,102 If you're interested in any of these, just let me know. 1599 01:11:08,102 --> 01:11:10,310 I'm very happy to chat. 1600 01:11:10,310 --> 01:11:12,380 So that's pretty much all I have to say. 1601 01:11:12,380 --> 01:11:14,330 I had lots of fun teaching this class. 1602 01:11:14,330 --> 01:11:20,180 I know this semester was tricky and difficult for many of us. 1603 01:11:20,180 --> 01:11:22,890 Every time I teach this class, I'm surprised how much I learn. 1604 01:11:22,890 --> 01:11:25,580 I hope you learned something useful things too. 1605 01:11:25,580 --> 01:11:27,240 I guess this is from last time. 1606 01:11:27,240 --> 01:11:29,330 So there's no course evaluations this year. 1607 01:11:29,330 --> 01:11:32,660 We might do an informal one to just [INAUDIBLE] 1608 01:11:32,660 --> 01:11:35,330 try to learn more about what you thought. 1609 01:11:35,330 --> 01:11:38,750 We did already an informal evaluation, 1610 01:11:38,750 --> 01:11:41,320 but that's assuming that there would be actual evaluations 1611 01:11:41,320 --> 01:11:42,360 this semester. 1612 01:11:42,360 --> 01:11:44,602 Unfortunately, MIT has canceled them this year, 1613 01:11:44,602 --> 01:11:46,310 I think to be mindful for some people who 1614 01:11:46,310 --> 01:11:50,810 are particularly affected by the quarantine and so on. 1615 01:11:50,810 --> 01:11:54,010 Come to my office hours if you have questions. 1616 01:11:54,010 --> 01:11:55,860 I have some Europe opportunities. 1617 01:11:55,860 --> 01:11:58,190 Some of you have already applied for those. 1618 01:11:58,190 --> 01:12:00,210 There's been quite a bit of interest. 1619 01:12:00,210 --> 01:12:03,920 So if you have already applied, I'll get back to you soon. 1620 01:12:03,920 --> 01:12:07,095 But any case, come and talk to me if you're interested. 1621 01:12:07,095 --> 01:12:09,470 I hope you learn more economics in the future [INAUDIBLE] 1622 01:12:09,470 --> 01:12:11,300 help you take psychological considerations 1623 01:12:11,300 --> 01:12:14,240 in those economic issues quite seriously. 1624 01:12:14,240 --> 01:12:16,390 So thank you very much.