1 00:00:00,000 --> 00:00:00,974 [SQUEAKING] 2 00:00:02,435 --> 00:00:04,383 [RUSTLING] 3 00:00:04,383 --> 00:00:09,750 [CLICKING] 4 00:00:09,750 --> 00:00:13,060 FRANK SCHILLBACH Let me get started on lecture 20. 5 00:00:13,060 --> 00:00:15,060 So I'm going to finish up with what we discussed 6 00:00:15,060 --> 00:00:16,653 last time, which is lecture 19. 7 00:00:16,653 --> 00:00:18,570 And then we're going to talk about lecture 20, 8 00:00:18,570 --> 00:00:23,687 which is about malleability and inaccessibility of preferences. 9 00:00:23,687 --> 00:00:25,770 For now, we're going to talk for a little bit more 10 00:00:25,770 --> 00:00:30,290 about defaults and frames, and nudges in particular. 11 00:00:30,290 --> 00:00:33,030 So what we left it off last time was we 12 00:00:33,030 --> 00:00:34,673 talked about default effects. 13 00:00:34,673 --> 00:00:36,090 We talked about that in particular 14 00:00:36,090 --> 00:00:37,260 in retirement savings. 15 00:00:37,260 --> 00:00:39,860 Defaults can be sort of setting essentially 16 00:00:39,860 --> 00:00:41,790 in one default option. 17 00:00:41,790 --> 00:00:45,540 That is like, what happens if you do nothing in a retirement 18 00:00:45,540 --> 00:00:47,190 savings account. 19 00:00:47,190 --> 00:00:49,050 Setting a certain default option can 20 00:00:49,050 --> 00:00:53,430 have very powerful effects in affecting people's behavior 21 00:00:53,430 --> 00:00:55,920 in a domain that's really important in this case, 22 00:00:55,920 --> 00:01:01,020 savings, and in a domain where traditional economic tools have 23 00:01:01,020 --> 00:01:05,519 not really gone very far, as in matching contributions, 24 00:01:05,519 --> 00:01:09,420 or any sort of like types of financial education, et cetera. 25 00:01:09,420 --> 00:01:10,920 It's just not doing very much. 26 00:01:10,920 --> 00:01:12,000 It's expensive to do. 27 00:01:12,000 --> 00:01:16,110 And it's hard to change behavior, while using defaults 28 00:01:16,110 --> 00:01:20,070 is really powerful, and can cause large behavior change 29 00:01:20,070 --> 00:01:25,550 or large changes in outcomes even years later. 30 00:01:25,550 --> 00:01:28,430 Let me talk a little bit about what is the optimal default 31 00:01:28,430 --> 00:01:29,750 decision regime. 32 00:01:29,750 --> 00:01:34,040 And that's getting into tricky territory. 33 00:01:34,040 --> 00:01:36,380 We talked about this last time a little bit. 34 00:01:36,380 --> 00:01:38,810 We'll talk about this again in the last lecture, 35 00:01:38,810 --> 00:01:41,880 on lecture 23, when we talk about policy. 36 00:01:41,880 --> 00:01:44,430 So when do you think about active choice versus default, 37 00:01:44,430 --> 00:01:45,680 so what is active choice? 38 00:01:45,680 --> 00:01:48,440 Active choice to remind you is like somebody has asked 39 00:01:48,440 --> 00:01:50,600 at the beginning, or directly asked, 40 00:01:50,600 --> 00:01:51,770 you have to make a choice. 41 00:01:51,770 --> 00:01:55,410 And I'm just sort of forcing you to do it, more or less. 42 00:01:55,410 --> 00:01:58,040 [INAUDIBLE] the company hires somebody and in the hiring 43 00:01:58,040 --> 00:02:00,860 package, there is just a form that you need to fill out. 44 00:02:00,860 --> 00:02:03,483 And otherwise you cannot really start [INAUDIBLE] company. 45 00:02:03,483 --> 00:02:05,150 Sometimes that's not really enforceable, 46 00:02:05,150 --> 00:02:06,942 but effectively, essentially just everybody 47 00:02:06,942 --> 00:02:08,206 makes some choice. 48 00:02:08,206 --> 00:02:10,789 That's essentially active choice versus default, which is just 49 00:02:10,789 --> 00:02:12,247 like, if you do nothing, you either 50 00:02:12,247 --> 00:02:15,870 get zero retirement savings, or some positive amounts. 51 00:02:15,870 --> 00:02:19,008 Now, if people are really different, then essentially, 52 00:02:19,008 --> 00:02:20,300 active choice is a great thing. 53 00:02:20,300 --> 00:02:22,525 Because if people really have different preferences, 54 00:02:22,525 --> 00:02:24,650 then we shouldn't just push them into one direction 55 00:02:24,650 --> 00:02:25,250 or the other. 56 00:02:25,250 --> 00:02:26,090 They should choose. 57 00:02:26,090 --> 00:02:29,090 Everybody chooses on their own what's best for them. 58 00:02:29,090 --> 00:02:32,390 In contrast, if you think most people are the same anyway, 59 00:02:32,390 --> 00:02:33,890 and in addition, people don't really 60 00:02:33,890 --> 00:02:35,720 know what's best for them, well then 61 00:02:35,720 --> 00:02:39,050 it might be better to just provide one default [INAUDIBLE] 62 00:02:39,050 --> 00:02:40,930 most people that's best for most people. 63 00:02:40,930 --> 00:02:42,680 And then most people will stick to anyway, 64 00:02:42,680 --> 00:02:44,990 because these defaults tend to stick. 65 00:02:44,990 --> 00:02:46,880 And if people don't know what they want 66 00:02:46,880 --> 00:02:48,020 or what's good for them, then sort 67 00:02:48,020 --> 00:02:50,062 of letting them choose actively might make things 68 00:02:50,062 --> 00:02:53,240 potentially worse anyway. 69 00:02:53,240 --> 00:02:55,422 And so that's sort of the fundamental trade 70 00:02:55,422 --> 00:02:57,908 off when people have come down on different sides of this. 71 00:02:57,908 --> 00:02:59,450 And it depends on your view, kind of, 72 00:02:59,450 --> 00:03:01,880 about sophistication and so on of customers 73 00:03:01,880 --> 00:03:04,010 who are [INAUDIBLE]. 74 00:03:04,010 --> 00:03:07,460 And one big and important issue with defaults 75 00:03:07,460 --> 00:03:09,460 and with the nudges, in general, which 76 00:03:09,460 --> 00:03:11,630 I'm going to talk about to you in a second 77 00:03:11,630 --> 00:03:14,580 is, we kind of want to make sure that we don't make people 78 00:03:14,580 --> 00:03:16,860 worse off in some ways by pushing people 79 00:03:16,860 --> 00:03:17,970 in certain directions. 80 00:03:17,970 --> 00:03:19,470 For example, one thing you might say 81 00:03:19,470 --> 00:03:22,380 is, well, why not set the default to like 5% or 10% 82 00:03:22,380 --> 00:03:24,060 of retirement savings? 83 00:03:24,060 --> 00:03:27,090 But then you worry about people potentially over-saving 84 00:03:27,090 --> 00:03:28,777 in their retirement savings accounts. 85 00:03:28,777 --> 00:03:31,110 And then essentially they have lots of credit card debt, 86 00:03:31,110 --> 00:03:33,960 potentially, which might be even larger. 87 00:03:33,960 --> 00:03:36,163 Sorry, the interest rate on the credit card debt 88 00:03:36,163 --> 00:03:37,080 might be really large. 89 00:03:37,080 --> 00:03:38,163 And people get in trouble. 90 00:03:38,163 --> 00:03:40,740 Or people then withdraw prematurely, and so on, 91 00:03:40,740 --> 00:03:43,830 which is [INAUDIBLE] potentially worse for them. 92 00:03:43,830 --> 00:03:46,050 We talked very briefly about the smart plan, 93 00:03:46,050 --> 00:03:50,240 which again, I'm going to get back to in lecture 23 as well. 94 00:03:50,240 --> 00:03:57,170 Now I also talked briefly about sort of default effects 95 00:03:57,170 --> 00:03:59,210 also are quite powerful in other settings. 96 00:03:59,210 --> 00:04:01,940 For example, in organ donations, in some places, 97 00:04:01,940 --> 00:04:04,830 organ donations are explicit consent, 98 00:04:04,830 --> 00:04:06,500 which is like you have to opt in, 99 00:04:06,500 --> 00:04:08,660 and otherwise you will not be an organ donor. 100 00:04:08,660 --> 00:04:11,060 In other countries-- and this is from a while ago-- 101 00:04:11,060 --> 00:04:13,813 in other countries, things are different, 102 00:04:13,813 --> 00:04:15,980 where there is what's called presumed consent, which 103 00:04:15,980 --> 00:04:17,360 is like you have to opt out. 104 00:04:17,360 --> 00:04:19,160 And otherwise, if you do not opt out, 105 00:04:19,160 --> 00:04:21,800 you're presumed to consenting to organ [INAUDIBLE].. 106 00:04:21,800 --> 00:04:24,050 We see essentially huge differences 107 00:04:24,050 --> 00:04:27,020 in otherwise quite similar countries, 108 00:04:27,020 --> 00:04:29,060 which sort of shows you that the default here 109 00:04:29,060 --> 00:04:31,370 can have really large effects. 110 00:04:31,370 --> 00:04:34,070 There's some other examples of default effects in particular 111 00:04:34,070 --> 00:04:35,840 when it comes to voter registration, 112 00:04:35,840 --> 00:04:37,760 or green energy choices. 113 00:04:37,760 --> 00:04:39,350 In some cases, in particular when 114 00:04:39,350 --> 00:04:41,930 it comes to voter registration, it's actually quite hard 115 00:04:41,930 --> 00:04:43,700 to come with arguments against setting 116 00:04:43,700 --> 00:04:45,050 a default in a certain way. 117 00:04:45,050 --> 00:04:47,360 Like in some sense, getting voters 118 00:04:47,360 --> 00:04:49,840 to be registered when you get say 119 00:04:49,840 --> 00:04:52,550 a driver's license or the like seems just very 120 00:04:52,550 --> 00:04:54,400 straightforward and-- 121 00:04:54,400 --> 00:04:57,020 or when they turn 18, that seems like a very straightforward 122 00:04:57,020 --> 00:04:58,070 thing to do. 123 00:04:58,070 --> 00:05:00,950 And there, in some ways, the defaults 124 00:05:00,950 --> 00:05:05,210 can really sort of remove some barriers and inefficiencies 125 00:05:05,210 --> 00:05:09,950 where we think we should agree that everybody should 126 00:05:09,950 --> 00:05:14,270 be able to vote if they would like to, with some exceptions, 127 00:05:14,270 --> 00:05:14,990 perhaps. 128 00:05:14,990 --> 00:05:17,090 And so, if you can default people 129 00:05:17,090 --> 00:05:19,730 into getting registered to vote, that seems like a good thing, 130 00:05:19,730 --> 00:05:20,690 unambiguously. 131 00:05:20,690 --> 00:05:24,680 So there sort of default effects can be really, really powerful. 132 00:05:24,680 --> 00:05:28,180 Now, let's talk a little bit about-- not just again, 133 00:05:28,180 --> 00:05:29,680 we're going to talk about this again 134 00:05:29,680 --> 00:05:31,702 in lecture 23, when it comes, about policy, 135 00:05:31,702 --> 00:05:33,160 but let me sort of define the issue 136 00:05:33,160 --> 00:05:35,210 and sort of be clear about that. 137 00:05:35,210 --> 00:05:37,960 So you might have sort of come across the very famous book 138 00:05:37,960 --> 00:05:40,720 by Richard Thaler, who recently won 139 00:05:40,720 --> 00:05:44,350 the Nobel Prize in economics, and Cass Sunstein, 140 00:05:44,350 --> 00:05:46,380 called Nudge. 141 00:05:46,380 --> 00:05:51,030 I'll give you the definition by Cass Sunstein, who 142 00:05:51,030 --> 00:05:54,720 says a nudge is a feature of the social environment that affects 143 00:05:54,720 --> 00:05:57,270 people's choices without imposing coercion 144 00:05:57,270 --> 00:06:01,200 of any kind of material incentive. 145 00:06:01,200 --> 00:06:03,912 So that-- default is sort of like a clear case 146 00:06:03,912 --> 00:06:05,370 of that, where essentially you just 147 00:06:05,370 --> 00:06:07,260 say, OK, if you don't do anything, 148 00:06:07,260 --> 00:06:10,020 I'm going to just pick a choice for you. 149 00:06:10,020 --> 00:06:13,050 But you can choose, however, in whichever way you want. 150 00:06:13,050 --> 00:06:16,540 And there is essentially no material incentive. 151 00:06:16,540 --> 00:06:19,272 But I'm changing essentially your choice environment 152 00:06:19,272 --> 00:06:20,730 or what people would call sometimes 153 00:06:20,730 --> 00:06:22,980 the choice architecture. 154 00:06:22,980 --> 00:06:24,655 And there's other versions of that. 155 00:06:24,655 --> 00:06:28,620 There's simplification, information disclosure. 156 00:06:28,620 --> 00:06:31,110 Again, these are all things that are usually available. 157 00:06:31,110 --> 00:06:33,360 People can figure it out on their own. 158 00:06:33,360 --> 00:06:36,250 But we make it easier for people. 159 00:06:36,250 --> 00:06:39,930 And one of the big mantras [INAUDIBLE] 160 00:06:39,930 --> 00:06:43,290 about like, what's the main thing we should do 161 00:06:43,290 --> 00:06:45,600 to improve people's behavior? 162 00:06:45,600 --> 00:06:47,850 Dick Thaler would often say, make it easy for people. 163 00:06:47,850 --> 00:06:50,100 Essentially, people are often-- 164 00:06:50,100 --> 00:06:52,770 and he said that about himself-- people are lazy. 165 00:06:52,770 --> 00:06:54,600 And they make simple choices. 166 00:06:54,600 --> 00:06:57,550 And if you make it easy for them to make the right choice, 167 00:06:57,550 --> 00:07:01,620 they will be more likely to make good choices for them. 168 00:07:01,620 --> 00:07:05,040 There's some other things like warnings, reminders. 169 00:07:05,040 --> 00:07:07,370 Notice for reminders, for example, again, you already 170 00:07:07,370 --> 00:07:09,120 have the information that somebody gave it 171 00:07:09,120 --> 00:07:10,480 to you previously. 172 00:07:10,480 --> 00:07:15,270 So really it's only like if you have memory issues, 173 00:07:15,270 --> 00:07:16,680 that might be useful for you. 174 00:07:16,680 --> 00:07:19,590 There's no material incentives or [INAUDIBLE].. 175 00:07:19,590 --> 00:07:20,910 Use of social norms. 176 00:07:20,910 --> 00:07:24,480 That would be things like I'm telling you all of your friends 177 00:07:24,480 --> 00:07:25,110 did x. 178 00:07:25,110 --> 00:07:27,450 Now would you like to do x or y? 179 00:07:27,450 --> 00:07:31,110 Or I can send you letters and say your neighbors used 180 00:07:31,110 --> 00:07:34,140 so much energy, and you use more energy than your neighbor. 181 00:07:34,140 --> 00:07:37,350 And then I put like a sad face next to it, or something. 182 00:07:37,350 --> 00:07:40,860 And then you might feel bad about it because 183 00:07:40,860 --> 00:07:44,950 of social norms, and sort of reduce your energy usage, 184 00:07:44,950 --> 00:07:47,860 which people have in fact done. 185 00:07:47,860 --> 00:07:50,190 You can also like increase ease and convenience. 186 00:07:50,190 --> 00:07:53,550 This is exactly what like [INAUDIBLE] convenience. 187 00:07:53,550 --> 00:07:55,740 That's exactly what Dick Thaler would say. 188 00:07:55,740 --> 00:07:58,560 Those are things like, for example, I 189 00:07:58,560 --> 00:08:04,560 can put Snickers bars next to the counter at the cashier. 190 00:08:04,560 --> 00:08:07,300 Or I can put apples next to it. 191 00:08:07,300 --> 00:08:09,853 And people are more likely to buy apples 192 00:08:09,853 --> 00:08:11,520 when it's easy or simple or when they're 193 00:08:11,520 --> 00:08:13,320 reminded of eating apples. 194 00:08:13,320 --> 00:08:16,410 And they're more likely to eat Snickers bars essentially 195 00:08:16,410 --> 00:08:18,480 if [INAUDIBLE] to them. 196 00:08:18,480 --> 00:08:20,160 To the extent that you want people 197 00:08:20,160 --> 00:08:24,270 to make what one might think of as good choices, 198 00:08:24,270 --> 00:08:27,060 making it easy for them to make the good choice without sort 199 00:08:27,060 --> 00:08:29,700 of restricting anybody's choice sets 200 00:08:29,700 --> 00:08:33,750 or coercing them in any way is considered a nudge. 201 00:08:33,750 --> 00:08:37,320 And then there's also some things about framing of choices 202 00:08:37,320 --> 00:08:42,260 as in gains versus losses, in terms of their contracts. 203 00:08:42,260 --> 00:08:43,400 Any questions on this? 204 00:08:49,920 --> 00:08:50,420 OK. 205 00:08:50,420 --> 00:08:52,513 So then let me give you, briefly, sort of-- 206 00:08:52,513 --> 00:08:53,930 and again, we're going to get back 207 00:08:53,930 --> 00:08:55,202 to all of this in lecture 23. 208 00:08:55,202 --> 00:08:56,660 But let me briefly sort of give you 209 00:08:56,660 --> 00:08:59,140 a couple of examples of nudges in some cases 210 00:08:59,140 --> 00:09:03,820 where it's pretty clear that some nudges are a good idea. 211 00:09:03,820 --> 00:09:05,820 So one thing that we talked about quite 212 00:09:05,820 --> 00:09:07,800 a bit already when it comes to procrastination 213 00:09:07,800 --> 00:09:12,990 or self-control problems present bias, or [? losing focus ?] 214 00:09:12,990 --> 00:09:17,010 is the health domain, and what people and in particular, 215 00:09:17,010 --> 00:09:20,200 individuals and society often have aligned goals, 216 00:09:20,200 --> 00:09:23,015 where individuals often want behavioral change. 217 00:09:23,015 --> 00:09:24,390 They want to improve their diets, 218 00:09:24,390 --> 00:09:27,480 increase physical activity, stop smoking, get vaccinated, 219 00:09:27,480 --> 00:09:29,070 use less energy, and so on. 220 00:09:29,070 --> 00:09:33,190 And often, there are societal costs of these behaviors, 221 00:09:33,190 --> 00:09:35,430 or lack of those behaviors. 222 00:09:35,430 --> 00:09:39,870 Sometimes there's externalities from smoking or getting or not 223 00:09:39,870 --> 00:09:41,250 getting vaccinated. 224 00:09:41,250 --> 00:09:43,590 Sometimes [INAUDIBLE] health care costs and the 225 00:09:43,590 --> 00:09:47,640 like from obesity and so on, where it's bad for everybody 226 00:09:47,640 --> 00:09:51,120 if the population is sick. 227 00:09:51,120 --> 00:09:53,340 So these goals are now aligned. 228 00:09:53,340 --> 00:09:54,840 And the social planner might want 229 00:09:54,840 --> 00:09:59,100 people to improve their diet, or the government might want that, 230 00:09:59,100 --> 00:10:02,040 or might want to reduce smoking and so on. 231 00:10:02,040 --> 00:10:04,680 But individuals, as we saw previously, 232 00:10:04,680 --> 00:10:06,750 often fail to follow through. 233 00:10:06,750 --> 00:10:10,200 And sort of often, education and information interventions 234 00:10:10,200 --> 00:10:12,820 are often ineffective. 235 00:10:12,820 --> 00:10:15,030 Even price interventions are often ineffective, 236 00:10:15,030 --> 00:10:17,900 or in the sense of like increasing prices, 237 00:10:17,900 --> 00:10:19,900 or incentivizing people, it doesn't always work. 238 00:10:19,900 --> 00:10:21,555 Plus, it's quite expensive to do. 239 00:10:21,555 --> 00:10:23,430 So one natural question you might want to ask 240 00:10:23,430 --> 00:10:29,070 is, can we use some nudges to align intentions and actions? 241 00:10:29,070 --> 00:10:31,750 And the reason why this is really popular, in particular, 242 00:10:31,750 --> 00:10:35,220 among governments, is that it's really cheap to do this. 243 00:10:35,220 --> 00:10:37,710 So it's actually sending some reminders or the like 244 00:10:37,710 --> 00:10:40,050 or providing information in one way or the other, 245 00:10:40,050 --> 00:10:41,610 is essentially free. 246 00:10:41,610 --> 00:10:45,240 And if governments don't have a lot of money, 247 00:10:45,240 --> 00:10:46,860 that's an easy thing to do as opposed 248 00:10:46,860 --> 00:10:51,850 to paying people, which is quite expensive in many cases. 249 00:10:51,850 --> 00:10:54,480 So here's an example of a free intervention which is 250 00:10:54,480 --> 00:10:56,340 about flu shot communications. 251 00:10:56,340 --> 00:11:00,270 This is a study by Katy Milkman and co-authors from 2011. 252 00:11:00,270 --> 00:11:03,870 So here, the goal of the intervention 253 00:11:03,870 --> 00:11:08,220 is to get people to sign up, or to actually get flu shots. 254 00:11:08,220 --> 00:11:10,740 So the control group got an informational mailing. 255 00:11:10,740 --> 00:11:12,090 I'll show you this in a second. 256 00:11:12,090 --> 00:11:17,000 The treatment group got the same, plus an encouragement 257 00:11:17,000 --> 00:11:18,840 to make a date plan. 258 00:11:18,840 --> 00:11:23,550 And the treatment group got the same plus make a date plan, 259 00:11:23,550 --> 00:11:24,420 and a time plan. 260 00:11:24,420 --> 00:11:25,987 Let me show you what this looks like. 261 00:11:25,987 --> 00:11:27,320 So here's the control condition. 262 00:11:27,320 --> 00:11:30,368 This is just an informational mailing. 263 00:11:30,368 --> 00:11:32,448 It's saying here, the company-- 264 00:11:32,448 --> 00:11:33,990 this is a company trying to do this-- 265 00:11:33,990 --> 00:11:37,350 is holding a free flu shot clinic. 266 00:11:37,350 --> 00:11:40,290 And companies very much want their workers to get flu shots, 267 00:11:40,290 --> 00:11:44,070 because that reduces essentially sort of sick days. 268 00:11:44,070 --> 00:11:45,690 If everybody gets their flu shot, 269 00:11:45,690 --> 00:11:48,190 not only are they healthier and it's probably good for them. 270 00:11:48,190 --> 00:11:51,240 They're like less at risk of dying and infecting each other, 271 00:11:51,240 --> 00:11:52,692 or just being sick. 272 00:11:52,692 --> 00:11:54,150 But it's also good for the company, 273 00:11:54,150 --> 00:12:00,150 because the company now gets workers to actually show up 274 00:12:00,150 --> 00:12:02,533 more, and have fewer sick days. 275 00:12:02,533 --> 00:12:04,200 So here's sort of the control condition. 276 00:12:04,200 --> 00:12:08,510 It looks like a perfectly reasonable letter, 277 00:12:08,510 --> 00:12:12,090 where you say, here is these days, on which you can sign up. 278 00:12:12,090 --> 00:12:14,820 And you are informed about the dates and times 279 00:12:14,820 --> 00:12:17,520 of the workplace flu clinic. 280 00:12:17,520 --> 00:12:21,210 Now, the treatment condition is the date plan condition, 281 00:12:21,210 --> 00:12:24,360 which essentially is asking people to-- 282 00:12:24,360 --> 00:12:27,660 or invite people to choose a concrete date 283 00:12:27,660 --> 00:12:30,000 for getting a flu vaccine. 284 00:12:30,000 --> 00:12:33,510 And the rest of the information [? mailer ?] 285 00:12:33,510 --> 00:12:34,840 is exactly the same. 286 00:12:34,840 --> 00:12:37,500 So essentially, that's just asking people, 287 00:12:37,500 --> 00:12:43,200 pick a day of the week and a month, or a month and a day, 288 00:12:43,200 --> 00:12:45,100 and a day of week, and just write it down. 289 00:12:45,100 --> 00:12:47,100 And presumably, that also then encourages people 290 00:12:47,100 --> 00:12:50,010 to put it into their calendar or the like 291 00:12:50,010 --> 00:12:53,070 and sort of acts as some form of a reminder. 292 00:12:53,070 --> 00:12:54,527 Notice, there's no incentive here. 293 00:12:54,527 --> 00:12:55,860 There's no financial incentives. 294 00:12:55,860 --> 00:12:56,910 There's no coercion and so on. 295 00:12:56,910 --> 00:12:59,430 It's just saying why don't you just pick a day right now. 296 00:12:59,430 --> 00:13:02,220 Just have a look, check your calendar, and just figure out. 297 00:13:02,220 --> 00:13:04,250 Make a plan for this. 298 00:13:04,250 --> 00:13:08,872 And the second condition is then make a date and a time plan 299 00:13:08,872 --> 00:13:10,580 condition, which is to say, why don't you 300 00:13:10,580 --> 00:13:15,030 also pick a time, which is even more concrete. 301 00:13:15,030 --> 00:13:19,250 That would be Monday, October 26, at like 9:00 AM 302 00:13:19,250 --> 00:13:20,590 or whenever I'm going to go. 303 00:13:20,590 --> 00:13:22,215 Presumably, that also encourages people 304 00:13:22,215 --> 00:13:24,240 to put it into their calendar. 305 00:13:24,240 --> 00:13:26,570 Now what you then get in this experiment-- 306 00:13:26,570 --> 00:13:28,430 it is a very simple experiment. 307 00:13:28,430 --> 00:13:30,230 You get, when you just send a letter, 308 00:13:30,230 --> 00:13:32,630 33% actually get the flu shot. 309 00:13:32,630 --> 00:13:36,080 When you also send a date plan, 34%. 310 00:13:36,080 --> 00:13:39,122 And when you send a date plan and a time plan, it's 37%. 311 00:13:39,122 --> 00:13:40,580 Now, these effects are a little bit 312 00:13:40,580 --> 00:13:42,950 underwhelming in the sense they're kind of small. 313 00:13:42,950 --> 00:13:48,470 It's like 1.6 percentage points at 4.2 percentage points. 314 00:13:48,470 --> 00:13:51,800 And in relative terms that's not huge effect. 315 00:13:51,800 --> 00:13:54,050 But notice what's beautiful about here 316 00:13:54,050 --> 00:13:57,350 is that the costs are essentially zero. 317 00:13:57,350 --> 00:14:00,580 So in some sense, if you value additional flu shot adherence, 318 00:14:00,580 --> 00:14:06,770 if you value more people getting flu shots, then a 4.2 319 00:14:06,770 --> 00:14:09,230 percentage points increase, which is about like a 10% 320 00:14:09,230 --> 00:14:11,450 or whatever, 11%, 12% or something 321 00:14:11,450 --> 00:14:13,880 increase relative to the control group, that's 322 00:14:13,880 --> 00:14:16,460 quite a bit of an increase for something 323 00:14:16,460 --> 00:14:18,000 that's essentially free. 324 00:14:18,000 --> 00:14:20,310 So the cost effectiveness of this is, of course, very, 325 00:14:20,310 --> 00:14:20,900 very high. 326 00:14:20,900 --> 00:14:22,525 Because it's essentially free, or maybe 327 00:14:22,525 --> 00:14:25,190 it cost somebody a few minutes to sort of write this 328 00:14:25,190 --> 00:14:28,790 down, or design the sheet. 329 00:14:28,790 --> 00:14:31,220 But really it costs exactly the same 330 00:14:31,220 --> 00:14:33,500 to send the sheet here on the left 331 00:14:33,500 --> 00:14:35,580 versus the sheet here on the right. 332 00:14:35,580 --> 00:14:39,410 So that's kind of an example of a win-win kind of situation 333 00:14:39,410 --> 00:14:44,330 where we make things essentially more effective. 334 00:14:44,330 --> 00:14:47,580 It's not really hurting anybody, and it's cheap and free to do. 335 00:14:47,580 --> 00:14:48,980 And so everybody is happy. 336 00:14:48,980 --> 00:14:52,190 And it's sort of uncontroversial in the sense of nobody 337 00:14:52,190 --> 00:14:53,840 would argue that this is a bad thing. 338 00:14:53,840 --> 00:14:55,400 Of course, some people might sort of 339 00:14:55,400 --> 00:14:57,990 have some concerns about the flu shots themselves, 340 00:14:57,990 --> 00:15:00,710 which I don't think there's any actual evidence against that 341 00:15:00,710 --> 00:15:01,548 as far as I know. 342 00:15:01,548 --> 00:15:03,590 But that's sort of an example where we say, look, 343 00:15:03,590 --> 00:15:04,760 this is very cheap to do. 344 00:15:04,760 --> 00:15:08,457 So presumably you should sort of pick the right mailing, 345 00:15:08,457 --> 00:15:10,790 or the right design, if I'm mailing, to encourage people 346 00:15:10,790 --> 00:15:14,390 to do desirable things, at least from the company's perspective, 347 00:15:14,390 --> 00:15:17,240 while also preserving people's freedom, 348 00:15:17,240 --> 00:15:21,530 not spending any money, but also not encouraging anybody. 349 00:15:21,530 --> 00:15:22,640 Any questions on this? 350 00:15:28,220 --> 00:15:28,900 OK. 351 00:15:28,900 --> 00:15:31,690 So here's another example of a similar intervention, 352 00:15:31,690 --> 00:15:34,780 which is signing up for FAFSA. 353 00:15:34,780 --> 00:15:36,250 This is a paper by Bettinger et al. 354 00:15:36,250 --> 00:15:41,950 in 2009, where essentially, people were provided 355 00:15:41,950 --> 00:15:44,950 free additional assistance in completing and filing 356 00:15:44,950 --> 00:15:47,680 applications for college financial aid. 357 00:15:47,680 --> 00:15:51,220 This is FAFSA, the financial aid for college. 358 00:15:51,220 --> 00:15:55,660 And it really increased college enrollment-- 359 00:15:55,660 --> 00:15:59,020 not only FAFSA completion, but also college enrollment. 360 00:15:59,020 --> 00:16:01,940 And there's sort of different conditions. 361 00:16:01,940 --> 00:16:04,490 One is control versus just providing people information 362 00:16:04,490 --> 00:16:04,990 only. 363 00:16:04,990 --> 00:16:06,990 Here's [INAUDIBLE] information of like you could 364 00:16:06,990 --> 00:16:08,703 get some FAFSA help and so on. 365 00:16:08,703 --> 00:16:10,120 But it seems to be really what you 366 00:16:10,120 --> 00:16:15,850 need is some additional assistance of completing 367 00:16:15,850 --> 00:16:17,440 and filing this application. 368 00:16:17,440 --> 00:16:22,240 Just somebody essentially walking you through the filing 369 00:16:22,240 --> 00:16:23,730 has a huge impact. 370 00:16:23,730 --> 00:16:26,230 There's a bit of a question of, is that really just a nudge, 371 00:16:26,230 --> 00:16:28,060 or is it like a more powerful intervention. 372 00:16:28,060 --> 00:16:29,650 But it's a really minor intervention 373 00:16:29,650 --> 00:16:31,570 where some people just need some level of help 374 00:16:31,570 --> 00:16:33,670 in filling out this form. 375 00:16:33,670 --> 00:16:35,530 And it has pretty large effects, if you 376 00:16:35,530 --> 00:16:37,690 look at the FAFSA [INAUDIBLE] completion 377 00:16:37,690 --> 00:16:41,800 of these forms, reasonably large effect on college enrollment. 378 00:16:41,800 --> 00:16:44,470 And you might sort of think this is a hugely important choice, 379 00:16:44,470 --> 00:16:49,840 and sort of if you get people to make that really big change, 380 00:16:49,840 --> 00:16:53,420 that's pretty remarkable for doing some very simple things. 381 00:16:53,420 --> 00:16:57,340 [INAUDIBLE] help me complete and fill out a simple form, 382 00:16:57,340 --> 00:17:00,190 something that you could have done anyway. 383 00:17:00,190 --> 00:17:04,750 So, and here again, so this is a relatively cheap thing to do. 384 00:17:04,750 --> 00:17:07,089 And it's essentially equivalent to the impact 385 00:17:07,089 --> 00:17:10,569 of several thousands of dollars of education subsidy. 386 00:17:10,569 --> 00:17:14,180 So if you wanted to essentially pay people to go to college, 387 00:17:14,180 --> 00:17:15,790 [INAUDIBLE] subsidize that, which 388 00:17:15,790 --> 00:17:18,730 you might want to do for other reasons anyway, 389 00:17:18,730 --> 00:17:22,780 one key issue with that is that not only is it perhaps not 390 00:17:22,780 --> 00:17:24,970 particularly effective, but also you're 391 00:17:24,970 --> 00:17:26,800 going to pay a lot of for marginal people. 392 00:17:26,800 --> 00:17:28,270 You're going to pay a lot of people who would 393 00:17:28,270 --> 00:17:29,530 have gone to college anyway. 394 00:17:29,530 --> 00:17:30,890 You're going to pay the subsidy. 395 00:17:30,890 --> 00:17:34,810 So if you have a low budget, that's really tough to do. 396 00:17:34,810 --> 00:17:38,410 And sort of then there's not a lot of-- 397 00:17:38,410 --> 00:17:40,630 the cost effectiveness tends to be very low. 398 00:17:40,630 --> 00:17:43,180 Because you have essentially a low impact, 399 00:17:43,180 --> 00:17:45,580 but like really high costs. 400 00:17:45,580 --> 00:17:48,220 And you can read about this more in the link that I had. 401 00:17:48,220 --> 00:17:49,720 But essentially the bottom line here 402 00:17:49,720 --> 00:17:52,210 is, again, you can do very small changes that 403 00:17:52,210 --> 00:17:55,540 can have pretty large effects. 404 00:17:55,540 --> 00:17:57,970 There's other types of examples that government use, 405 00:17:57,970 --> 00:17:59,440 or where governments use-- 406 00:17:59,440 --> 00:18:02,380 not just-- another example would be 407 00:18:02,380 --> 00:18:05,440 sending out letters for people to pay their taxes, which 408 00:18:05,440 --> 00:18:07,960 essentially are like reminders, or often appealing 409 00:18:07,960 --> 00:18:12,460 to people's sort of prosociality in various ways. 410 00:18:12,460 --> 00:18:14,080 But the government sends you letters. 411 00:18:14,080 --> 00:18:15,670 And there's different ways in which 412 00:18:15,670 --> 00:18:17,290 you can write these letters. 413 00:18:17,290 --> 00:18:18,970 You could say you should pay your taxes, 414 00:18:18,970 --> 00:18:20,980 or everybody's paying their taxes, 415 00:18:20,980 --> 00:18:23,420 or good citizens pay their taxes, and so on. 416 00:18:23,420 --> 00:18:25,900 And people-- there's a whole industry of nudged units 417 00:18:25,900 --> 00:18:27,760 where people then try to figure out what's 418 00:18:27,760 --> 00:18:29,810 the best way of doing that. 419 00:18:29,810 --> 00:18:32,170 And by doing that in different ways, 420 00:18:32,170 --> 00:18:37,160 that can have a pretty large effect on people's behaviors. 421 00:18:37,160 --> 00:18:39,650 And the governments sort of love this, because it's 422 00:18:39,650 --> 00:18:40,940 a very cheap thing to do. 423 00:18:40,940 --> 00:18:43,910 And it can have large impact and change behavior quite a bit. 424 00:18:43,910 --> 00:18:46,730 In this case, in the tax case, increase revenue 425 00:18:46,730 --> 00:18:50,420 by quite a bit without costing the government very much. 426 00:18:50,420 --> 00:18:52,760 Because often, [INAUDIBLE] send some letters anyway. 427 00:18:52,760 --> 00:18:55,430 And they send some more in some less effective ways 428 00:18:55,430 --> 00:18:56,760 of doing that. 429 00:18:56,760 --> 00:18:58,700 When you think about the intervention 430 00:18:58,700 --> 00:19:00,710 that I have here on the slide, the FAFSA one 431 00:19:00,710 --> 00:19:02,240 is essentially just-- 432 00:19:02,240 --> 00:19:03,860 the effects almost surely will be 433 00:19:03,860 --> 00:19:07,250 persistent in the sense of like, this is a one shot choice, 434 00:19:07,250 --> 00:19:09,720 whether you send your kid to college or not. 435 00:19:09,720 --> 00:19:13,625 And once you increase the college enrollment, 436 00:19:13,625 --> 00:19:17,865 which might have some other issues, but once you do that, 437 00:19:17,865 --> 00:19:19,490 there's going to be persistent effects. 438 00:19:19,490 --> 00:19:21,770 Because once kids go to college, that 439 00:19:21,770 --> 00:19:23,850 has effects on their behaviors and so on. 440 00:19:23,850 --> 00:19:25,040 Now when you're saying something else, which 441 00:19:25,040 --> 00:19:26,060 is some other decisions, [INAUDIBLE] 442 00:19:26,060 --> 00:19:28,330 just like about paying taxes where every year 443 00:19:28,330 --> 00:19:30,770 you get this letter from the government that sort of tells 444 00:19:30,770 --> 00:19:33,590 you you should pay your taxes, maybe the first time it works. 445 00:19:33,590 --> 00:19:34,840 Maybe the second time as well. 446 00:19:34,840 --> 00:19:37,130 But at some point, you're like, whatever. 447 00:19:37,130 --> 00:19:38,610 I don't care. 448 00:19:38,610 --> 00:19:40,070 That's a great question. 449 00:19:40,070 --> 00:19:44,000 And I don't know whether we have lots of evidence or not. 450 00:19:44,000 --> 00:19:46,568 I think the governments would sort of-- 451 00:19:46,568 --> 00:19:48,110 I think there's some research on this 452 00:19:48,110 --> 00:19:51,610 that I just am not sure I'm aware of. 453 00:19:51,610 --> 00:19:54,830 I think governments or people would say, well, in some sense, 454 00:19:54,830 --> 00:19:57,150 we don't care that much about it. 455 00:19:57,150 --> 00:19:58,300 Of course we care. 456 00:19:58,300 --> 00:20:01,317 But it doesn't change the fact that in the first year-- 457 00:20:01,317 --> 00:20:03,650 so we know that in the first year, it works pretty well. 458 00:20:03,650 --> 00:20:05,858 And there's some things that work better than others. 459 00:20:05,858 --> 00:20:08,470 Many governments send some letters or some forms 460 00:20:08,470 --> 00:20:09,757 of mailers anyway. 461 00:20:09,757 --> 00:20:12,340 So what you want to do is kind of like make sure you optimize, 462 00:20:12,340 --> 00:20:15,670 at least the first or second time you contact people. 463 00:20:15,670 --> 00:20:18,100 And that might have some persistent effects 464 00:20:18,100 --> 00:20:20,500 by themselves, even if you don't re-contact them. 465 00:20:20,500 --> 00:20:22,300 Because once you start paying your taxes, 466 00:20:22,300 --> 00:20:25,120 maybe then it becomes a habit, and people do it anyway. 467 00:20:25,120 --> 00:20:26,950 But even if that's not persistent, 468 00:20:26,950 --> 00:20:29,650 governments would say, well, if that gets people to pay taxes 469 00:20:29,650 --> 00:20:32,890 more at least once or twice, that's worth 470 00:20:32,890 --> 00:20:34,450 sort of the money spent. 471 00:20:34,450 --> 00:20:38,770 [INAUDIBLE] somebody essentially needs to design this letters. 472 00:20:38,770 --> 00:20:40,810 There's like, templates of what you can use. 473 00:20:40,810 --> 00:20:48,060 And that's essentially very cheap to do 474 00:20:48,060 --> 00:20:53,400 and has pretty large effects at least in the short run. 475 00:20:53,400 --> 00:20:56,490 I think [INAUDIBLE] correct, which is sometimes it's like, 476 00:20:56,490 --> 00:20:57,450 some effects-- 477 00:20:57,450 --> 00:21:00,120 it's just information versus some things about making 478 00:21:00,120 --> 00:21:02,043 you feel bad, or social norms. 479 00:21:02,043 --> 00:21:04,710 I think the information things-- if you think people just forget 480 00:21:04,710 --> 00:21:07,200 stuff-- for example, if people forget to get their flu shot 481 00:21:07,200 --> 00:21:09,492 and so on, you should probably send reminders and so on 482 00:21:09,492 --> 00:21:11,670 and get them to do that. 483 00:21:11,670 --> 00:21:14,840 If there's sort of these social norms, or sort of [INAUDIBLE] 484 00:21:14,840 --> 00:21:17,070 makes you feel bad in some ways, maybe those things 485 00:21:17,070 --> 00:21:20,280 are maybe more likely to potentially at least go away 486 00:21:20,280 --> 00:21:22,527 once you do it like 17 times, because at some point, 487 00:21:22,527 --> 00:21:25,110 people are like, yeah, whatever, the same letter I got always. 488 00:21:25,110 --> 00:21:26,700 And I haven't paid my taxes last time. 489 00:21:26,700 --> 00:21:29,090 I'm not going to pay them now either. 490 00:21:29,090 --> 00:21:32,100 It's additional assistance in completing and filing 491 00:21:32,100 --> 00:21:33,480 applications. 492 00:21:33,480 --> 00:21:34,740 So essentially, it's not-- 493 00:21:34,740 --> 00:21:36,750 so the financial aid is always available. 494 00:21:36,750 --> 00:21:39,420 It's just who takes advantage of financial aid 495 00:21:39,420 --> 00:21:41,940 depends on how easy you make it to people, 496 00:21:41,940 --> 00:21:43,710 and whether you support them, provide 497 00:21:43,710 --> 00:21:47,310 some support in completing and filing applications. 498 00:21:47,310 --> 00:21:50,550 There's also [INAUDIBLE] with that where essentially, once 499 00:21:50,550 --> 00:21:52,440 you do your taxes, essentially, it already 500 00:21:52,440 --> 00:21:56,620 prefills you the forms, which is even easier to do. 501 00:21:56,620 --> 00:21:59,070 But essentially, this falls under the category of making 502 00:21:59,070 --> 00:22:00,930 things easy for people. 503 00:22:00,930 --> 00:22:05,140 And that can sort of change behaviors quite a bit. 504 00:22:05,140 --> 00:22:07,980 But the key part in the nudges is-- 505 00:22:07,980 --> 00:22:10,500 and let me get back again to the definition-- 506 00:22:10,500 --> 00:22:16,200 is that there's no financial incentives. 507 00:22:16,200 --> 00:22:20,340 So this part here is without imposing any coercion, 508 00:22:20,340 --> 00:22:24,210 or without paying any kind of material incentives, 509 00:22:24,210 --> 00:22:26,242 or at least, only very, very small incentives. 510 00:22:26,242 --> 00:22:28,200 There's a bit of a question how to define that. 511 00:22:28,200 --> 00:22:31,800 But once I pay you $1,000 to go to college or whatever, 512 00:22:31,800 --> 00:22:34,430 subsidize you, that's not a nudge anymore. 513 00:22:34,430 --> 00:22:38,340 Or once I'm taking away certain options from you, [INAUDIBLE] 514 00:22:38,340 --> 00:22:40,470 eliminating choices from your choice set, 515 00:22:40,470 --> 00:22:41,743 again, that's not a nudge. 516 00:22:41,743 --> 00:22:42,660 That's something else. 517 00:22:42,660 --> 00:22:43,993 That's changing your choice set. 518 00:22:43,993 --> 00:22:45,720 This one is-- nudges are explicitly 519 00:22:45,720 --> 00:22:51,600 about keeping people's choices fully free and available, 520 00:22:51,600 --> 00:22:56,060 but making it easier for them to make a certain choice. 521 00:22:56,060 --> 00:22:57,770 Presumably, a choice that's desired 522 00:22:57,770 --> 00:22:59,420 in some ways by the social planner, 523 00:22:59,420 --> 00:23:03,030 or about people's plans for their future. 524 00:23:03,030 --> 00:23:07,640 So anyway, there's a large sort of set of examples 525 00:23:07,640 --> 00:23:09,180 of different nudges and so on. 526 00:23:09,180 --> 00:23:11,000 And they can be fairly effective. 527 00:23:11,000 --> 00:23:12,500 Of course, not for everybody and not 528 00:23:12,500 --> 00:23:15,830 for the fraction of people who are swayed by these nudges 529 00:23:15,830 --> 00:23:19,020 is not huge, in part because if you think about it, 530 00:23:19,020 --> 00:23:21,542 essentially people need to be marginal in some ways. 531 00:23:21,542 --> 00:23:23,000 If people just really, really don't 532 00:23:23,000 --> 00:23:25,417 want to go to college because it's too expensive for them, 533 00:23:25,417 --> 00:23:27,980 or whatever for other reasons, or if people really 534 00:23:27,980 --> 00:23:30,240 want to go to college anyway, the nudge 535 00:23:30,240 --> 00:23:31,580 will not change that behavior. 536 00:23:31,580 --> 00:23:33,330 Whether somebody fills out the form or not 537 00:23:33,330 --> 00:23:35,000 doesn't really matter. 538 00:23:35,000 --> 00:23:36,500 But if somebody is sort of like, ah, 539 00:23:36,500 --> 00:23:38,030 maybe I should send the kid to college or not, 540 00:23:38,030 --> 00:23:39,500 but now [INAUDIBLE] form and then 541 00:23:39,500 --> 00:23:41,360 they forget about it and so on and so forth, 542 00:23:41,360 --> 00:23:42,818 for that kind of people, there will 543 00:23:42,818 --> 00:23:45,530 be potentially large effects. 544 00:23:45,530 --> 00:23:47,420 But sort of then by definition, in some ways, 545 00:23:47,420 --> 00:23:49,420 the fraction of people who are actually marginal 546 00:23:49,420 --> 00:23:50,570 will not be huge. 547 00:23:50,570 --> 00:23:53,930 But again, since the costs tend to be extremely low, 548 00:23:53,930 --> 00:23:58,370 it tends to be very much cost effective to do so. 549 00:23:58,370 --> 00:24:01,730 Now again, sort of previewing a little bit what 550 00:24:01,730 --> 00:24:05,180 comes in the policy lecture is, so minor interventions 551 00:24:05,180 --> 00:24:06,960 can have large effects. 552 00:24:06,960 --> 00:24:10,310 But nudges can often achieve sort of-- 553 00:24:10,310 --> 00:24:11,810 in some cases, nudges can achieve 554 00:24:11,810 --> 00:24:13,057 unambiguous improvements. 555 00:24:13,057 --> 00:24:15,140 Like, if you send people reminders and they forget 556 00:24:15,140 --> 00:24:17,780 to get the flu shot, and now they do the flu shot, 557 00:24:17,780 --> 00:24:21,080 or if people don't take their medication, 558 00:24:21,080 --> 00:24:23,360 and you remind them, or provide information, 559 00:24:23,360 --> 00:24:25,730 or some other forms [INAUDIBLE] [? tell ?] 560 00:24:25,730 --> 00:24:27,890 people to take their medication, it's 561 00:24:27,890 --> 00:24:30,970 pretty clear that that's making people better off. 562 00:24:30,970 --> 00:24:33,530 But there's also a bunch of challenges and other situations 563 00:24:33,530 --> 00:24:35,860 which is kind of like-- 564 00:24:35,860 --> 00:24:37,600 in some cases, it's not at all obvious. 565 00:24:37,600 --> 00:24:39,310 Like, which nudge to choose-- 566 00:24:39,310 --> 00:24:41,290 are we making everybody better off? 567 00:24:41,290 --> 00:24:43,047 Are some people made worse off? 568 00:24:43,047 --> 00:24:45,130 For example, should everybody save for retirement? 569 00:24:45,130 --> 00:24:48,370 Are we pushing people to save too much? 570 00:24:48,370 --> 00:24:49,750 Should everybody go to college? 571 00:24:49,750 --> 00:24:51,100 Maybe in some ways, for some people, 572 00:24:51,100 --> 00:24:53,267 it's more suitable than for others to go to college. 573 00:24:53,267 --> 00:24:56,210 Often it's a huge financial expense for the family. 574 00:24:56,210 --> 00:24:59,560 And if then the job prospects are not necessarily better 575 00:24:59,560 --> 00:25:00,950 compared to not going to college, 576 00:25:00,950 --> 00:25:03,257 then it's not clear that one should do that. 577 00:25:03,257 --> 00:25:05,590 There's also some evidence or some concerns about nudges 578 00:25:05,590 --> 00:25:07,503 making people feel bad. 579 00:25:07,503 --> 00:25:08,920 So you get these letters that says 580 00:25:08,920 --> 00:25:11,200 you're destroying the environment, 581 00:25:11,200 --> 00:25:13,630 and [INAUDIBLE] all these sad faces, 582 00:25:13,630 --> 00:25:15,250 and you're worse than your neighbors, 583 00:25:15,250 --> 00:25:16,840 people might just feel bad about it. 584 00:25:16,840 --> 00:25:22,450 And we should put some weight on that. 585 00:25:22,450 --> 00:25:26,560 If you send these letters to 1,000 households, 586 00:25:26,560 --> 00:25:28,640 and 50 households or 20 households 587 00:25:28,640 --> 00:25:30,640 change their behavior in certain ways, 588 00:25:30,640 --> 00:25:32,740 but 300 households feel bad about it 589 00:25:32,740 --> 00:25:35,500 because they feel uncomfortable getting these nudges, 590 00:25:35,500 --> 00:25:40,130 or you get all these phone calls and spam about the environment 591 00:25:40,130 --> 00:25:42,550 from sending too many letters, we 592 00:25:42,550 --> 00:25:43,940 should put some weight on that. 593 00:25:43,940 --> 00:25:46,280 And then it's like, then there's some trade offs. 594 00:25:46,280 --> 00:25:47,560 It's not like entirely free. 595 00:25:47,560 --> 00:25:50,257 And then there are some costs and benefits of that. 596 00:25:50,257 --> 00:25:52,090 There's also some questions about which self 597 00:25:52,090 --> 00:25:53,210 should we respect. 598 00:25:53,210 --> 00:25:54,770 We talked about this a lot before, 599 00:25:54,770 --> 00:25:59,070 which is if one self wants to be very virtuous 600 00:25:59,070 --> 00:26:01,080 and exercise a lot and so on, and the other one 601 00:26:01,080 --> 00:26:03,210 wants to sit on the couch and watch TV, 602 00:26:03,210 --> 00:26:06,370 or if one self wants to smoke and the other one does not, 603 00:26:06,370 --> 00:26:09,210 it's not obvious that we should respect the long run 604 00:26:09,210 --> 00:26:14,610 self compared to the short run self that wants to just enjoy 605 00:26:14,610 --> 00:26:15,600 themselves, and so on. 606 00:26:15,600 --> 00:26:18,360 And sort of now there's tricky issues [INAUDIBLE] 607 00:26:18,360 --> 00:26:21,178 are we actually making people better off or not. 608 00:26:21,178 --> 00:26:23,470 And we'll get back to these issues in the last lecture, 609 00:26:23,470 --> 00:26:26,400 which talks about policy. 610 00:26:26,400 --> 00:26:28,850 Any questions on this lecture? 611 00:26:35,030 --> 00:26:38,480 This is actually lecture 20 about malleability 612 00:26:38,480 --> 00:26:41,930 and inaccessibility of preferences. 613 00:26:41,930 --> 00:26:46,430 This is, in a way, a more radical deviation 614 00:26:46,430 --> 00:26:48,890 from neoclassical economics. 615 00:26:48,890 --> 00:26:50,750 Because so far, we have always said, 616 00:26:50,750 --> 00:26:52,490 people know their preferences. 617 00:26:52,490 --> 00:26:53,660 People have certain beliefs. 618 00:26:53,660 --> 00:26:55,610 Their beliefs might be wrong, and so on. 619 00:26:55,610 --> 00:26:59,180 But crucially, people knew what they were doing, 620 00:26:59,180 --> 00:27:01,730 and they were deliberately making certain choices. 621 00:27:01,730 --> 00:27:06,740 And whether their preferences were perhaps 622 00:27:06,740 --> 00:27:09,530 including present bias, or social preferences, a reference 623 00:27:09,530 --> 00:27:11,240 dependence or the like-- 624 00:27:11,240 --> 00:27:13,370 but these preferences were given and fixed. 625 00:27:13,370 --> 00:27:15,623 And people knew what those preferences are. 626 00:27:15,623 --> 00:27:17,790 Now we're going to deviate from that and think about 627 00:27:17,790 --> 00:27:22,970 [INAUDIBLE] evidence that people might not actually A-- 628 00:27:22,970 --> 00:27:24,740 these preferences might be malleable. 629 00:27:24,740 --> 00:27:26,990 And B, people might not even understand 630 00:27:26,990 --> 00:27:28,640 why they want what they want. 631 00:27:28,640 --> 00:27:30,230 And sort of the preferences are sort 632 00:27:30,230 --> 00:27:32,120 of quite mysterious to people. 633 00:27:32,120 --> 00:27:35,540 And it's quite easy in some ways to manipulate people 634 00:27:35,540 --> 00:27:38,660 in some ways, without them even understanding it, 635 00:27:38,660 --> 00:27:40,340 knowing about it. 636 00:27:40,340 --> 00:27:44,142 So I tell you about some very fascinating psychology work, 637 00:27:44,142 --> 00:27:46,600 which is called-- the paper is called "Telling more than we 638 00:27:46,600 --> 00:27:48,267 can know" from Nisbett and Wilson, which 639 00:27:48,267 --> 00:27:50,172 is a classic paper in social psychology. 640 00:27:50,172 --> 00:27:52,630 And then we're going to talk a little bit about willingness 641 00:27:52,630 --> 00:27:54,460 to pay, and then the paper by Ariely 642 00:27:54,460 --> 00:27:58,540 et al on coherent arbitrariness, which you were supposed 643 00:27:58,540 --> 00:28:00,970 to read for today. 644 00:28:00,970 --> 00:28:01,660 OK. 645 00:28:01,660 --> 00:28:04,630 So let's step back a little bit, and talk about the Nisbett 646 00:28:04,630 --> 00:28:07,220 and Wilson paper. 647 00:28:07,220 --> 00:28:10,540 So this is-- so many questions-- you 648 00:28:10,540 --> 00:28:14,560 might have any questions about cognitive processes underlying 649 00:28:14,560 --> 00:28:17,595 our choices, evaluations, judgments, and behaviors. 650 00:28:17,595 --> 00:28:18,970 You might wonder, why do you like 651 00:28:18,970 --> 00:28:20,143 a certain person versus not? 652 00:28:20,143 --> 00:28:21,310 What do you like about them? 653 00:28:21,310 --> 00:28:23,602 And why do you like this person and not another person? 654 00:28:23,602 --> 00:28:26,150 And why you're friends with this person versus somebody else? 655 00:28:26,150 --> 00:28:27,817 And why do you like Math versus English, 656 00:28:27,817 --> 00:28:29,770 and so on and so forth? 657 00:28:29,770 --> 00:28:31,395 How did you solve a certain problem 658 00:28:31,395 --> 00:28:33,520 once you were sort of asked to solve some problems? 659 00:28:33,520 --> 00:28:36,162 How did you come up with a solution? 660 00:28:36,162 --> 00:28:37,870 Why did you take this job, or why did you 661 00:28:37,870 --> 00:28:39,633 take this class, versus another class? 662 00:28:39,633 --> 00:28:41,050 And you might sort of say, well, I 663 00:28:41,050 --> 00:28:42,460 like this one and not that one. 664 00:28:42,460 --> 00:28:44,660 And I really like sort of like computer science, 665 00:28:44,660 --> 00:28:47,350 versus math versus economics. 666 00:28:47,350 --> 00:28:49,950 But when you then ask [INAUDIBLE] yourself 667 00:28:49,950 --> 00:28:52,810 for others, why exactly do you like that one, versus something 668 00:28:52,810 --> 00:28:53,440 else? 669 00:28:53,440 --> 00:28:57,070 You'll realize quickly that people often don't quite 670 00:28:57,070 --> 00:28:59,330 have a good answer to those questions. 671 00:28:59,330 --> 00:29:03,100 And Nisbett and Wilson's fairly provocative paper 672 00:29:03,100 --> 00:29:05,980 says, essentially, we have no idea where these preferences 673 00:29:05,980 --> 00:29:07,070 are coming from. 674 00:29:07,070 --> 00:29:08,830 And we're just making stuff up. 675 00:29:08,830 --> 00:29:11,920 And so let me give you some examples of that. 676 00:29:11,920 --> 00:29:15,060 The first example is what's called Maier's 677 00:29:15,060 --> 00:29:17,640 two-string problem from 1931. 678 00:29:17,640 --> 00:29:21,180 This is [INAUDIBLE] one of these classic psychology experiments. 679 00:29:21,180 --> 00:29:23,160 The experiments works as follows. 680 00:29:23,160 --> 00:29:26,280 There's two cords hung from the ceiling 681 00:29:26,280 --> 00:29:30,990 of a lab with many objects, such as poles, ring stands, clamps, 682 00:29:30,990 --> 00:29:33,150 pliers, and extension cords. 683 00:29:33,150 --> 00:29:35,640 Subjects are now asked-- 684 00:29:35,640 --> 00:29:38,550 are told that a task is to tie the two ends of the cords 685 00:29:38,550 --> 00:29:39,100 together. 686 00:29:39,100 --> 00:29:41,400 So there's two cords hanging on the side 687 00:29:41,400 --> 00:29:44,370 of each side of the room. 688 00:29:44,370 --> 00:29:48,700 And subjects are supposed to tie these two cords together. 689 00:29:48,700 --> 00:29:52,290 The problem is that the cords are placed far apart 690 00:29:52,290 --> 00:29:55,020 from each other, such that subjects 691 00:29:55,020 --> 00:29:57,943 can't, while holding onto one cord, reach the other. 692 00:29:57,943 --> 00:29:59,610 So you can't just take one cord and then 693 00:29:59,610 --> 00:30:01,110 sort of like try to reach the other, 694 00:30:01,110 --> 00:30:03,300 because it's too far away. 695 00:30:03,300 --> 00:30:07,080 And so there's all these different objects in the room. 696 00:30:07,080 --> 00:30:10,210 And subjects are sort of trying to figure out how to do this. 697 00:30:10,210 --> 00:30:14,547 And so then they usually come up with one or two solutions 698 00:30:14,547 --> 00:30:17,130 that are not really solutions, because it doesn't really work. 699 00:30:17,130 --> 00:30:19,590 So they can get [INAUDIBLE] like extension cord, 700 00:30:19,590 --> 00:30:22,350 they use the pliers, and so on, and the clamp, and some ring 701 00:30:22,350 --> 00:30:22,900 or whatever. 702 00:30:22,900 --> 00:30:24,710 And it just doesn't work. 703 00:30:24,710 --> 00:30:28,880 And then they sort of try it and keep trying, 704 00:30:28,880 --> 00:30:30,570 but it doesn't really work out. 705 00:30:30,570 --> 00:30:33,990 And then they're told to do it another way. 706 00:30:33,990 --> 00:30:36,110 And then Maier, the experimenter, 707 00:30:36,110 --> 00:30:38,600 walks into the room at some point, or walks around, 708 00:30:38,600 --> 00:30:40,617 and essentially, accidentally-- this 709 00:30:40,617 --> 00:30:42,950 is sort of like on purpose of course-- accidentally puts 710 00:30:42,950 --> 00:30:44,900 some of the cords in motion. 711 00:30:44,900 --> 00:30:48,230 And subjects then very quickly afterwards figure out 712 00:30:48,230 --> 00:30:51,050 the solution within the next 45 seconds. 713 00:30:51,050 --> 00:30:52,490 Of course, the reasoning here is, 714 00:30:52,490 --> 00:30:54,890 why they figure it out is because now the cords are 715 00:30:54,890 --> 00:30:55,730 put in motion. 716 00:30:55,730 --> 00:30:58,535 And subjects sort of figure out how to do this. 717 00:30:58,535 --> 00:31:00,410 And what you're supposed to do is essentially 718 00:31:00,410 --> 00:31:01,370 put the cords in motion. 719 00:31:01,370 --> 00:31:02,870 And once you put them in motion, you 720 00:31:02,870 --> 00:31:07,380 can essentially reach them both and then put it together. 721 00:31:07,380 --> 00:31:08,360 I have a video here. 722 00:31:08,360 --> 00:31:10,160 I'm a little traumatized the last time 723 00:31:10,160 --> 00:31:11,280 where it didn't work out. 724 00:31:11,280 --> 00:31:13,730 So I'm not sure [INAUDIBLE]. 725 00:31:13,730 --> 00:31:14,730 Now I don't even see it. 726 00:31:14,730 --> 00:31:15,960 So maybe let's just skip it. 727 00:31:15,960 --> 00:31:17,460 But you can watch the video of that. 728 00:31:17,460 --> 00:31:20,120 It's the first two and a half minutes video. 729 00:31:20,120 --> 00:31:21,390 [INAUDIBLE] see anything. 730 00:31:21,390 --> 00:31:25,410 But there's a video that sort of demonstrates this task that you 731 00:31:25,410 --> 00:31:28,030 can sort of just skip this. 732 00:31:28,030 --> 00:31:31,920 So now then afterwards, people are asked, well, 733 00:31:31,920 --> 00:31:36,780 how did you come up with the idea of using a pendulum? 734 00:31:36,780 --> 00:31:39,540 And people then have all these like explanations. 735 00:31:39,540 --> 00:31:40,980 Well, it just dawned on me. 736 00:31:40,980 --> 00:31:42,738 It was the only thing left. 737 00:31:42,738 --> 00:31:44,530 I thought about all these different things. 738 00:31:44,530 --> 00:31:46,655 And at the end, I came up with this other solution, 739 00:31:46,655 --> 00:31:48,570 which is using these strings as a pendulum. 740 00:31:48,570 --> 00:31:51,112 Or I just realized the cord would swing 741 00:31:51,112 --> 00:31:52,320 [INAUDIBLE] the weight on it. 742 00:31:52,320 --> 00:31:54,067 So what the solution at the end is like, 743 00:31:54,067 --> 00:31:56,400 you put the plier on one of the cords and then swing it. 744 00:31:56,400 --> 00:31:57,630 And then you sort of use the other one. 745 00:31:57,630 --> 00:31:59,430 And then you could put them together. 746 00:31:59,430 --> 00:32:02,520 There's one Harvard psychology faculty subject, in fact, 747 00:32:02,520 --> 00:32:04,020 had the following explanation, which 748 00:32:04,020 --> 00:32:06,320 is, having exhausted everything else, 749 00:32:06,320 --> 00:32:07,650 the next thing was to swing it. 750 00:32:07,650 --> 00:32:10,670 I thought of the situation of swinging across a river. 751 00:32:10,670 --> 00:32:13,560 I had an imagery of monkeys swinging from trees. 752 00:32:13,560 --> 00:32:17,410 This imagery appeared simultaneously with a solution. 753 00:32:17,410 --> 00:32:20,100 The idea appeared complete. 754 00:32:20,100 --> 00:32:22,500 Now of course, the problem here is that-- 755 00:32:25,050 --> 00:32:27,510 the problem here is that's all bullshit. 756 00:32:27,510 --> 00:32:30,720 Because the reason why people come up with a solution 757 00:32:30,720 --> 00:32:35,790 is because, as I told you, Maier was walking in and accidentally 758 00:32:35,790 --> 00:32:37,110 putting the cords in motion. 759 00:32:37,110 --> 00:32:39,990 And there's differences in timing. 760 00:32:39,990 --> 00:32:42,390 Sometimes they would come in earlier, sometimes later. 761 00:32:42,390 --> 00:32:45,360 And every time the experimenter walks in, 762 00:32:45,360 --> 00:32:46,900 people would sort of accidentally 763 00:32:46,900 --> 00:32:48,840 then figure it out, and then come up 764 00:32:48,840 --> 00:32:51,360 with these elaborate explanations. 765 00:32:51,360 --> 00:32:53,160 And the reason why we know that this 766 00:32:53,160 --> 00:32:56,820 is because of the experimenter are coming in 767 00:32:56,820 --> 00:32:59,790 is because if the experimenter doesn't accidentally come in, 768 00:32:59,790 --> 00:33:02,880 and puts the cords in motion, then essentially, 769 00:33:02,880 --> 00:33:04,580 people are not figuring out. 770 00:33:04,580 --> 00:33:07,290 So we know the true causal effect here 771 00:33:07,290 --> 00:33:09,840 is the experimenter coming in and suggesting 772 00:33:09,840 --> 00:33:11,340 a solution to people. 773 00:33:11,340 --> 00:33:14,940 But people then make up all these stories, 774 00:33:14,940 --> 00:33:16,410 including like the monkeys swinging 775 00:33:16,410 --> 00:33:18,990 from trees, how they came up with a solution, which 776 00:33:18,990 --> 00:33:21,340 is clearly not how they came up with the solution. 777 00:33:21,340 --> 00:33:23,215 The reason why they came up with the solution 778 00:33:23,215 --> 00:33:26,980 is because Maier touched the strings. 779 00:33:26,980 --> 00:33:27,790 OK. 780 00:33:27,790 --> 00:33:30,010 Now there's a second example here, 781 00:33:30,010 --> 00:33:31,760 or three examples in total. 782 00:33:31,760 --> 00:33:33,490 So let me tell you about example number-- 783 00:33:33,490 --> 00:33:38,910 or let me ask first, is there any questions about this study? 784 00:33:38,910 --> 00:33:41,990 By the way, you should lower your hands, 785 00:33:41,990 --> 00:33:43,670 if you asked [INAUDIBLE] previously, 786 00:33:43,670 --> 00:33:46,770 like [? Brian ?] and [INAUDIBLE].. 787 00:33:46,770 --> 00:33:48,610 Unless you have have additional questions. 788 00:33:53,510 --> 00:33:57,380 Let me tell you the second study now. 789 00:33:57,380 --> 00:34:01,460 This is a study by Latane and Darley, 790 00:34:01,460 --> 00:34:04,400 which is, I told you this a little bit 791 00:34:04,400 --> 00:34:08,150 about [INAUDIBLE] these are the impacts of bystanders 792 00:34:08,150 --> 00:34:10,460 and witnesses on helping behaviors. 793 00:34:10,460 --> 00:34:12,989 I told you about the story of the Good Samaritan before. 794 00:34:12,989 --> 00:34:15,670 This is sort of a similar study. 795 00:34:15,670 --> 00:34:17,980 And this is essentially a situation 796 00:34:17,980 --> 00:34:20,260 that's, again, created by social psychologists. 797 00:34:20,260 --> 00:34:22,239 It's not a real situation, but people are 798 00:34:22,239 --> 00:34:24,409 meant to think that it's real. 799 00:34:24,409 --> 00:34:27,550 And so here, the more people over here, someone 800 00:34:27,550 --> 00:34:29,139 in another room, having what sounds 801 00:34:29,139 --> 00:34:32,440 like an epileptic seizure, the lower the probability 802 00:34:32,440 --> 00:34:35,280 that any given individual will rush to help. 803 00:34:35,280 --> 00:34:38,699 And you get similar results for individuals' reaction 804 00:34:38,699 --> 00:34:40,358 to dangerous-looking smoke coming out 805 00:34:40,358 --> 00:34:41,400 of the ceiling of a room. 806 00:34:41,400 --> 00:34:43,108 So like there's people in the experiment. 807 00:34:43,108 --> 00:34:45,630 And they have smoke coming out in a room. 808 00:34:45,630 --> 00:34:48,060 And the more people that are there, the less likely 809 00:34:48,060 --> 00:34:50,520 is any given person, perhaps because they're freeriding 810 00:34:50,520 --> 00:34:53,620 is to actually do something about it. 811 00:34:53,620 --> 00:34:55,425 And this looks kind of very dangerous. 812 00:34:55,425 --> 00:34:57,300 But nobody does anything, because essentially 813 00:34:57,300 --> 00:34:59,550 other people are also not doing anything. 814 00:34:59,550 --> 00:35:01,470 Now when you then ask people afterwards, 815 00:35:01,470 --> 00:35:05,430 you know, what influenced your choices? 816 00:35:05,430 --> 00:35:10,290 And essentially, when you ask them directly and tactfully 817 00:35:10,290 --> 00:35:13,540 and bluntly, you always get the same answer, that essentially, 818 00:35:13,540 --> 00:35:15,900 they think their behavior had not been influenced 819 00:35:15,900 --> 00:35:17,355 by the other people present. 820 00:35:17,355 --> 00:35:19,230 They think you're like-- for whatever reason, 821 00:35:19,230 --> 00:35:21,157 they didn't do anything, or maybe they 822 00:35:21,157 --> 00:35:23,490 thought it wasn't so bad, or there wasn't really danger, 823 00:35:23,490 --> 00:35:26,050 or whatever. 824 00:35:26,050 --> 00:35:28,280 But of course, these are randomized experiments. 825 00:35:28,280 --> 00:35:30,030 We know that some subjects were influenced 826 00:35:30,030 --> 00:35:32,155 by the presence of other people, because precisely, 827 00:35:32,155 --> 00:35:33,960 the experiment was creating that variation, 828 00:35:33,960 --> 00:35:36,090 and it affected people's behavior. 829 00:35:36,090 --> 00:35:39,060 [INAUDIBLE] people essentially just don't understand that, 830 00:35:39,060 --> 00:35:41,610 and sort of then come up with other explanations 831 00:35:41,610 --> 00:35:46,990 of why they do or did what they, in fact, did. 832 00:35:46,990 --> 00:35:49,690 And number three is sort of very similar. 833 00:35:49,690 --> 00:35:54,890 These are called erroneous reports about position effects. 834 00:35:54,890 --> 00:35:58,840 These are studies where certain items 835 00:35:58,840 --> 00:36:04,330 are positioned in different ways, to get people to-- 836 00:36:04,330 --> 00:36:06,680 when people were evaluating them. 837 00:36:06,680 --> 00:36:12,310 So passerbys were asked to evaluate some clothing. 838 00:36:12,310 --> 00:36:16,630 And they were asked about the quality and the preferences-- 839 00:36:16,630 --> 00:36:18,460 the quality of certain goods and people's 840 00:36:18,460 --> 00:36:20,760 preferences between different goods. 841 00:36:20,760 --> 00:36:23,042 And the way this experiment was set up-- 842 00:36:23,042 --> 00:36:25,500 and this is sort of essentially like a marketing experiment 843 00:36:25,500 --> 00:36:26,280 if you want-- 844 00:36:26,280 --> 00:36:29,910 essentially, there's a pronounced left-to-right 845 00:36:29,910 --> 00:36:30,990 position effect. 846 00:36:30,990 --> 00:36:33,510 The rightmost object was heavily over-chosen 847 00:36:33,510 --> 00:36:34,600 in these experiments. 848 00:36:34,600 --> 00:36:36,990 And that's in some sense irrelevant why that's the case. 849 00:36:36,990 --> 00:36:39,540 But essentially, from previous experiments, 850 00:36:39,540 --> 00:36:41,790 we know that essentially the rightmost object 851 00:36:41,790 --> 00:36:45,300 is most likely to be chosen, perhaps because people 852 00:36:45,300 --> 00:36:46,320 look at it first. 853 00:36:46,320 --> 00:36:49,305 [INAUDIBLE] look at it last and so on, it doesn't matter. 854 00:36:49,305 --> 00:36:50,680 What's important for our purposes 855 00:36:50,680 --> 00:36:53,700 is that the rightmost object is most heavily chosen. 856 00:36:53,700 --> 00:36:56,880 People like that object the best in these experiments. 857 00:36:56,880 --> 00:36:59,190 And then there's going to be randomization 858 00:36:59,190 --> 00:37:02,070 in the positioning of experiments. 859 00:37:02,070 --> 00:37:03,630 And people are then asked, well, why 860 00:37:03,630 --> 00:37:06,480 did you choose this sweater versus another? 861 00:37:06,480 --> 00:37:09,810 And no subject whatsoever ever mentions 862 00:37:09,810 --> 00:37:14,040 the position of the article in the room, 863 00:37:14,040 --> 00:37:15,720 and virtually all subjects denied 864 00:37:15,720 --> 00:37:18,270 it was, when asked directly about the possible effect 865 00:37:18,270 --> 00:37:19,802 of the position of the article. 866 00:37:19,802 --> 00:37:21,760 And people come up with all these explanations. 867 00:37:21,760 --> 00:37:24,030 It's like, green is really always my [INAUDIBLE] favorite 868 00:37:24,030 --> 00:37:24,540 color. 869 00:37:24,540 --> 00:37:26,250 And this sweater is really fluffy. 870 00:37:26,250 --> 00:37:27,000 And this and that. 871 00:37:27,000 --> 00:37:30,580 And I really like it for reason x and y. 872 00:37:30,580 --> 00:37:33,090 Of course that's true for some people, 873 00:37:33,090 --> 00:37:34,860 but we know that at least some people must 874 00:37:34,860 --> 00:37:36,390 be swayed by these position effects, 875 00:37:36,390 --> 00:37:39,302 because we set up an experiment to precisely do that. 876 00:37:39,302 --> 00:37:41,010 And then that essentially swayed behavior 877 00:37:41,010 --> 00:37:42,880 in a very predictable way. 878 00:37:42,880 --> 00:37:44,790 So essentially, there are these determinants 879 00:37:44,790 --> 00:37:46,800 on people's preferences of behavior, 880 00:37:46,800 --> 00:37:51,500 that people do not seem to understand. 881 00:37:51,500 --> 00:37:52,010 OK. 882 00:37:52,010 --> 00:37:54,527 So let me summarize what did we learn. 883 00:37:54,527 --> 00:37:57,110 So there are many instances in which subjects have no idea why 884 00:37:57,110 --> 00:37:59,130 they choose what they choose. 885 00:37:59,130 --> 00:38:00,455 And then people-- 886 00:38:00,455 --> 00:38:02,520 [INAUDIBLE] can read this more in this paper, 887 00:38:02,520 --> 00:38:05,010 which I think is a beautiful paper-- 888 00:38:05,010 --> 00:38:07,940 people appear to make up stories that 889 00:38:07,940 --> 00:38:11,450 are based on their a priori, implicit causal theories. 890 00:38:11,450 --> 00:38:13,160 What I mean by that, essentially, they 891 00:38:13,160 --> 00:38:16,770 have some theory of why they do certain things. 892 00:38:16,770 --> 00:38:18,860 And when you ask them, then, OK, well, why 893 00:38:18,860 --> 00:38:20,420 did you do what you just did? 894 00:38:20,420 --> 00:38:22,190 People will sort of come up with some ways 895 00:38:22,190 --> 00:38:24,980 of justifying [INAUDIBLE] in some ways 896 00:38:24,980 --> 00:38:27,020 their behavior, based on some theories 897 00:38:27,020 --> 00:38:28,850 that they had about themselves. 898 00:38:28,850 --> 00:38:29,735 And sort of-- 899 00:38:29,735 --> 00:38:31,820 I always like fluffy sweaters, and therefore you 900 00:38:31,820 --> 00:38:32,820 choose the sweater. 901 00:38:32,820 --> 00:38:34,820 Of course, it happens to be the only [INAUDIBLE] 902 00:38:34,820 --> 00:38:38,270 sweater if it's [INAUDIBLE] on the right versus on the left. 903 00:38:38,270 --> 00:38:40,790 But people then sort of have their ways in which they then 904 00:38:40,790 --> 00:38:44,510 explain what they do based on essentially things that they 905 00:38:44,510 --> 00:38:47,950 sort of essentially make up. 906 00:38:47,950 --> 00:38:50,831 Any questions on this before I move on? 907 00:38:57,900 --> 00:39:02,010 And I encourage you to read the paper, more for fun 908 00:39:02,010 --> 00:39:03,265 than for anything. 909 00:39:03,265 --> 00:39:04,890 It makes you kind of wonder quite a lot 910 00:39:04,890 --> 00:39:07,680 in why you do what you do, and why you prefer 911 00:39:07,680 --> 00:39:11,540 certain things versus others. 912 00:39:11,540 --> 00:39:14,180 So let's briefly talk about two experimental design 913 00:39:14,180 --> 00:39:17,630 tools, which will be useful for the coherent arbitrariness 914 00:39:17,630 --> 00:39:18,642 paper. 915 00:39:18,642 --> 00:39:20,600 One is what's called the strategy method, which 916 00:39:20,600 --> 00:39:22,292 you already came across a little bit. 917 00:39:22,292 --> 00:39:23,750 And the second one is what's called 918 00:39:23,750 --> 00:39:26,750 the BDM, the Becker-DeGroot-Marschak 919 00:39:26,750 --> 00:39:28,850 procedure for eliciting valuations. 920 00:39:28,850 --> 00:39:31,010 I'll try to be quick. 921 00:39:31,010 --> 00:39:38,240 So we're often interested in behavior in rare contingencies. 922 00:39:38,240 --> 00:39:40,220 So often, we may ask the question, 923 00:39:40,220 --> 00:39:43,370 how would people behave in many different contingencies? 924 00:39:43,370 --> 00:39:44,840 Some of them are often quite rare, 925 00:39:44,840 --> 00:39:47,730 that don't happen very often. 926 00:39:47,730 --> 00:39:50,070 And why do we care about such contingencies? 927 00:39:50,070 --> 00:39:51,780 Well, sometimes we just care about them 928 00:39:51,780 --> 00:39:55,200 like [INAUDIBLE] it's inherently important 929 00:39:55,200 --> 00:39:59,640 how people behave or prevent certain contingencies, 930 00:39:59,640 --> 00:40:01,995 disasters, earthquakes, droughts, et cetera. 931 00:40:01,995 --> 00:40:03,370 We care a lot about these things, 932 00:40:03,370 --> 00:40:04,890 even if they're sort of rarely happening. 933 00:40:04,890 --> 00:40:06,432 So we kind of want to know how people 934 00:40:06,432 --> 00:40:09,510 behave in certain situations. 935 00:40:09,510 --> 00:40:10,980 But events and rare contingencies 936 00:40:10,980 --> 00:40:15,390 also can affect, on top of that, events in likely contingencies. 937 00:40:15,390 --> 00:40:16,660 Here's a simple example. 938 00:40:16,660 --> 00:40:19,173 If your roommates think you'll punch them in the face 939 00:40:19,173 --> 00:40:21,090 if they borrow your stuff without asking them, 940 00:40:21,090 --> 00:40:23,130 they will not do it. 941 00:40:23,130 --> 00:40:26,310 Of course, hopefully, you are not punching your roommates, 942 00:40:26,310 --> 00:40:30,010 and I very much do not want to encourage you to do so. 943 00:40:30,010 --> 00:40:33,270 But here, the key part here is that punching 944 00:40:33,270 --> 00:40:38,790 is rare but important, precisely because you might do so in case 945 00:40:38,790 --> 00:40:41,910 they misbehave, they will not borrow your stuff 946 00:40:41,910 --> 00:40:44,320 or steal your stuff in the first place. 947 00:40:44,320 --> 00:40:46,410 So essentially, sort of these [? equilibrium ?] 948 00:40:46,410 --> 00:40:49,512 or rare contingencies might be quite important, 949 00:40:49,512 --> 00:40:51,720 not because we think they actually happen very often, 950 00:40:51,720 --> 00:40:56,540 but they sort of discipline behavior in other cases. 951 00:40:56,540 --> 00:40:58,970 Now, what's [? known ?] as strategy method? 952 00:40:58,970 --> 00:41:02,480 Well, the strategy method helps you 953 00:41:02,480 --> 00:41:06,890 to elicit behavior in many potentially rare circumstances 954 00:41:06,890 --> 00:41:08,780 by asking subjects what they would 955 00:41:08,780 --> 00:41:13,290 do with a choice implemented if the circumstances arises. 956 00:41:13,290 --> 00:41:16,160 That is to say, I'm asking you-- for many different cases, 957 00:41:16,160 --> 00:41:18,990 I'm going to implement one of those cases. 958 00:41:18,990 --> 00:41:24,320 And that's going to allow me to be [INAUDIBLE] [? compatible ?] 959 00:41:24,320 --> 00:41:25,860 in various ways. 960 00:41:25,860 --> 00:41:30,870 So since the decision does count if the contingency occurs, 961 00:41:30,870 --> 00:41:33,180 subjects have an incentive to choose correctly 962 00:41:33,180 --> 00:41:35,100 for each contingency. 963 00:41:35,100 --> 00:41:39,660 That is to say, I could say, suppose this is what we're 964 00:41:39,660 --> 00:41:41,580 doing in class in some sense. 965 00:41:41,580 --> 00:41:43,140 We talked about social preferences. 966 00:41:43,140 --> 00:41:46,080 I said, you know, I'm asking you to make a choice. 967 00:41:46,080 --> 00:41:50,140 I'm going to only pick one of you to implement that choice. 968 00:41:50,140 --> 00:41:57,145 And you have the incentives now to answer truthfully, 969 00:41:57,145 --> 00:41:59,520 because it could always be the case that your choice will 970 00:41:59,520 --> 00:42:01,420 be implemented. 971 00:42:01,420 --> 00:42:05,190 And so that allows then the experimenter, me in that case, 972 00:42:05,190 --> 00:42:09,030 to generate a lot of data from one simple experiment, where 973 00:42:09,030 --> 00:42:10,800 essentially, you can give a subject 974 00:42:10,800 --> 00:42:12,900 many different decisions, and then say, 975 00:42:12,900 --> 00:42:14,760 only one of your choices will actually 976 00:42:14,760 --> 00:42:16,020 be implemented for sure. 977 00:42:16,020 --> 00:42:18,400 Or you can ask many people the same question and say, 978 00:42:18,400 --> 00:42:21,630 [INAUDIBLE] only one of you, or one of your choices 979 00:42:21,630 --> 00:42:23,140 will be implemented. 980 00:42:23,140 --> 00:42:27,480 And so then that's essentially incentive-compatible, 981 00:42:27,480 --> 00:42:30,420 because it can always be that your choice is the one, or one 982 00:42:30,420 --> 00:42:32,850 of the choices is the one that counts. 983 00:42:32,850 --> 00:42:35,680 And when you look at sort of experimental evidence, 984 00:42:35,680 --> 00:42:38,370 it seems to suggest that the strategy methods are asking, 985 00:42:38,370 --> 00:42:41,372 essentially, if I can ask you-- so there's experiments where 986 00:42:41,372 --> 00:42:43,830 I can ask you 100 different questions, and only one of them 987 00:42:43,830 --> 00:42:46,650 will be implemented, versus a different group would 988 00:42:46,650 --> 00:42:49,110 be asked-- a randomized group would be asked only one 989 00:42:49,110 --> 00:42:49,630 question. 990 00:42:49,630 --> 00:42:52,350 It turns out that people actually answer in experiments 991 00:42:52,350 --> 00:42:57,250 these questions pretty similarly in both of these cases. 992 00:42:57,250 --> 00:42:59,160 So that's to say the strategy method seems 993 00:42:59,160 --> 00:43:04,424 to elicit individuals' true preferences pretty well. 994 00:43:04,424 --> 00:43:05,103 Sorry. 995 00:43:05,103 --> 00:43:06,020 [INAUDIBLE] messed up. 996 00:43:06,020 --> 00:43:07,252 But I think that's fine. 997 00:43:07,252 --> 00:43:08,960 So what's now the Becker-DeGroot-Marschak 998 00:43:08,960 --> 00:43:09,620 procedure? 999 00:43:09,620 --> 00:43:11,910 It's essentially a version of that, 1000 00:43:11,910 --> 00:43:17,825 which looks something like this, which is, people are asked-- 1001 00:43:17,825 --> 00:43:19,200 [INAUDIBLE] so what the goal here 1002 00:43:19,200 --> 00:43:22,523 is to understand people's willingness to pay for a good. 1003 00:43:22,523 --> 00:43:24,440 So subjects are told that a price for the good 1004 00:43:24,440 --> 00:43:25,730 will be randomly selected. 1005 00:43:25,730 --> 00:43:29,000 And the price goes from like $0.50 to like, say, $10. 1006 00:43:29,000 --> 00:43:31,400 For each of these prices, the person is asked, 1007 00:43:31,400 --> 00:43:35,780 do you prefer buy versus not-buy in these options. 1008 00:43:35,780 --> 00:43:38,943 And so then, I ask you to fill out the entire form. 1009 00:43:38,943 --> 00:43:41,360 Tell me, for each of these prices, what do you want to do? 1010 00:43:41,360 --> 00:43:43,070 If the price is $3, what do you want? 1011 00:43:43,070 --> 00:43:44,690 If the price is $1, what do you want? 1012 00:43:44,690 --> 00:43:47,030 If the price is $10, what do you want? 1013 00:43:47,030 --> 00:43:50,870 And then I'm going to afterwards then pick a price and say, OK. 1014 00:43:50,870 --> 00:43:52,378 Now the price is $3.50. 1015 00:43:52,378 --> 00:43:54,920 And now I'm going to just look at like, what did you actually 1016 00:43:54,920 --> 00:43:56,870 say, and whatever your choice was then 1017 00:43:56,870 --> 00:44:03,440 for that specific price is going to be implemented. 1018 00:44:03,440 --> 00:44:06,180 Is that clear? 1019 00:44:06,180 --> 00:44:09,270 Now, just to be clear what the problem is that we're 1020 00:44:09,270 --> 00:44:12,930 trying to overcome here-- 1021 00:44:12,930 --> 00:44:17,210 suppose I'm trying to sell you a mug. 1022 00:44:17,210 --> 00:44:18,758 Or suppose I have this mug here. 1023 00:44:18,758 --> 00:44:20,550 I'm saying, would you like to buy this mug? 1024 00:44:20,550 --> 00:44:23,220 It's a very beautiful mug with lots of trees on them. 1025 00:44:23,220 --> 00:44:24,720 I'd like to sell it to you. 1026 00:44:24,720 --> 00:44:27,950 And I'm eliciting your willingness to pay. 1027 00:44:27,950 --> 00:44:30,470 What's the problem that arises from just 1028 00:44:30,470 --> 00:44:33,240 asking you this directly, without the BDM procedure? 1029 00:44:33,240 --> 00:44:37,548 What problem-- what is the BDM procedure sort of helping with? 1030 00:44:37,548 --> 00:44:39,090 So if you go to a marketplace, or try 1031 00:44:39,090 --> 00:44:41,820 to bargain with me on this mug, which again, 1032 00:44:41,820 --> 00:44:45,210 it's a beautiful mug-- 1033 00:44:45,210 --> 00:44:47,850 you might want to shade your valuation. 1034 00:44:47,850 --> 00:44:50,640 You might be willing to pay $5, but you're 1035 00:44:50,640 --> 00:44:54,030 going to say $2, because you are hoping to get a bargain 1036 00:44:54,030 --> 00:44:56,340 and get [INAUDIBLE] a cheap price from me. 1037 00:44:56,340 --> 00:44:57,090 Right? 1038 00:44:57,090 --> 00:44:59,430 And the key part here is that you are essentially 1039 00:44:59,430 --> 00:45:03,900 hoping that your willingness to pay whatever you are offering 1040 00:45:03,900 --> 00:45:07,380 me will essentially change whatever price I'm 1041 00:45:07,380 --> 00:45:09,210 offering the mug to you. 1042 00:45:09,210 --> 00:45:12,240 Notice that that's not the case here in the BDM procedure. 1043 00:45:12,240 --> 00:45:15,600 In the BDM procedure, whatever you say 1044 00:45:15,600 --> 00:45:19,103 is independent of the actual price that's implemented. 1045 00:45:19,103 --> 00:45:20,520 I'm saying, essentially, I'm going 1046 00:45:20,520 --> 00:45:23,490 to randomly pick one of those 10 or 20 prices here. 1047 00:45:23,490 --> 00:45:26,700 And you're going to tell me, for each of those prices, what you 1048 00:45:26,700 --> 00:45:28,710 want, either buy or not buy. 1049 00:45:28,710 --> 00:45:31,538 And your choices are-- since I'm randomizing afterwards, 1050 00:45:31,538 --> 00:45:32,955 which price selection you're going 1051 00:45:32,955 --> 00:45:38,810 to pick, your choices are irrelevant for which 1052 00:45:38,810 --> 00:45:43,247 of the actual prices are selected. 1053 00:45:43,247 --> 00:45:45,080 And so now essentially get around this issue 1054 00:45:45,080 --> 00:45:47,480 about people shading or essentially underreporting 1055 00:45:47,480 --> 00:45:49,220 their willingness to pay. 1056 00:45:49,220 --> 00:45:52,130 Precisely that's what makes it incentive-compatible. 1057 00:45:52,130 --> 00:45:54,770 And that's essentially what's called the BDM mechanism. 1058 00:45:54,770 --> 00:45:59,155 That's what many economists use in many, many experiments. 1059 00:45:59,155 --> 00:46:00,530 There's another version of what's 1060 00:46:00,530 --> 00:46:07,102 called the BDM which is a more straightforward version of it. 1061 00:46:07,102 --> 00:46:08,810 Notice here you have to ask 20 questions. 1062 00:46:08,810 --> 00:46:10,977 Ask would you like to pay-- would you like to buy it 1063 00:46:10,977 --> 00:46:14,910 for $0.50, $1.00, $1.50, $2.00, and so on and so forth. 1064 00:46:14,910 --> 00:46:20,210 Another version of this is, I'm asking you straight up what's 1065 00:46:20,210 --> 00:46:21,770 your willingness to pay. 1066 00:46:21,770 --> 00:46:25,100 And the bid is then compared to a price determined 1067 00:46:25,100 --> 00:46:26,750 by a random number generator. 1068 00:46:26,750 --> 00:46:28,540 I'm just saying, like, tell me what your willingness to pay 1069 00:46:28,540 --> 00:46:29,210 is. 1070 00:46:29,210 --> 00:46:32,220 Then I'm going to do like a random number generator. 1071 00:46:32,220 --> 00:46:35,030 If the subject's bid is greater than the price that's 1072 00:46:35,030 --> 00:46:37,670 generated by the random number generator, 1073 00:46:37,670 --> 00:46:40,430 he or she pays the price, not the announced willingness 1074 00:46:40,430 --> 00:46:43,430 to pay, and receives the item being auctioned. 1075 00:46:43,430 --> 00:46:45,470 If the subject's bid is lower than the price, 1076 00:46:45,470 --> 00:46:48,740 he or she pays nothing and receives nothing. 1077 00:46:48,740 --> 00:46:53,210 And so now here again, the final price the person must pay 1078 00:46:53,210 --> 00:46:55,670 is independent of what the person indicated 1079 00:46:55,670 --> 00:46:57,380 as your willingness to pay, which 1080 00:46:57,380 --> 00:47:00,265 essentially solves the incentive compatibility issue. 1081 00:47:00,265 --> 00:47:01,640 Think of like this version that I 1082 00:47:01,640 --> 00:47:04,670 have on this line as just a more efficient way of eliciting 1083 00:47:04,670 --> 00:47:06,720 the question that's on the previous slide. 1084 00:47:06,720 --> 00:47:08,960 This is essentially exactly the same thing, 1085 00:47:08,960 --> 00:47:11,370 except for that it's a more efficient thing to do. 1086 00:47:11,370 --> 00:47:13,700 The problem with this version, often, is comprehension. 1087 00:47:13,700 --> 00:47:14,533 People get confused. 1088 00:47:14,533 --> 00:47:17,120 Or people tend to still try to bargain and give 1089 00:47:17,120 --> 00:47:18,780 like lower willingness to pay. 1090 00:47:18,780 --> 00:47:21,308 So sometimes, this version works better 1091 00:47:21,308 --> 00:47:23,100 because it's much easier, much transparent, 1092 00:47:23,100 --> 00:47:25,350 when we say, here's a price, do want to buy it or not, 1093 00:47:25,350 --> 00:47:27,200 versus and so on. 1094 00:47:27,200 --> 00:47:29,822 While here it's people are sort of [INAUDIBLE] because they're 1095 00:47:29,822 --> 00:47:31,280 used to, essentially, bargain, they 1096 00:47:31,280 --> 00:47:35,350 tend to sort of under-report at least sometimes. 1097 00:47:35,350 --> 00:47:36,430 Any questions on this? 1098 00:47:45,920 --> 00:47:49,940 So now I'm also going to skip this video. 1099 00:47:49,940 --> 00:47:50,805 It's a fun video. 1100 00:47:50,805 --> 00:47:52,805 Who knows the story of Tom Sawyer and the fence? 1101 00:47:59,130 --> 00:48:02,250 I think it's in the readings, no? 1102 00:48:02,250 --> 00:48:04,180 Tom Sawyer was misbehaving. 1103 00:48:04,180 --> 00:48:06,880 He was essentially punished to paint the fence. 1104 00:48:06,880 --> 00:48:08,750 He really didn't want to do it. 1105 00:48:08,750 --> 00:48:10,980 But then his friend comes along, and essentially, 1106 00:48:10,980 --> 00:48:12,730 he sort of tricks his friend into thinking 1107 00:48:12,730 --> 00:48:15,115 this is really like such a fun task, 1108 00:48:15,115 --> 00:48:16,870 and it ends up being essentially, 1109 00:48:16,870 --> 00:48:21,190 the friend even paying him for painting the fence, 1110 00:48:21,190 --> 00:48:23,380 and he not having to do it. 1111 00:48:23,380 --> 00:48:25,040 And the point of the story, of course, 1112 00:48:25,040 --> 00:48:30,080 is that people's willingness to pay is malleable in the sense, 1113 00:48:30,080 --> 00:48:33,340 like, even to the extent that people don't even 1114 00:48:33,340 --> 00:48:35,500 know whether they're willing to pay for something, 1115 00:48:35,500 --> 00:48:39,140 or you have to pay them something to do it. 1116 00:48:39,140 --> 00:48:41,140 And so essentially, people's preferences 1117 00:48:41,140 --> 00:48:46,180 are inherently malleable by the way they're being marketed, 1118 00:48:46,180 --> 00:48:49,732 or the way they appear to them, or being sold to them. 1119 00:48:49,732 --> 00:48:53,520 You can watch the video in the slides if you download it. 1120 00:48:53,520 --> 00:48:56,340 For whatever reason this is not working on Zoom, 1121 00:48:56,340 --> 00:48:57,730 at least for now. 1122 00:48:57,730 --> 00:49:01,940 So stepping back a little bit-- so overall, 1123 00:49:01,940 --> 00:49:03,990 so far we talked about two key components 1124 00:49:03,990 --> 00:49:05,250 of individual decision making. 1125 00:49:05,250 --> 00:49:08,340 Utility functions-- what people want and what they care about, 1126 00:49:08,340 --> 00:49:10,350 and beliefs-- how people perceive themselves 1127 00:49:10,350 --> 00:49:12,250 and patterns in the world. 1128 00:49:12,250 --> 00:49:13,960 And so understanding these, of course, 1129 00:49:13,960 --> 00:49:17,580 is important because that determines people's choices. 1130 00:49:17,580 --> 00:49:19,770 Now so far, we have always sort of pretended 1131 00:49:19,770 --> 00:49:22,740 and said people's preferences and beliefs are always 1132 00:49:22,740 --> 00:49:23,670 sharply-- 1133 00:49:23,670 --> 00:49:26,400 people are always sharply aware of what they want and believe 1134 00:49:26,400 --> 00:49:28,770 that costs [INAUDIBLE]. 1135 00:49:28,770 --> 00:49:30,790 For example, a homeowner might have 1136 00:49:30,790 --> 00:49:32,760 reference-dependent preferences, but they know 1137 00:49:32,760 --> 00:49:34,907 what their preferences are. 1138 00:49:34,907 --> 00:49:36,990 There might be some issues about how the reference 1139 00:49:36,990 --> 00:49:39,600 point is determined, but the preference is the function. 1140 00:49:39,600 --> 00:49:42,467 The functional form is always fixed, and it's known. 1141 00:49:42,467 --> 00:49:44,550 A person might have the wrong theory of the world, 1142 00:49:44,550 --> 00:49:46,860 but she always has some beliefs in mind, what 1143 00:49:46,860 --> 00:49:49,020 she uses to make those choices. 1144 00:49:49,020 --> 00:49:51,420 Or a smoker might act suboptimally, 1145 00:49:51,420 --> 00:49:53,730 but he always has a fully specific strategy 1146 00:49:53,730 --> 00:49:56,400 in mind for all his current and future decisions. 1147 00:49:56,400 --> 00:49:58,020 So people-- the smoker might sort of 1148 00:49:58,020 --> 00:49:59,940 have some issues with present bias, 1149 00:49:59,940 --> 00:50:02,400 or some wrong beliefs in some ways or the other. 1150 00:50:02,400 --> 00:50:04,440 But the utility function was always 1151 00:50:04,440 --> 00:50:08,160 assumed to be fixed and known. 1152 00:50:08,160 --> 00:50:10,510 Now we're going to deviate from that. 1153 00:50:10,510 --> 00:50:14,610 If you think about it like, what are your preferences, 1154 00:50:14,610 --> 00:50:16,980 and what do you like and what do you not like-- 1155 00:50:16,980 --> 00:50:20,610 in many cases, we actually don't know. 1156 00:50:20,610 --> 00:50:23,640 And in many cases, people sort of just 1157 00:50:23,640 --> 00:50:26,560 make things up as they go along. 1158 00:50:26,560 --> 00:50:29,090 So that's to say, like, people are essentially 1159 00:50:29,090 --> 00:50:30,630 making some choices. 1160 00:50:30,630 --> 00:50:32,795 And this one of the examples that I showed you 1161 00:50:32,795 --> 00:50:34,170 from earlier-- the three examples 1162 00:50:34,170 --> 00:50:36,780 that I showed you first, where people are just 1163 00:50:36,780 --> 00:50:39,127 induced to make certain choices one way or the other, 1164 00:50:39,127 --> 00:50:40,710 depending on their choice environment, 1165 00:50:40,710 --> 00:50:42,210 depending on their social influence. 1166 00:50:42,210 --> 00:50:45,480 In Tom Sawyer's case, what he tells his friend. 1167 00:50:45,480 --> 00:50:48,420 His friend is now willing to pay to paint a fence, which 1168 00:50:48,420 --> 00:50:49,840 actually is a punishment. 1169 00:50:49,840 --> 00:50:54,730 So people's preferences seem inherently malleable. 1170 00:50:54,730 --> 00:50:58,890 And so that's what this paper about coherent arbitrariness 1171 00:50:58,890 --> 00:51:01,180 is very much about. 1172 00:51:01,180 --> 00:51:07,050 And so if you think about it, in some ways, almost all 1173 00:51:07,050 --> 00:51:11,730 of economics, and any class that you have taken, the professor-- 1174 00:51:11,730 --> 00:51:15,540 or in any book, people will write down the utility function 1175 00:51:15,540 --> 00:51:17,980 as if that's some truth that we know about the world. 1176 00:51:17,980 --> 00:51:20,610 And this is what the utility function looks like. 1177 00:51:20,610 --> 00:51:22,658 But if you think about it, in many cases, 1178 00:51:22,658 --> 00:51:24,450 actually, we don't know what is the utility 1179 00:51:24,450 --> 00:51:24,930 function [INAUDIBLE]. 1180 00:51:24,930 --> 00:51:26,597 If somebody asked [INAUDIBLE] particular 1181 00:51:26,597 --> 00:51:30,240 about new goods, who knows what our preferences actually are. 1182 00:51:30,240 --> 00:51:33,700 So for example, if you say, you're trying to buy a monitor, 1183 00:51:33,700 --> 00:51:36,120 and there's like a 30 inch monitor versus a 24 inch 1184 00:51:36,120 --> 00:51:40,140 monitor, who knows how much you're willing to pay for that. 1185 00:51:40,140 --> 00:51:43,770 Maybe $50, maybe $100, maybe $20, maybe nothing. 1186 00:51:43,770 --> 00:51:47,460 It's very hard to sort of figure this out. 1187 00:51:47,460 --> 00:51:50,190 And sort of when asked to make these kinds of decisions, 1188 00:51:50,190 --> 00:51:52,877 people tend to sort of construct their preferences 1189 00:51:52,877 --> 00:51:53,460 on this stuff. 1190 00:51:53,460 --> 00:51:56,010 They essentially make stuff up. 1191 00:51:56,010 --> 00:51:59,167 And so because of that, because people 1192 00:51:59,167 --> 00:52:00,750 are sort of fundamentally in some ways 1193 00:52:00,750 --> 00:52:04,980 unsure about their preferences, this construction 1194 00:52:04,980 --> 00:52:08,070 of their valuation is very much easily manipulated 1195 00:52:08,070 --> 00:52:12,600 by, often, cues that should be really irrelevant. 1196 00:52:12,600 --> 00:52:16,470 And so one example that I showed you a little bit-- 1197 00:52:16,470 --> 00:52:19,560 I think in the second class, or from the survey, 1198 00:52:19,560 --> 00:52:20,910 is what's called anchoring. 1199 00:52:20,910 --> 00:52:23,795 This is also in the Ariely paper, which essentially 1200 00:52:23,795 --> 00:52:25,170 is, what they were doing is, they 1201 00:52:25,170 --> 00:52:30,420 were asking people about their willingness 1202 00:52:30,420 --> 00:52:32,460 to [INAUDIBLE] their Social Security number. 1203 00:52:32,460 --> 00:52:34,043 And then afterwards, their willingness 1204 00:52:34,043 --> 00:52:38,310 to pay for different things, about like, wine, design books, 1205 00:52:38,310 --> 00:52:39,735 chocolates, and so on. 1206 00:52:39,735 --> 00:52:41,610 Notice that these are somewhat unusual things 1207 00:52:41,610 --> 00:52:44,070 that people wouldn't necessarily buy every day. 1208 00:52:44,070 --> 00:52:45,930 And essentially what you see is people 1209 00:52:45,930 --> 00:52:48,690 who are in the highest quintile of their Social Security 1210 00:52:48,690 --> 00:52:49,350 number-- 1211 00:52:49,350 --> 00:52:51,960 this is like year number five-- 1212 00:52:51,960 --> 00:52:53,760 have much higher willingness to pay. 1213 00:52:53,760 --> 00:52:55,590 And this is also all incentive-compatible 1214 00:52:55,590 --> 00:52:57,840 [INAUDIBLE] methods essentially of using 1215 00:52:57,840 --> 00:53:00,300 some form of BDM methods. 1216 00:53:00,300 --> 00:53:03,750 People's willingness to pay is way higher when their Social 1217 00:53:03,750 --> 00:53:04,890 Security number-- 1218 00:53:04,890 --> 00:53:08,460 the last digits of their Social Security number are higher. 1219 00:53:08,460 --> 00:53:10,290 Now of course, that shouldn't be the case. 1220 00:53:10,290 --> 00:53:13,410 Your Social Security number has nothing to do with your tastes 1221 00:53:13,410 --> 00:53:14,700 for wine. 1222 00:53:14,700 --> 00:53:17,280 Because that's [INAUDIBLE] explicitly random. 1223 00:53:17,280 --> 00:53:20,070 Yet, people are easily manipulable. 1224 00:53:20,070 --> 00:53:23,610 And the difference in sort of valuations is like, [? huge. ?] 1225 00:53:23,610 --> 00:53:26,340 If you happen to have a low Social Security number, 1226 00:53:26,340 --> 00:53:33,050 or the last digit are low, then you are willing to pay $11.73. 1227 00:53:33,050 --> 00:53:35,900 If it's high, you're willing to pay like three times or even 1228 00:53:35,900 --> 00:53:37,250 more as much. 1229 00:53:37,250 --> 00:53:41,630 So the huge differences show that people are pretty easily 1230 00:53:41,630 --> 00:53:42,620 manipulable. 1231 00:53:42,620 --> 00:53:45,580 Now notice that these questions that I asked here 1232 00:53:45,580 --> 00:53:49,040 are sort of somewhat unusual sort of items. 1233 00:53:49,040 --> 00:53:50,540 This is not asking like how much are 1234 00:53:50,540 --> 00:53:52,490 you willing to pay for food at the food truck, 1235 00:53:52,490 --> 00:53:57,660 or whatever stuff that you do every day, or for a pizza. 1236 00:53:57,660 --> 00:53:59,780 Because there, people kind of know already 1237 00:53:59,780 --> 00:54:01,200 how much they're paying anyway. 1238 00:54:01,200 --> 00:54:03,090 And if I just ask you something else that's 1239 00:54:03,090 --> 00:54:05,090 different from the market price, people probably 1240 00:54:05,090 --> 00:54:09,650 would not want to be paying like $30 for it, 1241 00:54:09,650 --> 00:54:14,010 even if their Social Security number tends to be quite high. 1242 00:54:14,010 --> 00:54:17,120 Now, Ariely et al go further with this. 1243 00:54:17,120 --> 00:54:21,413 In some sense they have sort of more radical deviations. 1244 00:54:21,413 --> 00:54:23,330 So they elicit people's willingness to accept. 1245 00:54:23,330 --> 00:54:26,450 And this is how much they have to be 1246 00:54:26,450 --> 00:54:29,540 paid to endure an unpleasant sound 1247 00:54:29,540 --> 00:54:32,940 for a different length of time. 1248 00:54:32,940 --> 00:54:36,800 Now, why did they pick like an unpleasant sound? 1249 00:54:36,800 --> 00:54:38,630 In part, because they could provide people 1250 00:54:38,630 --> 00:54:39,890 like a sample of it. 1251 00:54:39,890 --> 00:54:43,040 In part because people don't have any experience with that. 1252 00:54:43,040 --> 00:54:45,330 It's completely unclear how much you are 1253 00:54:45,330 --> 00:54:47,090 supposed to be willing to pay-- 1254 00:54:47,090 --> 00:54:50,060 are willing to accept for listening to a sound. 1255 00:54:50,060 --> 00:54:52,103 So there's no price or market price for it. 1256 00:54:52,103 --> 00:54:54,020 And people sort of had to sort of in some ways 1257 00:54:54,020 --> 00:54:56,250 rely on their own preferences for that. 1258 00:54:56,250 --> 00:54:58,800 And then they know it's very easy to change the quantity. 1259 00:54:58,800 --> 00:55:01,320 You can do like 10 seconds, 30 seconds, 50 seconds. 1260 00:55:01,320 --> 00:55:05,320 It's very easy to manipulate that. 1261 00:55:05,320 --> 00:55:07,030 Now what is the procedure? 1262 00:55:07,030 --> 00:55:12,320 Subjects we're listening to a 30 second sample of the noise. 1263 00:55:12,320 --> 00:55:16,390 Then there were answering whether they hypothetically 1264 00:55:16,390 --> 00:55:19,210 would be willing to listen to the noise for another 30 1265 00:55:19,210 --> 00:55:20,970 seconds for x cents. 1266 00:55:20,970 --> 00:55:22,720 And then they were asked their willingness 1267 00:55:22,720 --> 00:55:27,700 to accept for 10, 30 and 60 seconds afterwards. 1268 00:55:27,700 --> 00:55:32,170 Notice that number two is only a hypothetical question, 1269 00:55:32,170 --> 00:55:35,870 and is essentially entirely irrelevant for number three. 1270 00:55:35,870 --> 00:55:36,370 Sorry. 1271 00:55:36,370 --> 00:55:39,040 This is supposed to say number three, not number one. 1272 00:55:39,040 --> 00:55:42,200 So number two here is like completely irrelevant. 1273 00:55:42,200 --> 00:55:45,100 This should really have no effect whatsoever 1274 00:55:45,100 --> 00:55:48,910 in affecting your willingness to accept for the 10, 30 and 60 1275 00:55:48,910 --> 00:55:49,670 seconds. 1276 00:55:49,670 --> 00:55:51,580 Again, this is just a hypothetical question 1277 00:55:51,580 --> 00:55:53,890 that really does not matter, and is not 1278 00:55:53,890 --> 00:55:57,440 going to be implemented at all. 1279 00:55:57,440 --> 00:55:59,050 And then the experimental variation 1280 00:55:59,050 --> 00:56:04,390 here is that x was varied across subjects. 1281 00:56:04,390 --> 00:56:05,650 Now, why would x matter? 1282 00:56:05,650 --> 00:56:07,310 What's going on here? 1283 00:56:07,310 --> 00:56:10,380 What's the experiment trying to do? 1284 00:56:10,380 --> 00:56:13,592 People are very unsure about what their valuation is. 1285 00:56:13,592 --> 00:56:15,300 So now what they do is, essentially, they 1286 00:56:15,300 --> 00:56:20,580 use the x-- the amount that's offered as like an anchor. 1287 00:56:20,580 --> 00:56:24,570 The same way as in, usually in markets, when you go to a store 1288 00:56:24,570 --> 00:56:27,000 and look at how much do certain things 1289 00:56:27,000 --> 00:56:29,700 cost, and you don't know the quality of the underlying 1290 00:56:29,700 --> 00:56:33,060 items, often people sort of try to infer quality or lack 1291 00:56:33,060 --> 00:56:35,190 of quality from the price. 1292 00:56:35,190 --> 00:56:37,380 And if we sort of say, if I'm telling you 1293 00:56:37,380 --> 00:56:39,210 I'm going to pay you like $100 for this, 1294 00:56:39,210 --> 00:56:40,630 it must be really, really painful. 1295 00:56:40,630 --> 00:56:43,440 So people try to sort of infer in some ways something 1296 00:56:43,440 --> 00:56:44,340 from that. 1297 00:56:44,340 --> 00:56:47,920 Now notice that this is explicitly hypothetical. 1298 00:56:47,920 --> 00:56:48,420 [SOUND] 1299 00:56:48,420 --> 00:56:48,990 And so on. 1300 00:56:48,990 --> 00:56:50,700 And so really it shouldn't matter. 1301 00:56:50,700 --> 00:56:52,200 If people were sure about like-- 1302 00:56:52,200 --> 00:56:53,580 I just played you the sound. 1303 00:56:53,580 --> 00:56:55,580 So you shouldn't be able to tell me how much you 1304 00:56:55,580 --> 00:56:56,620 like it or dislike it. 1305 00:56:56,620 --> 00:56:58,240 So this should really be irrelevant. 1306 00:56:58,240 --> 00:57:00,490 But people seem to sort of essentially just not really 1307 00:57:00,490 --> 00:57:02,640 know what's appropriate, or how much they're 1308 00:57:02,640 --> 00:57:05,820 willing to accept or not to do that. 1309 00:57:05,820 --> 00:57:08,600 And x essentially is sort of then anchoring [INAUDIBLE].. 1310 00:57:08,600 --> 00:57:09,100 OK. 1311 00:57:09,100 --> 00:57:11,560 So now we get these somewhat messy graphs. 1312 00:57:11,560 --> 00:57:15,410 Can somebody explain to me what this graph shows? 1313 00:57:15,410 --> 00:57:17,920 What do we find? 1314 00:57:17,920 --> 00:57:20,950 These are [INAUDIBLE] certain subjects. 1315 00:57:20,950 --> 00:57:24,080 And then there's like 10 seconds, 30 seconds, 1316 00:57:24,080 --> 00:57:25,420 and 60 seconds here. 1317 00:57:25,420 --> 00:57:26,950 So people are always-- 1318 00:57:26,950 --> 00:57:29,470 for each of these lines, on average, at least, 1319 00:57:29,470 --> 00:57:31,600 willingness to accept goes up, in the way 1320 00:57:31,600 --> 00:57:34,250 you expect, in the sense of, for more time, people 1321 00:57:34,250 --> 00:57:35,890 are asking for more. 1322 00:57:35,890 --> 00:57:37,760 That's very reasonable. 1323 00:57:37,760 --> 00:57:40,360 So 30 seconds, presumably, are worse than 10 seconds. 1324 00:57:40,360 --> 00:57:43,240 And 60 seconds are worse than 10 seconds. 1325 00:57:43,240 --> 00:57:46,120 But the levels seem to be completely arbitrary. 1326 00:57:46,120 --> 00:57:48,430 Essentially, giving people a high anchor 1327 00:57:48,430 --> 00:57:52,570 increases the levels by a lot for each duration. 1328 00:57:52,570 --> 00:57:55,130 And giving people a low anchor decreases their willingness 1329 00:57:55,130 --> 00:57:57,255 to pay at least a little bit compared to no anchor, 1330 00:57:57,255 --> 00:57:59,890 and surely compared to the [? high anchor. ?] 1331 00:57:59,890 --> 00:58:00,702 That is to say-- 1332 00:58:00,702 --> 00:58:02,660 and this is sort of where this term comes from, 1333 00:58:02,660 --> 00:58:06,130 coherent arbitrariness, which is, essentially, people 1334 00:58:06,130 --> 00:58:08,230 seem to be coherent in the sense of like, 1335 00:58:08,230 --> 00:58:11,890 once you fix a certain level, based on that level, 1336 00:58:11,890 --> 00:58:14,570 if you ask them, OK, if you tell me about 10 seconds 1337 00:58:14,570 --> 00:58:17,020 and I ask you about 30 seconds, about 60 seconds, 1338 00:58:17,020 --> 00:58:19,900 this demand curve or supply curve, if you want, 1339 00:58:19,900 --> 00:58:21,400 looks pretty reasonable. 1340 00:58:21,400 --> 00:58:23,620 But the actual level to start with 1341 00:58:23,620 --> 00:58:27,520 is completely arbitrary, because I can essentially 1342 00:58:27,520 --> 00:58:29,380 manipulate you by quite a bit. 1343 00:58:29,380 --> 00:58:32,090 Look at the differences in magnitudes. 1344 00:58:32,090 --> 00:58:33,460 This is like 50 versus 30. 1345 00:58:33,460 --> 00:58:34,840 That's almost like twice as much, 1346 00:58:34,840 --> 00:58:37,937 depending on just this [INAUDIBLE] anchor. 1347 00:58:37,937 --> 00:58:39,770 And there's sort of different rounds in that 1348 00:58:39,770 --> 00:58:41,853 that you can sort of do this one way or the other. 1349 00:58:41,853 --> 00:58:45,010 And this is like essentially increasing or decreasing, 1350 00:58:45,010 --> 00:58:48,360 starting in 10 seconds versus-- and then going up. 1351 00:58:48,360 --> 00:58:52,170 People's willingness to accept goes up the longer it is. 1352 00:58:52,170 --> 00:58:54,420 Or if you start with 60 seconds and go down, 1353 00:58:54,420 --> 00:58:56,220 essentially, people's willingness to accept 1354 00:58:56,220 --> 00:58:57,880 goes down the shorter it is. 1355 00:58:57,880 --> 00:58:59,800 So the direction is very much coherent. 1356 00:58:59,800 --> 00:59:01,770 The direction of essentially this sort 1357 00:59:01,770 --> 00:59:05,340 of change in terms of relative to duration 1358 00:59:05,340 --> 00:59:06,720 is very much coherent. 1359 00:59:06,720 --> 00:59:09,930 The level seems very much arbitrary. 1360 00:59:09,930 --> 00:59:14,370 And so this is where the title of the paper comes from. 1361 00:59:14,370 --> 00:59:17,880 Arbitrariness, essentially, people's willingness to accept 1362 00:59:17,880 --> 00:59:20,390 depended strongly on x. 1363 00:59:20,390 --> 00:59:22,140 For x equals 50, the willingness to accept 1364 00:59:22,140 --> 00:59:25,350 is like about $0.59 on average. 1365 00:59:25,350 --> 00:59:29,040 For x equals 10, it's only $0.40. 1366 00:59:29,040 --> 00:59:32,760 But it's also coherent in very sensible ways. 1367 00:59:32,760 --> 00:59:34,770 Their willingness to accept are highly 1368 00:59:34,770 --> 00:59:36,390 sensitive to the duration in the-- 1369 00:59:36,390 --> 00:59:38,670 very much in the expected direction. 1370 00:59:38,670 --> 00:59:43,640 Longer is always perceived to be more painful. 1371 00:59:43,640 --> 00:59:44,180 OK. 1372 00:59:44,180 --> 00:59:45,430 So how do we think about this? 1373 00:59:45,430 --> 00:59:50,612 Well, so one, preferences can be influenced by relevant cues. 1374 00:59:50,612 --> 00:59:52,820 For instance, [INAUDIBLE] arbitrary initial question, 1375 00:59:52,820 --> 00:59:54,862 or the Social Security number, or whatever that's 1376 00:59:54,862 --> 00:59:56,490 being elicited to start with. 1377 00:59:56,490 --> 01:00:00,440 But once people have stated a preference, 1378 01:00:00,440 --> 01:00:03,080 related preferences, they're like, essentially, 1379 01:00:03,080 --> 01:00:07,010 surrounding preferences are consistent in a sense of like, 1380 01:00:07,010 --> 01:00:10,380 if you think listening to the sound is painful, 1381 01:00:10,380 --> 01:00:14,910 asking you to listen to it twice as long will be more painful. 1382 01:00:14,910 --> 01:00:16,460 Therefore, I have to pay you more 1383 01:00:16,460 --> 01:00:21,860 to do that compared to what you said in the first place. 1384 01:00:21,860 --> 01:00:23,350 Any questions or is that clear? 1385 01:00:25,980 --> 01:00:27,490 OK. 1386 01:00:27,490 --> 01:00:29,530 So now there are some concerns about x 1387 01:00:29,530 --> 01:00:32,800 might be viewed as a hint from the experimenters, 1388 01:00:32,800 --> 01:00:34,390 how bad the sound is. 1389 01:00:34,390 --> 01:00:36,730 But you know, people just listened to the sound 1390 01:00:36,730 --> 01:00:38,200 for 30 seconds to start with. 1391 01:00:38,200 --> 01:00:40,100 Like, I gave you the sound for 30 seconds. 1392 01:00:40,100 --> 01:00:41,930 I told you exactly what it is. 1393 01:00:41,930 --> 01:00:47,120 So in some sense, that seems like not really a concern 1394 01:00:47,120 --> 01:00:47,630 anyway. 1395 01:00:47,630 --> 01:00:49,250 But there's also-- the very end of this experiment, 1396 01:00:49,250 --> 01:00:51,440 where x was generated by the last two digits 1397 01:00:51,440 --> 01:00:55,130 of the subject's Social Security number, 1398 01:00:55,130 --> 01:00:58,190 and explicitly so, and still there was a correlation between 1399 01:00:58,190 --> 01:01:00,680 essentially x and subjects' willingness to accept, 1400 01:01:00,680 --> 01:01:03,740 which really shouldn't be in any other-- 1401 01:01:03,740 --> 01:01:06,070 in the absence of such anchoring effects. 1402 01:01:06,070 --> 01:01:07,280 Are stakes too low? 1403 01:01:07,280 --> 01:01:09,740 I told you about like, $0.30, and $0.50. 1404 01:01:09,740 --> 01:01:11,592 Maybe people just don't care, and so on. 1405 01:01:11,592 --> 01:01:13,550 But there's also another experiment [INAUDIBLE] 1406 01:01:13,550 --> 01:01:14,390 10-fold stakes. 1407 01:01:14,390 --> 01:01:16,350 And they got essentially the same results. 1408 01:01:20,110 --> 01:01:22,950 So this evidence now is very much consistent with the idea 1409 01:01:22,950 --> 01:01:25,380 that subjects are searching for their preferences. 1410 01:01:25,380 --> 01:01:27,240 They don't quite know their true willingness 1411 01:01:27,240 --> 01:01:30,940 to accept for the sound, and this is essentially arbitrary. 1412 01:01:30,940 --> 01:01:33,000 But they know their willingness to accept 1413 01:01:33,000 --> 01:01:35,370 should relate to each other in a coherent way. 1414 01:01:35,370 --> 01:01:38,310 And in fact, once you fix the level in some ways, 1415 01:01:38,310 --> 01:01:42,150 they are in fact coherent, or essentially they're 1416 01:01:42,150 --> 01:01:43,320 sort of making sense. 1417 01:01:43,320 --> 01:01:45,240 They're [INAUDIBLE] consistent with that, 1418 01:01:45,240 --> 01:01:48,060 once you fix a certain level. 1419 01:01:48,060 --> 01:01:52,930 Now this is now getting back to Tom Sawyer. 1420 01:01:52,930 --> 01:01:56,300 Now, the paper goes-- so this essentially is sort of saying, 1421 01:01:56,300 --> 01:01:59,630 your level of willingness to accept for a certain good 1422 01:01:59,630 --> 01:02:02,510 that's unpleasant is malleable. 1423 01:02:02,510 --> 01:02:04,130 Essentially, I can manipulate you 1424 01:02:04,130 --> 01:02:08,960 to ask for a lower or a higher price, 1425 01:02:08,960 --> 01:02:11,450 depending on some irrelevant question that I ask you, 1426 01:02:11,450 --> 01:02:14,990 for a given thing that is perceived to be unpleasant. 1427 01:02:14,990 --> 01:02:19,010 Similarly, I can manipulate you to pay more or less 1428 01:02:19,010 --> 01:02:21,530 for a certain good that's perceived 1429 01:02:21,530 --> 01:02:23,750 to be a good, something that you want. 1430 01:02:23,750 --> 01:02:25,880 So the stuff on the Social Security number 1431 01:02:25,880 --> 01:02:29,130 that I showed you previously, going back, 1432 01:02:29,130 --> 01:02:30,390 these are all kind of things-- 1433 01:02:30,390 --> 01:02:32,760 presumably, there's some use for these things. 1434 01:02:32,760 --> 01:02:35,550 Belgian chocolates are supposed to be delicious. 1435 01:02:35,550 --> 01:02:37,230 Red wine, even if you don't like wine, 1436 01:02:37,230 --> 01:02:39,190 you can sell it to somebody else or something. 1437 01:02:39,190 --> 01:02:42,600 These are all things that seem to be worth willing 1438 01:02:42,600 --> 01:02:45,210 to pay for something. 1439 01:02:45,210 --> 01:02:47,460 And now essentially, given that you 1440 01:02:47,460 --> 01:02:50,430 are willing to pay something, or some positive amount for it, 1441 01:02:50,430 --> 01:02:53,580 I can manipulate now-- or the experimenters, in this case, 1442 01:02:53,580 --> 01:02:58,230 can manipulate people into higher or lower willingness 1443 01:02:58,230 --> 01:02:59,690 to pay. 1444 01:02:59,690 --> 01:03:02,650 And so that's true for positive or negative things. 1445 01:03:02,650 --> 01:03:05,680 Now, what's amazing about the Tom Sawyer story, 1446 01:03:05,680 --> 01:03:08,540 however, isn't the-- 1447 01:03:08,540 --> 01:03:09,820 it's ain't that work? 1448 01:03:09,820 --> 01:03:12,340 It's sort of like, that's the expression he sort of says 1449 01:03:12,340 --> 01:03:13,870 towards the end. 1450 01:03:13,870 --> 01:03:16,690 The amazing part here is that not only 1451 01:03:16,690 --> 01:03:20,770 is Tom Sawyer able to manipulate his friend's willingness 1452 01:03:20,770 --> 01:03:25,090 to accept, or willingness to pay in one direction or-- 1453 01:03:25,090 --> 01:03:27,670 increase or decrease it, but he's even able 1454 01:03:27,670 --> 01:03:30,155 to flip it from something that he 1455 01:03:30,155 --> 01:03:32,530 hates doing, that's essentially something that you really 1456 01:03:32,530 --> 01:03:34,510 do not want to do, where essentially one has 1457 01:03:34,510 --> 01:03:37,840 to pay somebody to do it, but instead, his friend 1458 01:03:37,840 --> 01:03:41,135 is willing to pay Tom Sawyer for being able to do it, right? 1459 01:03:41,135 --> 01:03:42,760 The friend, now, instead of like having 1460 01:03:42,760 --> 01:03:45,260 the friend to pay for it, you can say, I have to pay you $10 1461 01:03:45,260 --> 01:03:48,190 to do it, the friend is very happy to say, Tom, 1462 01:03:48,190 --> 01:03:49,838 I'm going to pay you some amount. 1463 01:03:49,838 --> 01:03:51,880 So I think he gives him some apples or some candy 1464 01:03:51,880 --> 01:03:55,570 or whatever, so that he can actually do it himself. 1465 01:03:55,570 --> 01:03:57,670 So essentially, what the manipulation here does 1466 01:03:57,670 --> 01:04:00,280 is not only changing the level for something 1467 01:04:00,280 --> 01:04:02,890 that's good or bad, but it's flipping the sign 1468 01:04:02,890 --> 01:04:06,130 from willingness to accept to willingness to pay, which 1469 01:04:06,130 --> 01:04:08,387 is a more radical deviation. 1470 01:04:08,387 --> 01:04:10,470 You might sort of think we know what's good for us 1471 01:04:10,470 --> 01:04:11,630 and what's bad for us. 1472 01:04:11,630 --> 01:04:13,630 But it seems to be that what Tom Sawyer is doing 1473 01:04:13,630 --> 01:04:17,764 is manipulating sort of the social perception of the item 1474 01:04:17,764 --> 01:04:20,380 or the activity, and that sort of flips essentially 1475 01:04:20,380 --> 01:04:22,930 the sign of the item. 1476 01:04:22,930 --> 01:04:27,880 So then Ariely now does that in a very beautiful experiment 1477 01:04:27,880 --> 01:04:28,420 in class. 1478 01:04:28,420 --> 01:04:31,180 And I always have wanted to sort of replicate this, 1479 01:04:31,180 --> 01:04:34,000 but I'm not sure I should. 1480 01:04:34,000 --> 01:04:37,510 So what he does is, or at the time, this is with MBAs-- 1481 01:04:37,510 --> 01:04:42,340 where he did a poetry reading from Walt Whitman's 1482 01:04:42,340 --> 01:04:44,020 Leaves of Grass. 1483 01:04:44,020 --> 01:04:47,725 And there, he has an experiment, where half of the class 1484 01:04:47,725 --> 01:04:49,588 is asked hypothetically-- 1485 01:04:49,588 --> 01:04:51,130 again, hypothetically, whether they'd 1486 01:04:51,130 --> 01:04:53,590 be willing to pay $10 to listen to Ariely 1487 01:04:53,590 --> 01:04:55,450 recite poetry for 10 minutes. 1488 01:04:55,450 --> 01:04:59,440 So he's like, OK, in class, some of you 1489 01:04:59,440 --> 01:05:01,120 are able to listen to this. 1490 01:05:01,120 --> 01:05:02,320 But it could be like-- 1491 01:05:02,320 --> 01:05:05,230 I think it's outside of class, because otherwise [INAUDIBLE].. 1492 01:05:05,230 --> 01:05:09,460 And are you willing to pay $10 for attending this poetry 1493 01:05:09,460 --> 01:05:10,220 reading? 1494 01:05:10,220 --> 01:05:12,220 And I think actually, Ariely is a very good sort 1495 01:05:12,220 --> 01:05:13,120 of story reader. 1496 01:05:13,120 --> 01:05:15,040 So it might actually be fun to listen to it. 1497 01:05:15,040 --> 01:05:17,050 But some of you know to ask hypothetically, 1498 01:05:17,050 --> 01:05:19,690 again, are you willing to pay $10 for it? 1499 01:05:19,690 --> 01:05:21,700 The other half then are asked hypothetically 1500 01:05:21,700 --> 01:05:25,660 are they're willing to accept $10 to listen to Ariely recite 1501 01:05:25,660 --> 01:05:26,920 the poetry for 10 minutes. 1502 01:05:26,920 --> 01:05:30,190 That's to say, it sounds like it's pretty painful 1503 01:05:30,190 --> 01:05:31,570 to listen to Ariely. 1504 01:05:31,570 --> 01:05:33,430 He's going to pay you $10 to do it. 1505 01:05:33,430 --> 01:05:36,920 And are you willing to do that? 1506 01:05:36,920 --> 01:05:39,110 Notice that here is only hypothetical. 1507 01:05:39,110 --> 01:05:40,700 These are all hypothetical choices 1508 01:05:40,700 --> 01:05:42,455 that will not be implemented. 1509 01:05:42,455 --> 01:05:44,330 So these hypothetical questions should really 1510 01:05:44,330 --> 01:05:47,810 have no effect whatsoever on what's asked afterwards. 1511 01:05:47,810 --> 01:05:50,510 Because afterwards, people are in fact indicating 1512 01:05:50,510 --> 01:05:53,210 their monetary valuations for one, three or six 1513 01:05:53,210 --> 01:05:56,230 minutes of poetry reading. 1514 01:05:56,230 --> 01:05:59,340 Now you can already guess what's happening here is here, 1515 01:05:59,340 --> 01:06:00,780 the first condition is essentially 1516 01:06:00,780 --> 01:06:02,460 asking for people's willingness to pay. 1517 01:06:02,460 --> 01:06:03,930 That's essentially selling the item 1518 01:06:03,930 --> 01:06:06,180 as like a good, that's something that you really want. 1519 01:06:06,180 --> 01:06:09,060 And you're sort of essentially encouraging people 1520 01:06:09,060 --> 01:06:11,880 to say, well, you should be willing to pay 1521 01:06:11,880 --> 01:06:16,440 something for it, even as much as like $10 for 10 minutes. 1522 01:06:16,440 --> 01:06:18,270 And that'll [INAUDIBLE] encourage people, 1523 01:06:18,270 --> 01:06:22,140 since they don't quite know, listening to your professor 1524 01:06:22,140 --> 01:06:24,480 is like doing some weird poetry could be really great, 1525 01:06:24,480 --> 01:06:27,060 or could be really bad, they don't quite know. 1526 01:06:27,060 --> 01:06:30,270 But here, essentially, they're manipulated into paying for it, 1527 01:06:30,270 --> 01:06:34,440 while here they're manipulated into being paid for it, 1528 01:06:34,440 --> 01:06:37,890 or asking for money to be paid for it. 1529 01:06:37,890 --> 01:06:43,765 So now, what they then get is their willingness to pay, 1530 01:06:43,765 --> 01:06:45,390 and their willingness to pay conditions 1531 01:06:45,390 --> 01:06:47,160 when the first condition here, where 1532 01:06:47,160 --> 01:06:51,870 they are given the hypothetical $10 for 10 minutes. 1533 01:06:51,870 --> 01:06:56,260 Here, the willingness to pay goes up and is positive. 1534 01:06:56,260 --> 01:06:59,110 And here, it's negative and goes down. 1535 01:06:59,110 --> 01:07:03,100 So what we see essentially is that not only is it 1536 01:07:03,100 --> 01:07:06,760 the case that when you make it essential, when you ask people 1537 01:07:06,760 --> 01:07:08,200 how much are they willing to pay, 1538 01:07:08,200 --> 01:07:10,890 people are in fact afterwards willing to pay for it. 1539 01:07:10,890 --> 01:07:13,120 But notice that everybody could have just said zero 1540 01:07:13,120 --> 01:07:14,650 if they didn't want to pay. 1541 01:07:14,650 --> 01:07:18,190 But in addition, once you sort of 1542 01:07:18,190 --> 01:07:21,490 fix their willingness to pay, the demand curve 1543 01:07:21,490 --> 01:07:24,790 is very much sensible in a sense. 1544 01:07:24,790 --> 01:07:27,195 Once you're willing to pay some amount for one minute, 1545 01:07:27,195 --> 01:07:28,570 you're going to be willing to pay 1546 01:07:28,570 --> 01:07:31,240 more money for more minutes. 1547 01:07:31,240 --> 01:07:33,280 Essentially, once we define this good to be 1548 01:07:33,280 --> 01:07:35,830 a good for you, or like this item or this activity 1549 01:07:35,830 --> 01:07:37,570 to be like a good thing for you, people 1550 01:07:37,570 --> 01:07:39,190 say more must be better, and sort of 1551 01:07:39,190 --> 01:07:42,020 are willing to pay more for more minutes. 1552 01:07:42,020 --> 01:07:47,320 On the other hand, once the item is determined as like a bad, 1553 01:07:47,320 --> 01:07:49,450 in the sense of this is really a bad activity. 1554 01:07:49,450 --> 01:07:52,000 Listening to your professor reciting 1555 01:07:52,000 --> 01:07:55,993 poetry is really awful, and you have to be paid for that, 1556 01:07:55,993 --> 01:07:57,910 then they sort of say, well one minute is bad, 1557 01:07:57,910 --> 01:07:59,680 but like six minutes surely is really bad. 1558 01:07:59,680 --> 01:08:02,410 And you have to pay me a lot more for six minutes than 1559 01:08:02,410 --> 01:08:04,270 for one minute. 1560 01:08:04,270 --> 01:08:05,920 And that's essentially saying subjects 1561 01:08:05,920 --> 01:08:09,430 don't know whether this reading is good or bad. 1562 01:08:09,430 --> 01:08:11,180 But they do know, essentially, either way, 1563 01:08:11,180 --> 01:08:13,180 more requires more money. 1564 01:08:13,180 --> 01:08:16,000 And that's essentially exactly the coherent arbitrariness, 1565 01:08:16,000 --> 01:08:19,180 where essentially, the level or even like 1566 01:08:19,180 --> 01:08:20,680 the sign is really unclear. 1567 01:08:20,680 --> 01:08:22,180 But once the sign is sort of fixed, 1568 01:08:22,180 --> 01:08:26,340 people behave in pretty reasonable ways. 1569 01:08:26,340 --> 01:08:27,670 I actually don't remember. 1570 01:08:27,670 --> 01:08:29,406 I should look it up-- 1571 01:08:29,406 --> 01:08:33,569 surely in the paper, whether it's possible in this condition 1572 01:08:33,569 --> 01:08:34,439 to actually have-- 1573 01:08:34,439 --> 01:08:36,147 [INAUDIBLE] so the [INAUDIBLE] experiment 1574 01:08:36,147 --> 01:08:39,060 would surely always ask you a whole range where 1575 01:08:39,060 --> 01:08:43,540 you can always say, between say, minus six or minus 10, 1576 01:08:43,540 --> 01:08:44,790 and plus 10. 1577 01:08:44,790 --> 01:08:47,569 Now tell me your willingness to accept versus willingness 1578 01:08:47,569 --> 01:08:48,069 to pay. 1579 01:08:48,069 --> 01:08:49,944 So if it's positive, it's willingness to pay. 1580 01:08:49,944 --> 01:08:52,020 If it's negative, it's willingness to accept. 1581 01:08:52,020 --> 01:08:53,189 I think that's what's-- 1582 01:08:53,189 --> 01:08:54,689 so surely, that's how the experiment 1583 01:08:54,689 --> 01:08:56,010 should have been done. 1584 01:08:56,010 --> 01:08:58,838 I would like to think and hope that's also 1585 01:08:58,838 --> 01:08:59,880 how it was actually done. 1586 01:08:59,880 --> 01:09:01,080 I don't remember. 1587 01:09:01,080 --> 01:09:02,543 This is quite a while ago. 1588 01:09:02,543 --> 01:09:04,210 But it's really in the paper, so I don't 1589 01:09:04,210 --> 01:09:05,609 know if anybody remembers. 1590 01:09:05,609 --> 01:09:07,312 But I think it's exactly right. 1591 01:09:07,312 --> 01:09:09,270 In some sense, the experiment is very much sort 1592 01:09:09,270 --> 01:09:11,340 of manipulating people in the way of saying like, 1593 01:09:11,340 --> 01:09:16,319 well, so the prompt here really gets you into like, OK, you're 1594 01:09:16,319 --> 01:09:17,463 supposed to pay for this. 1595 01:09:17,463 --> 01:09:19,380 Now it's kind of a little bit of a weird thing 1596 01:09:19,380 --> 01:09:21,630 to say I'm willing to accept. 1597 01:09:21,630 --> 01:09:26,189 Professor says, are you willing to pay $10 for it? 1598 01:09:26,189 --> 01:09:27,689 And then you say, well, how much are 1599 01:09:27,689 --> 01:09:30,390 you willing to pay for like six minutes? 1600 01:09:30,390 --> 01:09:32,580 And then you say, well you have to pay me like $5, 1601 01:09:32,580 --> 01:09:35,340 is a bit of a weird thing to say and the other way around. 1602 01:09:35,340 --> 01:09:38,390 So it's very much sort of like I think a set up 1603 01:09:38,390 --> 01:09:40,950 to sort of generate the effects that they're looking for. 1604 01:09:40,950 --> 01:09:43,719 I think the underlying-- so I think the underlying essence-- 1605 01:09:43,719 --> 01:09:46,399 and there are some other experiments that show somewhat 1606 01:09:46,399 --> 01:09:47,580 similar results-- 1607 01:09:47,580 --> 01:09:50,600 the underlying essence is right that particularly 1608 01:09:50,600 --> 01:09:53,240 for unknown goods, people's preferences 1609 01:09:53,240 --> 01:09:54,770 are very much malleable. 1610 01:09:54,770 --> 01:09:57,182 And people just don't know what they are. 1611 01:09:57,182 --> 01:09:58,640 And there's also some other things. 1612 01:09:58,640 --> 01:10:01,970 For example, if you think about, for example, social activities, 1613 01:10:01,970 --> 01:10:05,090 where maybe your friends like them, or they don't like them, 1614 01:10:05,090 --> 01:10:07,880 and people are very much like-- and this is exactly 1615 01:10:07,880 --> 01:10:10,490 the Tom Sawyer example, where people are very much sort 1616 01:10:10,490 --> 01:10:12,020 of malleable. 1617 01:10:12,020 --> 01:10:14,660 And you can vastly shape what people 1618 01:10:14,660 --> 01:10:18,080 want versus not in many cases. 1619 01:10:18,080 --> 01:10:20,730 And for example, one example would be like, 1620 01:10:20,730 --> 01:10:23,060 there's some very nice work by Loewenstein 1621 01:10:23,060 --> 01:10:27,020 and coauthors that looks at education. 1622 01:10:27,020 --> 01:10:30,730 And sort of, do students want to study versus not. 1623 01:10:30,730 --> 01:10:34,130 And do you want to be a nerd versus not in class. 1624 01:10:34,130 --> 01:10:37,550 And so depending very much on your environment as a kid, 1625 01:10:37,550 --> 01:10:40,310 when you grow up, if all of your friends 1626 01:10:40,310 --> 01:10:42,830 are sort of essentially not working hard, and sort of 1627 01:10:42,830 --> 01:10:45,740 want to be cool, and sort of do not want to study, 1628 01:10:45,740 --> 01:10:48,380 and it's not a cool thing in your environment to study, 1629 01:10:48,380 --> 01:10:50,450 people might just not want that. 1630 01:10:50,450 --> 01:10:53,742 But on the other hand, if you have like five nerd friends who 1631 01:10:53,742 --> 01:10:55,700 are all working really hard, or like, if you're 1632 01:10:55,700 --> 01:10:57,990 running around MIT, studying is a cool thing. 1633 01:10:57,990 --> 01:11:00,050 And everybody sort of is working hard. 1634 01:11:00,050 --> 01:11:04,890 And that becomes suddenly very much like a desirable activity. 1635 01:11:04,890 --> 01:11:06,560 So I think more generally, while this 1636 01:11:06,560 --> 01:11:08,660 is a bit of a contrived experiment, 1637 01:11:08,660 --> 01:11:11,900 surely people's preferences are very much 1638 01:11:11,900 --> 01:11:16,310 malleable through their social environments, 1639 01:11:16,310 --> 01:11:19,790 and what they think others think, 1640 01:11:19,790 --> 01:11:21,380 and how they perceive it. 1641 01:11:21,380 --> 01:11:23,035 And that can be [INAUDIBLE] shaped by-- 1642 01:11:26,470 --> 01:11:29,090 manipulated in certain ways, and potentially affected. 1643 01:11:29,090 --> 01:11:31,520 That could be very much like a policy angle 1644 01:11:31,520 --> 01:11:33,710 if you wanted to change people's behavior profoundly 1645 01:11:33,710 --> 01:11:38,340 without spending much money, in fact. 1646 01:11:38,340 --> 01:11:41,600 Let me-- I think that's mostly what I have to say. 1647 01:11:41,600 --> 01:11:44,240 Let me summarize, and then tell you about next week 1648 01:11:44,240 --> 01:11:46,170 for a second. 1649 01:11:46,170 --> 01:11:48,110 So we asked the question, whether people 1650 01:11:48,110 --> 01:11:49,340 have stable preferences. 1651 01:11:49,340 --> 01:11:50,870 And it seems to me that people don't 1652 01:11:50,870 --> 01:11:53,450 have clear preferences for goods and experiences, 1653 01:11:53,450 --> 01:11:56,330 and sort of construct their preferences on the spot. 1654 01:11:56,330 --> 01:12:00,020 Now, they're influenced by environmental cues 1655 01:12:00,020 --> 01:12:03,050 that, in a way, that doesn't necessarily 1656 01:12:03,050 --> 01:12:05,870 reflect the true utility from the good or experience. 1657 01:12:05,870 --> 01:12:07,610 And that's very much also what companies 1658 01:12:07,610 --> 01:12:08,527 are doing [INAUDIBLE]. 1659 01:12:08,527 --> 01:12:12,230 And sometimes, some items are made really, really expensive. 1660 01:12:12,230 --> 01:12:13,820 Like, why is this thing expensive? 1661 01:12:13,820 --> 01:12:16,640 And somehow companies are able to create 1662 01:12:16,640 --> 01:12:19,580 some fads or some way of making things desirable by just making 1663 01:12:19,580 --> 01:12:23,095 them expensive and making people want them in that way. 1664 01:12:23,095 --> 01:12:24,470 That very much relies on the fact 1665 01:12:24,470 --> 01:12:26,643 that people are in fact sort of malleable. 1666 01:12:26,643 --> 01:12:28,310 And you can sort of introspect and think 1667 01:12:28,310 --> 01:12:31,400 about what things in the world make you generally happy. 1668 01:12:31,400 --> 01:12:32,827 What things do you really like? 1669 01:12:32,827 --> 01:12:34,910 And what things are sort of more things that like, 1670 01:12:34,910 --> 01:12:36,860 well, other people like them, and you kind of 1671 01:12:36,860 --> 01:12:39,193 get sort of manipulated, or you get sort of like tricked 1672 01:12:39,193 --> 01:12:40,920 into doing that. 1673 01:12:40,920 --> 01:12:42,410 And in some sense, and we'll talk 1674 01:12:42,410 --> 01:12:44,420 about this next week a little bit about happiness-- 1675 01:12:44,420 --> 01:12:46,712 it's important to sort of try and figure it out and try 1676 01:12:46,712 --> 01:12:49,910 to sort of not perhaps be as much influenced by others, 1677 01:12:49,910 --> 01:12:53,960 and rather figuring out what you generally and truly like. 1678 01:12:53,960 --> 01:12:56,390 So there's a nice series of experiments 1679 01:12:56,390 --> 01:12:59,510 that sort of demonstrate this coherent arbitrariness, 1680 01:12:59,510 --> 01:13:02,480 in very sort of like clean variation, 1681 01:13:02,480 --> 01:13:06,080 and somewhat contrived context. 1682 01:13:06,080 --> 01:13:08,900 So you might wonder then, does it matter in the real world? 1683 01:13:08,900 --> 01:13:11,772 Well it's perhaps less important in settings 1684 01:13:11,772 --> 01:13:12,980 where people have experience. 1685 01:13:12,980 --> 01:13:14,897 Again, like if I had done the same experiments 1686 01:13:14,897 --> 01:13:17,488 with like pizza, you know how much you're 1687 01:13:17,488 --> 01:13:18,530 willing to pay for pizza. 1688 01:13:18,530 --> 01:13:19,613 You know the market price. 1689 01:13:19,613 --> 01:13:21,260 And you know also, and so on. 1690 01:13:21,260 --> 01:13:23,810 And so there, probably, your willingness to pay 1691 01:13:23,810 --> 01:13:26,990 and your preferences are very much like set. 1692 01:13:26,990 --> 01:13:29,630 But in some other cases, in particular like new 1693 01:13:29,630 --> 01:13:32,310 environments, [INAUDIBLE] sort of new-- 1694 01:13:32,310 --> 01:13:34,057 when preferences are shaped-- 1695 01:13:34,057 --> 01:13:36,140 for example, think about maybe back when you first 1696 01:13:36,140 --> 01:13:40,610 started at MIT, sort of seeing lots of other students, 1697 01:13:40,610 --> 01:13:42,890 when norms are shaped, people's preferences 1698 01:13:42,890 --> 01:13:46,820 are very much malleable and can be influenced profoundly. 1699 01:13:46,820 --> 01:13:52,370 Now, there's not so much like actual field evidence 1700 01:13:52,370 --> 01:13:55,663 in high stakes settings that are sort of this clean nature. 1701 01:13:55,663 --> 01:13:57,080 Having said that, the examples of, 1702 01:13:57,080 --> 01:13:58,970 for example, environmental-- sorry, 1703 01:13:58,970 --> 01:14:01,982 educational choices is perhaps the most compelling one 1704 01:14:01,982 --> 01:14:03,440 because that's really a high stakes 1705 01:14:03,440 --> 01:14:05,160 setting that matters a lot. 1706 01:14:05,160 --> 01:14:08,450 But people's preferences or their choices 1707 01:14:08,450 --> 01:14:10,940 are very much malleable. 1708 01:14:10,940 --> 01:14:14,920 That's all I have to say on coherent arbitrariness. 1709 01:14:14,920 --> 01:14:19,370 So next time, Monday, we're going to talk about poverty. 1710 01:14:19,370 --> 01:14:21,895 Please read the paper by Mani et al. 1711 01:14:21,895 --> 01:14:23,270 And then on Wednesday we're going 1712 01:14:23,270 --> 01:14:25,460 to talk about happiness and mental health. 1713 01:14:25,460 --> 01:14:30,290 Now, I promised you a guest lecturer. 1714 01:14:30,290 --> 01:14:32,990 And so you know, I was thinking about what's 1715 01:14:32,990 --> 01:14:33,810 a good thing to do. 1716 01:14:33,810 --> 01:14:37,080 And so you may have heard about this at UC Berkeley. 1717 01:14:37,080 --> 01:14:40,560 They have llamas come to campus to make students happy, 1718 01:14:40,560 --> 01:14:42,370 and destress them in some ways. 1719 01:14:42,370 --> 01:14:45,623 Now, of course, I can't bring any real llamas. 1720 01:14:45,623 --> 01:14:47,040 But what I found, and you may have 1721 01:14:47,040 --> 01:14:48,460 seen this on The Daily Show. 1722 01:14:48,460 --> 01:14:50,210 There's this thing called Goat To Meeting, 1723 01:14:50,210 --> 01:14:53,280 where you can essentially get like a goat or a llama 1724 01:14:53,280 --> 01:14:56,040 to come to your meeting for 10 minutes. 1725 01:14:56,040 --> 01:15:00,600 And I did ask for like a llama to arrive. 1726 01:15:00,600 --> 01:15:02,340 But there's apparently a chance that it 1727 01:15:02,340 --> 01:15:03,990 might be a goat or a cow. 1728 01:15:03,990 --> 01:15:08,490 So we'll have a visitor at the end of the lecture. 1729 01:15:08,490 --> 01:15:14,280 I think I asked for 2:20, or between 2:20 and 2:30. 1730 01:15:14,280 --> 01:15:17,400 And which we'll have a guest lecturer that will hopefully 1731 01:15:17,400 --> 01:15:20,640 either tell us about happiness or maybe make some of you, 1732 01:15:20,640 --> 01:15:23,370 or at least myself, happy. 1733 01:15:23,370 --> 01:15:27,040 That's all I have for today. 1734 01:15:27,040 --> 01:15:30,740 I'm happy to answer any questions that you might have.