1 00:00:00,000 --> 00:00:01,952 [SQUEAKING] 2 00:00:01,952 --> 00:00:04,392 [RUSTLING] 3 00:00:04,392 --> 00:00:07,320 [CLICKING] 4 00:00:10,113 --> 00:00:12,030 FRANK SCHILBACH: So I'm going to briefly recap 5 00:00:12,030 --> 00:00:13,740 what we discussed, and then sort of talk 6 00:00:13,740 --> 00:00:17,580 about a number of different applications, ranging from, 7 00:00:17,580 --> 00:00:20,280 like, work, exercising, credit cards, savings behavior, 8 00:00:20,280 --> 00:00:23,040 drinking, smoking, fertilizer use, and so on. 9 00:00:23,040 --> 00:00:26,670 And then, that'll spill into the next lecture, 10 00:00:26,670 --> 00:00:28,290 or go into next lecture as well. 11 00:00:28,290 --> 00:00:29,790 I'm going to sort of summarize where we are. 12 00:00:29,790 --> 00:00:31,170 What do we know about time preferences? 13 00:00:31,170 --> 00:00:32,250 What have you learned? 14 00:00:32,250 --> 00:00:34,542 What's useful from what have you learned for the world, 15 00:00:34,542 --> 00:00:38,800 and what things might be still up for investigation? 16 00:00:38,800 --> 00:00:40,770 OK, so what happened so far is, like, 17 00:00:40,770 --> 00:00:45,030 I showed you a simple model of exponential discounting. 18 00:00:45,030 --> 00:00:48,840 Again, that's the workhorse model of discounting. 19 00:00:48,840 --> 00:00:51,090 In economics, it's one of the most successful and most 20 00:00:51,090 --> 00:00:53,520 important models that people have written down. 21 00:00:53,520 --> 00:00:56,580 In economics, the Solow model, et cetera, like, many sort 22 00:00:56,580 --> 00:00:59,170 of different important models, thinking about long-run growth 23 00:00:59,170 --> 00:01:02,680 and so on, use this particular model. 24 00:01:02,680 --> 00:01:05,580 So it's been tremendously important and successful. 25 00:01:05,580 --> 00:01:07,950 It has different implications that we discussed 26 00:01:07,950 --> 00:01:09,930 at length, both in class, and lecture, 27 00:01:09,930 --> 00:01:12,870 and in recitation, which is constant discounting, 28 00:01:12,870 --> 00:01:15,330 dynamic consistency, and no demand for commitment. 29 00:01:15,330 --> 00:01:17,880 We discussed some evidence that shows 30 00:01:17,880 --> 00:01:20,520 that these assumptions-- these implications-- 31 00:01:20,520 --> 00:01:21,760 are not warranted. 32 00:01:21,760 --> 00:01:23,698 And then we talked about a different version 33 00:01:23,698 --> 00:01:25,740 of this model, or like an extension, if you want, 34 00:01:25,740 --> 00:01:28,073 of that model, which is the quasi-hyperbolic discounting 35 00:01:28,073 --> 00:01:31,380 model, which adds an additional parameter that measures 36 00:01:31,380 --> 00:01:33,720 people's present bias or present focus, 37 00:01:33,720 --> 00:01:36,810 as people call it sometimes these days, which allows us 38 00:01:36,810 --> 00:01:39,120 to be more flexible and be able to look 39 00:01:39,120 --> 00:01:43,590 at short-run and long-run discounting in the same model, 40 00:01:43,590 --> 00:01:45,990 as in, like, there is one parameter, beta, that measures 41 00:01:45,990 --> 00:01:50,070 people's short-run discount factor, and another parameter, 42 00:01:50,070 --> 00:01:51,982 delta, which is close to one off, 43 00:01:51,982 --> 00:01:53,940 and that measures the long-run discount factor. 44 00:01:53,940 --> 00:01:55,650 That makes the model more flexible, 45 00:01:55,650 --> 00:01:59,220 and we're able to explain some phenomena that might 46 00:01:59,220 --> 00:02:02,270 be hard to explain otherwise. 47 00:02:02,270 --> 00:02:05,880 Any questions on that so far, or last time, or the like? 48 00:02:11,230 --> 00:02:14,770 OK, so then next, we talked about sophistication versus 49 00:02:14,770 --> 00:02:15,910 naiveté. 50 00:02:15,910 --> 00:02:18,230 So this is the issue that we discussed before, 51 00:02:18,230 --> 00:02:22,030 which is present bias creates time inconsistency, right? 52 00:02:22,030 --> 00:02:24,370 When thinking about the future, we want to be patient. 53 00:02:24,370 --> 00:02:26,620 When the time actually comes, when the future actually 54 00:02:26,620 --> 00:02:29,230 arrived, we are impatient. 55 00:02:29,230 --> 00:02:31,690 So then the key question you might ask is, like, well, 56 00:02:31,690 --> 00:02:34,450 do people understand the time consistency? 57 00:02:34,450 --> 00:02:37,000 We talked about two different extreme assumptions or versions 58 00:02:37,000 --> 00:02:37,500 if this. 59 00:02:37,500 --> 00:02:40,720 One is full naiveté, which is the idea that the person does 60 00:02:40,720 --> 00:02:43,330 not realize that she will change her mind. 61 00:02:43,330 --> 00:02:45,670 When thinking about the future, she 62 00:02:45,670 --> 00:02:48,310 thinks she's going to follow through on her favorite plan. 63 00:02:48,310 --> 00:02:50,530 When the future comes, she will be patient. 64 00:02:50,530 --> 00:02:54,010 But then of course, the future arrives and surprises happen. 65 00:02:54,010 --> 00:02:57,230 And the person is surprised by their own present bias. 66 00:02:57,230 --> 00:03:00,370 And sort of there is false optimism about future patience, 67 00:03:00,370 --> 00:03:01,920 and sort of the-- 68 00:03:01,920 --> 00:03:03,670 over and over again, the person might say, 69 00:03:03,670 --> 00:03:06,880 this time is different. 70 00:03:06,880 --> 00:03:10,480 Second extreme assumption is full sophistication. 71 00:03:10,480 --> 00:03:11,740 That is perfect foresight. 72 00:03:11,740 --> 00:03:15,130 The person actually understands their beta perfectly well, 73 00:03:15,130 --> 00:03:17,710 and understands that she, in the future, 74 00:03:17,710 --> 00:03:19,510 might not stick to the plans that she has, 75 00:03:19,510 --> 00:03:21,340 and might change her mind. 76 00:03:21,340 --> 00:03:24,610 So she does sort of her best, given the future self's 77 00:03:24,610 --> 00:03:27,800 anticipated changes in behavior. 78 00:03:27,800 --> 00:03:29,560 So taken into account as a constraint what 79 00:03:29,560 --> 00:03:31,850 the person will do in the future, 80 00:03:31,850 --> 00:03:34,900 the person optimizes that way. 81 00:03:34,900 --> 00:03:37,000 There's no surprises about future present bias. 82 00:03:37,000 --> 00:03:39,820 The person has rational expectations. 83 00:03:39,820 --> 00:03:44,260 Those are the two broad extreme assumptions that we discussed. 84 00:03:44,260 --> 00:03:46,330 Now, how can we tell-- how can we actually tell-- 85 00:03:46,330 --> 00:03:50,890 so if you wanted to know about your friend, 86 00:03:50,890 --> 00:03:54,100 and try to understand, is this person naive or sophisticated, 87 00:03:54,100 --> 00:03:56,907 how can we actually tell whether that's the case? 88 00:03:56,907 --> 00:03:57,490 What do we do? 89 00:04:07,460 --> 00:04:09,968 What data could you collect from your friend or from people 90 00:04:09,968 --> 00:04:10,510 that you see? 91 00:04:14,320 --> 00:04:15,220 Yes? 92 00:04:15,220 --> 00:04:16,887 AUDIENCE: Just giving them [INAUDIBLE].. 93 00:04:16,887 --> 00:04:19,466 Trying to see, like, what they procrastinate [INAUDIBLE].. 94 00:04:22,223 --> 00:04:24,390 FRANK SCHILBACH: And what exactly would you collect? 95 00:04:28,640 --> 00:04:31,792 AUDIENCE: I guess if you could, asking them 96 00:04:31,792 --> 00:04:33,500 what they think that they're going to do, 97 00:04:33,500 --> 00:04:35,642 and then seeing what they actually do. 98 00:04:35,642 --> 00:04:37,600 FRANK SCHILBACH: Right, so one thing you can do 99 00:04:37,600 --> 00:04:39,190 is collect the person's beliefs. 100 00:04:39,190 --> 00:04:41,690 So you can ask them, what are you going to do in the future? 101 00:04:41,690 --> 00:04:43,660 And if the person mispredicts what they're going to be 102 00:04:43,660 --> 00:04:45,580 in the future-- in particular, if the person thinks 103 00:04:45,580 --> 00:04:47,913 they're going to be more patient in the future than they 104 00:04:47,913 --> 00:04:50,500 actually are, so if they think their beta in the future is 105 00:04:50,500 --> 00:04:55,780 higher than it actually is, that would suggest at least some 106 00:04:55,780 --> 00:04:58,870 naiveté, right? 107 00:04:58,870 --> 00:05:03,770 What else could we elicit? 108 00:05:03,770 --> 00:05:04,937 Yes? 109 00:05:04,937 --> 00:05:06,770 AUDIENCE: There are some choices that people 110 00:05:06,770 --> 00:05:08,600 make that wouldn't make sense if they're not 111 00:05:08,600 --> 00:05:09,433 being sophisticated. 112 00:05:09,433 --> 00:05:12,439 So for example, is we see them restricting their choices 113 00:05:12,439 --> 00:05:14,897 in some way that we would expect a sophisticated person-- 114 00:05:14,897 --> 00:05:17,230 like that would make sense if they're [INAUDIBLE] naive. 115 00:05:17,230 --> 00:05:18,490 FRANK SCHILBACH: Exactly, so if you offered them 116 00:05:18,490 --> 00:05:21,340 commitment devices, and we said, here's a commitment device. 117 00:05:21,340 --> 00:05:22,840 You can change your future behavior 118 00:05:22,840 --> 00:05:25,600 in certain ways that make certain behaviors in the future 119 00:05:25,600 --> 00:05:26,990 more expensive. 120 00:05:26,990 --> 00:05:29,950 So for example, you might tell your friend, 121 00:05:29,950 --> 00:05:31,540 or your friend might offer to you, 122 00:05:31,540 --> 00:05:34,972 if I don't do the problem set until Friday, 5:00 PM, 123 00:05:34,972 --> 00:05:36,430 because you want to have fun Friday 124 00:05:36,430 --> 00:05:39,880 night, I'm going to pay you $100. 125 00:05:39,880 --> 00:05:44,080 Now, if you make that choice, or if somebody offers you 126 00:05:44,080 --> 00:05:46,480 that option to pay them $100 in case 127 00:05:46,480 --> 00:05:50,350 you haven't done the problem set by Friday 5:00 PM, that choice 128 00:05:50,350 --> 00:05:53,410 does make doesn't make any sense if the person is 129 00:05:53,410 --> 00:05:54,880 an exponential discounter. 130 00:05:54,880 --> 00:05:57,520 That person only makes sense if you are a present bias 131 00:05:57,520 --> 00:05:59,950 or if you have self-control problems in some way. 132 00:05:59,950 --> 00:06:02,240 And you must be sophisticated in some sense. 133 00:06:02,240 --> 00:06:05,200 So it must indicate some form of sophistication. 134 00:06:05,200 --> 00:06:08,740 To be clear, it doesn't indicate perfect sophistication. 135 00:06:08,740 --> 00:06:11,110 You might be only partially sophisticated. 136 00:06:11,110 --> 00:06:14,410 But at least it indicates some form of sophistication. 137 00:06:14,410 --> 00:06:18,580 I'm going to talk about partial sophistication in a bit. 138 00:06:18,580 --> 00:06:20,980 The same awareness issue does not 139 00:06:20,980 --> 00:06:23,530 arise with exponential discounting. 140 00:06:23,530 --> 00:06:24,140 Why is that? 141 00:06:24,140 --> 00:06:25,540 So in some sense, if you thought about it, like, 142 00:06:25,540 --> 00:06:28,300 if you have looked at like the exponential discounting model, 143 00:06:28,300 --> 00:06:30,370 there's nothing about sophistication and naiveté. 144 00:06:30,370 --> 00:06:32,037 There's no parameter that measures that. 145 00:06:32,037 --> 00:06:33,870 And why is that? 146 00:06:33,870 --> 00:06:36,270 Why do we need like a delta hat, or whatever? 147 00:06:39,540 --> 00:06:40,040 Yes? 148 00:06:40,040 --> 00:06:44,347 AUDIENCE: [INAUDIBLE] 149 00:06:44,347 --> 00:06:45,680 FRANK SCHILBACH: Right, exactly. 150 00:06:45,680 --> 00:06:47,925 So the whole issue only arises because there's 151 00:06:47,925 --> 00:06:48,800 time and consistency. 152 00:06:48,800 --> 00:06:51,170 The future self wants different things 153 00:06:51,170 --> 00:06:52,520 that the current self does. 154 00:06:52,520 --> 00:06:55,760 In the exponential discounting model, there's no such issue. 155 00:06:55,760 --> 00:06:57,390 There's no time and consistency. 156 00:06:57,390 --> 00:06:59,030 So the future self will always do 157 00:06:59,030 --> 00:07:00,860 what the current self actually wants to do 158 00:07:00,860 --> 00:07:02,372 unless circumstances change. 159 00:07:02,372 --> 00:07:04,580 So we don't need any parameter that sort of looks at, 160 00:07:04,580 --> 00:07:06,200 like, how much is the future self 161 00:07:06,200 --> 00:07:08,720 deviating, because that's not even 162 00:07:08,720 --> 00:07:11,120 an issue in the first place. 163 00:07:11,120 --> 00:07:15,380 OK, so then we talked about sort of extreme assumptions. 164 00:07:15,380 --> 00:07:17,930 So these are extreme assumptions on the two ends. 165 00:07:17,930 --> 00:07:19,340 One is, like, full naiveté. 166 00:07:19,340 --> 00:07:20,330 You're entirely naive. 167 00:07:20,330 --> 00:07:24,308 You just cannot imagine that your beta in the future will be 168 00:07:24,308 --> 00:07:25,100 different from one. 169 00:07:25,100 --> 00:07:26,840 That's full naiveté. 170 00:07:26,840 --> 00:07:29,310 And then there is the other extreme assumption, 171 00:07:29,310 --> 00:07:31,880 which is full sophistication, which is beta hat 172 00:07:31,880 --> 00:07:32,912 equals beta, right? 173 00:07:32,912 --> 00:07:35,120 Essentially, it's like you understand entirely what's 174 00:07:35,120 --> 00:07:36,950 going on with your future beta. 175 00:07:36,950 --> 00:07:41,750 But of course, there's a whole range in between from beta to 1 176 00:07:41,750 --> 00:07:44,930 that beta hat could take. 177 00:07:44,930 --> 00:07:48,110 And that's what we refer to as partial naiveté. 178 00:07:48,110 --> 00:07:52,280 That is to say beta hat measures the beliefs about future beta. 179 00:07:52,280 --> 00:07:55,670 The extreme cases are useful to think about. 180 00:07:55,670 --> 00:07:58,225 But presumably, the truth is somewhere in between. 181 00:07:58,225 --> 00:07:59,600 So the intermediate case might be 182 00:07:59,600 --> 00:08:01,370 sort of the most relevant one, which 183 00:08:01,370 --> 00:08:05,515 is beta hat is in between beta and 1. 184 00:08:05,515 --> 00:08:06,890 So that is to say, the individual 185 00:08:06,890 --> 00:08:09,230 understands that they will experience present bias 186 00:08:09,230 --> 00:08:09,840 in the future. 187 00:08:09,840 --> 00:08:12,530 So I know that I have some present bias in the future. 188 00:08:12,530 --> 00:08:16,400 As an example, say, my beta might be 0.6. 189 00:08:16,400 --> 00:08:18,500 I might think-- I understand to some degree 190 00:08:18,500 --> 00:08:20,210 that my future beta is not 1. 191 00:08:20,210 --> 00:08:22,620 But I might think it's like 0.8 or the like. 192 00:08:22,620 --> 00:08:25,880 So I understand that I will be present bias in the future, 193 00:08:25,880 --> 00:08:30,990 but I underestimate the degree of present bias in the future. 194 00:08:30,990 --> 00:08:35,240 And so if I'm partially naive-- and this is what Maya was 195 00:08:35,240 --> 00:08:36,580 saying earlier-- 196 00:08:36,580 --> 00:08:39,620 I might demand commitment devices anyway. 197 00:08:39,620 --> 00:08:43,230 I might understand I have a problem in the future. 198 00:08:43,230 --> 00:08:45,180 So I want some commitment devices. 199 00:08:45,180 --> 00:08:46,370 But I could also overcommit. 200 00:08:46,370 --> 00:08:48,140 I could demand commitment devices that 201 00:08:48,140 --> 00:08:50,270 are actually not useful for me. 202 00:08:50,270 --> 00:08:50,775 Why is that? 203 00:08:50,775 --> 00:08:52,400 Because I sort of underestimate how bad 204 00:08:52,400 --> 00:08:53,540 my self-control problem is. 205 00:08:53,540 --> 00:08:55,373 I understand there's a self-control problem. 206 00:08:55,373 --> 00:08:57,860 Somebody offers me a commitment device that's fairly weak. 207 00:08:57,860 --> 00:08:59,660 I say great, that's going to help me. 208 00:08:59,660 --> 00:09:01,550 That's going to help me follow through. 209 00:09:01,550 --> 00:09:03,500 But then, surprise, my self-control problem 210 00:09:03,500 --> 00:09:05,762 is actually worse than I anticipated. 211 00:09:05,762 --> 00:09:07,220 And then I have a commitment device 212 00:09:07,220 --> 00:09:09,170 and I'm actually failing with that commitment device, 213 00:09:09,170 --> 00:09:10,837 because the self-control problem happens 214 00:09:10,837 --> 00:09:13,940 to be worse than anticipated. 215 00:09:13,940 --> 00:09:15,020 Any questions on this? 216 00:09:21,900 --> 00:09:25,460 OK, we're going to talk tomorrow about, like-- 217 00:09:25,460 --> 00:09:28,282 a little bit about solving problems with partial naiveté. 218 00:09:28,282 --> 00:09:30,740 We talked about solving problems with full naiveté and full 219 00:09:30,740 --> 00:09:31,702 sophistication. 220 00:09:31,702 --> 00:09:33,410 Partial naiveté is a little bit trickier, 221 00:09:33,410 --> 00:09:35,327 because it sort of requires iterating forwards 222 00:09:35,327 --> 00:09:36,450 and backwards. 223 00:09:36,450 --> 00:09:39,340 We'll talk about this briefly tomorrow. 224 00:09:39,340 --> 00:09:41,650 OK, so now, demand for commitment, we already 225 00:09:41,650 --> 00:09:42,850 talked about this before. 226 00:09:42,850 --> 00:09:44,750 Here is sort of a formal definition. 227 00:09:44,750 --> 00:09:47,710 It's defined as an arrangement entered into by an agent who 228 00:09:47,710 --> 00:09:49,450 restricts his and her future choice set 229 00:09:49,450 --> 00:09:52,300 by making certain choices more expensive, perhaps infinitely 230 00:09:52,300 --> 00:09:53,162 expensive. 231 00:09:53,162 --> 00:09:55,120 That's to say, at the margin, you might sort of 232 00:09:55,120 --> 00:09:59,290 pay for something that might make restrict your choices 233 00:09:59,290 --> 00:10:01,690 or make your choices in the future more expensive. 234 00:10:01,690 --> 00:10:03,500 If it's not available at all, you 235 00:10:03,500 --> 00:10:06,050 can think of this as, like, the price is infinite, right? 236 00:10:06,050 --> 00:10:08,530 So I don't want to eat donuts tomorrow. 237 00:10:08,530 --> 00:10:10,080 I can make donuts more expensive-- 238 00:10:10,080 --> 00:10:11,080 more and more expensive. 239 00:10:11,080 --> 00:10:12,580 If it's infinitely expensive, donuts 240 00:10:12,580 --> 00:10:16,280 are just not available to me. 241 00:10:16,280 --> 00:10:18,550 So now of course, as we said before, 242 00:10:18,550 --> 00:10:21,250 time-inconsistent preferences, the selves differ. 243 00:10:21,250 --> 00:10:23,140 And this is, again, repetition. 244 00:10:23,140 --> 00:10:24,880 Selfs differ between what you want today 245 00:10:24,880 --> 00:10:25,960 versus in the future. 246 00:10:25,960 --> 00:10:28,600 You might sort of worry about misbehaving in the future. 247 00:10:28,600 --> 00:10:30,940 If you understand that, you might 248 00:10:30,940 --> 00:10:33,820 want to discipline your future self by demanding a commitment 249 00:10:33,820 --> 00:10:34,960 device. 250 00:10:34,960 --> 00:10:38,785 OK, so who of you is using commitment devices, or can 251 00:10:38,785 --> 00:10:40,660 you give me an example of a commitment device 252 00:10:40,660 --> 00:10:43,700 that you have used in the past, successful or not? 253 00:10:43,700 --> 00:10:44,576 Yes? 254 00:10:44,576 --> 00:10:47,034 AUDIENCE: I have an app on my phone where I can set a timer 255 00:10:47,034 --> 00:10:49,570 [INAUDIBLE]. 256 00:10:49,570 --> 00:10:50,820 FRANK SCHILBACH: Does it work? 257 00:10:50,820 --> 00:10:53,020 AUDIENCE: Yes. 258 00:10:53,020 --> 00:10:54,950 FRANK SCHILBACH: I should try that. 259 00:10:54,950 --> 00:10:56,060 Yes? 260 00:10:56,060 --> 00:10:59,020 AUDIENCE: I have a browser extension 261 00:10:59,020 --> 00:11:03,022 that launch certain sites at certain times of the day. 262 00:11:03,022 --> 00:11:04,814 FRANK SCHILBACH: Does it work for you, too? 263 00:11:04,814 --> 00:11:06,087 AUDIENCE: Somewhat. 264 00:11:06,087 --> 00:11:07,670 FRANK SCHILBACH: Why does it not work? 265 00:11:07,670 --> 00:11:11,870 AUDIENCE: Because it's way too easy to just turn it off. 266 00:11:11,870 --> 00:11:13,858 FRANK SCHILBACH: Right, exactly. 267 00:11:13,858 --> 00:11:15,650 So that's an example of a commitment device 268 00:11:15,650 --> 00:11:17,150 that's sort of partial in some ways. 269 00:11:17,150 --> 00:11:18,980 It's not sort of strong enough. 270 00:11:18,980 --> 00:11:20,610 So either you can sort of substitute 271 00:11:20,610 --> 00:11:23,480 to, like, Firefox or whatever, and another browser, you 272 00:11:23,480 --> 00:11:25,940 can substitute your phone, you can substitute your friend's 273 00:11:25,940 --> 00:11:28,100 phone even, and so on. 274 00:11:28,100 --> 00:11:31,370 Or you might just actually be able to turn it off yourself. 275 00:11:31,370 --> 00:11:32,360 But you can circumvent. 276 00:11:32,360 --> 00:11:36,440 I think there's some apps that have sort of options that don't 277 00:11:36,440 --> 00:11:37,860 allow you to do that at all. 278 00:11:37,860 --> 00:11:40,380 But you guys are, like, a lot of CS majors. 279 00:11:40,380 --> 00:11:43,170 So maybe you can get around that as well. 280 00:11:43,170 --> 00:11:44,180 Any other examples? 281 00:11:44,180 --> 00:11:45,290 Yes? 282 00:11:45,290 --> 00:11:48,290 AUDIENCE: When you go shopping, buying more vegetables 283 00:11:48,290 --> 00:11:50,340 or something so that you feel like, oh, I 284 00:11:50,340 --> 00:11:54,710 have to eat them, otherwise I just wasted my time and money. 285 00:11:54,710 --> 00:11:56,960 FRANK SCHILBACH: So what do you do actually then? 286 00:11:56,960 --> 00:11:59,030 So what's your commitment device? 287 00:11:59,030 --> 00:12:00,950 AUDIENCE: I guess it's like making the choice 288 00:12:00,950 --> 00:12:03,240 to eat something else, like, more expensive. 289 00:12:03,240 --> 00:12:05,532 Because I'd have to, like, go back to the grocery store 290 00:12:05,532 --> 00:12:07,130 and I'd have to pay for it. 291 00:12:07,130 --> 00:12:09,530 FRANK SCHILBACH: But how do you commit then? 292 00:12:09,530 --> 00:12:11,300 Or, like, what's restricting your options? 293 00:12:11,300 --> 00:12:13,615 AUDIENCE: The fact that I'm lazy and I 294 00:12:13,615 --> 00:12:16,860 don't want to go somewhere to get other food. 295 00:12:16,860 --> 00:12:19,210 FRANK SCHILBACH: But, like, so you 296 00:12:19,210 --> 00:12:21,710 know that you're going to do that in the future potentially. 297 00:12:21,710 --> 00:12:24,810 So now, how do you avoid or change your future behavior? 298 00:12:24,810 --> 00:12:27,080 Can you sort of incentivize yourself? 299 00:12:27,080 --> 00:12:30,080 Or can you make sure that-- another version of that 300 00:12:30,080 --> 00:12:31,940 would be, like, so what people often 301 00:12:31,940 --> 00:12:35,780 do is when they buy, for example, potato 302 00:12:35,780 --> 00:12:38,990 chips or the like, they buy really small bags, in part 303 00:12:38,990 --> 00:12:41,900 sort of knowing that if they buy a big bag, which actually would 304 00:12:41,900 --> 00:12:43,837 be cheaper, they would just eat it all. 305 00:12:43,837 --> 00:12:45,920 So then, you have, like, only very small portions. 306 00:12:45,920 --> 00:12:47,660 And then if you want another one, 307 00:12:47,660 --> 00:12:50,060 you have to sort of go to the store and buy more. 308 00:12:50,060 --> 00:12:51,560 So that's like a version-- it's sort 309 00:12:51,560 --> 00:12:53,420 of a version of a commitment device, where essentially, you 310 00:12:53,420 --> 00:12:55,820 commit yourself to, if you want to eat more potato 311 00:12:55,820 --> 00:12:57,560 chips in the future, you have to go back 312 00:12:57,560 --> 00:12:59,227 to the store as opposed to having, like, 313 00:12:59,227 --> 00:13:02,430 a big bag that you can just consume in one session. 314 00:13:02,430 --> 00:13:03,350 Yeah. 315 00:13:03,350 --> 00:13:05,450 Any other examples? 316 00:13:05,450 --> 00:13:06,045 Yes? 317 00:13:06,045 --> 00:13:09,180 AUDIENCE: Carrying cash instead of card. 318 00:13:09,180 --> 00:13:12,742 Like, if you [INAUDIBLE] cash on you, you won't overspend. 319 00:13:12,742 --> 00:13:13,700 FRANK SCHILBACH: I see. 320 00:13:13,700 --> 00:13:15,680 And so that's interesting. 321 00:13:15,680 --> 00:13:16,340 Yeah, exactly. 322 00:13:16,340 --> 00:13:18,023 So, like, credit, it's interesting. 323 00:13:18,023 --> 00:13:19,940 Because in some sense, in some other settings, 324 00:13:19,940 --> 00:13:21,677 carrying cash is sort of not helping. 325 00:13:21,677 --> 00:13:24,260 But you're saying, like, instead of having a credit card where 326 00:13:24,260 --> 00:13:27,290 you can spend, usually, as much as you like, depending on what 327 00:13:27,290 --> 00:13:29,090 your credit limit is, you might say, 328 00:13:29,090 --> 00:13:32,180 I'm going to go out with $100, and, like, once I've 329 00:13:32,180 --> 00:13:34,160 spent the $100, I'm not going to spend more. 330 00:13:34,160 --> 00:13:36,620 I have to sort of go back home or the like. 331 00:13:36,620 --> 00:13:39,390 And then, I might sort of not give in to temptations. 332 00:13:39,390 --> 00:13:41,780 The reason I was hesitating a little bit 333 00:13:41,780 --> 00:13:43,580 is, like, in developing countries, 334 00:13:43,580 --> 00:13:47,120 in part, in some settings where I work, lots of people 335 00:13:47,120 --> 00:13:49,310 have lots of cash on hand from their work. 336 00:13:49,310 --> 00:13:51,500 For example, cycle rickshaw drivers, 337 00:13:51,500 --> 00:13:54,048 they would have a lot of cash on hand 338 00:13:54,048 --> 00:13:55,340 because they get, like, trips-- 339 00:13:55,340 --> 00:13:57,043 they get paid for every single trip. 340 00:13:57,043 --> 00:13:58,710 And then they have lots of cash on hand. 341 00:13:58,710 --> 00:14:01,002 It's actually quite bad, because then they can spend it 342 00:14:01,002 --> 00:14:03,230 on lots of things any day. 343 00:14:03,230 --> 00:14:07,070 And having it in a liquid, or their savings account 344 00:14:07,070 --> 00:14:10,160 or the like, would be a different form of commitment 345 00:14:10,160 --> 00:14:12,320 device that might help them. 346 00:14:12,320 --> 00:14:14,780 There was more-- yes? 347 00:14:14,780 --> 00:14:16,670 AUDIENCE: There's a show, Nathan For You, 348 00:14:16,670 --> 00:14:18,972 where he wants to help people lose weight. 349 00:14:18,972 --> 00:14:21,810 So he takes a picture of them-- like a very embarrassing 350 00:14:21,810 --> 00:14:22,760 picture-- 351 00:14:22,760 --> 00:14:25,376 and gets a notarized letter that gets sent out in two weeks 352 00:14:25,376 --> 00:14:27,200 if they don't lose five pounds. 353 00:14:27,200 --> 00:14:28,855 FRANK SCHILBACH: And does it work? 354 00:14:28,855 --> 00:14:29,480 AUDIENCE: Yeah. 355 00:14:29,480 --> 00:14:30,673 [LAUGHTER] 356 00:14:30,673 --> 00:14:32,090 FRANK SCHILBACH: Wow, interesting. 357 00:14:32,090 --> 00:14:32,870 I've not heard of this. 358 00:14:32,870 --> 00:14:33,440 What is it called? 359 00:14:33,440 --> 00:14:34,470 AUDIENCE: That's how embarrassing the picture is. 360 00:14:34,470 --> 00:14:35,990 FRANK SCHILBACH: I see. 361 00:14:35,990 --> 00:14:37,520 Yeah, you can see, like-- 362 00:14:37,520 --> 00:14:40,970 yeah, it depends a little bit, I guess-- 363 00:14:40,970 --> 00:14:45,140 yeah, so when asked does it work, does it work on average, 364 00:14:45,140 --> 00:14:47,282 or what's being shown, or how costly 365 00:14:47,282 --> 00:14:48,740 is it to fail-- but it sounds like, 366 00:14:48,740 --> 00:14:52,044 at least for some people, that's, in fact, effective. 367 00:14:52,044 --> 00:14:52,545 Yeah? 368 00:14:52,545 --> 00:14:54,170 AUDIENCE: There are some accounts where 369 00:14:54,170 --> 00:14:56,240 you can put money, and it's a bit harder 370 00:14:56,240 --> 00:14:58,490 to take that money out [INAUDIBLE].. 371 00:14:58,490 --> 00:15:03,110 So that's kind of [INAUDIBLE] and you can take it out 372 00:15:03,110 --> 00:15:03,610 [INAUDIBLE]. 373 00:15:03,610 --> 00:15:03,860 FRANK SCHILBACH: Right. 374 00:15:03,860 --> 00:15:05,735 AUDIENCE: If you really need it [INAUDIBLE].. 375 00:15:05,735 --> 00:15:08,420 FRANK SCHILBACH: Right, in fact, a lot of retirement accounts 376 00:15:08,420 --> 00:15:12,480 in the US, in many places, have penalties for early withdrawal. 377 00:15:12,480 --> 00:15:16,940 It's a lot of, like, many employers that offer 401(k) 378 00:15:16,940 --> 00:15:19,310 and other savings vehicles-- 379 00:15:19,310 --> 00:15:22,730 essentially, that's like a retirement sort of-- 380 00:15:22,730 --> 00:15:25,070 tax-deferred retirement savings often sort of subsidized 381 00:15:25,070 --> 00:15:26,090 by the employer. 382 00:15:26,090 --> 00:15:29,720 But often, one condition is that you have to pay a 10%, 383 00:15:29,720 --> 00:15:31,385 or something, penalty to withdraw it. 384 00:15:31,385 --> 00:15:33,260 Similarly, I think for some savings accounts, 385 00:15:33,260 --> 00:15:34,370 that's sort of similar. 386 00:15:34,370 --> 00:15:39,320 And the idea is very much, like, helping you resist temptations 387 00:15:39,320 --> 00:15:41,240 to change your plan. 388 00:15:41,240 --> 00:15:43,940 So you're going to plan for saving for retirement 389 00:15:43,940 --> 00:15:44,990 or something else. 390 00:15:44,990 --> 00:15:47,690 But in fact, you might sort of withdraw early 391 00:15:47,690 --> 00:15:48,860 if you're tempted. 392 00:15:48,860 --> 00:15:50,900 And then usually, the penalty is only 393 00:15:50,900 --> 00:15:52,693 something like 10% or the like. 394 00:15:52,693 --> 00:15:55,110 Because when people have actual shocks, like health shocks 395 00:15:55,110 --> 00:15:57,480 or other issues, they want to be able to take out 396 00:15:57,480 --> 00:15:59,892 the money at some penalty. 397 00:15:59,892 --> 00:16:01,350 So they don't want to have it to be 398 00:16:01,350 --> 00:16:05,790 entirely liquid, because that would be bad for them. 399 00:16:05,790 --> 00:16:07,230 Any other example? 400 00:16:07,230 --> 00:16:08,850 Yes? 401 00:16:08,850 --> 00:16:11,540 AUDIENCE: I have an alarm clock on my phone 402 00:16:11,540 --> 00:16:13,705 where I have to take a picture of something. 403 00:16:13,705 --> 00:16:15,330 So I'll get up, and go to the bathroom, 404 00:16:15,330 --> 00:16:17,245 and take a picture of the thing. 405 00:16:17,245 --> 00:16:18,495 FRANK SCHILBACH: Does it work? 406 00:16:18,495 --> 00:16:20,745 AUDIENCE: No, I take the picture and I go back to bed. 407 00:16:20,745 --> 00:16:23,460 FRANK SCHILBACH: I see, fair enough. 408 00:16:23,460 --> 00:16:24,920 Does it work sometimes? 409 00:16:24,920 --> 00:16:25,920 AUDIENCE: Yeah, kind of. 410 00:16:25,920 --> 00:16:27,463 FRANK SCHILBACH: Sometimes, I see. 411 00:16:27,463 --> 00:16:29,130 Yeah, so there the issue, in some sense, 412 00:16:29,130 --> 00:16:31,110 is, like, if it doesn't work, then 413 00:16:31,110 --> 00:16:32,700 you're kind of worse off than before, 414 00:16:32,700 --> 00:16:36,730 because you slept worse than you would have otherwise. 415 00:16:36,730 --> 00:16:40,240 And you know, there's no benefit of you getting actually up. 416 00:16:40,240 --> 00:16:41,100 Yeah. 417 00:16:41,100 --> 00:16:42,300 One more? 418 00:16:42,300 --> 00:16:42,878 Yes? 419 00:16:42,878 --> 00:16:45,045 AUDIENCE: Would, like, societally implied commitment 420 00:16:45,045 --> 00:16:46,587 about-- it's like, No Shave November, 421 00:16:46,587 --> 00:16:48,788 would some sort of variance on that work? 422 00:16:48,788 --> 00:16:50,205 FRANK SCHILBACH: Of what, exactly? 423 00:16:50,205 --> 00:16:52,710 AUDIENCE: Like No Shave November or variants of this, 424 00:16:52,710 --> 00:16:55,230 would that imply a commitment device? 425 00:16:55,230 --> 00:16:58,210 FRANK SCHILBACH: Yeah, I mean, I think in some sense, 426 00:16:58,210 --> 00:16:59,280 there's often-- 427 00:16:59,280 --> 00:17:01,530 some commitment devices are essentially sort of public 428 00:17:01,530 --> 00:17:04,388 attention or the like, where people are announcing one way 429 00:17:04,388 --> 00:17:06,930 or the other publicly-- which is kind of what you were saying 430 00:17:06,930 --> 00:17:08,310 earlier-- 431 00:17:08,310 --> 00:17:12,766 where society, in some sense, has some influence where people 432 00:17:12,766 --> 00:17:14,849 would say they would publicly declare that they're 433 00:17:14,849 --> 00:17:15,724 going to lose weight. 434 00:17:15,724 --> 00:17:18,460 They would publicly declare that they're going to save money, 435 00:17:18,460 --> 00:17:18,970 and so on. 436 00:17:18,970 --> 00:17:20,595 And then, there would be social shaming 437 00:17:20,595 --> 00:17:22,700 in some way or the other to do that. 438 00:17:22,700 --> 00:17:24,670 Now, what you want-- so in some sense, 439 00:17:24,670 --> 00:17:26,910 if it's just society imposing things on people, 440 00:17:26,910 --> 00:17:27,690 that's not enough. 441 00:17:27,690 --> 00:17:29,130 What you want is a person actively 442 00:17:29,130 --> 00:17:31,770 sort of making some announcement or some statements 443 00:17:31,770 --> 00:17:35,970 with a goal of making it more costly not to follow through. 444 00:17:35,970 --> 00:17:39,390 OK, so as I said before, demand for commitment 445 00:17:39,390 --> 00:17:42,120 requires at least some partial sophistication. 446 00:17:42,120 --> 00:17:43,615 I have some examples for you here. 447 00:17:43,615 --> 00:17:44,490 One is, like, StickK. 448 00:17:44,490 --> 00:17:45,600 I don't know if you have any of-- has 449 00:17:45,600 --> 00:17:46,558 any of you used StickK? 450 00:17:50,490 --> 00:17:51,090 No? 451 00:17:51,090 --> 00:17:55,200 OK, so StickK is a website founded by an academic, Dean 452 00:17:55,200 --> 00:17:56,730 Karlan and co-authors. 453 00:17:56,730 --> 00:17:59,610 What StickK does is, it's a commitment device 454 00:17:59,610 --> 00:18:02,700 that works, in some sense, in the way I was saying before. 455 00:18:02,700 --> 00:18:06,825 For example, if you want to of commit to certain behaviors, 456 00:18:06,825 --> 00:18:09,030 for example, if I wanted to finish a paper 457 00:18:09,030 --> 00:18:12,180 draft by Friday night, I would have to find a referee. 458 00:18:12,180 --> 00:18:14,580 I would, for example, ask Aaron to be my referee. 459 00:18:14,580 --> 00:18:16,890 And I would say, Aaron, I'm going to give you 460 00:18:16,890 --> 00:18:18,840 my credit card information. 461 00:18:18,840 --> 00:18:23,850 And if I don't finish the paper by Friday night, 462 00:18:23,850 --> 00:18:27,570 I'm going to pay $100 either to Aaron or to some charity-- 463 00:18:27,570 --> 00:18:31,540 could be anti charity, could be the pro-smoking, pro-whatever 464 00:18:31,540 --> 00:18:33,150 society. 465 00:18:33,150 --> 00:18:35,410 The money is going to be gone. 466 00:18:35,410 --> 00:18:36,870 And so then, Friday night comes. 467 00:18:36,870 --> 00:18:41,208 Aaron is the referee who can then decide, or will 468 00:18:41,208 --> 00:18:43,500 decide, then, whether I have actually followed through. 469 00:18:43,500 --> 00:18:44,730 Did I actually write a draft? 470 00:18:44,730 --> 00:18:47,220 Is there an actual paper that's done? 471 00:18:47,220 --> 00:18:49,750 And if it's done, then nothing happens. 472 00:18:49,750 --> 00:18:51,030 I don't have to pay any money. 473 00:18:51,030 --> 00:18:53,520 If it's not done, the money is gone. 474 00:18:53,520 --> 00:18:56,940 And since the credit card information is on the website, 475 00:18:56,940 --> 00:18:59,430 they can actually just withdraw that money. 476 00:18:59,430 --> 00:19:02,617 What's the problem with this commitment device, potentially? 477 00:19:05,250 --> 00:19:05,750 Yes? 478 00:19:05,750 --> 00:19:07,950 AUDIENCE: The referee not wanting to enforce it? 479 00:19:07,950 --> 00:19:09,200 FRANK SCHILBACH: Yes, exactly. 480 00:19:09,200 --> 00:19:12,110 So, like, I tried to do this a few times in school. 481 00:19:12,110 --> 00:19:13,423 And my friend was my referee. 482 00:19:13,423 --> 00:19:14,840 And then, Friday night would come. 483 00:19:14,840 --> 00:19:17,298 And then I would sort of just convince my friend to, like-- 484 00:19:17,298 --> 00:19:21,823 I'll pay you dinner with if you let me get through. 485 00:19:21,823 --> 00:19:23,240 So you need a very good friend who 486 00:19:23,240 --> 00:19:24,698 is actually sort of then insisting, 487 00:19:24,698 --> 00:19:26,600 and sort of credible, potentially. 488 00:19:26,600 --> 00:19:29,150 Or your enemy could be your referee, if you wanted, 489 00:19:29,150 --> 00:19:31,260 who is happy sort of then to take your money. 490 00:19:31,260 --> 00:19:32,510 But I think it's worth trying. 491 00:19:32,510 --> 00:19:34,343 You can try it out and see whether it works. 492 00:19:34,343 --> 00:19:37,820 I'd love to hear some thoughts on whether it works or not. 493 00:19:37,820 --> 00:19:42,020 There's Clocky which I think is coming out of MIT. 494 00:19:42,020 --> 00:19:43,700 It's an alarm clock that runs away, 495 00:19:43,700 --> 00:19:48,365 which is kind of similar to what you mentioned earlier. 496 00:19:48,365 --> 00:19:50,990 There's also Tocky, which is the alarm clock that jumps around, 497 00:19:50,990 --> 00:19:52,682 and then you have to sort of find it. 498 00:19:52,682 --> 00:19:54,068 [LAUGHTER] 499 00:19:54,068 --> 00:19:56,690 There are some forms of-- 500 00:19:56,690 --> 00:19:58,532 this is a fake alarm clock-- 501 00:19:58,532 --> 00:20:00,740 I don't think it actually exists-- where essentially, 502 00:20:00,740 --> 00:20:03,860 if you don't get up and press a button on your alarm, 503 00:20:03,860 --> 00:20:05,630 you essentially give money to charity 504 00:20:05,630 --> 00:20:08,720 or anti-charities if you want. 505 00:20:08,720 --> 00:20:11,340 There's a thing called Antabuse, which is quite interesting, 506 00:20:11,340 --> 00:20:14,720 which is a drug that essentially interferes 507 00:20:14,720 --> 00:20:17,250 with the metabolism of alcohol. 508 00:20:17,250 --> 00:20:19,250 So this is some people have trouble metabolizing 509 00:20:19,250 --> 00:20:20,178 alcohol anyway. 510 00:20:20,178 --> 00:20:21,470 So when they drink, they flush. 511 00:20:21,470 --> 00:20:25,080 They turn-- they get headaches. 512 00:20:25,080 --> 00:20:27,390 They have to throw up, and feel unwell, and so on. 513 00:20:27,390 --> 00:20:29,510 So for them, drinking is very unpleasant as it is. 514 00:20:29,510 --> 00:20:31,010 It turns out, there's a drug that 515 00:20:31,010 --> 00:20:36,590 can actually sort of reproduce those kinds of behaviors 516 00:20:36,590 --> 00:20:38,150 or feelings. 517 00:20:38,150 --> 00:20:40,480 And it's essentially-- you can take it. 518 00:20:40,480 --> 00:20:42,650 And then for the next 24 to 48 hours, 519 00:20:42,650 --> 00:20:44,548 you will not be able to metabolize alcohol. 520 00:20:44,548 --> 00:20:45,965 There's also versions of that that 521 00:20:45,965 --> 00:20:48,113 are sort of implants for a month or the like. 522 00:20:48,113 --> 00:20:50,030 But essentially, they make alcohol consumption 523 00:20:50,030 --> 00:20:50,930 very unpleasant. 524 00:20:50,930 --> 00:20:53,680 And this is sort of meant to help people 525 00:20:53,680 --> 00:20:55,880 who have problems with alcohol consumption, 526 00:20:55,880 --> 00:20:58,370 that if you don't want to drink tomorrow, 527 00:20:58,370 --> 00:21:02,510 or in two days from now, or the like, or the next month, 528 00:21:02,510 --> 00:21:06,240 you can take it and then commit to not doing that. 529 00:21:06,240 --> 00:21:09,710 And then alcohol consumption becomes very unpleasant 530 00:21:09,710 --> 00:21:11,780 when consuming alcohol anyway. 531 00:21:11,780 --> 00:21:13,310 It's a little bit dangerous. 532 00:21:13,310 --> 00:21:15,950 Or it can be quite dangerous if people then drink anyway, 533 00:21:15,950 --> 00:21:17,840 because there's sort of an alcohol antabuse 534 00:21:17,840 --> 00:21:21,210 sort of reaction that's quite dangerous. 535 00:21:21,210 --> 00:21:23,300 So it hasn't been particularly successful, 536 00:21:23,300 --> 00:21:26,450 in part also because people sort of fall off the bandwagon. 537 00:21:26,450 --> 00:21:29,060 So if you essentially take that daily, or, like, 538 00:21:29,060 --> 00:21:31,100 every second day, what happens then 539 00:21:31,100 --> 00:21:33,317 is that people start wanting to drink. 540 00:21:33,317 --> 00:21:35,150 And then they just stop taking the Antabuse. 541 00:21:35,150 --> 00:21:35,960 And then they have to sort of just 542 00:21:35,960 --> 00:21:37,910 wait it out for 24 hours or the like 543 00:21:37,910 --> 00:21:39,930 until the drug is out of their system. 544 00:21:39,930 --> 00:21:42,177 And then they drink anyway. 545 00:21:42,177 --> 00:21:44,510 But in principle, it's like the ideal commitment device. 546 00:21:44,510 --> 00:21:46,343 Because essentially, the only reason for you 547 00:21:46,343 --> 00:21:51,740 to take this drug would be to reduce your future drinking. 548 00:21:51,740 --> 00:21:54,560 There's also sort of various versions of self-control 549 00:21:54,560 --> 00:21:58,615 devices which essentially either restrict certain websites that 550 00:21:58,615 --> 00:21:59,240 you can go to-- 551 00:21:59,240 --> 00:22:01,580 you can restrict yourself-- or it restricts them 552 00:22:01,580 --> 00:22:03,730 to certain hours, or you can sort of give yourself 553 00:22:03,730 --> 00:22:06,680 a budget for a number of hours that you 554 00:22:06,680 --> 00:22:09,260 can use certain websites for. 555 00:22:09,260 --> 00:22:10,715 Any questions? 556 00:22:16,760 --> 00:22:19,700 So I also have a video of commitment devices, 557 00:22:19,700 --> 00:22:22,280 which I'm not sure whether the audio actually works. 558 00:22:22,280 --> 00:22:23,460 So we have to try this out. 559 00:22:23,460 --> 00:22:26,510 If not, I'll show it to you tomorrow. 560 00:22:26,510 --> 00:22:29,750 OK, so as you can see, I think we can learn 561 00:22:29,750 --> 00:22:31,580 a lot from watching movies. 562 00:22:31,580 --> 00:22:32,240 And I love TV. 563 00:22:32,240 --> 00:22:34,690 So what can we learn from this? 564 00:22:34,690 --> 00:22:37,190 I think that it can actually-- we can learn a lot about sort 565 00:22:37,190 --> 00:22:39,260 of commitment devices, and whether they work, 566 00:22:39,260 --> 00:22:41,290 and why they work, or why they might not work. 567 00:22:41,290 --> 00:22:42,950 So what did what did we learn? 568 00:22:45,720 --> 00:22:47,002 Yes? 569 00:22:47,002 --> 00:22:50,404 AUDIENCE: [INAUDIBLE] reversible commitment devices are not 570 00:22:50,404 --> 00:22:52,445 as effective. 571 00:22:52,445 --> 00:22:54,070 FRANK SCHILBACH: Right, so just making, 572 00:22:54,070 --> 00:22:56,910 like, future choices more expensive might not be enough, 573 00:22:56,910 --> 00:22:57,410 right? 574 00:22:57,410 --> 00:22:59,860 I might sort of say I might have self-control problems. 575 00:22:59,860 --> 00:23:02,545 Now, I'm going to increase the price of my future behavior. 576 00:23:02,545 --> 00:23:04,420 But that doesn't mean that it actually works. 577 00:23:04,420 --> 00:23:05,500 It might just be more costly. 578 00:23:05,500 --> 00:23:07,458 Now, I have to climb up the ladder or whatever. 579 00:23:07,458 --> 00:23:11,702 But then I might still eat the cookie, and that's not helping. 580 00:23:11,702 --> 00:23:12,660 What else did we learn? 581 00:23:17,620 --> 00:23:18,358 Yeah? 582 00:23:18,358 --> 00:23:20,400 AUDIENCE: But I think you have to find a balance. 583 00:23:20,400 --> 00:23:22,240 Because if you make the penalty too high, 584 00:23:22,240 --> 00:23:24,540 you're not going to use the commitment device. 585 00:23:24,540 --> 00:23:25,300 FRANK SCHILBACH: Right, exactly. 586 00:23:25,300 --> 00:23:27,370 So what they did, essentially, they made the penalty-- 587 00:23:27,370 --> 00:23:28,787 or they essentially made the price 588 00:23:28,787 --> 00:23:29,950 of cookies infinitely high. 589 00:23:29,950 --> 00:23:31,750 They just sort of gave them away to the birds. 590 00:23:31,750 --> 00:23:34,083 Now they don't have any cookies at all, and sort of now, 591 00:23:34,083 --> 00:23:36,080 they're going to eat no cookies whatsoever, 592 00:23:36,080 --> 00:23:40,150 which is kind of not what they wanted either, right? 593 00:23:40,150 --> 00:23:43,060 AUDIENCE: But I think another lesson is that once 594 00:23:43,060 --> 00:23:46,625 you've made one of these commitment devices 595 00:23:46,625 --> 00:23:50,680 so costly that you actually just transfer it 596 00:23:50,680 --> 00:23:52,250 to another temptation [INAUDIBLE].. 597 00:23:52,250 --> 00:23:53,500 FRANK SCHILBACH: Yes, exactly. 598 00:23:53,500 --> 00:23:54,730 That's what I was saying earlier. 599 00:23:54,730 --> 00:23:56,260 You might sort of restrict your browser. 600 00:23:56,260 --> 00:23:58,343 You might sort of, like, shut down Chrome, or even 601 00:23:58,343 --> 00:23:59,605 your entire laptop. 602 00:23:59,605 --> 00:24:01,480 But then, if you start surfing on your phone, 603 00:24:01,480 --> 00:24:02,737 that's not really helping. 604 00:24:02,737 --> 00:24:04,570 So in some sense there's, like, substitution 605 00:24:04,570 --> 00:24:08,110 to different potential either sort of goods, or devices, 606 00:24:08,110 --> 00:24:10,840 and so on, or technologies that might sort of undo 607 00:24:10,840 --> 00:24:12,310 your commitment device. 608 00:24:12,310 --> 00:24:15,040 So they need to be foolproof in some sense, in the sense of, 609 00:24:15,040 --> 00:24:17,920 like, being able to not substitute to other things. 610 00:24:24,140 --> 00:24:27,530 OK, so first, substitution across temptation goods 611 00:24:27,530 --> 00:24:29,360 can mitigate the usefulness of commitment. 612 00:24:29,360 --> 00:24:31,882 If, essentially, you have other vices or other things 613 00:24:31,882 --> 00:24:33,590 you're going to do instead, then it's not 614 00:24:33,590 --> 00:24:36,560 really helpful to shut down one thing only. 615 00:24:36,560 --> 00:24:40,160 So you have to think about, if I don't do the behavior that I'm 616 00:24:40,160 --> 00:24:42,620 trying to prevent, is there sort of other behaviors 617 00:24:42,620 --> 00:24:46,020 that am I going to substitute towards, 618 00:24:46,020 --> 00:24:49,740 and then it's not going to be helpful either? 619 00:24:49,740 --> 00:24:52,880 Now, it's also helpful to think about what is it actually-- 620 00:24:52,880 --> 00:24:55,670 what is it actually required for a commitment device 621 00:24:55,670 --> 00:24:57,253 to be helpful to a person? 622 00:24:57,253 --> 00:24:59,420 So first, kind of obviously in some ways, the person 623 00:24:59,420 --> 00:25:01,800 needs to have some self-control problem in the first place. 624 00:25:01,800 --> 00:25:04,342 This needs to be something that, in the future, some behavior 625 00:25:04,342 --> 00:25:05,362 you want to change. 626 00:25:05,362 --> 00:25:07,570 Second, there needs to be some sophistication, right? 627 00:25:07,570 --> 00:25:09,020 Like Frog and Toad, they kind of know 628 00:25:09,020 --> 00:25:10,812 that they're going to eat too many cookies. 629 00:25:10,812 --> 00:25:12,380 They need to sort of understand that. 630 00:25:12,380 --> 00:25:15,240 Third, and this is what was said earlier, 631 00:25:15,240 --> 00:25:17,240 the commitment device needs to be effective. 632 00:25:17,240 --> 00:25:19,490 We need to have something that's sort of strong enough 633 00:25:19,490 --> 00:25:24,860 to be able to help us overcome our self-control problems. 634 00:25:24,860 --> 00:25:26,563 And then fourth, it needs to be-- 635 00:25:26,563 --> 00:25:27,980 the person needs to actually think 636 00:25:27,980 --> 00:25:29,733 that the device is effective. 637 00:25:29,733 --> 00:25:31,400 If I sort of don't have faith in myself, 638 00:25:31,400 --> 00:25:35,240 I think this is not useful, then I'm never going to take it up. 639 00:25:35,240 --> 00:25:39,170 Or conversely, I guess, there's issues potentially with, like, 640 00:25:39,170 --> 00:25:40,160 naiveté. 641 00:25:40,160 --> 00:25:42,050 There's two things that naiveté could do. 642 00:25:42,050 --> 00:25:43,980 So one part in naiveté-- 643 00:25:43,980 --> 00:25:45,990 what I was already talking about before-- 644 00:25:45,990 --> 00:25:48,317 which is to say, I might be naive in a sense 645 00:25:48,317 --> 00:25:49,400 I'm saying, like, I know-- 646 00:25:49,400 --> 00:25:51,442 I don't have-- if I'm fully naive, I might think, 647 00:25:51,442 --> 00:25:52,942 I don't have a self-control problem. 648 00:25:52,942 --> 00:25:53,790 It's not really bad. 649 00:25:53,790 --> 00:25:55,070 So I don't really need a commitment device 650 00:25:55,070 --> 00:25:56,300 in the first place. 651 00:25:56,300 --> 00:25:59,160 That's one version where I sort of under demand commitment 652 00:25:59,160 --> 00:26:00,890 by saying I don't actually need it. 653 00:26:00,890 --> 00:26:03,472 It's not helpful for me anyway. 654 00:26:03,472 --> 00:26:04,680 In the future, I will behave. 655 00:26:04,680 --> 00:26:06,290 So I might not demand commitment. 656 00:26:06,290 --> 00:26:08,450 Another version is to say, I might actually 657 00:26:08,450 --> 00:26:11,720 demand commitment devices where I overcommit myself in a sense, 658 00:26:11,720 --> 00:26:14,270 like, I think the commitment device is actually helpful 659 00:26:14,270 --> 00:26:15,920 when in fact it is not. 660 00:26:15,920 --> 00:26:17,630 So Frog and Toad were not doing that. 661 00:26:17,630 --> 00:26:19,463 In some sense, they sort of like understood, 662 00:26:19,463 --> 00:26:22,250 to some degree at least, pretty well what they're 663 00:26:22,250 --> 00:26:23,430 going to do in the future. 664 00:26:23,430 --> 00:26:25,730 I think they sort of underestimated the substitution 665 00:26:25,730 --> 00:26:27,438 in this case, in the sense of, like, they 666 00:26:27,438 --> 00:26:31,680 thought giving the cookies to the birds would be helpful. 667 00:26:31,680 --> 00:26:34,240 But in fact, then they sort of underestimated 668 00:26:34,240 --> 00:26:35,360 the substitution. 669 00:26:35,360 --> 00:26:37,040 And in the end, you know, they're going to be worse off. 670 00:26:37,040 --> 00:26:39,415 Because now, he's going to make the cake anyway, and then 671 00:26:39,415 --> 00:26:40,190 eat a lot of cake. 672 00:26:40,190 --> 00:26:43,850 He could have as well just eaten the cookies in the first place. 673 00:26:43,850 --> 00:26:44,870 Any questions on this? 674 00:26:50,430 --> 00:26:54,740 OK, so now, let me sort of turn toward some academic papers, 675 00:26:54,740 --> 00:26:57,990 and sort of actual empirical setups. 676 00:26:57,990 --> 00:27:00,800 So this is the paper that you, I hope, all read, 677 00:27:00,800 --> 00:27:02,780 which is Ariely and Wertenbroch. 678 00:27:02,780 --> 00:27:05,540 Dan Ariely was actually a professor here at Sloan, 679 00:27:05,540 --> 00:27:09,350 and did a bunch of experiments, mostly with MIT students 680 00:27:09,350 --> 00:27:10,990 at Sloan-- 681 00:27:10,990 --> 00:27:12,380 MBAs. 682 00:27:12,380 --> 00:27:16,310 And so these are 51 executives at Sloan. 683 00:27:16,310 --> 00:27:18,410 These are sort of highly incentivized individuals, 684 00:27:18,410 --> 00:27:20,180 in the sense that they take classes, 685 00:27:20,180 --> 00:27:21,920 and if they don't-- if they fail a class, 686 00:27:21,920 --> 00:27:24,170 if they do badly in a class, they have to actually pay 687 00:27:24,170 --> 00:27:24,960 for it themselves. 688 00:27:24,960 --> 00:27:27,773 So they really are incentivized to do well. 689 00:27:27,773 --> 00:27:29,690 They had to submit three papers in their class 690 00:27:29,690 --> 00:27:31,970 to get a 1% grade penalty for late submissions, which 691 00:27:31,970 --> 00:27:34,200 is quite a bit. 692 00:27:34,200 --> 00:27:35,330 There were two groups. 693 00:27:35,330 --> 00:27:37,220 Group A had evenly-spaced deadlines. 694 00:27:37,220 --> 00:27:40,520 Group B had the option to set their own deadlines. 695 00:27:40,520 --> 00:27:46,440 And now, the first sort of result 696 00:27:46,440 --> 00:27:48,860 here is that there was demand for commitment in the sense 697 00:27:48,860 --> 00:27:52,970 that, like, 68% of people chose deadlines 698 00:27:52,970 --> 00:27:54,230 prior to the last week. 699 00:27:54,230 --> 00:27:55,730 So when given the option when do you 700 00:27:55,730 --> 00:27:58,188 want to set your deadlines over the course of the semester, 701 00:27:58,188 --> 00:28:02,540 68% of people chose deadlines that were not the last week. 702 00:28:02,540 --> 00:28:04,316 Why is that demand for commitment? 703 00:28:07,910 --> 00:28:08,410 Yes? 704 00:28:08,410 --> 00:28:09,548 AUDIENCE: More choices as to when 705 00:28:09,548 --> 00:28:11,830 to do the paper, just to make a deadline at the end. 706 00:28:11,830 --> 00:28:13,270 FRANK SCHILBACH: Exactly, like, you-- there's 707 00:28:13,270 --> 00:28:15,728 no reason, unless you have sort of issues with self-control 708 00:28:15,728 --> 00:28:19,360 and the like, you just make it essentially costly 709 00:28:19,360 --> 00:28:20,710 for you to submit late. 710 00:28:20,710 --> 00:28:23,110 If you are worried about submitting everything late, 711 00:28:23,110 --> 00:28:26,050 that's a good thing for you to do, and it might help you. 712 00:28:26,050 --> 00:28:29,380 Again, notice that it requires some sophistication. 713 00:28:29,380 --> 00:28:32,200 You need to sort of understand your future behavior 714 00:28:32,200 --> 00:28:32,883 to do that. 715 00:28:32,883 --> 00:28:34,300 Now, what they find is, like, they 716 00:28:34,300 --> 00:28:36,940 find no late submissions, which is interesting by itself. 717 00:28:36,940 --> 00:28:40,090 But in particular, they find that group A has higher grades 718 00:28:40,090 --> 00:28:41,680 than group B, OK? 719 00:28:41,680 --> 00:28:45,820 Group A is the group that has evenly-spaced deadlines, 720 00:28:45,820 --> 00:28:48,280 compared to group B that has the option 721 00:28:48,280 --> 00:28:50,497 to set their own deadlines. 722 00:28:50,497 --> 00:28:52,330 It is consistent with self-control problems, 723 00:28:52,330 --> 00:28:55,420 in the sense that people set their own deadlines that 724 00:28:55,420 --> 00:28:57,130 are early. 725 00:28:57,130 --> 00:28:58,570 What else is it consistent with? 726 00:28:58,570 --> 00:29:00,678 What did we learn from that? 727 00:29:00,678 --> 00:29:02,970 What do we learn from the fact that group A does better 728 00:29:02,970 --> 00:29:03,570 than group B? 729 00:29:07,460 --> 00:29:08,223 Yes? 730 00:29:08,223 --> 00:29:11,630 AUDIENCE: That the commitment device wasn't strong enough? 731 00:29:11,630 --> 00:29:14,420 FRANK SCHILBACH: Yeah, so in some sense, the commitment 732 00:29:14,420 --> 00:29:17,150 devices that people chose for themselves was not 733 00:29:17,150 --> 00:29:19,340 strong enough, or, like, weaker than 734 00:29:19,340 --> 00:29:21,080 the evenly-spaced deadlines. 735 00:29:21,080 --> 00:29:23,813 Now, notice that evenly-spaced deadlines are 736 00:29:23,813 --> 00:29:24,980 in the choice set of people. 737 00:29:24,980 --> 00:29:27,355 Like, if you choose your own deadlines, you can just say, 738 00:29:27,355 --> 00:29:29,327 I'm choosing evenly-spaced deadlines. 739 00:29:29,327 --> 00:29:31,910 And you should do as well-- if you're perfectly sophisticated, 740 00:29:31,910 --> 00:29:34,760 you would do as well as the group A. 741 00:29:34,760 --> 00:29:36,140 But the fact that the groups-- 742 00:29:36,140 --> 00:29:39,050 group B is doing worse sort of tells us that people are not 743 00:29:39,050 --> 00:29:40,453 optimally setting deadlines. 744 00:29:40,453 --> 00:29:42,620 Either they don't-- there's a fraction of people who 745 00:29:42,620 --> 00:29:43,910 don't choose any deadlines. 746 00:29:43,910 --> 00:29:46,910 There's at least 32% of people who sort of say, 747 00:29:46,910 --> 00:29:48,330 my deadlines are all at the end. 748 00:29:48,330 --> 00:29:49,910 So maybe these are naive people. 749 00:29:49,910 --> 00:29:51,710 Or even among the people who set deadlines, 750 00:29:51,710 --> 00:29:54,470 they might choose deadlines set further back in the semester, 751 00:29:54,470 --> 00:29:57,200 and they do worse than in the evenly-spaced deadlines. 752 00:29:57,200 --> 00:29:58,310 So what did we learn? 753 00:29:58,310 --> 00:29:59,940 So people do set early deadlines-- 754 00:29:59,940 --> 00:30:01,080 quite a few of them. 755 00:30:01,080 --> 00:30:03,500 So we learn about people are at least partially-- 756 00:30:03,500 --> 00:30:05,062 or some people are sophisticated. 757 00:30:05,062 --> 00:30:06,770 But they're only partially sophisticated. 758 00:30:06,770 --> 00:30:08,990 They don't set deadlines optimally, 759 00:30:08,990 --> 00:30:11,240 which is sort of consistent with people being naive, 760 00:30:11,240 --> 00:30:12,200 at least in some way. 761 00:30:12,200 --> 00:30:12,700 Yeah? 762 00:30:12,700 --> 00:30:14,310 AUDIENCE: I was wondering, does it 763 00:30:14,310 --> 00:30:18,780 make sense to try to find the difference between not 764 00:30:18,780 --> 00:30:24,500 adequately measuring your commitment [INAUDIBLE] 765 00:30:24,500 --> 00:30:26,860 estimating how long it would take you to do the paper? 766 00:30:26,860 --> 00:30:28,860 FRANK SCHILBACH: Right, that's a great question. 767 00:30:28,860 --> 00:30:31,100 So the question is, so there's two things that 768 00:30:31,100 --> 00:30:33,350 are perhaps uncertain when you think about the future. 769 00:30:33,350 --> 00:30:36,130 One is your present bias-- your preference parameter 770 00:30:36,130 --> 00:30:39,370 about how much work you're going to put 771 00:30:39,370 --> 00:30:41,050 in today versus tomorrow, and how much 772 00:30:41,050 --> 00:30:42,440 you're going to procrastinate. 773 00:30:42,440 --> 00:30:45,148 A second question is about your beliefs about your effort 774 00:30:45,148 --> 00:30:46,690 costs in some way or the other, which 775 00:30:46,690 --> 00:30:50,410 is how costly is it for you to actually do the problem set, 776 00:30:50,410 --> 00:30:51,790 or write the paper, and so on. 777 00:30:51,790 --> 00:30:54,187 And you might underestimate how tedious it is to do. 778 00:30:54,187 --> 00:30:56,020 There's sort of a large literature on what's 779 00:30:56,020 --> 00:30:57,270 called the "planning fallacy." 780 00:30:57,270 --> 00:30:59,110 People always plan too many things, 781 00:30:59,110 --> 00:31:03,560 and think whatever they plan takes shorter than it actually 782 00:31:03,560 --> 00:31:04,060 does. 783 00:31:04,060 --> 00:31:05,830 Or put differently, things always 784 00:31:05,830 --> 00:31:08,822 take longer than people think when they make plans. 785 00:31:08,822 --> 00:31:10,780 And the odd thing is that also is the case even 786 00:31:10,780 --> 00:31:11,990 if you take that into account. 787 00:31:11,990 --> 00:31:13,930 So in some sense, even if you're aware of the planning fallacy, 788 00:31:13,930 --> 00:31:15,820 and you plan, and you know that people are going to-- things 789 00:31:15,820 --> 00:31:17,950 are going to take longer, somewhat oddly enough, 790 00:31:17,950 --> 00:31:19,810 people are even taking longer even 791 00:31:19,810 --> 00:31:21,410 if you take that into account. 792 00:31:21,410 --> 00:31:23,440 But anyway, so the question is kind of 793 00:31:23,440 --> 00:31:27,950 like how can we separate that. 794 00:31:27,950 --> 00:31:30,872 I think, from this evidence, we cannot separate this. 795 00:31:30,872 --> 00:31:32,830 There are sort of like cleaner experiments that 796 00:31:32,830 --> 00:31:34,600 are more careful about this. 797 00:31:34,600 --> 00:31:36,970 In some sense, if it were just about 798 00:31:36,970 --> 00:31:39,010 like underestimating effort costs, 799 00:31:39,010 --> 00:31:41,800 then you would not set your own deadlines necessarily. 800 00:31:41,800 --> 00:31:43,780 Like, the setting early deadlines 801 00:31:43,780 --> 00:31:46,537 is consistent with self-control issues. 802 00:31:46,537 --> 00:31:48,370 But just from the performance, you're right. 803 00:31:48,370 --> 00:31:51,280 That could just be people underestimate effort costs as 804 00:31:51,280 --> 00:31:52,340 well. 805 00:31:52,340 --> 00:31:55,810 In general, I think this is hard to separate in many situations. 806 00:31:55,810 --> 00:31:57,670 The key part about the self-control problems 807 00:31:57,670 --> 00:32:00,670 about the time-inconsistent is a demand for commitment 808 00:32:00,670 --> 00:32:02,140 which sort of reveals that people 809 00:32:02,140 --> 00:32:05,530 have some form of self-control problems related 810 00:32:05,530 --> 00:32:07,690 to time preferences. 811 00:32:07,690 --> 00:32:10,570 Any other question? 812 00:32:10,570 --> 00:32:13,600 So now, there's one concern is the two sessions are not 813 00:32:13,600 --> 00:32:14,440 randomly assigned. 814 00:32:14,440 --> 00:32:15,898 Why is that a problem, potentially? 815 00:32:15,898 --> 00:32:19,060 So we have, like, 51 executives. 816 00:32:19,060 --> 00:32:22,930 25 or something-- 26 are in one and 25 are another, 817 00:32:22,930 --> 00:32:25,180 but they're not randomly assigned 818 00:32:25,180 --> 00:32:27,310 across the two sessions. 819 00:32:27,310 --> 00:32:27,810 Yes? 820 00:32:27,810 --> 00:32:29,762 AUDIENCE: Well, if you're just [INAUDIBLE] 821 00:32:29,762 --> 00:32:34,043 you might want to pick the class with the [INAUDIBLE].. 822 00:32:34,043 --> 00:32:36,460 FRANK SCHILBACH: Right, so there could be selection people 823 00:32:36,460 --> 00:32:38,110 sort of switch sessions. 824 00:32:38,110 --> 00:32:39,250 I am not sure recitations-- 825 00:32:39,250 --> 00:32:42,760 I'm not sure that's actually possible. 826 00:32:42,760 --> 00:32:45,130 But setting that-- so that's surely one problem, 827 00:32:45,130 --> 00:32:47,800 like, selection into the different groups. 828 00:32:47,800 --> 00:32:51,190 Setting that aside, what other problem do we have? 829 00:32:55,180 --> 00:32:56,450 Yes? 830 00:32:56,450 --> 00:32:57,950 AUDIENCE: [INAUDIBLE] something like 831 00:32:57,950 --> 00:32:59,556 if the two sessions are at different times, 832 00:32:59,556 --> 00:33:01,202 and there's some advanced class that we 833 00:33:01,202 --> 00:33:03,252 can see as one session but not the other, 834 00:33:03,252 --> 00:33:05,435 you could end up with more advanced students 835 00:33:05,435 --> 00:33:08,820 in [INAUDIBLE] section, which could affect how good they 836 00:33:08,820 --> 00:33:09,960 are at writing [INAUDIBLE]. 837 00:33:09,960 --> 00:33:12,300 FRANK SCHILBACH: Exactly, there could just be other-- 838 00:33:12,300 --> 00:33:14,050 sorry, this is a version of what you were saying earlier. 839 00:33:14,050 --> 00:33:15,383 So I misunderstood a little bit. 840 00:33:15,383 --> 00:33:17,910 Exactly, there could be, like, people are-- 841 00:33:17,910 --> 00:33:20,430 for other reasons than the actual treatment, 842 00:33:20,430 --> 00:33:21,715 people might-- 843 00:33:21,715 --> 00:33:22,590 one session is early. 844 00:33:22,590 --> 00:33:23,208 One is late. 845 00:33:23,208 --> 00:33:24,000 One is on Thursday. 846 00:33:24,000 --> 00:33:25,380 One is on Friday. 847 00:33:25,380 --> 00:33:26,387 One has a better TA. 848 00:33:26,387 --> 00:33:27,720 One has a worse TA, or whatever. 849 00:33:27,720 --> 00:33:29,845 People might sort of select into different-- so the 850 00:33:29,845 --> 00:33:34,050 characteristics that people have when they select into 851 00:33:34,050 --> 00:33:36,120 or when they end up in session A versus session B 852 00:33:36,120 --> 00:33:37,510 might be different. 853 00:33:37,510 --> 00:33:39,630 So what you compare is essentially different types 854 00:33:39,630 --> 00:33:41,340 of people. 855 00:33:41,340 --> 00:33:43,440 Second, there's only two sessions or sections. 856 00:33:43,440 --> 00:33:45,480 That's essentially an issue of, like, 857 00:33:45,480 --> 00:33:46,852 sample size and statistics. 858 00:33:46,852 --> 00:33:48,810 They're going to talk to one of the recitations 859 00:33:48,810 --> 00:33:50,320 about that in more detail. 860 00:33:50,320 --> 00:33:54,270 But essentially, the bottom line is that if you randomize, add-- 861 00:33:54,270 --> 00:33:56,700 or if you were to randomize, which they haven't even 862 00:33:56,700 --> 00:33:58,830 done here, but if you were randomize 863 00:33:58,830 --> 00:34:01,620 at the level of a section, then if you only have, 864 00:34:01,620 --> 00:34:04,530 like, two sections, that's essentially not enough. 865 00:34:04,530 --> 00:34:07,110 Because you might face correlated shocks. 866 00:34:07,110 --> 00:34:07,652 What is that? 867 00:34:07,652 --> 00:34:09,152 For example, it might be like one TA 868 00:34:09,152 --> 00:34:10,239 is better than the other. 869 00:34:10,239 --> 00:34:12,840 And one class, A or B, the group might 870 00:34:12,840 --> 00:34:15,449 do much better than the other, not because of the deadline 871 00:34:15,449 --> 00:34:17,620 policy, but just because of the TA. 872 00:34:17,620 --> 00:34:19,770 Or again, it could be that, like, the recitation is 873 00:34:19,770 --> 00:34:21,978 early in the morning, and people don't pay attention, 874 00:34:21,978 --> 00:34:22,889 or whatever. 875 00:34:22,889 --> 00:34:25,250 There might be many other things happening and so on. 876 00:34:25,250 --> 00:34:27,090 So in some sense, the sample size 877 00:34:27,090 --> 00:34:28,980 is A, too small to start with. 878 00:34:28,980 --> 00:34:31,409 50 people is very small for an experiment. 879 00:34:31,409 --> 00:34:34,650 Two, if you randomize or if you have only, like, two treatment 880 00:34:34,650 --> 00:34:39,570 groups, you need more clusters, as people call it-- 881 00:34:39,570 --> 00:34:45,389 more sort of items of randomization overall. 882 00:34:45,389 --> 00:34:47,435 So now, there's a second experiment 883 00:34:47,435 --> 00:34:49,560 that they did sort of to deal with the first issue. 884 00:34:49,560 --> 00:34:53,100 This was sort of like, again, not a huge sample, 885 00:34:53,100 --> 00:34:54,780 but somewhat better. 886 00:34:54,780 --> 00:34:55,889 It's 60 people. 887 00:34:55,889 --> 00:35:00,190 And now, it's randomized into three treatment groups. 888 00:35:00,190 --> 00:35:01,920 One is evenly-spaced deadlines. 889 00:35:01,920 --> 00:35:04,260 B is, like, no deadlines. 890 00:35:04,260 --> 00:35:06,400 Three is, like, self-imposed deadlines. 891 00:35:06,400 --> 00:35:09,540 This is now a proofreading exercise over 21 days. 892 00:35:09,540 --> 00:35:11,560 This is now randomized at the individual level, 893 00:35:11,560 --> 00:35:12,700 so it's not in groups anymore. 894 00:35:12,700 --> 00:35:14,850 So we get a little bit away from this cluster issue 895 00:35:14,850 --> 00:35:16,450 that we had before. 896 00:35:16,450 --> 00:35:18,180 Now, what does an exponential discounter 897 00:35:18,180 --> 00:35:20,940 do in terms of performance? 898 00:35:20,940 --> 00:35:23,660 What do we predict what's going to happen? 899 00:35:23,660 --> 00:35:25,680 So like, a person with beta equals 1, 900 00:35:25,680 --> 00:35:27,342 what do we say is going to happen 901 00:35:27,342 --> 00:35:28,800 to the performance of these groups? 902 00:35:33,580 --> 00:35:35,905 AUDIENCE: It should be the same? 903 00:35:35,905 --> 00:35:36,780 FRANK SCHILBACH: Yes. 904 00:35:36,780 --> 00:35:37,150 Which one? 905 00:35:37,150 --> 00:35:37,650 All of them? 906 00:35:45,766 --> 00:35:49,250 AUDIENCE: I'm not sure. 907 00:35:49,250 --> 00:35:51,950 FRANK SCHILBACH: So let's start with the self-imposed deadlines 908 00:35:51,950 --> 00:35:54,442 and the no deadlines. 909 00:35:54,442 --> 00:35:56,650 What kinds of deadlines is the exponential discounter 910 00:35:56,650 --> 00:35:57,994 going to set? 911 00:35:57,994 --> 00:35:58,619 AUDIENCE: None? 912 00:35:58,619 --> 00:36:00,161 FRANK SCHILBACH: None, somebody said? 913 00:36:00,161 --> 00:36:01,290 AUDIENCE: [INAUDIBLE] 914 00:36:01,290 --> 00:36:01,980 FRANK SCHILBACH: Yes, and-- 915 00:36:01,980 --> 00:36:03,300 AUDIENCE: I don't think they need the deadline. 916 00:36:03,300 --> 00:36:04,900 FRANK SCHILBACH: Yeah, like, why would you set a deadline? 917 00:36:04,900 --> 00:36:05,490 You didn't need a deadline. 918 00:36:05,490 --> 00:36:07,800 You're going to just do it whenever it's best to do. 919 00:36:07,800 --> 00:36:09,630 There might be some-- you might get sick 920 00:36:09,630 --> 00:36:11,670 or, like, some other shocks might appear 921 00:36:11,670 --> 00:36:13,060 in the course of the semester. 922 00:36:13,060 --> 00:36:15,090 So deadlines will just restrict yourself. 923 00:36:15,090 --> 00:36:18,210 So you will not self-impose any deadlines. 924 00:36:18,210 --> 00:36:20,700 So early deadlines essentially just limit flexibility. 925 00:36:20,700 --> 00:36:22,140 That's not helpful to do. 926 00:36:22,140 --> 00:36:25,320 You don't need to sort of self-impose any deadlines. 927 00:36:25,320 --> 00:36:28,920 And then, you might sort of do weekly worse-- 928 00:36:28,920 --> 00:36:31,050 in group A, you might do it a little bit worse, 929 00:36:31,050 --> 00:36:34,163 because essentially, shocks might happen or the like. 930 00:36:34,163 --> 00:36:35,580 There's no reason to believe that, 931 00:36:35,580 --> 00:36:38,100 like, if you restrict your option, you might do better. 932 00:36:38,100 --> 00:36:41,070 Group A might happen to do as well as C and B. But at least 933 00:36:41,070 --> 00:36:42,930 they are going to be weekly worse, 934 00:36:42,930 --> 00:36:47,670 in part because people get sick around deadlines or the like. 935 00:36:47,670 --> 00:36:48,450 OK? 936 00:36:48,450 --> 00:36:53,340 So what about a sophisticated person with beta hat 937 00:36:53,340 --> 00:36:54,600 equals beta smaller than 1? 938 00:37:07,590 --> 00:37:08,420 Yes? 939 00:37:08,420 --> 00:37:10,054 AUDIENCE: Self-imposed deadlines should 940 00:37:10,054 --> 00:37:12,290 equal evenly-spaced deadlines? 941 00:37:12,290 --> 00:37:15,148 FRANK SCHILBACH: Can you explain more? 942 00:37:15,148 --> 00:37:18,040 AUDIENCE: [INAUDIBLE] 943 00:37:21,753 --> 00:37:23,670 FRANK SCHILBACH: So the evenly-spaced deadline 944 00:37:23,670 --> 00:37:26,720 will do better or worse than no deadlines, to start with? 945 00:37:26,720 --> 00:37:28,130 AUDIENCE: They would do better? 946 00:37:28,130 --> 00:37:30,222 FRANK SCHILBACH: And why is that? 947 00:37:30,222 --> 00:37:35,389 AUDIENCE: Because they-- if they have present bias, 948 00:37:35,389 --> 00:37:36,514 then they'll procrastinate. 949 00:37:36,514 --> 00:37:39,420 So then they would have three things due at the same time. 950 00:37:39,420 --> 00:37:40,590 FRANK SCHILBACH: Exactly. 951 00:37:40,590 --> 00:37:42,000 Exactly, so that should help. 952 00:37:42,000 --> 00:37:44,250 And then the self-imposed deadlines, there, 953 00:37:44,250 --> 00:37:45,830 you can optimally set the deadlines, 954 00:37:45,830 --> 00:37:48,150 so you could probably do at least weekly better than 955 00:37:48,150 --> 00:37:50,640 the evenly-spaced ones, because you might also take 956 00:37:50,640 --> 00:37:53,280 into account some constraints that you have in a semester, 957 00:37:53,280 --> 00:37:56,100 like your family is visiting or the like, 958 00:37:56,100 --> 00:37:57,630 that might be sort of-- 959 00:37:57,630 --> 00:37:59,640 so evenly-spaced deadlines, some might not 960 00:37:59,640 --> 00:38:01,050 be optimal because of that. 961 00:38:01,050 --> 00:38:03,508 If you self-impose them and you're perfectly sophisticated, 962 00:38:03,508 --> 00:38:05,490 you can take that into account. 963 00:38:05,490 --> 00:38:07,380 So deadlines can help. 964 00:38:07,380 --> 00:38:10,590 And flexible deadlines tend to be preferable, 965 00:38:10,590 --> 00:38:11,970 because you can take into account 966 00:38:11,970 --> 00:38:16,320 your individual specific costs that you might face. 967 00:38:16,320 --> 00:38:18,120 OK, what about the fully naive person? 968 00:38:25,800 --> 00:38:26,580 Yes? 969 00:38:26,580 --> 00:38:28,878 AUDIENCE: So no deadlines and self-imposed deadlines 970 00:38:28,878 --> 00:38:29,670 should be the same. 971 00:38:29,670 --> 00:38:31,362 Because if you're fully naive, you 972 00:38:31,362 --> 00:38:33,570 don't think you need that much [INAUDIBLE] commitment 973 00:38:33,570 --> 00:38:35,237 devices, so you just set them to the end 974 00:38:35,237 --> 00:38:37,060 as if you're expontential. 975 00:38:37,060 --> 00:38:39,630 And evenly-spaced should be weekly better, 976 00:38:39,630 --> 00:38:43,163 because even if it's not fully optimal, 977 00:38:43,163 --> 00:38:44,830 it should stop some of the [INAUDIBLE].. 978 00:38:44,830 --> 00:38:45,872 FRANK SCHILBACH: Correct. 979 00:38:45,872 --> 00:38:48,170 Correct, so as before, the evenly-spaced deadlines, 980 00:38:48,170 --> 00:38:50,290 since the person has beta smaller-- or some people 981 00:38:50,290 --> 00:38:53,530 have beta smaller than 1, the evenly-spaced deadlines likely 982 00:38:53,530 --> 00:38:55,313 do better, or help people do better, 983 00:38:55,313 --> 00:38:57,730 unless they sort of miss a lot of deadlines anyway because 984 00:38:57,730 --> 00:38:58,522 of procrastination. 985 00:38:58,522 --> 00:39:01,150 But assuming the deadlines are sort of helping people space 986 00:39:01,150 --> 00:39:04,180 things out better, they're going to do better 987 00:39:04,180 --> 00:39:06,160 than the no deadlines case. 988 00:39:06,160 --> 00:39:07,950 And then, the fully naive person says, 989 00:39:07,950 --> 00:39:09,200 why would I set any deadlines? 990 00:39:09,200 --> 00:39:10,060 I don't need any deadline. 991 00:39:10,060 --> 00:39:11,007 Flexibility is good. 992 00:39:11,007 --> 00:39:12,340 I'm going to do it early anyway. 993 00:39:12,340 --> 00:39:14,450 Of course, that's not going to happen. 994 00:39:14,450 --> 00:39:16,840 But then essentially, B is doing the same as C, 995 00:39:16,840 --> 00:39:21,070 because the person does not set any deadlines at all, OK? 996 00:39:21,070 --> 00:39:22,990 So now, what about the partially naive person? 997 00:39:37,730 --> 00:39:38,840 Yes? 998 00:39:38,840 --> 00:39:41,390 AUDIENCE: A is better than C which is better than B? 999 00:39:41,390 --> 00:39:43,160 FRANK SCHILBACH: Can you say more? 1000 00:39:43,160 --> 00:39:47,300 AUDIENCE: Sure, so I mean, I guess 1001 00:39:47,300 --> 00:39:49,460 if they're sufficiently sophisticated, then 1002 00:39:49,460 --> 00:39:53,330 it's is possible that Cis better than A which is better than B. 1003 00:39:53,330 --> 00:39:56,173 But if they're mostly naive, then they 1004 00:39:56,173 --> 00:39:59,820 will do some kind of deadline imposition 1005 00:39:59,820 --> 00:40:04,150 which is not the absence of deadlines in its entirety, 1006 00:40:04,150 --> 00:40:07,080 but it might not be as good as [INAUDIBLE].. 1007 00:40:07,080 --> 00:40:09,240 FRANK SCHILBACH: So I guess the answer is, like, 1008 00:40:09,240 --> 00:40:10,560 to some degree, it depends. 1009 00:40:10,560 --> 00:40:13,060 So we still have, like, A is better than B. 1010 00:40:13,060 --> 00:40:15,240 So if people have beta smaller than 1, 1011 00:40:15,240 --> 00:40:16,860 the evenly-spaced deadlines still, 1012 00:40:16,860 --> 00:40:20,070 as before, are going to do somewhat better 1013 00:40:20,070 --> 00:40:21,570 than the no deadlines. 1014 00:40:21,570 --> 00:40:23,580 Now, for the self-imposed deadlines, 1015 00:40:23,580 --> 00:40:25,375 essentially two things can happen. 1016 00:40:25,375 --> 00:40:27,000 One is, like, the person might actually 1017 00:40:27,000 --> 00:40:29,260 set some deadlines that happen to be actually helpful. 1018 00:40:29,260 --> 00:40:31,260 If they're sufficiently sophisticated, 1019 00:40:31,260 --> 00:40:34,170 the deadlines might actually help at least some people. 1020 00:40:34,170 --> 00:40:36,983 But for some people, they might set overambitious deadlines 1021 00:40:36,983 --> 00:40:39,150 and say, oh, I'm going to get it done all next week. 1022 00:40:39,150 --> 00:40:41,100 Well, it turns out that's actually not true. 1023 00:40:41,100 --> 00:40:42,810 And then they sort of said too-early deadlines 1024 00:40:42,810 --> 00:40:44,880 and then have to rush, or they miss the deadline, or the like. 1025 00:40:44,880 --> 00:40:46,390 They might actually do worse. 1026 00:40:46,390 --> 00:40:49,680 So there, we have essentially no clear predictions. 1027 00:40:49,680 --> 00:40:50,970 Some commitment should help. 1028 00:40:50,970 --> 00:40:53,850 So C could be better than B. But individuals might also 1029 00:40:53,850 --> 00:40:54,450 overcommit. 1030 00:40:54,450 --> 00:40:56,530 So C might actually be worse than B. 1031 00:40:56,530 --> 00:40:59,280 So that's-- essentially, there's no empirical-- 1032 00:40:59,280 --> 00:41:01,320 clear empirical prediction. 1033 00:41:01,320 --> 00:41:04,500 Any questions on what I showed you here? 1034 00:41:04,500 --> 00:41:05,764 Yes? 1035 00:41:05,764 --> 00:41:14,373 AUDIENCE: [INAUDIBLE] 1036 00:41:14,373 --> 00:41:15,290 FRANK SCHILBACH: Yeah. 1037 00:41:15,290 --> 00:41:23,412 AUDIENCE: [INAUDIBLE] 1038 00:41:23,412 --> 00:41:24,870 FRANK SCHILBACH: So you mean, like, 1039 00:41:24,870 --> 00:41:26,760 a subsample of people who happen to choose 1040 00:41:26,760 --> 00:41:29,100 evenly-spaced deadlines? 1041 00:41:29,100 --> 00:41:29,917 Yeah. 1042 00:41:29,917 --> 00:41:33,326 AUDIENCE: So the concerns [INAUDIBLE]---- 1043 00:41:33,326 --> 00:41:39,657 so I guess I'm just not sure [INAUDIBLE] could that possibly 1044 00:41:39,657 --> 00:41:43,513 [INAUDIBLE] leave the concerns that you [INAUDIBLE].. 1045 00:41:43,513 --> 00:41:45,930 FRANK SCHILBACH: Yeah, so the issue there is essentially-- 1046 00:41:45,930 --> 00:41:48,097 so one thing they were doing in the paper is to say, 1047 00:41:48,097 --> 00:41:50,700 let's look at the people who set evenly-spaced deadlines, 1048 00:41:50,700 --> 00:41:52,290 and say, when they-- 1049 00:41:52,290 --> 00:41:55,320 how are they doing compared to people in the control group, 1050 00:41:55,320 --> 00:41:56,220 or the like? 1051 00:41:56,220 --> 00:41:58,200 The problem with that is, like, these people 1052 00:41:58,200 --> 00:41:59,670 are selected in certain ways. 1053 00:41:59,670 --> 00:42:02,430 Like, if I sort of just-- 1054 00:42:02,430 --> 00:42:05,320 this is a subset of people that's non-randomly selected. 1055 00:42:05,320 --> 00:42:07,070 We know that they chose certain deadlines. 1056 00:42:07,070 --> 00:42:09,112 So in some sense, we know that they're different, 1057 00:42:09,112 --> 00:42:11,620 in some ways, compared to the average person in the group. 1058 00:42:11,620 --> 00:42:12,780 And these could be people who are, like, 1059 00:42:12,780 --> 00:42:14,580 more productive, less productive, et cetera, and so 1060 00:42:14,580 --> 00:42:15,203 on. 1061 00:42:15,203 --> 00:42:16,620 So the clean design, how you would 1062 00:42:16,620 --> 00:42:18,495 do that is essentially, what you would do is, 1063 00:42:18,495 --> 00:42:23,400 you would ask people to-- 1064 00:42:23,400 --> 00:42:25,170 ask for their deadline choices, and then 1065 00:42:25,170 --> 00:42:27,960 randomize them into different deadline regimens. 1066 00:42:27,960 --> 00:42:29,638 So either they get their choice or not. 1067 00:42:29,638 --> 00:42:31,680 So you can actually estimate the treatment effect 1068 00:42:31,680 --> 00:42:33,097 on those kinds of people, and then 1069 00:42:33,097 --> 00:42:35,730 look at are evenly-spaced deadlines-- 1070 00:42:35,730 --> 00:42:38,490 are those treatment effects on those people worse or not? 1071 00:42:38,490 --> 00:42:41,280 But I worry that in that kind of analysis, 1072 00:42:41,280 --> 00:42:43,500 what's non-random here-- or what they're choosing 1073 00:42:43,500 --> 00:42:46,290 as a non-random subset and compare that with a control 1074 00:42:46,290 --> 00:42:46,830 group-- 1075 00:42:46,830 --> 00:42:49,038 they're sort of-- that's correlated with other stuff. 1076 00:42:49,038 --> 00:42:55,080 And that's not necessarily getting your causal effect. 1077 00:42:55,080 --> 00:43:01,650 Having said that, when you look at the evenly-spaced deadlines, 1078 00:43:01,650 --> 00:43:03,660 they could sort make things worse in some ways, 1079 00:43:03,660 --> 00:43:05,573 in the sense of, like, even for people 1080 00:43:05,573 --> 00:43:06,990 with beta smaller than 1, it could 1081 00:43:06,990 --> 00:43:08,280 be that people missed the deadlines 1082 00:43:08,280 --> 00:43:10,655 because they're procrastinating, and so on, and so forth. 1083 00:43:10,655 --> 00:43:12,533 Or they had some shocks and the like, 1084 00:43:12,533 --> 00:43:14,700 so it's not necessarily obvious that they do better. 1085 00:43:14,700 --> 00:43:17,400 That's sort of all under some assumptions. 1086 00:43:21,000 --> 00:43:22,424 Any other questions? 1087 00:43:27,640 --> 00:43:29,670 OK, so now, what we find is, like, 1088 00:43:29,670 --> 00:43:32,730 A is doing better than C is doing better than B. 1089 00:43:32,730 --> 00:43:35,970 So this essentially true for errors detected. 1090 00:43:35,970 --> 00:43:37,650 It's true for delays in submission. 1091 00:43:37,650 --> 00:43:39,480 It's also true for people's earnings 1092 00:43:39,480 --> 00:43:41,558 in the entire exercise. 1093 00:43:41,558 --> 00:43:43,350 So now, what you have is, like, A is better 1094 00:43:43,350 --> 00:43:46,530 than C is better than B. So what is that consistent with? 1095 00:43:46,530 --> 00:43:49,470 Well, it seems like it's consistent with some partial 1096 00:43:49,470 --> 00:43:50,430 naiveté. 1097 00:43:50,430 --> 00:43:55,947 In some sense, admittedly, the prediction of partial naiveté 1098 00:43:55,947 --> 00:43:57,780 is a little bit loose in a sense that, like, 1099 00:43:57,780 --> 00:44:02,190 all we're saying is A is better than B. And you know, 1100 00:44:02,190 --> 00:44:04,840 then the comparison with C could go either way. 1101 00:44:04,840 --> 00:44:07,680 So sure, it's consistent with partial naiveté, 1102 00:44:07,680 --> 00:44:10,050 but it's also only suggestive in some sense. 1103 00:44:10,050 --> 00:44:13,210 Because there's no clear prediction in that case. 1104 00:44:13,210 --> 00:44:15,770 However, we can reject-- sort of like the other cases, 1105 00:44:15,770 --> 00:44:17,520 we can reject the exponential discountings 1106 00:44:17,520 --> 00:44:20,940 of sophisticated present bias to fully sophisticated one, 1107 00:44:20,940 --> 00:44:23,580 or the fully naive present biased person, 1108 00:44:23,580 --> 00:44:26,200 because those predictions are not borne out. 1109 00:44:26,200 --> 00:44:32,370 So the only case that really is left is the partial naiveté 1110 00:44:32,370 --> 00:44:32,880 one. 1111 00:44:32,880 --> 00:44:35,230 So let me-- sort of to summarize what we saw, 1112 00:44:35,230 --> 00:44:37,830 so the result is that deadline setting improves performance, 1113 00:44:37,830 --> 00:44:40,740 that is evidence of some present bias-- being essentially 1114 00:44:40,740 --> 00:44:42,960 beta being smaller than 1. 1115 00:44:42,960 --> 00:44:46,720 It's also consistent with partial naiveté. 1116 00:44:46,720 --> 00:44:48,840 People set deadlines, which is consistent. 1117 00:44:48,840 --> 00:44:50,580 They set early deadlines, which shows 1118 00:44:50,580 --> 00:44:54,360 some demand for commitment, some form of sophistication. 1119 00:44:54,360 --> 00:44:57,360 Then, result 2 says deadline setting is suboptimal, which 1120 00:44:57,360 --> 00:45:00,250 sort of suggests that beta is smaller than beta hat, 1121 00:45:00,250 --> 00:45:03,000 or beta hat is bigger than beta, which means that essentially, 1122 00:45:03,000 --> 00:45:07,470 people seem to underestimate the self-control problems that they 1123 00:45:07,470 --> 00:45:08,610 have in the future. 1124 00:45:08,610 --> 00:45:10,110 So that's, sort of broadly speaking, 1125 00:45:10,110 --> 00:45:16,650 some support of the beta delta model with partial naiveté. 1126 00:45:16,650 --> 00:45:17,900 Any questions on this paper? 1127 00:45:26,700 --> 00:45:27,450 OK. 1128 00:45:27,450 --> 00:45:29,670 Yes, sorry, go ahead. 1129 00:45:29,670 --> 00:45:32,043 AUDIENCE: I was confused by what you meant by partial 1130 00:45:32,043 --> 00:45:33,460 sophistication or partial naiveté. 1131 00:45:33,460 --> 00:45:35,876 FRANK SCHILBACH: So this is what I was discussing earlier, 1132 00:45:35,876 --> 00:45:37,735 partial naivete being-- meaning that-- 1133 00:45:37,735 --> 00:45:39,110 let me just go back for a second. 1134 00:45:45,844 --> 00:45:47,710 This is [INAUDIBLE] discussing here earlier, 1135 00:45:47,710 --> 00:45:49,600 partial naiveté meaning that-- 1136 00:45:49,600 --> 00:45:51,380 so we discussed two cases before, 1137 00:45:51,380 --> 00:45:54,700 which was full naiveté or full sophistication-- 1138 00:45:54,700 --> 00:45:57,670 full naiveté meaning that beta hat equals 1, 1139 00:45:57,670 --> 00:46:01,900 full sophistication meaning that beta hat equals beta. 1140 00:46:01,900 --> 00:46:06,610 And now, if you sort of take the case in between, which is-- 1141 00:46:06,610 --> 00:46:09,040 where you see-- which I have here at the bottom, which 1142 00:46:09,040 --> 00:46:13,180 is beta hat is in the middle of beta and 1, 1143 00:46:13,180 --> 00:46:15,160 or in between beta and 1. 1144 00:46:15,160 --> 00:46:18,220 So essentially, you understand that your beta in the future 1145 00:46:18,220 --> 00:46:19,660 is not 1. 1146 00:46:19,660 --> 00:46:24,920 But you sort of underestimate how small it is. 1147 00:46:24,920 --> 00:46:28,150 So as I said before, my beta might be 0.6. 1148 00:46:28,150 --> 00:46:31,060 I might understand my future beta is not 1. 1149 00:46:31,060 --> 00:46:32,920 So I understand it's smaller than 1. 1150 00:46:32,920 --> 00:46:35,530 But I might think it's, like, 0.8 or 0.9, which 1151 00:46:35,530 --> 00:46:39,430 is to say I overestimate my future beta to some degree. 1152 00:46:39,430 --> 00:46:41,142 While I get it right that it's not 1, 1153 00:46:41,142 --> 00:46:42,850 I might sort of get it wrong in the sense 1154 00:46:42,850 --> 00:46:44,200 of saying it's not 0.6. 1155 00:46:44,200 --> 00:46:47,360 In fact, I might think it's higher than that. 1156 00:46:47,360 --> 00:46:47,860 OK. 1157 00:46:54,450 --> 00:46:55,710 OK, let me go back to-- 1158 00:46:55,710 --> 00:47:00,517 OK, so in some sense, these are nice experiments. 1159 00:47:00,517 --> 00:47:02,850 They're useful experiments to help us sort of understand 1160 00:47:02,850 --> 00:47:04,260 is there evidence for this model, 1161 00:47:04,260 --> 00:47:07,920 is there evidence for some of the predictions that we have? 1162 00:47:07,920 --> 00:47:10,020 But it's still fairly contrived in the sense of, 1163 00:47:10,020 --> 00:47:12,227 like, it's kind of like-- especially 1164 00:47:12,227 --> 00:47:14,310 in the second experiment-- it's kind of like a lab 1165 00:47:14,310 --> 00:47:15,540 experiment and the like. 1166 00:47:15,540 --> 00:47:18,270 Really, what we want to know is, in the real world, 1167 00:47:18,270 --> 00:47:20,700 are commitment devices A, helpful in some sense? 1168 00:47:20,700 --> 00:47:21,930 Do people want them? 1169 00:47:21,930 --> 00:47:23,460 And B, are they actually improving 1170 00:47:23,460 --> 00:47:24,750 behaviors in certain ways? 1171 00:47:24,750 --> 00:47:26,340 Should a firm use commitment devices? 1172 00:47:26,340 --> 00:47:30,182 This is actually helpful for improving firm performance. 1173 00:47:30,182 --> 00:47:32,390 So there's a very nice paper by Supreet Kaur, Sendhil 1174 00:47:32,390 --> 00:47:33,765 Mullainathan, and Michael Kremer, 1175 00:47:33,765 --> 00:47:36,570 who sort of investigate this question using 1176 00:47:36,570 --> 00:47:38,650 data entry workers in India. 1177 00:47:38,650 --> 00:47:41,970 So this is a full-time job for people. 1178 00:47:41,970 --> 00:47:45,190 This is the primary source of earnings for these workers. 1179 00:47:45,190 --> 00:47:48,030 These are people who are sort of recruited as data entry workers 1180 00:47:48,030 --> 00:47:51,930 for the course of 13 months-- so over a year-- 1181 00:47:51,930 --> 00:47:53,812 as a typical data entry worker. 1182 00:47:53,812 --> 00:47:55,770 So in some sense, there was a job ad that said, 1183 00:47:55,770 --> 00:47:56,940 we're looking for a data entry worker. 1184 00:47:56,940 --> 00:47:58,350 We have data to be entered. 1185 00:47:58,350 --> 00:48:00,060 People would answer the job ad and then 1186 00:48:00,060 --> 00:48:04,530 sort of start working to do data entry. 1187 00:48:04,530 --> 00:48:08,610 Output is measured, as it is usual done in many data entry 1188 00:48:08,610 --> 00:48:11,910 companies, by measuring the number of accurate 1189 00:48:11,910 --> 00:48:13,920 fields entered in a day. 1190 00:48:13,920 --> 00:48:17,250 This is measured by doing dual entry, as in to say the data 1191 00:48:17,250 --> 00:48:17,970 is entered twice. 1192 00:48:17,970 --> 00:48:19,860 It's then matched with the other person. 1193 00:48:19,860 --> 00:48:23,140 And the number of errors or discrepancies 1194 00:48:23,140 --> 00:48:25,390 are then defined as an error rate. 1195 00:48:25,390 --> 00:48:27,870 And if the error rate-- 1196 00:48:27,870 --> 00:48:29,950 you're matched to different people over time, 1197 00:48:29,950 --> 00:48:31,755 so I can sort of back out your error rate-- 1198 00:48:31,755 --> 00:48:33,397 how good or bad you are. 1199 00:48:33,397 --> 00:48:34,980 So people are essentially incentivized 1200 00:48:34,980 --> 00:48:37,440 to do to work fast, and incentivized 1201 00:48:37,440 --> 00:48:40,560 to enter stuff correctly. 1202 00:48:40,560 --> 00:48:44,280 Workers are paid a piece rate using the weekly paycheck that 1203 00:48:44,280 --> 00:48:45,900 paid once a week. 1204 00:48:45,900 --> 00:48:47,280 There's no restrictions on hours. 1205 00:48:47,280 --> 00:48:48,863 People can show up whenever they like. 1206 00:48:48,863 --> 00:48:52,240 They can stay for as many hours as they like on any given day. 1207 00:48:52,240 --> 00:48:53,820 There are no penalties for absences. 1208 00:48:53,820 --> 00:48:56,070 So you can just-- if you don't show up on a given day, 1209 00:48:56,070 --> 00:48:57,207 nothing happens. 1210 00:48:57,207 --> 00:48:59,040 This is what the data entry task looks like. 1211 00:48:59,040 --> 00:49:00,210 There's essentially some data. 1212 00:49:00,210 --> 00:49:01,377 This is kind of interesting. 1213 00:49:01,377 --> 00:49:02,730 This is data coming from Kenya. 1214 00:49:02,730 --> 00:49:04,480 So Michael Kremer and Sendhil Mullainathan 1215 00:49:04,480 --> 00:49:06,480 were working on some Kenyan data where 1216 00:49:06,480 --> 00:49:10,410 they had a huge amount of data from Kenyan sugarcane farmers 1217 00:49:10,410 --> 00:49:14,070 over the course of 30 years-- some historical records 1218 00:49:14,070 --> 00:49:15,030 that they found. 1219 00:49:15,030 --> 00:49:16,390 They wanted them to get entered. 1220 00:49:16,390 --> 00:49:17,940 So like, how do we get them entered? 1221 00:49:17,940 --> 00:49:19,568 Well, India is the place to do that. 1222 00:49:19,568 --> 00:49:20,610 So they got them entered. 1223 00:49:20,610 --> 00:49:23,520 And then, when they hired lots of people to enter the data, 1224 00:49:23,520 --> 00:49:26,250 they thought, well, why not run an experiment 1225 00:49:26,250 --> 00:49:29,768 to learn about self-control among workers while doing that? 1226 00:49:29,768 --> 00:49:32,310 So what they would do is they would show these scanned images 1227 00:49:32,310 --> 00:49:33,520 of these pages. 1228 00:49:33,520 --> 00:49:36,990 And then there's like a computer software, pretty rudimentary, 1229 00:49:36,990 --> 00:49:41,860 where you can sort of enter the fields in your data set. 1230 00:49:41,860 --> 00:49:46,200 Now, the commitment device is a dominated contract 1231 00:49:46,200 --> 00:49:47,350 that looks as follows. 1232 00:49:47,350 --> 00:49:49,230 So the-- as I told you before, workers 1233 00:49:49,230 --> 00:49:51,030 are paid a piece rate w. 1234 00:49:51,030 --> 00:49:52,560 It's a linear piece rate. 1235 00:49:52,560 --> 00:49:55,600 That's the control contract for their production. 1236 00:49:55,600 --> 00:49:58,020 So forever production, think of production as how much 1237 00:49:58,020 --> 00:49:59,850 the correct entries that you produce. 1238 00:49:59,850 --> 00:50:01,550 For each correct entry that you make, 1239 00:50:01,550 --> 00:50:04,050 you're going to get paid w. 1240 00:50:04,050 --> 00:50:06,960 Now, the dominated contract looks as this-- 1241 00:50:06,960 --> 00:50:08,860 this is the red one that you see here-- 1242 00:50:08,860 --> 00:50:12,180 which is until a target T. We'll talk 1243 00:50:12,180 --> 00:50:13,530 about the target in a second. 1244 00:50:13,530 --> 00:50:17,010 Until the target T, you paid w over 2, which 1245 00:50:17,010 --> 00:50:18,870 is like half as much as before. 1246 00:50:18,870 --> 00:50:21,630 And as soon as you reach T, the target, 1247 00:50:21,630 --> 00:50:24,450 you're going to be paid w for the entire production 1248 00:50:24,450 --> 00:50:25,440 that you make. 1249 00:50:25,440 --> 00:50:26,895 Why is this a commitment contract? 1250 00:50:29,940 --> 00:50:32,630 So if I ask you to choose your target T, 1251 00:50:32,630 --> 00:50:34,130 if you choose a positive target, why 1252 00:50:34,130 --> 00:50:35,450 is this a commitment contract? 1253 00:50:35,450 --> 00:50:35,690 Yes? 1254 00:50:35,690 --> 00:50:37,315 AUDIENCE: So if you go less then you're 1255 00:50:37,315 --> 00:50:40,460 penalized for it by making less money [INAUDIBLE].. 1256 00:50:40,460 --> 00:50:41,508 FRANK SCHILBACH: Right. 1257 00:50:41,508 --> 00:50:42,050 That's right. 1258 00:50:42,050 --> 00:50:46,230 So there's no value of choosing this contract. 1259 00:50:46,230 --> 00:50:48,860 So if you are an exponential discounter, 1260 00:50:48,860 --> 00:50:51,140 you might just choose T equals 0, 1261 00:50:51,140 --> 00:50:53,570 the reason being that there's no point in choosing 1262 00:50:53,570 --> 00:50:54,515 a high target. 1263 00:50:54,515 --> 00:50:55,890 Who knows what's going to happen? 1264 00:50:55,890 --> 00:50:56,810 You might get a headache. 1265 00:50:56,810 --> 00:50:57,860 Your kid might get sick. 1266 00:50:57,860 --> 00:50:59,450 The computer might not work, or whatever. 1267 00:50:59,450 --> 00:51:00,908 For whatever reason, there might be 1268 00:51:00,908 --> 00:51:04,070 shocks, uncertainty that might make you not reach your target. 1269 00:51:04,070 --> 00:51:06,530 And so if you choose a positive target, the only thing you 1270 00:51:06,530 --> 00:51:08,405 can do from this positive target is, you see, 1271 00:51:08,405 --> 00:51:09,740 you can do only worse, right? 1272 00:51:09,740 --> 00:51:13,890 It's a weakly dominated contract in terms of your payments, 1273 00:51:13,890 --> 00:51:16,290 so there's no reason to actually do that. 1274 00:51:16,290 --> 00:51:18,430 But why might the worker actually choose it anyway? 1275 00:51:22,960 --> 00:51:24,122 Yes? 1276 00:51:24,122 --> 00:51:27,400 AUDIENCE: Because earlier, you specified 1277 00:51:27,400 --> 00:51:29,332 that the workers can come and go as they 1278 00:51:29,332 --> 00:51:30,540 please once they get to work. 1279 00:51:30,540 --> 00:51:32,520 And so there's no real set schedule. 1280 00:51:32,520 --> 00:51:36,040 So [INAUDIBLE] or think that they know themselves, and think 1281 00:51:36,040 --> 00:51:38,470 that they'll push off all the work until the very 1282 00:51:38,470 --> 00:51:39,970 last minute, et cetera, et cetera. 1283 00:51:39,970 --> 00:51:42,290 And so they might want the commitment device 1284 00:51:42,290 --> 00:51:44,883 to incentivize themselves to finish their work [INAUDIBLE].. 1285 00:51:44,883 --> 00:51:46,800 FRANK SCHILBACH: Exactly, I might sort of just 1286 00:51:46,800 --> 00:51:49,575 be slightly below T or the like. 1287 00:51:49,575 --> 00:51:52,320 I might sort of be tempted to go home. 1288 00:51:52,320 --> 00:51:54,210 You know, my friends call or whatever, 1289 00:51:54,210 --> 00:51:55,690 and I might just not want to do it. 1290 00:51:55,690 --> 00:51:56,940 It might be just a tedious day. 1291 00:51:56,940 --> 00:51:57,898 It might be really hot. 1292 00:51:57,898 --> 00:51:59,730 I had really goals to work a lot, 1293 00:51:59,730 --> 00:52:02,970 and I want to incentivize my future self 1294 00:52:02,970 --> 00:52:04,980 to reach certain targets. 1295 00:52:04,980 --> 00:52:06,030 That's exactly right. 1296 00:52:06,030 --> 00:52:08,640 So I think this is-- 1297 00:52:08,640 --> 00:52:10,230 we've already said all of that. 1298 00:52:10,230 --> 00:52:11,825 Workers can choose the T in advance. 1299 00:52:11,825 --> 00:52:13,950 They can choose-- essentially, on the previous day, 1300 00:52:13,950 --> 00:52:16,800 they can choose T equals 0. 1301 00:52:16,800 --> 00:52:19,590 They also have randomization of paydays 1302 00:52:19,590 --> 00:52:21,750 to be able to look at payday effects. 1303 00:52:21,750 --> 00:52:23,140 What do I mean by that? 1304 00:52:23,140 --> 00:52:26,200 I told you, like, workers are paid weekly. 1305 00:52:26,200 --> 00:52:27,660 And so they randomize the paydays. 1306 00:52:27,660 --> 00:52:29,993 What they do is they randomize the pages to be Tuesdays, 1307 00:52:29,993 --> 00:52:32,543 Thursdays, or Saturdays so that you can allow 1308 00:52:32,543 --> 00:52:33,960 for day-of-the-week fixed effects. 1309 00:52:33,960 --> 00:52:35,760 That's kind of [INAUDIBLE] interesting. 1310 00:52:35,760 --> 00:52:37,830 But essentially, you can look at, 1311 00:52:37,830 --> 00:52:39,882 when a worker is paid in a given week, 1312 00:52:39,882 --> 00:52:42,090 are they more or less productive on that day compared 1313 00:52:42,090 --> 00:52:44,560 to the previous days, and so on and so forth. 1314 00:52:44,560 --> 00:52:46,480 What does the exponential discounting model 1315 00:52:46,480 --> 00:52:50,100 predict about paydays or payday effects? 1316 00:52:50,100 --> 00:52:51,900 Should you work harder on a payday? 1317 00:53:05,590 --> 00:53:06,560 Yes? 1318 00:53:06,560 --> 00:53:09,177 AUDIENCE: No, because you're time-consistent. 1319 00:53:09,177 --> 00:53:10,510 FRANK SCHILBACH: Right, exactly. 1320 00:53:10,510 --> 00:53:13,210 Like, your delta is 0.95 or the like. 1321 00:53:13,210 --> 00:53:16,780 It doesn't really matter whether the daily discounting should 1322 00:53:16,780 --> 00:53:18,460 be essentially close-- there should 1323 00:53:18,460 --> 00:53:20,080 be essentially close to no discounting 1324 00:53:20,080 --> 00:53:21,310 between different days. 1325 00:53:21,310 --> 00:53:24,250 You might have a yearly delta of 0.95 or the like. 1326 00:53:24,250 --> 00:53:27,255 But then, between today and tomorrow, or two days 1327 00:53:27,255 --> 00:53:29,800 and three days from now and so on, it doesn't really matter. 1328 00:53:29,800 --> 00:53:32,770 So if I'm working today and I'm paid today, 1329 00:53:32,770 --> 00:53:34,450 it's the same as if I'm working today 1330 00:53:34,450 --> 00:53:36,800 and I'm paid in like three days. 1331 00:53:36,800 --> 00:53:39,310 However, if I'm a quasi-hyperbolic discounter, 1332 00:53:39,310 --> 00:53:41,800 well, what's going to be the case is, like, 1333 00:53:41,800 --> 00:53:44,200 if I'm working on Sundays, I'm working today 1334 00:53:44,200 --> 00:53:45,965 and I'm going to be paid today. 1335 00:53:45,965 --> 00:53:47,590 Well, then I'm going to go work harder. 1336 00:53:47,590 --> 00:53:49,090 Maybe I get the money right away. 1337 00:53:49,090 --> 00:53:51,280 I can buy a nice meal, my family is 1338 00:53:51,280 --> 00:53:53,450 going to be happier, and so on and so forth. 1339 00:53:53,450 --> 00:53:56,120 Now notice, that requires some form of liquidity constraints, 1340 00:53:56,120 --> 00:53:56,620 right? 1341 00:53:56,620 --> 00:53:58,450 In the sense that if I have a bunch of cash anyway, 1342 00:53:58,450 --> 00:53:59,950 it doesn't really matter whether I'm 1343 00:53:59,950 --> 00:54:01,720 going to be paid today versus tomorrow. 1344 00:54:01,720 --> 00:54:04,840 But assuming that some workers are liquidity constrained, 1345 00:54:04,840 --> 00:54:06,580 essentially whether they're paid today 1346 00:54:06,580 --> 00:54:09,850 versus tomorrow matters a lot when the reward comes. 1347 00:54:09,850 --> 00:54:12,140 If you think about today, it's like 10:00 AM. 1348 00:54:12,140 --> 00:54:13,630 I'm typing and typing. 1349 00:54:13,630 --> 00:54:15,820 Now, I think about, like, when do I get the reward. 1350 00:54:15,820 --> 00:54:17,290 Either it's on the payday, you're 1351 00:54:17,290 --> 00:54:19,030 going to be paid at, like, 5:00 PM, 1352 00:54:19,030 --> 00:54:20,947 so you're going to get essentially a nice meal 1353 00:54:20,947 --> 00:54:21,822 on the same day. 1354 00:54:21,822 --> 00:54:24,280 Versus if you're paid in five days, that's really far away. 1355 00:54:24,280 --> 00:54:27,910 In the beta delta world, your reward for working hard 1356 00:54:27,910 --> 00:54:31,382 is coming much later, OK? 1357 00:54:31,382 --> 00:54:33,340 So you would think quasi-hyperbolic discounters 1358 00:54:33,340 --> 00:54:35,950 would put in higher efforts on paydays. 1359 00:54:35,950 --> 00:54:39,153 Assuming there is some form of liquidity constraints, 1360 00:54:39,153 --> 00:54:41,570 there should be close to no difference between other days. 1361 00:54:41,570 --> 00:54:44,195 That depends a little bit, like, what's the horizon of the beta 1362 00:54:44,195 --> 00:54:46,690 that we talked about before. 1363 00:54:46,690 --> 00:54:49,090 OK, so now what do they find in this paper? 1364 00:54:49,090 --> 00:54:51,160 They find essentially three results. 1365 00:54:51,160 --> 00:54:53,410 The first result is there's demand for commitment. 1366 00:54:53,410 --> 00:54:56,310 There's sort of-- people choose dominated contracts. 1367 00:54:56,310 --> 00:54:59,170 That is to say, workers select dominated contracts 1368 00:54:59,170 --> 00:55:01,210 about 36% of the time. 1369 00:55:01,210 --> 00:55:04,660 This is a lower bound for the extent of time 1370 00:55:04,660 --> 00:55:06,710 inconsistency for three reasons. 1371 00:55:06,710 --> 00:55:08,632 Reason one is some workers might be naive. 1372 00:55:08,632 --> 00:55:10,840 They might sort of think they don't need any targets. 1373 00:55:10,840 --> 00:55:12,950 Well, in fact, they would benefit from targets. 1374 00:55:12,950 --> 00:55:16,510 Or they might underinvest in this technology. 1375 00:55:16,510 --> 00:55:19,060 Second, people might think this commitment device is actually 1376 00:55:19,060 --> 00:55:20,710 ineffective, in a sense saying, look, 1377 00:55:20,710 --> 00:55:22,960 this is sort of what we're talking about with the Frog 1378 00:55:22,960 --> 00:55:23,890 and the Toad. 1379 00:55:23,890 --> 00:55:26,630 There might be-- they might have time-inconsistency issues. 1380 00:55:26,630 --> 00:55:29,080 It might just not be a strong enough commitment device. 1381 00:55:29,080 --> 00:55:31,210 And so then, if I have a day where I just 1382 00:55:31,210 --> 00:55:33,700 don't want to work, then if that's not going to get you 1383 00:55:33,700 --> 00:55:36,550 over the target, well then I might not 1384 00:55:36,550 --> 00:55:39,580 choose this commitment device, not because I'm not-- 1385 00:55:39,580 --> 00:55:42,550 I don't have present bias, but because it's just 1386 00:55:42,550 --> 00:55:46,530 not effective enough of a punishment for me to do. 1387 00:55:46,530 --> 00:55:49,390 Number three is people might prefer flexibility 1388 00:55:49,390 --> 00:55:50,650 and they're risk-averse. 1389 00:55:50,650 --> 00:55:53,518 These are issues like they might have children that get sick, 1390 00:55:53,518 --> 00:55:55,060 they might have headaches, and so on. 1391 00:55:55,060 --> 00:55:57,770 They might just be risks that's unrelated to their present bias 1392 00:55:57,770 --> 00:55:58,510 and the like. 1393 00:55:58,510 --> 00:55:59,710 The computers might be bad. 1394 00:55:59,710 --> 00:56:01,840 So they had different computers at different speed. 1395 00:56:01,840 --> 00:56:03,730 You might end up with a really bad computer, 1396 00:56:03,730 --> 00:56:06,358 and then you might not reach the target. 1397 00:56:06,358 --> 00:56:08,650 That's nothing to do with self-control or present bias, 1398 00:56:08,650 --> 00:56:10,510 but rather with sort of external risk. 1399 00:56:10,510 --> 00:56:12,010 If you're really worried about risk, 1400 00:56:12,010 --> 00:56:16,390 then you might not choose a positive target either. 1401 00:56:16,390 --> 00:56:20,060 Second, they find offering the dominated contracts increases 1402 00:56:20,060 --> 00:56:20,560 output. 1403 00:56:20,560 --> 00:56:23,200 That is to say, if you compare groups that were, on some days, 1404 00:56:23,200 --> 00:56:25,720 offered those contracts, compared to other groups that 1405 00:56:25,720 --> 00:56:28,270 were randomly selected to be not offered 1406 00:56:28,270 --> 00:56:31,060 those contracts on those same days, 1407 00:56:31,060 --> 00:56:34,240 they find that being offered this commitment contract 1408 00:56:34,240 --> 00:56:36,580 increases production by 2.3%. 1409 00:56:36,580 --> 00:56:39,243 Now you might say 2% is actually pretty low. 1410 00:56:39,243 --> 00:56:40,660 And that's true in absolute terms. 1411 00:56:40,660 --> 00:56:42,050 That's not a lot of money. 1412 00:56:42,050 --> 00:56:44,950 But if you think about what other options does 1413 00:56:44,950 --> 00:56:46,840 the employer have, well, one thing 1414 00:56:46,840 --> 00:56:48,850 you might want to do to increase worker's output 1415 00:56:48,850 --> 00:56:52,450 is sort of a double their wages or just increase wages overall. 1416 00:56:52,450 --> 00:56:57,160 Now it turns out that they also have some piece rate variations 1417 00:56:57,160 --> 00:57:00,010 where they can actually estimate how much of a piece rate 1418 00:57:00,010 --> 00:57:03,070 increase do we need to achieve these effects on productivity. 1419 00:57:03,070 --> 00:57:05,200 And it turns out, the impact is actually 1420 00:57:05,200 --> 00:57:07,810 corresponding to, like, an 18% increase in the piece rate 1421 00:57:07,810 --> 00:57:08,310 wage. 1422 00:57:08,310 --> 00:57:10,810 Now, if you're an employer, you don't 1423 00:57:10,810 --> 00:57:12,970 want to pay people, like, an 80% higher piece 1424 00:57:12,970 --> 00:57:16,843 rate for, like, a 2% or 3% increase in productivity. 1425 00:57:16,843 --> 00:57:18,010 That's just not worth doing. 1426 00:57:18,010 --> 00:57:19,093 It's just like a bad deal. 1427 00:57:19,093 --> 00:57:21,385 I mean, depends a little bit on your costs and benefits 1428 00:57:21,385 --> 00:57:22,090 of production. 1429 00:57:22,090 --> 00:57:23,620 But that's a very ineffective way 1430 00:57:23,620 --> 00:57:27,070 of getting workers to be more productive, 1431 00:57:27,070 --> 00:57:28,810 and it's a costly thing to do. 1432 00:57:28,810 --> 00:57:31,420 You're going to incentivize a bunch of intramarginal people 1433 00:57:31,420 --> 00:57:32,260 to do that. 1434 00:57:32,260 --> 00:57:34,660 Instead, offering commitment devices in this setting 1435 00:57:34,660 --> 00:57:35,860 is actually free. 1436 00:57:35,860 --> 00:57:37,370 And in some sense, for the employer, 1437 00:57:37,370 --> 00:57:38,870 you might actually pay people less. 1438 00:57:38,870 --> 00:57:40,245 Some people don't reach a target, 1439 00:57:40,245 --> 00:57:42,053 you can actually pay w over 2. 1440 00:57:42,053 --> 00:57:43,720 Now, you actually don't want to do that, 1441 00:57:43,720 --> 00:57:46,600 because workers will be not happy and, you know, 1442 00:57:46,600 --> 00:57:49,300 probably get annoyed with you if you do that too much. 1443 00:57:49,300 --> 00:57:51,147 But the technology is actually free. 1444 00:57:51,147 --> 00:57:53,230 Here's a thing that you can offer to your workers. 1445 00:57:53,230 --> 00:57:56,300 They're going to be 2% to 3% more productive. 1446 00:57:56,300 --> 00:57:59,260 That's actually a big deal for a lot of companies, 1447 00:57:59,260 --> 00:58:01,750 in terms of the margins are often small, 1448 00:58:01,750 --> 00:58:03,790 and a much better instrument and getting workers 1449 00:58:03,790 --> 00:58:06,520 to be more productive compared to increasing people's wages. 1450 00:58:06,520 --> 00:58:09,720 Because wages, obviously, you need to pay. 1451 00:58:09,720 --> 00:58:12,430 Any questions on that? 1452 00:58:12,430 --> 00:58:14,160 And then a third-- and this is sort of 1453 00:58:14,160 --> 00:58:16,710 consistent with a model of self-control control 1454 00:58:16,710 --> 00:58:18,390 being really the driver of this-- 1455 00:58:18,390 --> 00:58:21,670 is payday effects predict people's demand for commitment. 1456 00:58:21,670 --> 00:58:23,512 I'm going to show you this in a second. 1457 00:58:23,512 --> 00:58:25,470 But essentially, what they find is that they're 1458 00:58:25,470 --> 00:58:26,820 A-- they're payday effects. 1459 00:58:26,820 --> 00:58:28,570 What do I mean by that? 1460 00:58:28,570 --> 00:58:31,470 If you plot, essentially, production over the course 1461 00:58:31,470 --> 00:58:32,890 of the day cycle-- 1462 00:58:32,890 --> 00:58:35,280 so here's-- on the very right hand side of this graph, 1463 00:58:35,280 --> 00:58:37,590 you see people's payday productivity compared 1464 00:58:37,590 --> 00:58:39,630 to the productivity on the day after the payday, 1465 00:58:39,630 --> 00:58:41,360 or seven days before the payday. 1466 00:58:41,360 --> 00:58:43,860 You'll see people essentially are more productive on payday. 1467 00:58:43,860 --> 00:58:45,600 They produce more on paydays compared 1468 00:58:45,600 --> 00:58:46,687 to, like, previous days. 1469 00:58:46,687 --> 00:58:48,270 And so that what you see, essentially, 1470 00:58:48,270 --> 00:58:51,300 is a constant increase towards the day of the payday. 1471 00:58:51,300 --> 00:58:52,920 Now, somebody was asking me previously 1472 00:58:52,920 --> 00:58:54,510 about quasi-hyperbolic discounting, 1473 00:58:54,510 --> 00:58:58,340 or hyperbolic discounting, which model is right. 1474 00:58:58,340 --> 00:59:00,090 When you look at this graph, this actually 1475 00:59:00,090 --> 00:59:02,790 doesn't look very much like quasi-hyperbolic discounting. 1476 00:59:02,790 --> 00:59:05,243 It looks a lot more like hyperbolic discounting, 1477 00:59:05,243 --> 00:59:07,410 as in, like, there's sort of like a smooth increase. 1478 00:59:07,410 --> 00:59:09,510 There's not a jump up. 1479 00:59:09,510 --> 00:59:12,325 On the day of the payday people are more productive, 1480 00:59:12,325 --> 00:59:13,950 but it's rather sort of more consistent 1481 00:59:13,950 --> 00:59:16,222 with true hyperbolic discounting. 1482 00:59:16,222 --> 00:59:17,430 That's more of a detail here. 1483 00:59:17,430 --> 00:59:18,847 That's not that important for you. 1484 00:59:18,847 --> 00:59:20,310 But I wanted to point that out. 1485 00:59:20,310 --> 00:59:20,810 Yes? 1486 00:59:20,810 --> 00:59:23,220 AUDIENCE: What's the scale and unit on the y-axis? 1487 00:59:23,220 --> 00:59:24,440 FRANK SCHILBACH: That is a great question. 1488 00:59:24,440 --> 00:59:25,480 I think its production. 1489 00:59:25,480 --> 00:59:27,540 So I think this is not a-- 1490 00:59:27,540 --> 00:59:28,447 this is not a great-- 1491 00:59:28,447 --> 00:59:30,030 I think these are units of production. 1492 00:59:30,030 --> 00:59:32,820 I don't know exactly what the fraction 1493 00:59:32,820 --> 00:59:34,480 is in terms of production. 1494 00:59:34,480 --> 00:59:36,780 But it must be, like, a few percent of production. 1495 00:59:36,780 --> 00:59:38,043 I can look this up. 1496 00:59:38,043 --> 00:59:39,460 But it's like units of production, 1497 00:59:39,460 --> 00:59:42,570 which is surely not what you want to use here. 1498 00:59:42,570 --> 00:59:44,160 Yeah, and so then-- 1499 00:59:44,160 --> 00:59:46,980 so what we find here is that, essentially, people 1500 00:59:46,980 --> 00:59:48,722 are more productive on paydays compared 1501 00:59:48,722 --> 00:59:51,180 to the day after the payday, or compared to, like, six days 1502 00:59:51,180 --> 00:59:53,430 before the payday. 1503 00:59:53,430 --> 00:59:55,470 And then we find another graph, which 1504 00:59:55,470 --> 00:59:59,490 is a quite nice one, which is to say high payday workers, 1505 00:59:59,490 --> 01:00:02,160 or high payday effect workers, are more likely to select 1506 01:00:02,160 --> 01:00:03,660 positive targets. 1507 01:00:03,660 --> 01:00:06,990 What I mean by that is they split the sample 1508 01:00:06,990 --> 01:00:11,250 into workers who have high payday impacts and low payday 1509 01:00:11,250 --> 01:00:11,790 impacts. 1510 01:00:11,790 --> 01:00:14,490 That is to say, when you look at the workers and look at, 1511 01:00:14,490 --> 01:00:17,430 like, which workers are more productive on paydays compared 1512 01:00:17,430 --> 01:00:20,820 to on other days, you can sort of split the sample into two. 1513 01:00:20,820 --> 01:00:22,410 Some workers are much more productive 1514 01:00:22,410 --> 01:00:24,060 on paydays compared to other days. 1515 01:00:24,060 --> 01:00:25,695 Some workers are not. 1516 01:00:25,695 --> 01:00:27,570 So what you see here in the graph is you see, 1517 01:00:27,570 --> 01:00:29,400 essentially, the blue dots. 1518 01:00:29,400 --> 01:00:31,945 These are workers who have high payday impacts. 1519 01:00:31,945 --> 01:00:34,320 These are workers who are much more productive on paydays 1520 01:00:34,320 --> 01:00:35,880 compared to other days. 1521 01:00:35,880 --> 01:00:38,100 And you see the red dots, which are workers 1522 01:00:38,100 --> 01:00:40,410 who are, essentially, pretty much 1523 01:00:40,410 --> 01:00:42,390 they have low payday impact, meaning 1524 01:00:42,390 --> 01:00:45,780 they kind of work the same on paydays versus on other days. 1525 01:00:45,780 --> 01:00:48,630 And then what they show on the-- so then, on the x-axis, 1526 01:00:48,630 --> 01:00:49,590 they show experience. 1527 01:00:49,590 --> 01:00:51,510 This is essentially-- think of this 1528 01:00:51,510 --> 01:00:52,750 as the course of the study. 1529 01:00:52,750 --> 01:00:56,040 These are something like 150 days. 1530 01:00:56,040 --> 01:00:59,760 They have different workdays in the sample, where 1531 01:00:59,760 --> 01:01:02,760 essentially, over time, people get to choose 1532 01:01:02,760 --> 01:01:04,530 over and over their targets. 1533 01:01:04,530 --> 01:01:08,310 And what you see, essentially, is that the blue graph-- 1534 01:01:08,310 --> 01:01:11,490 the blue dots tend to be A, higher than the red graphs, 1535 01:01:11,490 --> 01:01:13,502 meaning that people have high payday impacts. 1536 01:01:13,502 --> 01:01:15,210 These are workers of high payday effects. 1537 01:01:15,210 --> 01:01:17,190 They're like more productive on paydays. 1538 01:01:17,190 --> 01:01:21,192 They're more likely to choose positive targets. 1539 01:01:21,192 --> 01:01:22,900 Sorry, I should have said, on the y-axis, 1540 01:01:22,900 --> 01:01:26,880 you see the fraction of people choosing positive targets, OK? 1541 01:01:26,880 --> 01:01:28,600 So what we see essentially is two things. 1542 01:01:28,600 --> 01:01:31,020 One is the blue dots are higher than the red dots, 1543 01:01:31,020 --> 01:01:33,600 meaning that the people have high payday impacts. 1544 01:01:33,600 --> 01:01:37,140 They are more likely to choose positive targets, sort 1545 01:01:37,140 --> 01:01:39,630 of suggesting that the underlying reason 1546 01:01:39,630 --> 01:01:42,270 why people have high payday impact is self-control. 1547 01:01:42,270 --> 01:01:45,510 You're more, essentially, productive on paydays 1548 01:01:45,510 --> 01:01:48,070 because you're present biased one way or the other. 1549 01:01:48,070 --> 01:01:51,557 And that also then predicts whether you choose the target. 1550 01:01:51,557 --> 01:01:53,640 So people who have higher self-control problems as 1551 01:01:53,640 --> 01:01:55,587 revealed by the payday effects, those 1552 01:01:55,587 --> 01:01:57,420 are the people who are more likely to choose 1553 01:01:57,420 --> 01:01:59,130 positive targets. 1554 01:01:59,130 --> 01:02:02,310 Second, we also see that the blue dots 1555 01:02:02,310 --> 01:02:05,202 tend to be trending upwards in the graph. 1556 01:02:05,202 --> 01:02:06,660 You see on the right, the blue dots 1557 01:02:06,660 --> 01:02:08,970 are higher than on the left in the graph, 1558 01:02:08,970 --> 01:02:11,850 meaning that as people have more experience in the study, when 1559 01:02:11,850 --> 01:02:14,380 they have like 100, 150 days in the study, 1560 01:02:14,380 --> 01:02:17,100 they're more likely to choose positive targets, 1561 01:02:17,100 --> 01:02:20,920 consistent with some learning about self-control over time. 1562 01:02:20,920 --> 01:02:23,340 So they learn over time that choosing positive targets 1563 01:02:23,340 --> 01:02:25,132 is a good thing for them, potentially, that 1564 01:02:25,132 --> 01:02:26,850 makes them more productive. 1565 01:02:26,850 --> 01:02:29,340 You don't see such a thing for the red dots 1566 01:02:29,340 --> 01:02:33,198 that seem to be essentially flat or constant over time. 1567 01:02:33,198 --> 01:02:34,740 Sorry, that was a lot of information. 1568 01:02:34,740 --> 01:02:36,270 Are there any questions on that? 1569 01:02:40,950 --> 01:02:43,520 Yes? 1570 01:02:43,520 --> 01:02:45,880 AUDIENCE: Arguably, there's lots to do 1571 01:02:45,880 --> 01:02:47,468 with the savings of the payday, right? 1572 01:02:47,468 --> 01:02:49,510 So on the payday, I have to come in to the office 1573 01:02:49,510 --> 01:02:50,674 and do the work. 1574 01:02:50,674 --> 01:02:53,880 So I might as well type a little longer to earn a little more. 1575 01:02:53,880 --> 01:02:56,138 Another is you don't really have an incentive. 1576 01:02:56,138 --> 01:02:57,180 They're not getting paid. 1577 01:02:57,180 --> 01:02:58,780 So there are other jobs where you 1578 01:02:58,780 --> 01:03:00,990 get paid at the end of the day, every day. 1579 01:03:00,990 --> 01:03:03,790 And then you don't really find the payday effects 1580 01:03:03,790 --> 01:03:05,760 in those kinds of jobs, right? 1581 01:03:05,760 --> 01:03:07,220 FRANK SCHILBACH: Right, so exactly. 1582 01:03:07,220 --> 01:03:08,970 So there's an issue on, when you think 1583 01:03:08,970 --> 01:03:12,602 about these payday effects, what exactly do we learn from that? 1584 01:03:12,602 --> 01:03:14,310 One way to think about the payday effects 1585 01:03:14,310 --> 01:03:16,143 is kind of like the story I was telling you, 1586 01:03:16,143 --> 01:03:18,870 which is to say, on days when you come in, when you type, 1587 01:03:18,870 --> 01:03:21,420 when you work, you're going to be paid in the evening. 1588 01:03:21,420 --> 01:03:24,180 And you're going to work harder, because your reward is 1589 01:03:24,180 --> 01:03:25,440 closer to you-- 1590 01:03:25,440 --> 01:03:26,773 to your effort. 1591 01:03:26,773 --> 01:03:28,440 Another explanation that you were saying 1592 01:03:28,440 --> 01:03:29,850 is like, well, what if workers are just, 1593 01:03:29,850 --> 01:03:31,060 like, liquidity constrained. 1594 01:03:31,060 --> 01:03:33,910 What if their children are hungry, et cetera, and so on. 1595 01:03:33,910 --> 01:03:37,920 So you come in on that day because you want a paycheck. 1596 01:03:37,920 --> 01:03:40,380 Once you show up anyway, you might as well do some work. 1597 01:03:40,380 --> 01:03:45,540 And you're going to end up being more productive that way. 1598 01:03:45,540 --> 01:03:48,090 That's surely, in part, going on. 1599 01:03:48,090 --> 01:03:49,990 I have sort of two responses to that. 1600 01:03:49,990 --> 01:03:53,250 So one response is, like, well, we find that the workers with 1601 01:03:53,250 --> 01:03:56,580 high payday impacts are more likely to choose positive 1602 01:03:56,580 --> 01:03:59,550 targets, which sort of suggests there is an underlying issue 1603 01:03:59,550 --> 01:04:03,660 which is self-control problems that drives both the payday 1604 01:04:03,660 --> 01:04:06,770 effects and the high-- 1605 01:04:06,770 --> 01:04:08,790 the positive targets that people choose 1606 01:04:08,790 --> 01:04:10,530 to demand for commitment. 1607 01:04:10,530 --> 01:04:13,470 Second, and more subtle in some ways, is, like, people 1608 01:04:13,470 --> 01:04:14,670 are liquidity constrained. 1609 01:04:14,670 --> 01:04:16,313 People who don't have cash, there's 1610 01:04:16,313 --> 01:04:17,730 often a reason why they don't have 1611 01:04:17,730 --> 01:04:19,560 cash, which is often because they haven't 1612 01:04:19,560 --> 01:04:20,832 saved in the first place. 1613 01:04:20,832 --> 01:04:22,290 Again, that's a little complicated. 1614 01:04:22,290 --> 01:04:25,290 But essentially, if somebody is very liquidity constrained, 1615 01:04:25,290 --> 01:04:27,510 never has cash, often that's coming-- 1616 01:04:27,510 --> 01:04:28,980 or the underlying reason might be 1617 01:04:28,980 --> 01:04:31,260 present bias, or some form of self-control problems 1618 01:04:31,260 --> 01:04:32,490 to start with. 1619 01:04:32,490 --> 01:04:34,770 The reason being that, like, if you really 1620 01:04:34,770 --> 01:04:36,838 had such a value-- high value of having cash, 1621 01:04:36,838 --> 01:04:38,880 you should just save it and have it in your home, 1622 01:04:38,880 --> 01:04:42,070 or try to save money in a bank account and so on. 1623 01:04:42,070 --> 01:04:44,070 So usually, we think that when people are really 1624 01:04:44,070 --> 01:04:45,810 liquidity constrained, often that's 1625 01:04:45,810 --> 01:04:48,960 an underlying reason of present bias 1626 01:04:48,960 --> 01:04:50,998 or some form of self-control problems. 1627 01:04:50,998 --> 01:04:53,040 Having said that, there's also some other reasons 1628 01:04:53,040 --> 01:04:54,030 why people can't save. 1629 01:04:54,030 --> 01:04:56,640 Because it might be they don't have access to bank accounts, 1630 01:04:56,640 --> 01:04:58,170 and so on, and so forth. 1631 01:05:03,400 --> 01:05:05,440 OK. 1632 01:05:05,440 --> 01:05:09,230 OK, so let me tell you one more application. 1633 01:05:09,230 --> 01:05:13,700 And let me sort of continue with the rest of those tomorrow. 1634 01:05:13,700 --> 01:05:17,620 So this is sort of like a classic study, in fact, 1635 01:05:17,620 --> 01:05:20,308 done in Boston in Boston area health clubs. 1636 01:05:20,308 --> 01:05:21,850 So this is Dellavigna and Malmendier. 1637 01:05:21,850 --> 01:05:24,495 And what they did is they looked at health clubs 1638 01:05:24,495 --> 01:05:25,870 and different options that people 1639 01:05:25,870 --> 01:05:29,320 did in gyms when they had choices between the following, 1640 01:05:29,320 --> 01:05:33,070 or the following kind, which is monthly fees of over $70 1641 01:05:33,070 --> 01:05:38,080 for unlimited use of the gym, or a pay-per-visit fee of $10. 1642 01:05:38,080 --> 01:05:39,938 If you had that option or those two options, 1643 01:05:39,938 --> 01:05:41,230 which options would you choose? 1644 01:05:41,230 --> 01:05:42,938 Or why would you choose one or the other? 1645 01:05:47,050 --> 01:05:47,550 Yes? 1646 01:05:47,550 --> 01:05:49,008 AUDIENCE: If you think you're going 1647 01:05:49,008 --> 01:05:52,330 to use the gym more than seven times in a month, [INAUDIBLE].. 1648 01:05:55,547 --> 01:05:56,880 FRANK SCHILBACH: Right, exactly. 1649 01:05:56,880 --> 01:06:00,600 So why pay more than-- why pick option number one if you only 1650 01:06:00,600 --> 01:06:03,102 go three times? 1651 01:06:03,102 --> 01:06:04,560 When I was teaching the first time, 1652 01:06:04,560 --> 01:06:08,090 I was doing exactly that, in fact. 1653 01:06:08,090 --> 01:06:10,380 But exactly as you say, if you make-- 1654 01:06:10,380 --> 01:06:13,200 if you understand how often you go to the gym, 1655 01:06:13,200 --> 01:06:16,710 why choose option one if you don't go at least seven times? 1656 01:06:16,710 --> 01:06:18,930 There could be some transaction costs and the like. 1657 01:06:18,930 --> 01:06:21,040 But surely, if you go only once or twice, 1658 01:06:21,040 --> 01:06:23,580 you should not choose option one. 1659 01:06:23,580 --> 01:06:26,160 Now of course, some people might choose option one anyway. 1660 01:06:26,160 --> 01:06:27,820 Why might you do that? 1661 01:06:27,820 --> 01:06:28,320 Yes? 1662 01:06:28,320 --> 01:06:30,237 AUDIENCE: Would it be for a commitment device? 1663 01:06:30,237 --> 01:06:32,070 Like, I would pay for this month [INAUDIBLE] 1664 01:06:32,070 --> 01:06:33,430 I want to get my money's worth. 1665 01:06:33,430 --> 01:06:36,540 FRANK SCHILBACH: Right, So one option 1666 01:06:36,540 --> 01:06:39,580 is, like, essentially change their future prices. 1667 01:06:39,580 --> 01:06:42,330 So notice that the marginal cost of going to the gym in option 1668 01:06:42,330 --> 01:06:44,070 two is $10, right? 1669 01:06:44,070 --> 01:06:46,710 Every time you go, your marginal cost is $10. 1670 01:06:46,710 --> 01:06:48,953 In option one, the marginal cost is 0. 1671 01:06:48,953 --> 01:06:51,120 So what I'm trying to do is change the marginal cost 1672 01:06:51,120 --> 01:06:56,910 in the future to make it more favorable for me to go. 1673 01:06:56,910 --> 01:06:59,622 What else could be going on? 1674 01:06:59,622 --> 01:07:01,830 So one is they could use this as a commitment device. 1675 01:07:01,830 --> 01:07:05,160 What's another explanation? 1676 01:07:05,160 --> 01:07:05,660 Yeah? 1677 01:07:05,660 --> 01:07:08,005 AUDIENCE: Overestimation of how many times you would go? 1678 01:07:08,005 --> 01:07:10,380 FRANK SCHILBACH: Exactly, it could be like, essentially-- 1679 01:07:10,380 --> 01:07:11,755 and there's two versions of that. 1680 01:07:11,755 --> 01:07:15,313 One version has to do with beta delta or present bias, which 1681 01:07:15,313 --> 01:07:16,730 is to say I'm just underestimating 1682 01:07:16,730 --> 01:07:18,350 my future present bias. 1683 01:07:18,350 --> 01:07:21,620 I think my beta is, like, 1, but in fact, it's 0.5. 1684 01:07:21,620 --> 01:07:23,520 I think I'm going to go 17 times, 1685 01:07:23,520 --> 01:07:26,670 but in fact, I'm going only twice. 1686 01:07:26,670 --> 01:07:27,630 That's one option. 1687 01:07:27,630 --> 01:07:29,720 Another option that was also mentioned earlier 1688 01:07:29,720 --> 01:07:32,990 was about, like, underestimating the costs of future exercising. 1689 01:07:32,990 --> 01:07:34,490 That is to say, what often happens 1690 01:07:34,490 --> 01:07:36,503 when people sign up for the gym, often 1691 01:07:36,503 --> 01:07:37,670 they are already at the gym. 1692 01:07:37,670 --> 01:07:40,100 They are really excited to exercise, and so on. 1693 01:07:40,100 --> 01:07:42,915 And they might underestimate how costly it actually 1694 01:07:42,915 --> 01:07:44,540 is once they sit at home in the evening 1695 01:07:44,540 --> 01:07:46,250 coming back home from work and so on. 1696 01:07:46,250 --> 01:07:49,040 They might underestimate how it feels, how costly it is 1697 01:07:49,040 --> 01:07:50,645 for them to actually exercise. 1698 01:07:50,645 --> 01:07:52,520 Nothing to do with self-control specifically, 1699 01:07:52,520 --> 01:07:54,180 but it's just it's a tedious thing to do. 1700 01:07:54,180 --> 01:07:55,460 They might underestimate how tired 1701 01:07:55,460 --> 01:07:57,043 they are, and so on, and so forth-- so 1702 01:07:57,043 --> 01:07:59,790 some form of like underestimation of the cost. 1703 01:07:59,790 --> 01:08:02,480 That's, again, hard to separate in many cases. 1704 01:08:02,480 --> 01:08:04,650 But that could also be going on. 1705 01:08:04,650 --> 01:08:09,140 So what they find then is people exercise, on average, 4.3 times 1706 01:08:09,140 --> 01:08:10,440 a month in the first year. 1707 01:08:10,440 --> 01:08:12,470 That's about $17 per visit. 1708 01:08:12,470 --> 01:08:15,650 And of course, you should be choosing option two. 1709 01:08:15,650 --> 01:08:19,130 Before canceling, consumers go 2.3 months, on average, 1710 01:08:19,130 --> 01:08:20,840 without using the gym at all, which 1711 01:08:20,840 --> 01:08:24,380 is kind of like it's even tedious to cancel the gym 1712 01:08:24,380 --> 01:08:25,520 and actually go there. 1713 01:08:25,520 --> 01:08:27,760 And I did, actually, exactly that as well. 1714 01:08:27,760 --> 01:08:31,109 And so how do we think about quasi-hyperbolic discounting 1715 01:08:31,109 --> 01:08:31,609 here? 1716 01:08:31,609 --> 01:08:33,800 Well, the gym-goers would like to exercise 1717 01:08:33,800 --> 01:08:34,771 a lot in the future. 1718 01:08:34,771 --> 01:08:36,229 Being naive, that's what they think 1719 01:08:36,229 --> 01:08:38,750 they will do, so, you know, overestimate what they will do. 1720 01:08:38,750 --> 01:08:42,979 To save on gym costs, they buy the monthly membership. 1721 01:08:42,979 --> 01:08:44,990 So that's kind of what the naive person does. 1722 01:08:44,990 --> 01:08:46,460 When it comes down to exercising, 1723 01:08:46,460 --> 01:08:48,050 their short-run inpatient kicks in, 1724 01:08:48,050 --> 01:08:50,255 and they do end up ending the membership much. 1725 01:08:50,255 --> 01:08:52,130 That's very much like sort of underestimating 1726 01:08:52,130 --> 01:08:54,590 or overestimating their beta or underestimating 1727 01:08:54,590 --> 01:08:55,910 their self-control problem. 1728 01:08:55,910 --> 01:08:57,729 Now, what does the sophisticated person do? 1729 01:08:57,729 --> 01:09:00,620 We already discussed that as well, which is they also 1730 01:09:00,620 --> 01:09:02,450 prefer to exercise a lot in the future, 1731 01:09:02,450 --> 01:09:04,740 but they realize they don't want to do so later. 1732 01:09:04,740 --> 01:09:08,840 So what they want to do is they choose the monthly contract 1733 01:09:08,840 --> 01:09:11,090 as a form of commitment device, in the sense of, 1734 01:09:11,090 --> 01:09:14,600 like, making future actions changing prices in the future, 1735 01:09:14,600 --> 01:09:16,622 they make it now cheaper to exercise compared 1736 01:09:16,622 --> 01:09:18,830 to sit at home by sort of changing the marginal costs 1737 01:09:18,830 --> 01:09:19,640 in the future. 1738 01:09:19,640 --> 01:09:21,870 And they might even be willing to pay for that. 1739 01:09:21,870 --> 01:09:23,510 So they might actually say, I know 1740 01:09:23,510 --> 01:09:25,970 that I'm only going to go like three or four times. 1741 01:09:25,970 --> 01:09:27,470 But if I don't choose that contract, 1742 01:09:27,470 --> 01:09:31,220 I'm going to only go 0 or 1 times, and it's worth for me 1743 01:09:31,220 --> 01:09:33,212 to do that. 1744 01:09:33,212 --> 01:09:34,670 So now, there's a different version 1745 01:09:34,670 --> 01:09:36,295 of-- in which one could do a commitment 1746 01:09:36,295 --> 01:09:39,020 devices about dealing with temptation, which 1747 01:09:39,020 --> 01:09:42,590 is a very clever bundling of temptations paper by Milkman 1748 01:09:42,590 --> 01:09:43,399 and others. 1749 01:09:43,399 --> 01:09:45,859 What they did-- instead of offering commitment devices 1750 01:09:45,859 --> 01:09:50,000 directly, what they did is they bundle two things. 1751 01:09:50,000 --> 01:09:52,399 I told you previously about investment goods and leisure 1752 01:09:52,399 --> 01:09:52,939 goods. 1753 01:09:52,939 --> 01:09:55,340 And so investment goods are things like going to the gym 1754 01:09:55,340 --> 01:09:56,990 that you do not enough of. 1755 01:09:56,990 --> 01:09:59,480 Leisure goods are things that you enjoy in the present 1756 01:09:59,480 --> 01:10:01,480 and you might do too much of that. 1757 01:10:01,480 --> 01:10:03,230 So now, what you can do is you essentially 1758 01:10:03,230 --> 01:10:04,170 bundle those things. 1759 01:10:04,170 --> 01:10:07,220 So what they did is they offered people the option to only 1760 01:10:07,220 --> 01:10:10,070 listen to audiobooks-- addictive audiobooks-- 1761 01:10:10,070 --> 01:10:11,240 at the gym. 1762 01:10:11,240 --> 01:10:12,637 And so then, you can only sort of 1763 01:10:12,637 --> 01:10:13,970 do that while you're at the gym. 1764 01:10:13,970 --> 01:10:15,170 And you might come back to the gym, 1765 01:10:15,170 --> 01:10:17,045 not necessarily because you want to exercise, 1766 01:10:17,045 --> 01:10:19,850 but because you want to listen to your audiobook. 1767 01:10:19,850 --> 01:10:20,960 This can also backfire. 1768 01:10:20,960 --> 01:10:23,750 I have a friend of mine, also in grad school, who 1769 01:10:23,750 --> 01:10:25,430 was watching lots of TV shows. 1770 01:10:25,430 --> 01:10:28,850 And he convinced himself that he could only watch TV shows 1771 01:10:28,850 --> 01:10:30,793 while being at the gym. 1772 01:10:30,793 --> 01:10:32,210 So then, I would be in the office, 1773 01:10:32,210 --> 01:10:34,835 and he would sometimes come back from the gym totally exhausted 1774 01:10:34,835 --> 01:10:37,730 because he had watched like three episodes of certain TV 1775 01:10:37,730 --> 01:10:40,550 shows while being on the treadmill. 1776 01:10:40,550 --> 01:10:43,760 And so he was kind of like overexercising 1777 01:10:43,760 --> 01:10:47,840 because of his consumption-- his bundling of temptations. 1778 01:10:47,840 --> 01:10:49,190 But in this case, for-- 1779 01:10:49,190 --> 01:10:50,810 so you have to sort of figure out how to calibrate this, 1780 01:10:50,810 --> 01:10:51,350 right? 1781 01:10:51,350 --> 01:10:53,433 But it could be a way of getting you, essentially, 1782 01:10:53,433 --> 01:10:56,060 to-- by bundling, essentially, pleasant and unpleasant things 1783 01:10:56,060 --> 01:10:58,925 at different points in time, by bundling them in certain ways, 1784 01:10:58,925 --> 01:11:00,800 it might make you-- sort of help you overcome 1785 01:11:00,800 --> 01:11:03,380 two issues at the time, which is one, you go to the gym 1786 01:11:03,380 --> 01:11:04,250 more often. 1787 01:11:04,250 --> 01:11:06,710 B, you might not sort of watch too much TV 1788 01:11:06,710 --> 01:11:09,620 if you can convince yourself to actually follow through. 1789 01:11:09,620 --> 01:11:11,690 They do find, in fact, bundling of temptation 1790 01:11:11,690 --> 01:11:14,060 is effective until the Thanksgiving 1791 01:11:14,060 --> 01:11:17,810 break, when sort of everything goes downhill in their study. 1792 01:11:17,810 --> 01:11:19,190 OK, that's all I have for now. 1793 01:11:19,190 --> 01:11:22,250 I'm going to continue on tomorrow with the remaining 1794 01:11:22,250 --> 01:11:24,300 parts of time preferences. 1795 01:11:24,300 --> 01:11:25,990 Thank you.