1 00:00:00,000 --> 00:00:01,924 [SQUEAKING] 2 00:00:01,924 --> 00:00:03,848 [RUSTLING] 3 00:00:03,848 --> 00:00:04,810 [CLICKING] 4 00:00:12,920 --> 00:00:15,440 FRANK SCHILBACH: So today what I'm going to try and do 5 00:00:15,440 --> 00:00:18,200 is go through some of the survey questions 6 00:00:18,200 --> 00:00:21,080 that we asked you last time at the end 7 00:00:21,080 --> 00:00:22,820 and try to illustrate some of the topics 8 00:00:22,820 --> 00:00:24,737 and some of the issues of behavioral economics 9 00:00:24,737 --> 00:00:27,470 using short survey answers and trying to give you 10 00:00:27,470 --> 00:00:29,600 a better or more precise overview of what 11 00:00:29,600 --> 00:00:32,470 kinds of topics are we going to talk about. 12 00:00:32,470 --> 00:00:34,250 As you know, this survey was anonymous. 13 00:00:34,250 --> 00:00:37,872 Somebody was asking, do we need ethical IRB approval for that? 14 00:00:37,872 --> 00:00:39,830 We take ethical issues, in fact, very seriously 15 00:00:39,830 --> 00:00:41,870 in all of the studies that we're doing. 16 00:00:41,870 --> 00:00:44,480 The answer to that is, since the survey was anonymous, 17 00:00:44,480 --> 00:00:47,270 and there was no personally-identifiable 18 00:00:47,270 --> 00:00:50,900 information collected, it's OK to do that, 19 00:00:50,900 --> 00:00:52,280 in particular in situations where 20 00:00:52,280 --> 00:00:55,550 arguably nobody is harmed from answering 21 00:00:55,550 --> 00:00:58,190 those types of questions. 22 00:00:58,190 --> 00:01:00,440 The surveys, you might say that's not a particularly 23 00:01:00,440 --> 00:01:01,640 rigorous experiment. 24 00:01:01,640 --> 00:01:02,977 A survey, that's exactly right. 25 00:01:02,977 --> 00:01:05,060 So the survey that we we're asking, in some sense, 26 00:01:05,060 --> 00:01:07,880 provides some suggestive evidence 27 00:01:07,880 --> 00:01:10,725 of some of the behavioral issues are going on. 28 00:01:10,725 --> 00:01:13,100 A lot of the evidence that I'm going to show you in class 29 00:01:13,100 --> 00:01:15,290 are much more rigorous in terms of actual much more 30 00:01:15,290 --> 00:01:18,140 rigorous experimentation that's kind of hard for me 31 00:01:18,140 --> 00:01:20,720 to do in a short survey. 32 00:01:20,720 --> 00:01:23,630 So you think of these evidence I'm showing you as suggestive. 33 00:01:23,630 --> 00:01:25,838 And then I'm going to show you more rigorous evidence 34 00:01:25,838 --> 00:01:27,830 that those kinds of phenomena, in fact, also 35 00:01:27,830 --> 00:01:32,990 holding once you do things more rigorously. 36 00:01:32,990 --> 00:01:35,210 So now when you think about how do 37 00:01:35,210 --> 00:01:38,310 economists think about human behavior, 38 00:01:38,310 --> 00:01:40,430 there are broadly three aspects in which 39 00:01:40,430 --> 00:01:42,060 you can think about that. 40 00:01:42,060 --> 00:01:43,190 The first one is-- 41 00:01:43,190 --> 00:01:46,160 or broadly speaking, we have constrained optimization, 42 00:01:46,160 --> 00:01:48,260 where people have a utility function, which 43 00:01:48,260 --> 00:01:51,860 is specifying in some mathematical form of what 44 00:01:51,860 --> 00:01:53,270 makes people happy. 45 00:01:53,270 --> 00:01:55,700 There's stuff in the world, consumption and so on, that 46 00:01:55,700 --> 00:01:57,320 goes into the utility function. 47 00:01:57,320 --> 00:01:59,840 You can eat apples, bananas, and so on. 48 00:01:59,840 --> 00:02:01,940 And essentially, the more of those 49 00:02:01,940 --> 00:02:04,520 you eat, usually the happier you are, 50 00:02:04,520 --> 00:02:08,990 which economists have construed as a concept called utility. 51 00:02:08,990 --> 00:02:10,370 Higher utility is good. 52 00:02:10,370 --> 00:02:14,180 And it specifies what makes people happy. 53 00:02:14,180 --> 00:02:17,300 Now there are different aspects to that. 54 00:02:17,300 --> 00:02:19,770 You can think of this as there's the instantaneous utility 55 00:02:19,770 --> 00:02:20,270 function. 56 00:02:20,270 --> 00:02:21,812 And I should have said in recitation, 57 00:02:21,812 --> 00:02:25,280 we're going to discuss a review of utility maximization. 58 00:02:25,280 --> 00:02:29,060 If you have taken 1401, 1403, 1404, 59 00:02:29,060 --> 00:02:32,360 and are very familiar with this kind of material, 60 00:02:32,360 --> 00:02:34,547 you might want to skip recitation this week. 61 00:02:34,547 --> 00:02:36,380 Future recitations will be much more general 62 00:02:36,380 --> 00:02:39,260 and specific to the class. 63 00:02:39,260 --> 00:02:42,390 So what makes people happy is essentially two things. 64 00:02:42,390 --> 00:02:44,870 One is there's an instantaneous utility function, which 65 00:02:44,870 --> 00:02:47,750 is kind of like at this moment in time what makes you happy 66 00:02:47,750 --> 00:02:50,000 or on a specific day-- often it's 67 00:02:50,000 --> 00:02:53,480 defined on a daily basis or even yearly basis 68 00:02:53,480 --> 00:02:56,990 at a specific period or moment in time, what makes you happy. 69 00:02:56,990 --> 00:02:59,570 That's the instantaneous utility function. 70 00:02:59,570 --> 00:03:03,080 And then there's time, risk, and others-- 71 00:03:03,080 --> 00:03:06,420 these are like risk, time, and social preferences. 72 00:03:06,420 --> 00:03:08,900 So time preferences is how do you aggregate over 73 00:03:08,900 --> 00:03:12,450 time, today versus tomorrow, two days, three days, four days, 74 00:03:12,450 --> 00:03:14,930 five days from now? 75 00:03:14,930 --> 00:03:17,600 How do you aggregate these up? 76 00:03:17,600 --> 00:03:18,620 So you might be happy-- 77 00:03:18,620 --> 00:03:20,900 you might make a choice today that makes you today 78 00:03:20,900 --> 00:03:21,530 very happy. 79 00:03:21,530 --> 00:03:23,580 But you regret it in five days from now. 80 00:03:23,580 --> 00:03:25,790 And the question is how do you aggregate 81 00:03:25,790 --> 00:03:30,650 these instantaneous utility functions 82 00:03:30,650 --> 00:03:32,180 into one big function? 83 00:03:32,180 --> 00:03:34,190 Those are time preferences. 84 00:03:34,190 --> 00:03:37,220 Risk preferences is when there's risk involved, when things 85 00:03:37,220 --> 00:03:39,350 are uncertain, so if I offer you a lottery, 86 00:03:39,350 --> 00:03:42,020 if you want to play the lottery or the like, 87 00:03:42,020 --> 00:03:45,240 if you take certain classes, if you study for an exam or not, 88 00:03:45,240 --> 00:03:46,490 these are often risky choices. 89 00:03:46,490 --> 00:03:48,390 In a sense, you can do one thing or another. 90 00:03:48,390 --> 00:03:50,120 Often you don't know what the outcome 91 00:03:50,120 --> 00:03:55,220 might be in various choices that you make in life. 92 00:03:55,220 --> 00:03:58,250 And then often some people are what we call risk averse, 93 00:03:58,250 --> 00:04:00,890 to say they avoid risk. 94 00:04:00,890 --> 00:04:03,630 And that affects their choices. 95 00:04:03,630 --> 00:04:06,200 And then finally, as far as social preferences, 96 00:04:06,200 --> 00:04:08,150 you could say social preferences in some sense 97 00:04:08,150 --> 00:04:10,850 go into the instantaneous utility function, 98 00:04:10,850 --> 00:04:12,980 in a sense of that's just another sort of argument 99 00:04:12,980 --> 00:04:14,000 of that function. 100 00:04:14,000 --> 00:04:16,550 Or you might say, in some sense I care about myself. 101 00:04:16,550 --> 00:04:17,730 And I care about others. 102 00:04:17,730 --> 00:04:20,180 And so if I'm aggregating my own instantaneous utility 103 00:04:20,180 --> 00:04:22,490 and others' instantaneous utility, [INAUDIBLE] larger 104 00:04:22,490 --> 00:04:24,140 function overall. 105 00:04:24,140 --> 00:04:27,260 But broadly speaking, one part of people's utility constrained 106 00:04:27,260 --> 00:04:29,660 optimization is the utility function. 107 00:04:29,660 --> 00:04:32,610 Second is people's beliefs. 108 00:04:32,610 --> 00:04:34,400 This is how they believe-- 109 00:04:34,400 --> 00:04:36,350 what people believe about their environment. 110 00:04:36,350 --> 00:04:38,183 When you think about purchasing things, that 111 00:04:38,183 --> 00:04:40,190 would be not usually prices, it would be 112 00:04:40,190 --> 00:04:41,810 returns to certain investment. 113 00:04:41,810 --> 00:04:43,400 If you take a class versus another, 114 00:04:43,400 --> 00:04:45,440 or if you take a major versus another, 115 00:04:45,440 --> 00:04:49,370 what does that lead to in terms of your future earnings, 116 00:04:49,370 --> 00:04:52,770 your happiness, and so on and so forth? 117 00:04:52,770 --> 00:04:55,572 And then one of that is essentially your priors, what 118 00:04:55,572 --> 00:04:56,780 do you think about the world? 119 00:04:56,780 --> 00:04:58,572 And then how do you update your information 120 00:04:58,572 --> 00:05:01,040 once I provide you with new information? 121 00:05:01,040 --> 00:05:03,200 So there's like some prior beliefs that you have. 122 00:05:03,200 --> 00:05:04,820 I'll provide you some new information. 123 00:05:04,820 --> 00:05:09,740 And you update it to your posterior beliefs overall. 124 00:05:09,740 --> 00:05:13,100 And then when you think about the utility maximization, 125 00:05:13,100 --> 00:05:16,550 problem from a mathematical perspective, you might say, 126 00:05:16,550 --> 00:05:18,230 once I know your preferences, and I 127 00:05:18,230 --> 00:05:20,173 know your beliefs or your information, 128 00:05:20,173 --> 00:05:21,590 I have all the information I need. 129 00:05:21,590 --> 00:05:22,830 I can solve for the optimum. 130 00:05:22,830 --> 00:05:24,720 And that's how you should behave. 131 00:05:24,720 --> 00:05:27,300 When in some sense, the whole problem is solved for you. 132 00:05:27,300 --> 00:05:30,290 So we are done. 133 00:05:30,290 --> 00:05:32,570 But then even conditional on that, people 134 00:05:32,570 --> 00:05:34,250 seem to make choices that don't quite 135 00:05:34,250 --> 00:05:37,220 fit the classical economics framework, which 136 00:05:37,220 --> 00:05:38,840 is kind of how are people-- 137 00:05:38,840 --> 00:05:40,220 or the question is, how do people 138 00:05:40,220 --> 00:05:42,020 use utility functions and beliefs to make 139 00:05:42,020 --> 00:05:43,070 certain decisions? 140 00:05:43,070 --> 00:05:46,070 And do we find certain anomalies or deviations 141 00:05:46,070 --> 00:05:48,860 from the perfect utility maximization 142 00:05:48,860 --> 00:05:52,060 in a sense of using preferences and beliefs? 143 00:05:52,060 --> 00:05:54,560 So there's different things that some influences on behavior 144 00:05:54,560 --> 00:05:56,870 aren't just about utility and beliefs. 145 00:05:56,870 --> 00:05:59,300 In particular, the issues like-- as I talked a little bit 146 00:05:59,300 --> 00:06:02,242 about last time, which are frames, default and nudges, 147 00:06:02,242 --> 00:06:03,200 and sort of heuristics. 148 00:06:03,200 --> 00:06:07,160 That's to say the way I present a problem to you 149 00:06:07,160 --> 00:06:10,610 might affect your choices greatly for given preferences 150 00:06:10,610 --> 00:06:12,290 and for given information. 151 00:06:12,290 --> 00:06:16,670 That is to say you might think you like one thing or another. 152 00:06:16,670 --> 00:06:18,770 And you have certain information about the returns 153 00:06:18,770 --> 00:06:20,720 to these choices. 154 00:06:20,720 --> 00:06:22,850 But then the way I present an information 155 00:06:22,850 --> 00:06:25,340 or I set the default, for example 156 00:06:25,340 --> 00:06:28,520 for savings or other choices, people 157 00:06:28,520 --> 00:06:34,430 might decide dramatically differently. 158 00:06:34,430 --> 00:06:37,910 So now what I'm going to do is essentially take each of these 159 00:06:37,910 --> 00:06:39,650 items, essentially first preferences, 160 00:06:39,650 --> 00:06:41,420 the utility function, second beliefs, 161 00:06:41,420 --> 00:06:43,640 and then choices and decision making, 162 00:06:43,640 --> 00:06:47,060 and show you essentially how using psychological insights 163 00:06:47,060 --> 00:06:50,666 might be used to improve people's-- 164 00:06:50,666 --> 00:06:53,300 to understanding their decision making as a whole 165 00:06:53,300 --> 00:06:55,640 or understand certain choices that people make 166 00:06:55,640 --> 00:06:58,490 and how essentially psychological aspects might 167 00:06:58,490 --> 00:07:02,870 affect people's behavior. 168 00:07:02,870 --> 00:07:05,030 So the first thing I'm going to talk about 169 00:07:05,030 --> 00:07:07,370 is a social preferences is like how 170 00:07:07,370 --> 00:07:09,960 do people think about themselves and others? 171 00:07:09,960 --> 00:07:13,370 And much of classical economics thinks that people are selfish. 172 00:07:13,370 --> 00:07:16,777 They essentially care about themselves and nobody else. 173 00:07:16,777 --> 00:07:17,860 Is that a good assumption? 174 00:07:17,860 --> 00:07:19,193 Or what do you think about that? 175 00:07:24,660 --> 00:07:25,470 Yes? 176 00:07:25,470 --> 00:07:26,095 AUDIENCE: Yeah. 177 00:07:26,095 --> 00:07:29,130 I think you can think of it in many ways that eventually leads 178 00:07:29,130 --> 00:07:32,220 you to think [INAUDIBLE] you can say that you value 179 00:07:32,220 --> 00:07:37,460 [INAUDIBLE] other people [INAUDIBLE] 180 00:07:37,460 --> 00:07:39,940 caring about other people and comparing 181 00:07:39,940 --> 00:07:42,430 to caring about other people because it makes you happy 182 00:07:42,430 --> 00:07:43,310 somehow. 183 00:07:43,310 --> 00:07:44,268 FRANK SCHILBACH: Right. 184 00:07:44,268 --> 00:07:46,712 So one thing you're saying is essentially it's actually 185 00:07:46,712 --> 00:07:48,170 tricky to figure out what's selfish 186 00:07:48,170 --> 00:07:49,600 and what is not, is in some sense, 187 00:07:49,600 --> 00:07:52,100 if I'm nice to all of you, it could be that I'm really nice. 188 00:07:52,100 --> 00:07:55,880 It could be that I just really care about evaluations 189 00:07:55,880 --> 00:07:56,580 or whatever. 190 00:07:56,580 --> 00:07:58,310 So in some sense, it's hard to interpret 191 00:07:58,310 --> 00:08:00,710 from people, what they do, in terms 192 00:08:00,710 --> 00:08:02,187 of understanding their motives. 193 00:08:02,187 --> 00:08:03,770 That's actually a very tricky question 194 00:08:03,770 --> 00:08:05,740 in behavioral economics, trying to figure out, 195 00:08:05,740 --> 00:08:07,040 are people truly altruistic? 196 00:08:07,040 --> 00:08:13,160 Or are they doing things for others for ulterior motives? 197 00:08:13,160 --> 00:08:15,640 I want to step back a little bit and saying overall, 198 00:08:15,640 --> 00:08:18,140 the assumption of selfishness is actually a pretty good one. 199 00:08:18,140 --> 00:08:20,900 In many situations, when you think about people's choices, 200 00:08:20,900 --> 00:08:23,550 usually the choices affect people themselves. 201 00:08:23,550 --> 00:08:25,280 When you think about what kind of classes 202 00:08:25,280 --> 00:08:27,140 do you choose, what kind of profession 203 00:08:27,140 --> 00:08:33,260 do you want to get into, whom you want to marry and so on, 204 00:08:33,260 --> 00:08:35,240 often that's a choice that affects yourself. 205 00:08:35,240 --> 00:08:36,919 These are individual choices. 206 00:08:36,919 --> 00:08:38,059 People are selfish. 207 00:08:38,059 --> 00:08:41,070 Of course, people are not selfish in all respects. 208 00:08:41,070 --> 00:08:43,220 But overall it's actually mostly true. 209 00:08:43,220 --> 00:08:47,360 It's actually a pretty good assumption in many situations. 210 00:08:47,360 --> 00:08:49,715 What I'm going to talk now about is in some situations, 211 00:08:49,715 --> 00:08:50,840 it's not a good assumption. 212 00:08:50,840 --> 00:08:53,540 So we should amend or think about this a little more 213 00:08:53,540 --> 00:08:54,530 broadly. 214 00:08:54,530 --> 00:08:59,000 And one way I think about this is to think about charity. 215 00:08:59,000 --> 00:09:01,550 2% of GDP is spent on charity. 216 00:09:01,550 --> 00:09:06,020 I guess it was $373.3 billion in 2015, which is a lot of money. 217 00:09:06,020 --> 00:09:07,520 So there's some evidence that people 218 00:09:07,520 --> 00:09:12,780 seem to be caring about others in some way. 219 00:09:12,780 --> 00:09:15,350 Now one way to think about this is to say, 220 00:09:15,350 --> 00:09:18,650 you have a utility function that has as an argument, 221 00:09:18,650 --> 00:09:21,570 other people's utility or consumption. 222 00:09:21,570 --> 00:09:23,120 So there's some people who say, I'm 223 00:09:23,120 --> 00:09:25,640 donating money to somebody in Kenya. 224 00:09:25,640 --> 00:09:28,070 I have a utility function that says, this person in Kenya, 225 00:09:28,070 --> 00:09:30,380 if that person has higher consumption, that 226 00:09:30,380 --> 00:09:31,940 makes me happier. 227 00:09:31,940 --> 00:09:33,740 That's one way to think about this. 228 00:09:33,740 --> 00:09:35,900 But there's also a broader ways. 229 00:09:35,900 --> 00:09:38,510 And you may call that pure altruism in the sense of I'm 230 00:09:38,510 --> 00:09:40,100 just caring about that person. 231 00:09:40,100 --> 00:09:44,173 If that person does better, that makes me happier. 232 00:09:44,173 --> 00:09:45,840 But that's probably not the whole story. 233 00:09:45,840 --> 00:09:47,257 So what other reasons might people 234 00:09:47,257 --> 00:09:50,230 have to give or be nice to others? 235 00:09:50,230 --> 00:09:50,730 Yeah? 236 00:09:50,730 --> 00:09:53,272 AUDIENCE: [INAUDIBLE] warm glow from others knowing you gave? 237 00:09:53,272 --> 00:09:55,605 FRANK SCHILBACH: Right, so warm glow-- can you say more? 238 00:09:55,605 --> 00:09:57,180 What do you mean exactly by that? 239 00:09:57,180 --> 00:09:59,306 AUDIENCE: If other people that you gave, 240 00:09:59,306 --> 00:10:01,845 you can get a warm glow feeling from people knowing that you 241 00:10:01,845 --> 00:10:03,610 gave things to other people. 242 00:10:03,610 --> 00:10:05,860 FRANK SCHILBACH: So you're saying two things, I think. 243 00:10:05,860 --> 00:10:09,130 One is warm glow, just feeling good about updating 244 00:10:09,130 --> 00:10:10,690 about yourself in some sense. 245 00:10:10,690 --> 00:10:12,112 People call that often self image 246 00:10:12,112 --> 00:10:13,570 or the like, where essentially just 247 00:10:13,570 --> 00:10:14,955 you want to be a good person. 248 00:10:14,955 --> 00:10:16,330 Somebody asks you, would you like 249 00:10:16,330 --> 00:10:18,287 to give money to somebody else? 250 00:10:18,287 --> 00:10:20,620 It's not like you actually care about this other person. 251 00:10:20,620 --> 00:10:23,200 But you want to maintain your image of being a good person. 252 00:10:23,200 --> 00:10:28,000 And it makes you just feel good about yourself. 253 00:10:28,000 --> 00:10:29,450 You said also something different, 254 00:10:29,450 --> 00:10:31,510 which is you care a lot about what others. 255 00:10:31,510 --> 00:10:33,700 So one is about self-image, what do you think about yourself. 256 00:10:33,700 --> 00:10:34,480 I'm a good person. 257 00:10:34,480 --> 00:10:35,360 I give to others. 258 00:10:35,360 --> 00:10:37,653 And if somebody asked me, I should probably give. 259 00:10:37,653 --> 00:10:39,070 And you said something else, which 260 00:10:39,070 --> 00:10:42,880 is I care a lot about what other people think about me. 261 00:10:42,880 --> 00:10:45,520 And so usually we think of that as called social image, which 262 00:10:45,520 --> 00:10:48,550 is to say, I care a lot about others' opinions. 263 00:10:48,550 --> 00:10:52,120 And if other people think I'm a nice person, giving money 264 00:10:52,120 --> 00:10:55,470 helps with that. 265 00:10:55,470 --> 00:10:55,970 What else? 266 00:10:55,970 --> 00:10:57,220 What other motives do we have? 267 00:10:57,220 --> 00:10:58,384 Yes? 268 00:10:58,384 --> 00:10:59,610 AUDIENCE: [INAUDIBLE]. 269 00:11:06,412 --> 00:11:07,370 FRANK SCHILBACH: Right. 270 00:11:07,370 --> 00:11:09,520 So that's a little bit, in some sense, semantics, 271 00:11:09,520 --> 00:11:13,840 in the sense of if I give money to pay lower taxes, 272 00:11:13,840 --> 00:11:15,997 in some sense, if I looked at this more carefully, 273 00:11:15,997 --> 00:11:17,830 it may look like-- and I'm going to show you 274 00:11:17,830 --> 00:11:20,560 some evidence of essentially the fraction of money 275 00:11:20,560 --> 00:11:22,330 that's donated to [INAUDIBLE]. 276 00:11:22,330 --> 00:11:23,800 I'm going to split this up. 277 00:11:23,800 --> 00:11:26,240 And there, some aspects in that, for example, 278 00:11:26,240 --> 00:11:29,020 if you donate money to Harvard for a building, 279 00:11:29,020 --> 00:11:32,200 it's not obvious that's necessarily altruism 280 00:11:32,200 --> 00:11:34,660 as opposed to you feel really good about yourself 281 00:11:34,660 --> 00:11:40,270 to have your name on a building, if you have $100 million 282 00:11:40,270 --> 00:11:43,130 or something that's at your disposal. 283 00:11:43,130 --> 00:11:45,130 So you're saying essentially some behaviors that 284 00:11:45,130 --> 00:11:47,470 may look altruistic, but in fact, they're not, 285 00:11:47,470 --> 00:11:50,680 in part because it's just individual optimization. 286 00:11:50,680 --> 00:11:53,380 So in some sense, that's more like a misspecification, 287 00:11:53,380 --> 00:11:56,830 a misunderstanding of people's motives, which I think surely 288 00:11:56,830 --> 00:12:00,190 is in some of what looks like altruism, which in fact, it's 289 00:12:00,190 --> 00:12:01,900 not. 290 00:12:01,900 --> 00:12:03,922 There was-- yeah? 291 00:12:03,922 --> 00:12:10,810 AUDIENCE: [INAUDIBLE] more altruistic [INAUDIBLE] 292 00:12:10,810 --> 00:12:18,614 altruistic societies [INAUDIBLE] have access 293 00:12:18,614 --> 00:12:24,052 to [INAUDIBLE] then you could be acting altruistically, 294 00:12:24,052 --> 00:12:29,022 but in a sense of trying to ultimately optimize your own 295 00:12:29,022 --> 00:12:31,430 [INAUDIBLE]. 296 00:12:31,430 --> 00:12:33,150 FRANK SCHILBACH: So if I can rephrase-- 297 00:12:33,150 --> 00:12:34,650 we're going to get to this, in fact, 298 00:12:34,650 --> 00:12:37,560 in a few lectures on the social preferences, which 299 00:12:37,560 --> 00:12:39,570 is to say if you compare societies, 300 00:12:39,570 --> 00:12:42,000 in certain societies, for example, if people do things 301 00:12:42,000 --> 00:12:44,740 like whaling or the like-- essentially, 302 00:12:44,740 --> 00:12:48,210 if you do occupations where you need cooperation 303 00:12:48,210 --> 00:12:51,240 to be successful in your finding food 304 00:12:51,240 --> 00:12:54,780 or the like, that might lead to altruism, where 305 00:12:54,780 --> 00:12:57,990 altruistic behavior people, help each other not necessarily 306 00:12:57,990 --> 00:12:59,260 because they like each other. 307 00:12:59,260 --> 00:13:03,160 But essentially it's necessary to be able to survive. 308 00:13:03,160 --> 00:13:06,510 And then that might lead to either just 309 00:13:06,510 --> 00:13:10,110 cooperative behavior as a whole, or perhaps to people 310 00:13:10,110 --> 00:13:11,527 are, in fact, nicer to each other. 311 00:13:11,527 --> 00:13:12,860 So we will actually get to that. 312 00:13:12,860 --> 00:13:14,850 And that's essentially exactly as you say, sort 313 00:13:14,850 --> 00:13:18,240 of an evolutionary perspective, and saying depending 314 00:13:18,240 --> 00:13:21,540 on what sort of the social the incentives are for society, 315 00:13:21,540 --> 00:13:24,180 depending on what kinds of professions people do, 316 00:13:24,180 --> 00:13:27,630 people might be nicer or less nice to each other 317 00:13:27,630 --> 00:13:32,700 because it's necessary to survive or be rich and so on. 318 00:13:32,700 --> 00:13:34,290 What else? 319 00:13:34,290 --> 00:13:35,630 Yes? 320 00:13:35,630 --> 00:13:37,650 AUDIENCE: You might have expectations 321 00:13:37,650 --> 00:13:40,718 about whether other people will reciprocate [INAUDIBLE].. 322 00:13:40,718 --> 00:13:41,760 FRANK SCHILBACH: Exactly. 323 00:13:41,760 --> 00:13:43,620 So there is essentially reciprocity. 324 00:13:43,620 --> 00:13:45,480 If I do something really nice to you, 325 00:13:45,480 --> 00:13:49,110 and then ask you a favor in return, 326 00:13:49,110 --> 00:13:51,120 you might not actually want to do that. 327 00:13:51,120 --> 00:13:53,730 But you feel so inclined because I've been [INAUDIBLE] you. 328 00:13:53,730 --> 00:13:55,500 It's a thing that we do in society. 329 00:13:55,500 --> 00:13:57,660 And reciprocity is really important. 330 00:13:57,660 --> 00:14:00,180 Quite related to that is fairness. 331 00:14:00,180 --> 00:14:02,400 in some sense, if somebody does something, 332 00:14:02,400 --> 00:14:06,090 or if somebody gets paid a lot more than somebody else, 333 00:14:06,090 --> 00:14:07,950 you might want to share that with others, 334 00:14:07,950 --> 00:14:09,900 not necessarily because you want to. 335 00:14:09,900 --> 00:14:11,280 But it's perceived as unfair. 336 00:14:11,280 --> 00:14:15,687 And people are uncomfortable with that. 337 00:14:15,687 --> 00:14:16,270 Anything else? 338 00:14:19,810 --> 00:14:20,310 Yes? 339 00:14:20,310 --> 00:14:21,650 Sorry. 340 00:14:21,650 --> 00:14:23,640 AUDIENCE: [INAUDIBLE] stress. 341 00:14:23,640 --> 00:14:26,720 So maybe when you observe suffering, 342 00:14:26,720 --> 00:14:29,300 you take it on as your own suffering. 343 00:14:29,300 --> 00:14:31,702 So when you act altruistically, it's 344 00:14:31,702 --> 00:14:35,215 not necessarily because you care about these people directly. 345 00:14:35,215 --> 00:14:38,180 But you now want to alleviate your own suffering. 346 00:14:38,180 --> 00:14:39,430 FRANK SCHILBACH: Yes, exactly. 347 00:14:39,430 --> 00:14:42,400 So that's what economists would refer to as the inequity 348 00:14:42,400 --> 00:14:46,450 aversion, one way or the other, essentially seeing inequality 349 00:14:46,450 --> 00:14:49,630 or inequity makes people uncomfortable. 350 00:14:49,630 --> 00:14:53,870 And that comes in various forms. 351 00:14:53,870 --> 00:14:57,310 One form of that would be, as you say, in absolute terms, 352 00:14:57,310 --> 00:14:58,930 look here, people, and this is-- 353 00:14:58,930 --> 00:15:02,020 I said a lot of poverty and development economics. 354 00:15:02,020 --> 00:15:03,730 Here is people who are extremely poor. 355 00:15:03,730 --> 00:15:06,400 It's either our moral obligation, 356 00:15:06,400 --> 00:15:08,800 or I just feel uncomfortable seeing people suffering. 357 00:15:08,800 --> 00:15:11,110 We should help them to get over a certain-- 358 00:15:11,110 --> 00:15:12,760 up to a certain standard of maybe 359 00:15:12,760 --> 00:15:17,093 reducing or eradicating absolute or extreme poverty. 360 00:15:17,093 --> 00:15:19,510 There's a different version of that, which is just to say, 361 00:15:19,510 --> 00:15:24,550 inequality is just uncomfortable and unfair overall. 362 00:15:24,550 --> 00:15:26,890 We have, for example, in the US people are 363 00:15:26,890 --> 00:15:29,290 a lot richer than say in India. 364 00:15:29,290 --> 00:15:30,940 Nevertheless you might want to think 365 00:15:30,940 --> 00:15:34,510 you would like to live in a fair and just society, where people 366 00:15:34,510 --> 00:15:37,510 at least have a reasonably high living standards 367 00:15:37,510 --> 00:15:38,260 that they live on. 368 00:15:38,260 --> 00:15:41,418 And then it's either a moral or other obligation. 369 00:15:41,418 --> 00:15:43,210 And people might give because it might just 370 00:15:43,210 --> 00:15:45,885 feel uncomfortable to them. 371 00:15:45,885 --> 00:15:47,260 I think we said all of the things 372 00:15:47,260 --> 00:15:48,385 that I have here mentioned. 373 00:15:48,385 --> 00:15:53,740 And so there's various questions on various motives on altruism 374 00:15:53,740 --> 00:15:55,852 and why people give and are nice to others. 375 00:15:55,852 --> 00:15:57,310 Part of what we're going to discuss 376 00:15:57,310 --> 00:15:58,852 when we talk about social preferences 377 00:15:58,852 --> 00:16:00,894 is trying to disentangle those different motives. 378 00:16:00,894 --> 00:16:02,727 And there's a bunch of different experiments 379 00:16:02,727 --> 00:16:04,630 that people have run to trying to learn about 380 00:16:04,630 --> 00:16:05,890 why do people give? 381 00:16:05,890 --> 00:16:08,500 What determines altruism? 382 00:16:08,500 --> 00:16:12,490 What kinds of circumstances make people become nicer to others? 383 00:16:12,490 --> 00:16:16,090 Or if you wanted to increase altruism, how would we do that? 384 00:16:16,090 --> 00:16:17,200 Yes? 385 00:16:17,200 --> 00:16:19,370 AUDIENCE: Does caring about others mean 386 00:16:19,370 --> 00:16:20,602 the general population? 387 00:16:20,602 --> 00:16:22,060 How would they look at it as caring 388 00:16:22,060 --> 00:16:24,200 about your friends or family? 389 00:16:24,200 --> 00:16:26,260 [INAUDIBLE] 390 00:16:26,260 --> 00:16:28,630 FRANK SCHILBACH: I think those are very much related. 391 00:16:28,630 --> 00:16:30,130 There's a bit of a question overall. 392 00:16:30,130 --> 00:16:32,110 When you think about your utility function, 393 00:16:32,110 --> 00:16:34,390 usually it's defined for individuals. 394 00:16:34,390 --> 00:16:36,790 It's essentially you, yourself, how are you doing? 395 00:16:36,790 --> 00:16:38,355 There's other views, where you say, 396 00:16:38,355 --> 00:16:39,980 it's actually the household as a whole. 397 00:16:39,980 --> 00:16:43,870 So essentially everybody in your household is a unit. 398 00:16:43,870 --> 00:16:45,697 And then you maximize as a whole. 399 00:16:45,697 --> 00:16:48,280 There's a bunch of things about within the household, conflict 400 00:16:48,280 --> 00:16:49,600 and issues that arise. 401 00:16:49,600 --> 00:16:51,400 And decision making within households 402 00:16:51,400 --> 00:16:53,000 might not be optimal. 403 00:16:53,000 --> 00:16:56,118 But you can think of altruism as narrowly defined about friends 404 00:16:56,118 --> 00:16:57,160 and people that you love. 405 00:16:57,160 --> 00:16:59,230 And presumably you have like higher altruism 406 00:16:59,230 --> 00:17:00,830 towards those individuals. 407 00:17:00,830 --> 00:17:05,020 And then people who essentially are further away, there's 408 00:17:05,020 --> 00:17:07,390 less of that or perhaps other motives. 409 00:17:07,390 --> 00:17:08,800 These things are surely related. 410 00:17:08,800 --> 00:17:09,700 But they're distinct. 411 00:17:09,700 --> 00:17:13,937 I think when you think about your parents or your family, 412 00:17:13,937 --> 00:17:15,520 what do you do for them, often there's 413 00:17:15,520 --> 00:17:18,640 different motives than when you-- 414 00:17:18,640 --> 00:17:20,319 for example, reciprocity et cetera, 415 00:17:20,319 --> 00:17:23,200 seems a lot more important in some ways or perhaps 416 00:17:23,200 --> 00:17:27,250 pure altruism, as opposed to inequity aversion 417 00:17:27,250 --> 00:17:31,720 or when you think about poor people across the world. 418 00:17:31,720 --> 00:17:33,930 But I think these things are very much related. 419 00:17:33,930 --> 00:17:37,360 And the question is, which of those particular elements 420 00:17:37,360 --> 00:17:39,565 are more important in different settings? 421 00:17:44,520 --> 00:17:47,010 So then the key questions we're going to ask is-- 422 00:17:47,010 --> 00:17:50,370 one is what is the nature of such social preferences, 423 00:17:50,370 --> 00:17:52,920 i.e. the motivation to help and hurt others? 424 00:17:52,920 --> 00:17:54,468 We mentioned already most of them. 425 00:17:54,468 --> 00:17:56,010 And we're going to try and understand 426 00:17:56,010 --> 00:17:59,775 which of those are important in which settings? 427 00:17:59,775 --> 00:18:02,220 So one of them is what determines 428 00:18:02,220 --> 00:18:04,170 and why do people give to others? 429 00:18:04,170 --> 00:18:06,557 And a separate question-- so broadly speaking, 430 00:18:06,557 --> 00:18:08,640 you can think of this as like why are people nice? 431 00:18:08,640 --> 00:18:10,682 And what determines-- to others, and why are they 432 00:18:10,682 --> 00:18:12,780 doing that-- or appear nice, if you want. 433 00:18:12,780 --> 00:18:16,200 And a second version is called social influences. 434 00:18:16,200 --> 00:18:18,238 How does the presence of others affect 435 00:18:18,238 --> 00:18:19,530 your behavior and your utility? 436 00:18:19,530 --> 00:18:23,158 And then to say it might be that you care a lot about others, 437 00:18:23,158 --> 00:18:25,200 what other people think, or you are very jealous, 438 00:18:25,200 --> 00:18:26,343 and so on and so forth. 439 00:18:26,343 --> 00:18:27,510 That's not about being nice. 440 00:18:27,510 --> 00:18:30,060 That's just about your utility might be deeply affected what 441 00:18:30,060 --> 00:18:31,425 other people think about you. 442 00:18:31,425 --> 00:18:33,383 They're going to talk a little bit about things 443 00:18:33,383 --> 00:18:37,247 like Facebook and Instagram, et cetera, social media 444 00:18:37,247 --> 00:18:37,830 in particular. 445 00:18:37,830 --> 00:18:40,230 It might affect how do you feel about yourself. 446 00:18:40,230 --> 00:18:45,570 And social influences might shape people's behavior. 447 00:18:45,570 --> 00:18:47,860 So when you think about then why are people nice-- 448 00:18:47,860 --> 00:18:49,560 and this is what you were saying earlier, about when you think 449 00:18:49,560 --> 00:18:51,210 about where do the nations go? 450 00:18:51,210 --> 00:18:57,482 I was saying, 2% of GDP is given to charity. 451 00:18:57,482 --> 00:18:59,190 When you look at some of those charities, 452 00:18:59,190 --> 00:19:01,410 for some of the charities, it seems pretty clear 453 00:19:01,410 --> 00:19:06,090 that people do that because they want to be nice to others. 454 00:19:06,090 --> 00:19:10,200 For other charities, perhaps not so much. 455 00:19:10,200 --> 00:19:13,290 And some of the education ones are essentially donations 456 00:19:13,290 --> 00:19:16,470 to essentially the sponsor like a quarterback at your school. 457 00:19:16,470 --> 00:19:18,720 It's not clear that this is altruism. 458 00:19:18,720 --> 00:19:21,810 Other donations, if you give for health or poverty, 459 00:19:21,810 --> 00:19:23,640 I think that's pretty obvious. 460 00:19:26,880 --> 00:19:29,232 Now what we did in class is we-- so one way 461 00:19:29,232 --> 00:19:31,440 to think about social preference as a measure, that's 462 00:19:31,440 --> 00:19:33,390 a very basic and coarse way of doing that, 463 00:19:33,390 --> 00:19:35,370 is what's called the dictator game to measure 464 00:19:35,370 --> 00:19:36,900 people's preferences. 465 00:19:36,900 --> 00:19:38,080 And it's very simple. 466 00:19:38,080 --> 00:19:40,598 And one of the questions we asked you was, 467 00:19:40,598 --> 00:19:42,390 here's if you get $10, you could split it-- 468 00:19:42,390 --> 00:19:44,670 and it's a hypothetical question in class, 469 00:19:44,670 --> 00:19:46,320 actually in lecture 12 or something. 470 00:19:46,320 --> 00:19:49,830 We're going to do this at least at some point with real money. 471 00:19:49,830 --> 00:19:53,730 But the question is, if you had just $10, 472 00:19:53,730 --> 00:19:55,230 what would you do with that $10? 473 00:19:55,230 --> 00:19:56,772 How much would you keep for yourself? 474 00:19:56,772 --> 00:19:59,370 And how much would you give to somebody else? 475 00:19:59,370 --> 00:20:01,650 The other version that we asked you was-- 476 00:20:01,650 --> 00:20:03,150 or there was two versions of that. 477 00:20:03,150 --> 00:20:04,770 One was the recipient is informed 478 00:20:04,770 --> 00:20:06,870 about the circumstances of the decision. 479 00:20:06,870 --> 00:20:09,360 The other one is the other person might never notice. 480 00:20:09,360 --> 00:20:12,450 The money is just given anonymously. 481 00:20:12,450 --> 00:20:13,470 How can we use that to-- 482 00:20:13,470 --> 00:20:16,140 what can we disentangle potentially from those two 483 00:20:16,140 --> 00:20:18,330 choices or options? 484 00:20:18,330 --> 00:20:20,430 If people choose differently in one and two, 485 00:20:20,430 --> 00:20:21,420 what does that tell us? 486 00:20:21,420 --> 00:20:21,600 Yeah? 487 00:20:21,600 --> 00:20:23,935 AUDIENCE: [INAUDIBLE] one, they chose to split it, 488 00:20:23,935 --> 00:20:25,810 that might be more of a social thing 489 00:20:25,810 --> 00:20:27,893 because they know that the other person will know. 490 00:20:27,893 --> 00:20:30,650 So they might worry that [INAUDIBLE] too much. 491 00:20:30,650 --> 00:20:32,880 But if they answer the same for the current two, 492 00:20:32,880 --> 00:20:34,620 then they're a little bit more selfless 493 00:20:34,620 --> 00:20:37,060 because let's say then they would split [INAUDIBLE] 494 00:20:37,060 --> 00:20:38,780 whether or not the other person knows. 495 00:20:38,780 --> 00:20:39,822 FRANK SCHILBACH: Exactly. 496 00:20:39,822 --> 00:20:41,840 So in some way, at least, you can 497 00:20:41,840 --> 00:20:44,690 think of some form of pure altruism, in some sense. 498 00:20:44,690 --> 00:20:48,290 I would be very happy if you had $10 more on your bank account, 499 00:20:48,290 --> 00:20:49,250 even if you never knew. 500 00:20:49,250 --> 00:20:50,630 There's no reciprocity. 501 00:20:50,630 --> 00:20:53,150 There's no you knowing or learning 502 00:20:53,150 --> 00:20:55,100 about what I did or didn't do. 503 00:20:55,100 --> 00:20:58,140 It's just like I'd be happier if you had more money. 504 00:20:58,140 --> 00:21:02,212 And therefore I'm giving it to you. 505 00:21:02,212 --> 00:21:03,920 Of course, there's things like self-image 506 00:21:03,920 --> 00:21:05,360 and so on, are hard to rule out. 507 00:21:05,360 --> 00:21:07,790 In some sense maybe it's not like I'm actually 508 00:21:07,790 --> 00:21:08,540 happy about you. 509 00:21:08,540 --> 00:21:12,230 I'm just happy about be about me being happy about you. 510 00:21:12,230 --> 00:21:13,970 So there's tricky issues about that. 511 00:21:13,970 --> 00:21:17,210 But it's getting close to a form of pure altruism. 512 00:21:17,210 --> 00:21:20,020 If instead, you learn about that either I 513 00:21:20,020 --> 00:21:21,770 explicitly-- it was me who gave you money, 514 00:21:21,770 --> 00:21:24,270 or at least there was somebody else doing it, 515 00:21:24,270 --> 00:21:26,660 then that's much more about the person who 516 00:21:26,660 --> 00:21:28,910 cares about where the money is coming from and so on. 517 00:21:28,910 --> 00:21:31,040 That tells you more about social motives 518 00:21:31,040 --> 00:21:33,860 potentially about how the other person feels 519 00:21:33,860 --> 00:21:36,050 when they get money, how others were treating them, 520 00:21:36,050 --> 00:21:39,230 or perhaps how I feel when the other person gets it and so on. 521 00:21:39,230 --> 00:21:40,740 Is there a question? 522 00:21:40,740 --> 00:21:41,240 No. 523 00:21:41,240 --> 00:21:44,726 AUDIENCE: [INAUDIBLE] I don't agree that it would necessarily 524 00:21:44,726 --> 00:21:48,050 have to be about repetition [INAUDIBLE].. 525 00:21:48,050 --> 00:21:50,792 It could simply be about promoting the idea 526 00:21:50,792 --> 00:21:51,750 of getting [INAUDIBLE]. 527 00:21:51,750 --> 00:21:58,850 So if a person [INAUDIBLE] it doesn't 528 00:21:58,850 --> 00:22:00,646 serve to promote the idea [INAUDIBLE].. 529 00:22:00,646 --> 00:22:03,595 But if it did, the person did know that somebody else was 530 00:22:03,595 --> 00:22:05,012 going to give them money, it would 531 00:22:05,012 --> 00:22:09,261 serve to [INAUDIBLE] even if-- they 532 00:22:09,261 --> 00:22:15,540 could be equally selfless even if [INAUDIBLE] your intentions 533 00:22:15,540 --> 00:22:17,840 are [INAUDIBLE]. 534 00:22:17,840 --> 00:22:20,210 FRANK SCHILBACH: So just to clarify, so in number two, 535 00:22:20,210 --> 00:22:22,610 I think what I'm saying is that's pretty 536 00:22:22,610 --> 00:22:24,320 close to pure altruism. 537 00:22:24,320 --> 00:22:27,420 In number one, I'm saying there's other forces at play. 538 00:22:27,420 --> 00:22:30,433 One of them is about people care about what others do. 539 00:22:30,433 --> 00:22:32,600 You were saying in some sense it's a version of that 540 00:22:32,600 --> 00:22:35,163 and saying, if it promotes altruism, in some sense, 541 00:22:35,163 --> 00:22:36,830 if I give you money, and you learn about 542 00:22:36,830 --> 00:22:38,180 that I give you money, and the hope 543 00:22:38,180 --> 00:22:40,680 is that you'll give money to others, in some sense you would 544 00:22:40,680 --> 00:22:43,820 only do that if you care about others one way or the other, 545 00:22:43,820 --> 00:22:46,783 or if you learn something from those kinds of actions. 546 00:22:46,783 --> 00:22:47,450 That's all good. 547 00:22:47,450 --> 00:22:48,980 I think I'm just saying there's-- 548 00:22:48,980 --> 00:22:50,750 so those kinds of experiments will 549 00:22:50,750 --> 00:22:54,440 help us disentangle potentially different motives for giving. 550 00:22:54,440 --> 00:22:55,550 This is a whole course. 551 00:22:55,550 --> 00:22:57,740 And you can't rule out all sorts of things. 552 00:22:57,740 --> 00:22:59,342 But it's clear that in some sense, 553 00:22:59,342 --> 00:23:00,800 there is something different, which 554 00:23:00,800 --> 00:23:04,070 has to do with people caring about others one way 555 00:23:04,070 --> 00:23:05,325 or the other in some way. 556 00:23:05,325 --> 00:23:05,825 Yeah? 557 00:23:05,825 --> 00:23:06,988 AUDIENCE: I actually did have a question. 558 00:23:06,988 --> 00:23:08,208 Do you think the results could change 559 00:23:08,208 --> 00:23:09,672 if we gave a larger sum of money, 560 00:23:09,672 --> 00:23:12,356 like $500 or $1,000 or something? 561 00:23:12,356 --> 00:23:15,284 Because with $10, the difference between getting $5 562 00:23:15,284 --> 00:23:17,740 and getting $10 isn't going to be big. 563 00:23:17,740 --> 00:23:20,465 That's the difference between having a nice $5 564 00:23:20,465 --> 00:23:23,820 and being able to get four bags of chips 565 00:23:23,820 --> 00:23:25,330 versus two bags of chips. 566 00:23:25,330 --> 00:23:28,270 So if you do it with $1,000, the difference 567 00:23:28,270 --> 00:23:33,040 between $500 and $1,000 could be the difference between-- 568 00:23:33,040 --> 00:23:37,105 something larger, like $500 would maybe pay for [INAUDIBLE] 569 00:23:37,105 --> 00:23:37,840 a while. 570 00:23:37,840 --> 00:23:41,270 But with $1,000, you could maybe buy a small car or something 571 00:23:41,270 --> 00:23:41,770 with it. 572 00:23:41,770 --> 00:23:43,145 FRANK SCHILBACH: Yes, understood. 573 00:23:43,145 --> 00:23:44,930 So there's only two questions here. 574 00:23:44,930 --> 00:23:46,930 One is I'm asking you hypothetical questions. 575 00:23:46,930 --> 00:23:48,680 And you might say all sorts of things. 576 00:23:48,680 --> 00:23:50,890 But once I'm actually putting down money and say, 577 00:23:50,890 --> 00:23:53,542 here's actually $10, how are you going to behave? 578 00:23:53,542 --> 00:23:55,000 So what we're going to do in class, 579 00:23:55,000 --> 00:23:56,290 we're going to have some hypothetical choices. 580 00:23:56,290 --> 00:23:57,615 Then we have some real choices and going 581 00:23:57,615 --> 00:23:59,110 to look at how that's different. 582 00:23:59,110 --> 00:24:02,110 In general people choose actually pretty similar 583 00:24:02,110 --> 00:24:04,090 and hypothetical to real choices. 584 00:24:04,090 --> 00:24:06,910 Then what we will not be able to answer in class is how would 585 00:24:06,910 --> 00:24:09,310 you choose if I give you $500? 586 00:24:09,310 --> 00:24:12,860 Because I'd be poor at the end of class. 587 00:24:12,860 --> 00:24:15,220 However, there have been some experiments 588 00:24:15,220 --> 00:24:17,330 with bankers and rich people and so on, 589 00:24:17,330 --> 00:24:19,660 where actually they gave people $1,000. 590 00:24:19,660 --> 00:24:22,217 And what you see overall is that people's giving behavior 591 00:24:22,217 --> 00:24:23,800 in those kinds of experiments actually 592 00:24:23,800 --> 00:24:26,320 is quite similar to smaller sums. 593 00:24:26,320 --> 00:24:28,238 I think there may be some differences. 594 00:24:28,238 --> 00:24:30,280 And I think there's a huge difference potentially 595 00:24:30,280 --> 00:24:33,400 in particular when you think about poverty in developing-- 596 00:24:33,400 --> 00:24:35,308 or giving in developing countries. 597 00:24:35,308 --> 00:24:36,850 There's a huge difference potentially 598 00:24:36,850 --> 00:24:38,890 whether you give people large sums of money, 599 00:24:38,890 --> 00:24:40,307 and that could change their lives, 600 00:24:40,307 --> 00:24:43,120 and people could invest and so on and so forth, 601 00:24:43,120 --> 00:24:46,750 as opposed to very small amounts or even repeated small amounts. 602 00:24:46,750 --> 00:24:50,800 If they give you $5 every few days for quite a while, 603 00:24:50,800 --> 00:24:52,450 that might actually not change things 604 00:24:52,450 --> 00:24:55,330 that much because every small thing feels really small. 605 00:24:55,330 --> 00:24:57,970 But instead, if I give you $1,000 right away, 606 00:24:57,970 --> 00:25:01,250 you might invest and try to have a better life and so on. 607 00:25:01,250 --> 00:25:03,672 So in that sense, I think that's potentially different. 608 00:25:03,672 --> 00:25:05,630 When you think about these kinds of experiments 609 00:25:05,630 --> 00:25:09,370 actually look quite similar in the behavior at least 610 00:25:09,370 --> 00:25:11,930 in lab games. 611 00:25:11,930 --> 00:25:15,500 So now how much do people give? 612 00:25:15,500 --> 00:25:16,898 So the strict version, just to be 613 00:25:16,898 --> 00:25:18,440 clear in terms of the standard model, 614 00:25:18,440 --> 00:25:20,480 would say, people should give exactly zero. 615 00:25:20,480 --> 00:25:21,980 If you care only about yourself, why 616 00:25:21,980 --> 00:25:23,010 would you give money to others? 617 00:25:23,010 --> 00:25:24,052 You just keep everything. 618 00:25:24,052 --> 00:25:27,320 Usually subjects give about $2 to $3. 619 00:25:27,320 --> 00:25:30,650 The average giving in class with about $3 in the first case 620 00:25:30,650 --> 00:25:34,710 and $2 in the second case. 621 00:25:34,710 --> 00:25:36,350 So in some sense, it seems essentially 622 00:25:36,350 --> 00:25:40,920 that when the person knows about the circumstances-- 623 00:25:40,920 --> 00:25:44,150 so something that's valued more either being it 624 00:25:44,150 --> 00:25:47,550 signals altruism, or just the other person might be happier, 625 00:25:47,550 --> 00:25:51,240 and so we appreciate that in some way. 626 00:25:51,240 --> 00:25:53,302 And so here is what this looks like. 627 00:25:53,302 --> 00:25:54,510 You can see the distribution. 628 00:25:54,510 --> 00:25:57,710 A lot of people give either $0 or $5. 629 00:25:57,710 --> 00:25:59,690 There's a few of you who would give everything. 630 00:25:59,690 --> 00:26:00,590 That's very nice. 631 00:26:00,590 --> 00:26:02,590 There I do wonder if it's actually if you gave-- 632 00:26:02,590 --> 00:26:04,340 it was actually 10 real dollars, would you 633 00:26:04,340 --> 00:26:05,300 really follow through? 634 00:26:05,300 --> 00:26:08,540 But surely there are some really nice people. 635 00:26:08,540 --> 00:26:10,490 There's some issues with those kinds 636 00:26:10,490 --> 00:26:13,563 of games in a sense of I'm eliciting 637 00:26:13,563 --> 00:26:15,230 your behavior in terms of what would you 638 00:26:15,230 --> 00:26:17,480 do if I gave you $10 with some other student 639 00:26:17,480 --> 00:26:20,030 or some other person that's fairly rich? 640 00:26:20,030 --> 00:26:23,030 You could see some people actually saying, I'm giving $0. 641 00:26:23,030 --> 00:26:26,810 And instead I'm donating the money to somebody in Kenya. 642 00:26:26,810 --> 00:26:29,930 It would look in the lab as if I was really selfish. 643 00:26:29,930 --> 00:26:31,910 But in fact, I'm actually altruistic. 644 00:26:31,910 --> 00:26:34,580 So there's issues with these kinds of lab experiments. 645 00:26:34,580 --> 00:26:38,450 But in general, I think people's behavior in these lab games 646 00:26:38,450 --> 00:26:40,820 is fairly predictive of real world things 647 00:26:40,820 --> 00:26:41,800 that we care about. 648 00:26:41,800 --> 00:26:42,550 Yes? 649 00:26:42,550 --> 00:26:45,200 AUDIENCE: I'm curious about the order in which these questions 650 00:26:45,200 --> 00:26:47,630 were asked because I remember on the survey we 651 00:26:47,630 --> 00:26:52,040 had the question on the left being asked first. 652 00:26:52,040 --> 00:26:55,715 And I think my thought process would have been different 653 00:26:55,715 --> 00:26:59,005 if I had been asked the questions in the reverse order. 654 00:26:59,005 --> 00:27:00,380 FRANK SCHILBACH: Yes, absolutely. 655 00:27:00,380 --> 00:27:03,060 So I think in an actual experiment 656 00:27:03,060 --> 00:27:05,090 that I or others would run, this would be 657 00:27:05,090 --> 00:27:06,320 randomized what the order is. 658 00:27:06,320 --> 00:27:08,270 And then they can test for that directly. 659 00:27:08,270 --> 00:27:10,740 Or what you would do, is you randomize across people. 660 00:27:10,740 --> 00:27:12,230 So some people get one question. 661 00:27:12,230 --> 00:27:14,150 And people will get the others. 662 00:27:14,150 --> 00:27:16,400 We haven't done that in part just because otherwise we 663 00:27:16,400 --> 00:27:18,192 would have had to send you different links. 664 00:27:18,192 --> 00:27:19,610 And it was already chaotic enough 665 00:27:19,610 --> 00:27:22,580 to send you one link, which seems 666 00:27:22,580 --> 00:27:26,000 to be the most we can handle. 667 00:27:26,000 --> 00:27:27,500 But usually what you would do, is 668 00:27:27,500 --> 00:27:29,208 essentially you would randomize the order 669 00:27:29,208 --> 00:27:30,375 and then be able to do that. 670 00:27:30,375 --> 00:27:32,300 My guess is that broadly speaking the results 671 00:27:32,300 --> 00:27:34,010 would be similar qualitatively. 672 00:27:34,010 --> 00:27:35,780 There might be some small differences 673 00:27:35,780 --> 00:27:38,480 in terms of the exact numbers. 674 00:27:38,480 --> 00:27:41,150 And surely the order of those kinds of questions often 675 00:27:41,150 --> 00:27:44,330 matters, which in some sense goes to we have a number-- 676 00:27:44,330 --> 00:27:46,100 the third aspect of choice, where 677 00:27:46,100 --> 00:27:48,380 if you think about in then the classical model, 678 00:27:48,380 --> 00:27:49,993 the order should surely not matter. 679 00:27:49,993 --> 00:27:51,660 Essentially I'm giving you some choices. 680 00:27:51,660 --> 00:27:52,827 You have a utility function. 681 00:27:52,827 --> 00:27:53,690 You have beliefs. 682 00:27:53,690 --> 00:27:55,430 And now just you'd be done. 683 00:27:55,430 --> 00:27:57,230 The order is completely irrelevant. 684 00:27:57,230 --> 00:27:59,420 But it turns out that order effects often in fact 685 00:27:59,420 --> 00:28:01,680 are quite important, in an experiments, 686 00:28:01,680 --> 00:28:03,960 and I think in the real world as well. 687 00:28:03,960 --> 00:28:04,725 Yeah? 688 00:28:04,725 --> 00:28:07,865 AUDIENCE: [INAUDIBLE] think about how too much altruism may 689 00:28:07,865 --> 00:28:12,310 be [INAUDIBLE] if someone gave me $5 [INAUDIBLE] when you gave 690 00:28:12,310 --> 00:28:15,116 $10, they might think they're really weird or not 691 00:28:15,116 --> 00:28:15,616 [INAUDIBLE]. 692 00:28:15,616 --> 00:28:16,205 [LAUGHTER] 693 00:28:16,205 --> 00:28:18,080 FRANK SCHILBACH: No, that's a great question. 694 00:28:18,080 --> 00:28:19,450 In fact, we do-- 695 00:28:19,450 --> 00:28:21,130 so a lot of these experiments, and what 696 00:28:21,130 --> 00:28:23,793 I'm going to tell you a little bit about is as well, 697 00:28:23,793 --> 00:28:25,210 a lot of the experimental evidence 698 00:28:25,210 --> 00:28:27,580 that's been collected so far, has 699 00:28:27,580 --> 00:28:30,700 been collected in rich countries and Western countries, 700 00:28:30,700 --> 00:28:35,650 industrialized countries, rich and educated people. 701 00:28:35,650 --> 00:28:42,075 And the acronym for that is WEIRD, which is essentially-- 702 00:28:42,075 --> 00:28:43,450 think of a lot of experiments are 703 00:28:43,450 --> 00:28:48,460 done with mostly college students and rich universities 704 00:28:48,460 --> 00:28:49,540 in the world. 705 00:28:49,540 --> 00:28:53,950 And their behavior might not be as-- 706 00:28:53,950 --> 00:28:56,650 so in some sense, the hope is that we 707 00:28:56,650 --> 00:28:59,380 find universal characteristics or parameters 708 00:28:59,380 --> 00:29:00,920 of people's behavior. 709 00:29:00,920 --> 00:29:03,910 But it's not unreasonable to think that in other societies, 710 00:29:03,910 --> 00:29:05,532 people behave quite differently. 711 00:29:05,532 --> 00:29:07,990 And for example, one thing that we find in India, in part-- 712 00:29:07,990 --> 00:29:11,410 we do these kinds of games with people. 713 00:29:11,410 --> 00:29:14,310 And there, sometimes people would say, if I gave too much, 714 00:29:14,310 --> 00:29:15,403 or if somebody else-- 715 00:29:15,403 --> 00:29:17,320 often we have these games, where it's not just 716 00:29:17,320 --> 00:29:18,880 one choice, but another choice. 717 00:29:18,880 --> 00:29:21,650 And if somebody, for example, offers you too much money, 718 00:29:21,650 --> 00:29:23,650 the other person reacts and says, this is weird. 719 00:29:23,650 --> 00:29:25,870 I can't accept this money, because otherwise I would 720 00:29:25,870 --> 00:29:27,620 owe them some debt or the like. 721 00:29:27,620 --> 00:29:30,040 So there's these kinds of behaviors, where in some sense, 722 00:29:30,040 --> 00:29:34,090 it's more complicated because of other reciprocity 723 00:29:34,090 --> 00:29:37,030 or other reasons. 724 00:29:37,030 --> 00:29:40,210 In China and other countries, there's 725 00:29:40,210 --> 00:29:41,912 a clearer culture of reciprocity, 726 00:29:41,912 --> 00:29:43,870 that people are fairly careful of, for example, 727 00:29:43,870 --> 00:29:45,770 on whom to invite and so on and so forth, 728 00:29:45,770 --> 00:29:47,805 and whom to accept presents and so 729 00:29:47,805 --> 00:29:49,430 on from because you know in some sense, 730 00:29:49,430 --> 00:29:51,100 you have to reciprocate at some point. 731 00:29:51,100 --> 00:29:53,938 And if you're not able to do that, that's very bad. 732 00:29:53,938 --> 00:29:55,480 So I think that's exactly as you say, 733 00:29:55,480 --> 00:29:58,400 it's a lot more complicated in many settings. 734 00:29:58,400 --> 00:29:58,900 Yeah? 735 00:29:58,900 --> 00:30:01,108 AUDIENCE: So I just have a question about the graphs. 736 00:30:01,108 --> 00:30:02,500 What is the y-axis? 737 00:30:02,500 --> 00:30:04,000 FRANK SCHILBACH: It's the density. 738 00:30:04,000 --> 00:30:06,940 It's the aggregates. 739 00:30:06,940 --> 00:30:07,845 It adds up to 1. 740 00:30:07,845 --> 00:30:09,220 So essentially, it's a fraction-- 741 00:30:09,220 --> 00:30:11,725 think of this as the fraction overall-- 742 00:30:11,725 --> 00:30:12,475 it's essentially-- 743 00:30:12,475 --> 00:30:14,728 AUDIENCE: [INAUDIBLE] that are greater than 1? 744 00:30:18,258 --> 00:30:20,050 FRANK SCHILBACH: It's the PDF, essentially. 745 00:30:20,050 --> 00:30:22,540 It adds up its aggregates up to it to 1. 746 00:30:22,540 --> 00:30:26,000 So another version of that would be just a fraction of people. 747 00:30:26,000 --> 00:30:31,020 AUDIENCE: But you can't have a fraction that's larger than 1. 748 00:30:31,020 --> 00:30:33,000 I'm confused. 749 00:30:33,000 --> 00:30:36,720 FRANK SCHILBACH: No, this is not the fraction. 750 00:30:36,720 --> 00:30:37,820 This is the PDF, right? 751 00:30:37,820 --> 00:30:38,530 AUDIENCE: That's the PDF. 752 00:30:38,530 --> 00:30:39,160 FRANK SCHILBACH: Yeah, exactly. 753 00:30:39,160 --> 00:30:41,160 It's the PDF that's essentially aggregates to 1. 754 00:30:41,160 --> 00:30:43,118 But you could relabel this and essentially 755 00:30:43,118 --> 00:30:44,410 look at the fraction of people. 756 00:30:44,410 --> 00:30:48,060 The graph would exactly look the same. 757 00:30:48,060 --> 00:30:49,310 Maybe you should just do that. 758 00:30:54,260 --> 00:30:57,350 So second, let's talk a little bit about time preferences. 759 00:30:57,350 --> 00:30:59,358 So one interesting example-- 760 00:30:59,358 --> 00:31:01,400 and in some sense, I think one of the things we'd 761 00:31:01,400 --> 00:31:03,275 like to encourage you in behavioral economics 762 00:31:03,275 --> 00:31:06,050 or in psychology and economics is look around in the world 763 00:31:06,050 --> 00:31:07,550 and think about what's weird or what 764 00:31:07,550 --> 00:31:09,410 kind of things in the world can be explained 765 00:31:09,410 --> 00:31:10,952 with behavioral economics, the things 766 00:31:10,952 --> 00:31:12,410 that we see in the world? 767 00:31:12,410 --> 00:31:15,323 And here's one example of that, which is just a weird thing, 768 00:31:15,323 --> 00:31:16,240 if you think about it. 769 00:31:16,240 --> 00:31:20,180 So there's this bet that's called-- 770 00:31:20,180 --> 00:31:21,990 there's these betting agencies. 771 00:31:21,990 --> 00:31:23,960 And one of these bets is where dieters 772 00:31:23,960 --> 00:31:27,080 can bet on their own weight loss success. 773 00:31:27,080 --> 00:31:28,910 And they often lose. 774 00:31:28,910 --> 00:31:30,200 So how does this work? 775 00:31:30,200 --> 00:31:33,200 There's William Hill, which is a sports betting agency, 776 00:31:33,200 --> 00:31:35,990 and they offer all sorts of weird bets. 777 00:31:35,990 --> 00:31:37,550 And one of them apparently is also 778 00:31:37,550 --> 00:31:40,670 some somewhat unusual wagers, where 779 00:31:40,670 --> 00:31:46,700 people are allowed to bet on their own weight loss. 780 00:31:50,450 --> 00:31:52,100 And in fact, they do that. 781 00:31:52,100 --> 00:31:54,380 But they often lose while doing this. 782 00:31:54,380 --> 00:31:56,150 Now why is this a weird bet? 783 00:31:56,150 --> 00:31:57,660 What's weird about this? 784 00:31:57,660 --> 00:31:58,160 Yes? 785 00:31:58,160 --> 00:32:00,243 AUDIENCE: [INAUDIBLE] you have control technically 786 00:32:00,243 --> 00:32:01,590 over the outcome [INAUDIBLE]. 787 00:32:01,590 --> 00:32:02,840 FRANK SCHILBACH: Yes, exactly. 788 00:32:02,840 --> 00:32:04,760 So if you had perfect self-control, 789 00:32:04,760 --> 00:32:06,920 and if you had perfect foresight, in some sense, 790 00:32:06,920 --> 00:32:09,290 if you understood what your preferences are, 791 00:32:09,290 --> 00:32:11,720 and you have plans for the next month or two-- 792 00:32:11,720 --> 00:32:13,220 of course, there's unexpected things 793 00:32:13,220 --> 00:32:19,650 that could happen-- but if you understood your preferences 794 00:32:19,650 --> 00:32:21,890 well and had perfect self-control, 795 00:32:21,890 --> 00:32:24,620 you would know that either you can lose weight, 796 00:32:24,620 --> 00:32:26,360 and then you would win the bet. 797 00:32:26,360 --> 00:32:31,190 Or you would not lose weight, but then you would not even 798 00:32:31,190 --> 00:32:32,850 start the bet in the first place. 799 00:32:32,850 --> 00:32:36,620 But here, over 80% of betters, in fact, lose. 800 00:32:36,620 --> 00:32:39,080 So now how do we think about that? 801 00:32:42,400 --> 00:32:44,411 How do we explain this behavior? 802 00:32:44,411 --> 00:32:46,755 AUDIENCE: [INAUDIBLE]. 803 00:32:46,755 --> 00:32:47,630 FRANK SCHILBACH: Yes? 804 00:32:47,630 --> 00:32:48,680 AUDIENCE: [INAUDIBLE]. 805 00:32:48,680 --> 00:32:50,630 FRANK SCHILBACH: And so more specifically, 806 00:32:50,630 --> 00:32:52,160 what does that mean? 807 00:32:52,160 --> 00:32:55,650 AUDIENCE: So [INAUDIBLE] starting tomorrow. 808 00:32:55,650 --> 00:32:58,012 But when tomorrow comes, they do not [INAUDIBLE].. 809 00:32:58,012 --> 00:32:58,970 FRANK SCHILBACH: Right. 810 00:32:58,970 --> 00:33:01,070 But there's two things going on, potentially. 811 00:33:01,070 --> 00:33:02,510 So one is of control, in the sense 812 00:33:02,510 --> 00:33:05,450 of I have low self control. 813 00:33:05,450 --> 00:33:07,010 I like eating donuts. 814 00:33:07,010 --> 00:33:08,870 And I like eating donuts a lot. 815 00:33:08,870 --> 00:33:11,680 So I'm not going to lose any weight any time soon. 816 00:33:11,680 --> 00:33:13,670 Can that explain that behavior by itself? 817 00:33:13,670 --> 00:33:14,720 Or what else do we need? 818 00:33:18,670 --> 00:33:19,232 Yes? 819 00:33:19,232 --> 00:33:20,190 AUDIENCE: A part of it. 820 00:33:20,190 --> 00:33:21,460 FRANK SCHILBACH: Oh sorry, yeah? 821 00:33:21,460 --> 00:33:22,960 AUDIENCE: That's part of the answer. 822 00:33:22,960 --> 00:33:26,513 The other part is I'm stopping [INAUDIBLE] 823 00:33:26,513 --> 00:33:28,180 tomorrow [INAUDIBLE] that I want to lose 824 00:33:28,180 --> 00:33:29,208 some weight [INAUDIBLE]. 825 00:33:29,208 --> 00:33:30,250 FRANK SCHILBACH: Exactly. 826 00:33:30,250 --> 00:33:32,890 So what you need is some form of naivete or optimism 827 00:33:32,890 --> 00:33:35,500 in a sense of saying, maybe today 828 00:33:35,500 --> 00:33:37,060 I know I really like donuts. 829 00:33:37,060 --> 00:33:40,510 But tomorrow I will just eat lots of salad. 830 00:33:40,510 --> 00:33:42,903 And so in some sense, there's this overoptimism naivete, 831 00:33:42,903 --> 00:33:44,320 where people are essentially naive 832 00:33:44,320 --> 00:33:46,600 about their future preferences. 833 00:33:46,600 --> 00:33:49,900 Plus there needs to be some form of self control problems. 834 00:33:49,900 --> 00:33:55,800 And that combination leads to these kinds of bets. 835 00:33:55,800 --> 00:33:57,590 So one part is self-control problems. 836 00:33:57,590 --> 00:34:01,640 The second part is a naivete overconfidence. 837 00:34:01,640 --> 00:34:04,130 Putting these together could get some people to essentially 838 00:34:04,130 --> 00:34:06,103 engage in these kinds of bets. 839 00:34:06,103 --> 00:34:07,520 Is there some other motivation why 840 00:34:07,520 --> 00:34:10,370 you would engage in this bet? 841 00:34:10,370 --> 00:34:10,989 Yes? 842 00:34:10,989 --> 00:34:15,578 AUDIENCE: [INAUDIBLE] actually be [INAUDIBLE]?? 843 00:34:15,578 --> 00:34:16,620 FRANK SCHILBACH: Exactly. 844 00:34:16,620 --> 00:34:19,080 So it could be that in some sense, I'm just-- 845 00:34:19,080 --> 00:34:20,489 so one explanation is-- 846 00:34:20,489 --> 00:34:21,989 I think this is a really great deal. 847 00:34:21,989 --> 00:34:23,350 Somebody offers me this great deal. 848 00:34:23,350 --> 00:34:24,850 I'm just going to gluon some weight. 849 00:34:24,850 --> 00:34:26,520 And I'm going to make a bunch of money. 850 00:34:26,520 --> 00:34:28,980 And then afterwards, I guess I'll eat a bunch of donuts. 851 00:34:28,980 --> 00:34:32,218 So I don't really necessarily need to want to lose weight. 852 00:34:32,218 --> 00:34:33,510 I just think it's a great deal. 853 00:34:33,510 --> 00:34:36,060 So that's overconfidence about my self-control. 854 00:34:36,060 --> 00:34:37,110 Then I engage in the bet. 855 00:34:37,110 --> 00:34:39,110 It turns out actually I don't have self-control. 856 00:34:39,110 --> 00:34:41,940 And I lose the money and eat lots of donuts. 857 00:34:41,940 --> 00:34:45,132 Now the second thing that you're saying is actually, 858 00:34:45,132 --> 00:34:47,340 I might actually be aware of my self-control problem. 859 00:34:47,340 --> 00:34:48,630 And so I might actually understand that I 860 00:34:48,630 --> 00:34:49,920 have a self-control problem. 861 00:34:49,920 --> 00:34:52,860 Now I'm engaging in this bet because as it turns out, 862 00:34:52,860 --> 00:34:55,199 on average, I guess there's a 20% chance of me actually 863 00:34:55,199 --> 00:34:56,070 succeeding. 864 00:34:56,070 --> 00:35:00,112 And perhaps the bet itself might help me in achieving that goal. 865 00:35:00,112 --> 00:35:02,070 So that's essentially some form of a commitment 866 00:35:02,070 --> 00:35:06,180 device, where engaging in that is helping me lose weight. 867 00:35:06,180 --> 00:35:07,265 I might actually know. 868 00:35:07,265 --> 00:35:08,640 And there's a question of are you 869 00:35:08,640 --> 00:35:11,768 fully sophisticated or perhaps partially or fully naive? 870 00:35:11,768 --> 00:35:14,310 In some sense, I might know that there's a good chance that I 871 00:35:14,310 --> 00:35:15,660 might actually lose the money. 872 00:35:15,660 --> 00:35:18,035 But it might be worth for me actually engaging in the bet 873 00:35:18,035 --> 00:35:21,240 because the expected value of losing weight 874 00:35:21,240 --> 00:35:26,080 might be higher than the loss of money overall. 875 00:35:26,080 --> 00:35:29,140 So I think we said all there is to say here. 876 00:35:29,140 --> 00:35:31,080 So essentially there's either-- 877 00:35:31,080 --> 00:35:33,480 again either people are just overconfidence or naive. 878 00:35:33,480 --> 00:35:36,180 Second one is people might want to set themselves 879 00:35:36,180 --> 00:35:38,070 incentives to lose weight. 880 00:35:38,070 --> 00:35:40,890 Again you need some naivete and a sense of-- 881 00:35:40,890 --> 00:35:42,870 or some stochasticity or something. 882 00:35:42,870 --> 00:35:44,150 Otherwise it couldn't be-- 883 00:35:44,150 --> 00:35:45,900 otherwise you would only engage in the bet 884 00:35:45,900 --> 00:35:47,838 if that's actually helpful. 885 00:35:47,838 --> 00:35:48,630 So it needs to be-- 886 00:35:48,630 --> 00:35:50,820 some people either-- it's stochastic, 887 00:35:50,820 --> 00:35:51,972 or they're partially naive. 888 00:35:51,972 --> 00:35:54,180 In some sense they think there's a chance it'll work. 889 00:35:54,180 --> 00:35:56,055 But in fact, maybe they overestimate how high 890 00:35:56,055 --> 00:35:58,000 the chance of succeeding is. 891 00:35:58,000 --> 00:35:59,850 So overall what this reflects is there's 892 00:35:59,850 --> 00:36:03,958 a conflict between short-run and long-run plans. 893 00:36:03,958 --> 00:36:05,500 There's actually two things going on. 894 00:36:05,500 --> 00:36:07,500 One is what you want in the short run might 895 00:36:07,500 --> 00:36:09,540 be different from what you wanted the long run. 896 00:36:09,540 --> 00:36:13,950 Second one is you might have imperfect foresight about that. 897 00:36:13,950 --> 00:36:14,700 Yes? 898 00:36:14,700 --> 00:36:16,500 AUDIENCE: Were people allowed to bet they would gain weight? 899 00:36:16,500 --> 00:36:17,583 FRANK SCHILBACH: Say that? 900 00:36:17,583 --> 00:36:20,250 AUDIENCE: Were people allowed to bet they would gain weight? 901 00:36:20,250 --> 00:36:21,460 FRANK SCHILBACH: Actually, I don't know. 902 00:36:21,460 --> 00:36:22,093 You could try. 903 00:36:22,093 --> 00:36:23,010 I actually don't know. 904 00:36:23,010 --> 00:36:26,220 But I think that's probably easier to do. 905 00:36:26,220 --> 00:36:28,080 So maybe they wouldn't accept the bets. 906 00:36:28,080 --> 00:36:31,570 Or your quota would be worse. 907 00:36:31,570 --> 00:36:32,236 Yeah? 908 00:36:32,236 --> 00:36:34,319 AUDIENCE: Could there also be a change in utility, 909 00:36:34,319 --> 00:36:37,476 where you [INAUDIBLE] I'll get-- if I lose 10 pounds, 910 00:36:37,476 --> 00:36:38,425 I'll get $500. 911 00:36:38,425 --> 00:36:40,854 But over time, you're like, oh, eating 912 00:36:40,854 --> 00:36:44,003 is actually-- changing my eating habits is Oh, working out 913 00:36:44,003 --> 00:36:46,380 is difficult. So now I'd rather pay the $500 914 00:36:46,380 --> 00:36:48,454 but not have to lose the weight because I'm 915 00:36:48,454 --> 00:36:50,070 no longer as interested. 916 00:36:50,070 --> 00:36:51,028 FRANK SCHILBACH: Right. 917 00:36:51,028 --> 00:36:53,280 She's saying in some sense, there's 918 00:36:53,280 --> 00:36:54,942 some learning uncertainty involved. 919 00:36:54,942 --> 00:36:57,150 Yeah, I think that could be the case for some people. 920 00:36:57,150 --> 00:36:58,733 You would think in over 80% of people, 921 00:36:58,733 --> 00:37:01,780 they know their preferences by now. 922 00:37:01,780 --> 00:37:06,090 So a lot of behaviors-- and I have the same I'm 923 00:37:06,090 --> 00:37:09,990 chronically overoptimistic about when I get up in the morning. 924 00:37:09,990 --> 00:37:13,180 Every night, I'm sort of, tomorrow at 7:00, I'll get up. 925 00:37:13,180 --> 00:37:16,360 And I'll exercise and write a poem and so on and so forth. 926 00:37:16,360 --> 00:37:18,100 And then that never happens. 927 00:37:18,100 --> 00:37:20,853 So I think you say, there is potentially 928 00:37:20,853 --> 00:37:22,020 uncertainty about the world. 929 00:37:22,020 --> 00:37:26,610 And you might just not know how hard losing weight is, 930 00:37:26,610 --> 00:37:27,720 in this example. 931 00:37:27,720 --> 00:37:30,210 There's plenty of examples, which we're going to show you, 932 00:37:30,210 --> 00:37:33,630 where people over and over make the same choices that 933 00:37:33,630 --> 00:37:35,185 seem to be not optimal. 934 00:37:35,185 --> 00:37:37,810 But you think people should have all the information they need. 935 00:37:37,810 --> 00:37:40,470 And people should have learned over time. 936 00:37:40,470 --> 00:37:42,970 But I think part of that is some explanation, potentially, 937 00:37:42,970 --> 00:37:43,470 is learning. 938 00:37:43,470 --> 00:37:45,720 That's exactly right. 939 00:37:45,720 --> 00:37:48,030 So then what we have done here-- 940 00:37:48,030 --> 00:37:50,140 in our survey, we asked you two questions. 941 00:37:50,140 --> 00:37:55,170 One was, how often do you think you should be exercising? 942 00:37:55,170 --> 00:37:58,080 The other one is, how often do you, in fact, exercise? 943 00:37:58,080 --> 00:37:59,500 These actually look quite similar. 944 00:37:59,500 --> 00:38:01,260 There seems to be one person who thinks they should, 945 00:38:01,260 --> 00:38:03,510 and they actually do exercise 50 times a month-- 946 00:38:03,510 --> 00:38:05,575 [LAUGHTER] 947 00:38:05,575 --> 00:38:06,700 --which is very impressive. 948 00:38:06,700 --> 00:38:08,430 I don't know whether you were truthful. 949 00:38:08,430 --> 00:38:10,470 You know who you are. 950 00:38:10,470 --> 00:38:15,467 So if you plot this against each other, 951 00:38:15,467 --> 00:38:17,550 what do you essentially see is quite a bit of mass 952 00:38:17,550 --> 00:38:19,530 below the 45 degree axis, which is 953 00:38:19,530 --> 00:38:21,030 to say there's a bunch of people who 954 00:38:21,030 --> 00:38:26,650 say they should be exercising more than they actually do. 955 00:38:26,650 --> 00:38:28,980 And that's a typical pattern in the world. 956 00:38:28,980 --> 00:38:30,750 Lots of people think in the future, 957 00:38:30,750 --> 00:38:33,120 they should be doing things or more virtuous things. 958 00:38:33,120 --> 00:38:34,950 In the future, they would like to be 959 00:38:34,950 --> 00:38:37,320 a better and more virtuous, more healthy person 960 00:38:37,320 --> 00:38:38,530 than they actually are. 961 00:38:38,530 --> 00:38:42,088 So if you had to choose for next month or next week or the like, 962 00:38:42,088 --> 00:38:44,130 you would choose lots of virtuous and good things 963 00:38:44,130 --> 00:38:44,940 for yourself. 964 00:38:44,940 --> 00:38:47,368 But then if you had to choose for right now, 965 00:38:47,368 --> 00:38:48,660 that might not actually happen. 966 00:38:48,660 --> 00:38:52,170 And that leads to this discrepancy between people's 967 00:38:52,170 --> 00:38:54,450 desires and the actual behaviors, 968 00:38:54,450 --> 00:38:58,980 which we're going to then study and try and understand. 969 00:38:58,980 --> 00:39:01,368 And then the broad question there, then is to say, 970 00:39:01,368 --> 00:39:02,910 given that that's the case, do people 971 00:39:02,910 --> 00:39:04,170 understand their behavior? 972 00:39:04,170 --> 00:39:05,400 Are they sophisticated? 973 00:39:05,400 --> 00:39:07,630 And are they engaging in certain commitment devices 974 00:39:07,630 --> 00:39:09,338 or certain behaviors that might help them 975 00:39:09,338 --> 00:39:11,430 overcome those types of issues? 976 00:39:11,430 --> 00:39:12,805 So at the source of this conflict 977 00:39:12,805 --> 00:39:15,390 is essentially what we call present bias, which essentially 978 00:39:15,390 --> 00:39:19,140 is manifested in what we ask you about as well, which is ask you 979 00:39:19,140 --> 00:39:23,280 two questions about a choice between $100 in cash two weeks 980 00:39:23,280 --> 00:39:28,710 from now versus $x in 54 weeks from now. 981 00:39:28,710 --> 00:39:31,320 If you think about 52 versus 54 weeks from now, 982 00:39:31,320 --> 00:39:32,670 that's really far in the future. 983 00:39:32,670 --> 00:39:35,580 And two weeks early or late really don't make a difference. 984 00:39:35,580 --> 00:39:39,210 And most people would probably say something close to 100. 985 00:39:39,210 --> 00:39:41,160 If I instead ask you right now, would you 986 00:39:41,160 --> 00:39:43,800 like to have $100 versus in two weeks from now, 987 00:39:43,800 --> 00:39:46,680 a lot more people say they'd rather have $100 now. 988 00:39:46,680 --> 00:39:50,610 Or they take like a lower amount right now 989 00:39:50,610 --> 00:39:52,920 than $100 and the future, essentially 990 00:39:52,920 --> 00:39:55,020 because right now, they put a lot of weight 991 00:39:55,020 --> 00:39:56,970 into the immediate present. 992 00:39:56,970 --> 00:39:59,250 And that's what we refer to as present bias. 993 00:39:59,250 --> 00:40:01,950 Essentially people put disproportional very high 994 00:40:01,950 --> 00:40:04,320 weight on the president, compared to anything 995 00:40:04,320 --> 00:40:05,710 that's in the future. 996 00:40:05,710 --> 00:40:09,030 And so when you look at that, it's a little bit noisy. 997 00:40:09,030 --> 00:40:10,500 And the question is not quite ideal 998 00:40:10,500 --> 00:40:12,708 because in some sense, what I really would like to do 999 00:40:12,708 --> 00:40:15,782 is ask you about 52 weeks from now and 54 weeks from now 1000 00:40:15,782 --> 00:40:17,490 and then wait for a year and then ask you 1001 00:40:17,490 --> 00:40:18,990 the exact same question again. 1002 00:40:18,990 --> 00:40:21,730 Now I can't really wait for you to do that. 1003 00:40:21,730 --> 00:40:23,250 So in some sense, the circumstances 1004 00:40:23,250 --> 00:40:25,110 in a year from now I might be quite different from what 1005 00:40:25,110 --> 00:40:25,652 they are now. 1006 00:40:25,652 --> 00:40:28,668 You might be richer, or more educated, wiser, and so on. 1007 00:40:28,668 --> 00:40:30,210 But for what it's worth, it does seem 1008 00:40:30,210 --> 00:40:32,760 to be the case that when you ask people 1009 00:40:32,760 --> 00:40:37,050 about $100 in 52 weeks versus 54 weeks from now, 1010 00:40:37,050 --> 00:40:38,990 the distribution is very close to 100. 1011 00:40:38,990 --> 00:40:40,740 Essentially you don't care very much about 1012 00:40:40,740 --> 00:40:44,910 whether you get $100 in 52 weeks versus 54 weeks from now. 1013 00:40:44,910 --> 00:40:48,090 If you look at like $100 now versus 1014 00:40:48,090 --> 00:40:49,950 $100 in two weeks from now, I have 1015 00:40:49,950 --> 00:40:51,960 to pay you a higher amount in two weeks from now 1016 00:40:51,960 --> 00:40:53,960 to get you indifferent between now and two weeks 1017 00:40:53,960 --> 00:40:56,667 from now, which is saying like you care more about today 1018 00:40:56,667 --> 00:40:57,750 versus two weeks from now. 1019 00:40:57,750 --> 00:41:00,510 Your discount rate between today and two weeks from now 1020 00:41:00,510 --> 00:41:03,870 is higher than it is from 52 versus 54 weeks from now. 1021 00:41:03,870 --> 00:41:04,545 Yeah? 1022 00:41:04,545 --> 00:41:05,920 AUDIENCE: I was wondering how you 1023 00:41:05,920 --> 00:41:08,472 could distinguish this time preference 1024 00:41:08,472 --> 00:41:11,580 and expectation of this. 1025 00:41:11,580 --> 00:41:13,440 For example, if you offer money now 1026 00:41:13,440 --> 00:41:15,180 is more certain than two weeks from now, 1027 00:41:15,180 --> 00:41:17,530 [INAUDIBLE] I do not see you again. 1028 00:41:17,530 --> 00:41:19,855 But if I look at 52 versus 54, they are both uncertain. 1029 00:41:19,855 --> 00:41:20,730 FRANK SCHILBACH: Yes. 1030 00:41:20,730 --> 00:41:23,670 So just to repeat the question, the question 1031 00:41:23,670 --> 00:41:25,120 is what about risk? 1032 00:41:25,120 --> 00:41:27,210 So if I ask you $100 now, and I give you 1033 00:41:27,210 --> 00:41:29,790 $100 bill, and say-- or some money 1034 00:41:29,790 --> 00:41:32,297 in the future, who knows whether I'm going to show up 1035 00:41:32,297 --> 00:41:33,630 and actually give you the money. 1036 00:41:33,630 --> 00:41:37,353 And there's reason to be skeptical. 1037 00:41:37,353 --> 00:41:38,520 I will be here in two weeks. 1038 00:41:38,520 --> 00:41:40,200 Don't worry. 1039 00:41:40,200 --> 00:41:41,950 So there's a bunch of experiments 1040 00:41:41,950 --> 00:41:44,910 that, in fact, try to deal with that fairly carefully. 1041 00:41:44,910 --> 00:41:48,090 So they will do things like in some experiments with students, 1042 00:41:48,090 --> 00:41:50,250 they give the professor's office name. 1043 00:41:50,250 --> 00:41:53,700 They have the phone number, the business card-- 1044 00:41:53,700 --> 00:41:55,990 they even give checks that are sent automatically. 1045 00:41:55,990 --> 00:41:57,930 So people try to deal with that. 1046 00:41:57,930 --> 00:41:59,375 It's a great question. 1047 00:41:59,375 --> 00:42:00,750 In fact, there's lots of research 1048 00:42:00,750 --> 00:42:02,070 in that to try to do that. 1049 00:42:02,070 --> 00:42:03,528 Some people said, this present bias 1050 00:42:03,528 --> 00:42:04,945 is actually not real that you find 1051 00:42:04,945 --> 00:42:06,240 in these types of experiments. 1052 00:42:06,240 --> 00:42:08,670 But then people have been very careful in trying 1053 00:42:08,670 --> 00:42:12,180 to disentangle risk from time preferences. 1054 00:42:12,180 --> 00:42:15,420 And they find essentially even if you try to minimize risk 1055 00:42:15,420 --> 00:42:17,550 as much as possible in those experiments, 1056 00:42:17,550 --> 00:42:21,210 present bias seems to persist in those kinds of choices. 1057 00:42:21,210 --> 00:42:21,780 Yeah? 1058 00:42:21,780 --> 00:42:24,050 AUDIENCE: I'm curious about the reasoning 1059 00:42:24,050 --> 00:42:29,213 or being indifferent between $100 now and less than $100 1060 00:42:29,213 --> 00:42:30,120 in the future. 1061 00:42:30,120 --> 00:42:34,170 FRANK SCHILBACH: You have to ask your classmates about that. 1062 00:42:34,170 --> 00:42:36,330 It could be that in some sense, you say-- 1063 00:42:36,330 --> 00:42:39,468 if you, for example, say you know that you tend to waste 1064 00:42:39,468 --> 00:42:41,760 money in certain situations-- maybe you're really tired 1065 00:42:41,760 --> 00:42:43,260 today, and you're just going to say, 1066 00:42:43,260 --> 00:42:46,328 I'm going to waste the money on hamburgers-- 1067 00:42:46,328 --> 00:42:48,870 you might say maybe in a week from now, things are different. 1068 00:42:48,870 --> 00:42:50,328 I think there's lots of explanation 1069 00:42:50,328 --> 00:42:51,420 that you can come up with. 1070 00:42:51,420 --> 00:42:54,740 Again, you have to ask your classmates 1071 00:42:54,740 --> 00:42:55,740 what they were thinking. 1072 00:42:55,740 --> 00:42:58,580 If anybody has an explanation, please let us know. 1073 00:43:02,370 --> 00:43:04,400 I think one broad bottom line from this 1074 00:43:04,400 --> 00:43:06,800 in these kinds of experiments, you see a lot of behavior 1075 00:43:06,800 --> 00:43:08,840 that looks-- 1076 00:43:08,840 --> 00:43:11,492 there's lots of potential anomalies. 1077 00:43:11,492 --> 00:43:12,950 There's lots of noise, essentially, 1078 00:43:12,950 --> 00:43:15,110 in some of those kinds of data. 1079 00:43:15,110 --> 00:43:18,230 In aggregate, things often look-- 1080 00:43:18,230 --> 00:43:19,310 the patterns emerge. 1081 00:43:19,310 --> 00:43:21,200 But lots of people make choices that are 1082 00:43:21,200 --> 00:43:22,550 in fact hard to rationalize. 1083 00:43:22,550 --> 00:43:25,220 And we try and look at aggregate patterns. 1084 00:43:25,220 --> 00:43:27,500 Most of these things are systematic. 1085 00:43:27,500 --> 00:43:30,568 But there's some people, they're on the very left. 1086 00:43:30,568 --> 00:43:32,360 Maybe these are just mistakes, for example, 1087 00:43:32,360 --> 00:43:34,130 or they just misunderstood the question. 1088 00:43:34,130 --> 00:43:38,060 Or there's typos and so on and so forth. 1089 00:43:38,060 --> 00:43:39,650 So let me summarize here. 1090 00:43:39,650 --> 00:43:43,315 So people tend to be fairly patient for far-off decisions. 1091 00:43:43,315 --> 00:43:44,690 So in the future, you're actually 1092 00:43:44,690 --> 00:43:46,357 quite patient between stuff that happens 1093 00:43:46,357 --> 00:43:49,197 between a year from now, two years from now, 1094 00:43:49,197 --> 00:43:50,030 five years from now. 1095 00:43:50,030 --> 00:43:52,873 People invest all sorts of things for the far away future. 1096 00:43:52,873 --> 00:43:55,040 But when it comes to immediately relevant decisions, 1097 00:43:55,040 --> 00:43:57,002 people tend to be quite impatient. 1098 00:43:57,002 --> 00:43:58,460 You care a lot about right now what 1099 00:43:58,460 --> 00:44:01,970 happens, the next maybe day or the next few hours. 1100 00:44:01,970 --> 00:44:05,450 You care a lot about that much more than a week 1101 00:44:05,450 --> 00:44:08,130 or two or even a year from now. 1102 00:44:08,130 --> 00:44:11,070 So there's then two broad questions to begin to look at. 1103 00:44:11,070 --> 00:44:13,640 One is evidence of those kinds of conflicts 1104 00:44:13,640 --> 00:44:16,010 between the current and future selves in terms 1105 00:44:16,010 --> 00:44:18,500 of short-run desires and long-run goals. 1106 00:44:18,500 --> 00:44:21,992 And we can try and find evidence of that. 1107 00:44:21,992 --> 00:44:24,200 Second, we're going to look at whether and how people 1108 00:44:24,200 --> 00:44:26,630 predict their future utility and behavior. 1109 00:44:26,630 --> 00:44:29,060 And that's a general pattern in behavioral economics 1110 00:44:29,060 --> 00:44:31,580 and psychology and economics, is to say actually 1111 00:44:31,580 --> 00:44:34,640 the fact that people have present bias itself is not 1112 00:44:34,640 --> 00:44:40,220 a huge problem in the sense of causing welfare losses 1113 00:44:40,220 --> 00:44:43,340 or making people unhappy, as long as people 1114 00:44:43,340 --> 00:44:44,640 are sophisticated. 1115 00:44:44,640 --> 00:44:46,850 So as long as I know I have self control problems, 1116 00:44:46,850 --> 00:44:50,270 and I'm fully aware of them, I can set my decision environment 1117 00:44:50,270 --> 00:44:52,490 or my choices such that I can actually 1118 00:44:52,490 --> 00:44:55,040 mitigate a lot of these biases, in part because I can 1119 00:44:55,040 --> 00:44:56,630 buy certain commitment devices. 1120 00:44:56,630 --> 00:44:59,060 I can set alarm clocks and all sorts 1121 00:44:59,060 --> 00:45:02,400 of things that might help me improve my behavior. 1122 00:45:02,400 --> 00:45:04,940 The problem comes from naivete. 1123 00:45:04,940 --> 00:45:07,070 And then to say, if I have self-control problems, 1124 00:45:07,070 --> 00:45:09,290 and I'm naive on top of that, then 1125 00:45:09,290 --> 00:45:11,120 I might get really screwed because I think, 1126 00:45:11,120 --> 00:45:12,420 I'm going to go do stuff in the future. 1127 00:45:12,420 --> 00:45:13,350 I'll do it in the future. 1128 00:45:13,350 --> 00:45:14,933 And so on-- and then the future comes. 1129 00:45:14,933 --> 00:45:17,820 And surprise, I'm still present bias 1130 00:45:17,820 --> 00:45:19,320 and have like self control problems. 1131 00:45:19,320 --> 00:45:22,010 So if I think that I'm have more self control than actually 1132 00:45:22,010 --> 00:45:26,510 have, that might cause a lot of bad behaviors or problems 1133 00:45:26,510 --> 00:45:30,170 because I make mistakes that are then hard to fix in the future. 1134 00:45:30,170 --> 00:45:33,020 That's a general pattern about behavioral economics. 1135 00:45:33,020 --> 00:45:35,330 Similar issues-- for example, you think about memory, 1136 00:45:35,330 --> 00:45:37,590 if I have imperfect memory, that's of course, 1137 00:45:37,590 --> 00:45:38,750 potentially a problem. 1138 00:45:38,750 --> 00:45:40,370 But I can fix it in some sense. 1139 00:45:40,370 --> 00:45:41,870 I can set myself reminders. 1140 00:45:41,870 --> 00:45:44,460 I can help other people have them remind me and so on. 1141 00:45:44,460 --> 00:45:45,950 I can set my decision environments 1142 00:45:45,950 --> 00:45:48,980 that I can account for imperfect memory. 1143 00:45:48,980 --> 00:45:50,660 The problem comes if I'm overconfident. 1144 00:45:50,660 --> 00:45:52,288 If I think I can remember everything , 1145 00:45:52,288 --> 00:45:54,580 I talked to people and never take notes and [INAUDIBLE] 1146 00:45:54,580 --> 00:45:57,750 I'll remember, but then an hour later, 1147 00:45:57,750 --> 00:46:00,990 I've forgotten everything, that really needs to problems. 1148 00:46:00,990 --> 00:46:04,040 And again it's not the bias itself 1149 00:46:04,040 --> 00:46:08,390 in terms of having imperfect cognitive function or the like, 1150 00:46:08,390 --> 00:46:10,850 but rather the overconfidence of the same 1151 00:46:10,850 --> 00:46:14,660 that often leads to welfare losses or making people unhappy 1152 00:46:14,660 --> 00:46:16,550 or making inferior choices. 1153 00:46:20,150 --> 00:46:24,540 So let me talk about beliefs then for a bit. 1154 00:46:24,540 --> 00:46:27,288 So a broad set of beliefs or-- sorry, 1155 00:46:27,288 --> 00:46:29,330 were there any other questions about preferences? 1156 00:46:29,330 --> 00:46:30,352 Yes? 1157 00:46:30,352 --> 00:46:32,922 AUDIENCE: Yeah, for present bias, 1158 00:46:32,922 --> 00:46:38,600 is it in general symmetrical compared to loss versus pain? 1159 00:46:38,600 --> 00:46:42,560 Do people prefer to lose in the future rather than to lose now? 1160 00:46:42,560 --> 00:46:47,950 And is it [INAUDIBLE] in general? 1161 00:46:47,950 --> 00:46:48,940 FRANK SCHILBACH: Yeah. 1162 00:46:48,940 --> 00:46:49,857 We'll talk about that. 1163 00:46:49,857 --> 00:46:54,520 In fact, in some sense, people tend to-- so essentially people 1164 00:46:54,520 --> 00:46:58,210 like to have gains in the present 1165 00:46:58,210 --> 00:47:00,227 and losses in the future. 1166 00:47:00,227 --> 00:47:01,810 In some sense-- think of it like this. 1167 00:47:01,810 --> 00:47:03,730 You put more weight on the present utility 1168 00:47:03,730 --> 00:47:05,860 than you put on the future utility. 1169 00:47:05,860 --> 00:47:07,180 How do you do that? 1170 00:47:07,180 --> 00:47:10,620 You want to have more gains and fewer losses 1171 00:47:10,620 --> 00:47:12,250 or negative things going on. 1172 00:47:12,250 --> 00:47:15,850 That is not to say that loss aversion always happens-- 1173 00:47:15,850 --> 00:47:18,412 so loss aversion as a concept of gains and losses, 1174 00:47:18,412 --> 00:47:19,870 you are loss averse in the present, 1175 00:47:19,870 --> 00:47:21,965 and you also loss of in the future. 1176 00:47:21,965 --> 00:47:23,590 But if you could pick, would you rather 1177 00:47:23,590 --> 00:47:25,810 have a gain in the present or a loss-- 1178 00:47:25,810 --> 00:47:28,305 or a gain or loss in the present versus in the future, 1179 00:47:28,305 --> 00:47:29,680 you would tend to say, I'd rather 1180 00:47:29,680 --> 00:47:32,470 have the gain now and the loss in the future, precisely 1181 00:47:32,470 --> 00:47:35,800 because you overweigh, essentially, the present period 1182 00:47:35,800 --> 00:47:38,023 or utility. 1183 00:47:38,023 --> 00:47:38,940 I don't know if that-- 1184 00:47:38,940 --> 00:47:42,460 but we'll talk about that. 1185 00:47:42,460 --> 00:47:46,450 So let me talk about now beliefs and information updating. 1186 00:47:46,450 --> 00:47:50,200 So one very basic question that I think we ask you 1187 00:47:50,200 --> 00:47:53,290 and that's a classic question that people, like Kahneman 1188 00:47:53,290 --> 00:47:56,223 and Tversky and so on would often ask, 1189 00:47:56,223 --> 00:47:57,640 which is essentially the question, 1190 00:47:57,640 --> 00:47:59,770 suppose 100 people have HIV. 1191 00:47:59,770 --> 00:48:02,650 We have a test for HIV that's 99% accurate. 1192 00:48:02,650 --> 00:48:04,300 This means that if a person has HIV, 1193 00:48:04,300 --> 00:48:07,390 the test returns a positive result with 99% probability. 1194 00:48:07,390 --> 00:48:08,890 And if the person does not have HIV, 1195 00:48:08,890 --> 00:48:11,950 it returns a negative result with 99% probability. 1196 00:48:11,950 --> 00:48:14,770 If a person's HIV test came back positive, 1197 00:48:14,770 --> 00:48:17,650 what's the probability that she has HIV? 1198 00:48:17,650 --> 00:48:19,570 Now that's a very straightforward question, 1199 00:48:19,570 --> 00:48:21,697 that there's a clear correct answer. 1200 00:48:21,697 --> 00:48:23,530 If you know Bayes' rule, you can essentially 1201 00:48:23,530 --> 00:48:25,107 just calculate that. 1202 00:48:25,107 --> 00:48:26,940 I'm sure somewhere in statistics, et cetera, 1203 00:48:26,940 --> 00:48:28,930 people have taught you that. 1204 00:48:28,930 --> 00:48:33,640 Now 22% of this class would have answered 99%. 1205 00:48:33,640 --> 00:48:35,530 That's usually the classic what's 1206 00:48:35,530 --> 00:48:38,245 called base rate neglect. 1207 00:48:38,245 --> 00:48:40,120 For what it's worth, you guys are much better 1208 00:48:40,120 --> 00:48:41,200 than previous years. 1209 00:48:41,200 --> 00:48:42,310 I don't know why that is. 1210 00:48:42,310 --> 00:48:45,370 But maybe people get smarter over time. 1211 00:48:45,370 --> 00:48:47,440 But still this is MIT. 1212 00:48:47,440 --> 00:48:49,840 So if 20% of MIT students get this wrong, 1213 00:48:49,840 --> 00:48:52,090 surely the fraction in the general population who gets 1214 00:48:52,090 --> 00:48:53,590 that wrong is a lot higher. 1215 00:48:53,590 --> 00:48:56,620 And the typical explanation is that there's what's 1216 00:48:56,620 --> 00:48:58,990 called base rate neglect. 1217 00:48:58,990 --> 00:49:01,150 The logic is something like this. 1218 00:49:01,150 --> 00:49:05,420 A positive person will probably receive a positive result. 1219 00:49:05,420 --> 00:49:08,140 If she's tested positive, she's likely to be HIV positive. 1220 00:49:08,140 --> 00:49:10,930 And then that's 99%. 1221 00:49:10,930 --> 00:49:14,740 Now that's a very sort of natural and easy answer. 1222 00:49:14,740 --> 00:49:16,990 In particular if you only had like five or 10 seconds 1223 00:49:16,990 --> 00:49:19,450 to answer this question, that's a natural response 1224 00:49:19,450 --> 00:49:20,560 many people would make. 1225 00:49:20,560 --> 00:49:22,090 Usually the fraction of people would 1226 00:49:22,090 --> 00:49:25,135 say 99% would be a lot higher or maybe half or more. 1227 00:49:25,135 --> 00:49:27,010 In previous years, I think it was about half. 1228 00:49:27,010 --> 00:49:28,480 And very few people, in fact, would 1229 00:49:28,480 --> 00:49:31,600 give the correct answer of 50%. 1230 00:49:31,600 --> 00:49:33,220 And the mistake here is essentially 1231 00:49:33,220 --> 00:49:34,900 what's called sort of base rate neglect, 1232 00:49:34,900 --> 00:49:38,800 is to ignore the fact that the population, the base rate, 1233 00:49:38,800 --> 00:49:40,850 is in fact, very low. 1234 00:49:40,850 --> 00:49:41,890 It's only 1%. 1235 00:49:41,890 --> 00:49:43,720 And you should condition on that when 1236 00:49:43,720 --> 00:49:46,422 you try to solve this problem. 1237 00:49:46,422 --> 00:49:47,380 So here's what you see. 1238 00:49:47,380 --> 00:49:49,600 So 50 is the correct answer. 1239 00:49:49,600 --> 00:49:50,500 If I remember right. 1240 00:49:50,500 --> 00:49:52,450 But there's lots of spread around this. 1241 00:49:52,450 --> 00:49:54,730 And the fraction of people who get it right 1242 00:49:54,730 --> 00:49:58,420 is maybe 50% or even lower than that. 1243 00:49:58,420 --> 00:50:00,580 So now in some sense, you can look at this 1244 00:50:00,580 --> 00:50:03,250 and look into the answer as like 50%. 1245 00:50:03,250 --> 00:50:05,710 The key part is not is this a specific problem 1246 00:50:05,710 --> 00:50:07,220 and some people maybe misunderstood 1247 00:50:07,220 --> 00:50:08,317 and so on and so forth. 1248 00:50:08,317 --> 00:50:10,900 The point here is that this is actually a fairly basic problem 1249 00:50:10,900 --> 00:50:14,260 in terms of probability theory. 1250 00:50:14,260 --> 00:50:16,000 This is a very simple problem to solve 1251 00:50:16,000 --> 00:50:19,000 in a sense of if you think about real problems you need 1252 00:50:19,000 --> 00:50:21,400 to solve in the world, the world is way more complicated 1253 00:50:21,400 --> 00:50:22,580 than that. 1254 00:50:22,580 --> 00:50:25,300 However, even among MIT students, 1255 00:50:25,300 --> 00:50:29,050 something like half of the students, 1256 00:50:29,050 --> 00:50:31,752 when given several minutes at least to think about this-- 1257 00:50:31,752 --> 00:50:33,460 and I don't know how much time you took-- 1258 00:50:33,460 --> 00:50:35,710 but when given quite a bit of time to think about it-- 1259 00:50:35,710 --> 00:50:36,992 get the answer wrong. 1260 00:50:36,992 --> 00:50:38,950 That means essentially, in many other decisions 1261 00:50:38,950 --> 00:50:40,630 you have to act much faster, people 1262 00:50:40,630 --> 00:50:42,640 don't think about these things that much, people 1263 00:50:42,640 --> 00:50:44,110 have not taken statistics. 1264 00:50:44,110 --> 00:50:45,880 People have not taken math, and so on. 1265 00:50:48,640 --> 00:50:50,230 And so the fraction of people who 1266 00:50:50,230 --> 00:50:52,210 would get these very basic problems 1267 00:50:52,210 --> 00:50:55,360 is probably much higher, let alone 1268 00:50:55,360 --> 00:50:58,270 much more complicated problems. 1269 00:50:58,270 --> 00:51:02,490 So broadly speaking, these problems are hard. 1270 00:51:02,490 --> 00:51:06,000 Often what we do when we have cognitively demanding tasks, 1271 00:51:06,000 --> 00:51:08,790 we tend to have sort of quick intuitive shortcuts. 1272 00:51:08,790 --> 00:51:12,060 In some sense, think about the problems are really hard. 1273 00:51:12,060 --> 00:51:14,820 We try to simplify and make it easier for us. 1274 00:51:14,820 --> 00:51:18,600 We give intuitive answers to those types of questions. 1275 00:51:18,600 --> 00:51:20,850 These intuitive answers are actually often not bad. 1276 00:51:20,850 --> 00:51:23,260 They're actually pretty close in many situations. 1277 00:51:23,260 --> 00:51:27,240 But often they systematically leads to incorrect answers. 1278 00:51:27,240 --> 00:51:29,537 And essentially then, there's a whole field 1279 00:51:29,537 --> 00:51:31,620 of essentially cognitive psychology and so on that 1280 00:51:31,620 --> 00:51:35,555 tries to think about what kind of biases arise. 1281 00:51:35,555 --> 00:51:38,670 And in particular, Kahneman and Tversky, who-- 1282 00:51:38,670 --> 00:51:41,070 Danny Kahneman, a psychologist who 1283 00:51:41,070 --> 00:51:42,690 got the Nobel Prize in economics, 1284 00:51:42,690 --> 00:51:44,640 with the experiments-- 1285 00:51:44,640 --> 00:51:47,010 and you may have read Thinking Fast and Slow-- 1286 00:51:47,010 --> 00:51:48,480 with their experiments, essentially 1287 00:51:48,480 --> 00:51:51,480 demonstrated a large number of those kinds of anomalies 1288 00:51:51,480 --> 00:51:54,870 and biases in people's cognition, 1289 00:51:54,870 --> 00:51:56,940 if you want, in terms of heuristics 1290 00:51:56,940 --> 00:52:00,970 that they use for these kinds of probability and other problems. 1291 00:52:00,970 --> 00:52:02,337 And these are systematic biases. 1292 00:52:02,337 --> 00:52:04,170 These are not so subtle random mistakes that 1293 00:52:04,170 --> 00:52:05,910 are added to people's choices. 1294 00:52:05,910 --> 00:52:08,400 People systematically in systematic ways 1295 00:52:08,400 --> 00:52:14,790 deviate from optimal choices or correct calculations. 1296 00:52:14,790 --> 00:52:17,190 And the question is, can we understand 1297 00:52:17,190 --> 00:52:19,230 these systematic biases better and then 1298 00:52:19,230 --> 00:52:21,930 try to explain behavior systematically? 1299 00:52:21,930 --> 00:52:24,107 And if we understood these mistakes better, 1300 00:52:24,107 --> 00:52:26,190 then we could sort of potentially fix the mistakes 1301 00:52:26,190 --> 00:52:28,500 and improve people's decision making. 1302 00:52:32,950 --> 00:52:37,270 There's another question that worked reasonably well-- 1303 00:52:37,270 --> 00:52:41,030 but MIT is a little bit of special population-- 1304 00:52:41,030 --> 00:52:42,970 which is the question of overconfidence. 1305 00:52:42,970 --> 00:52:45,400 And it's a little bit hard to elicit with a simple survey. 1306 00:52:45,400 --> 00:52:48,010 So what we are asking you about, what's the probability of you 1307 00:52:48,010 --> 00:52:51,790 yourself would earn a lot of money, 1308 00:52:51,790 --> 00:52:56,080 versus what's the probability of what students in your class 1309 00:52:56,080 --> 00:52:57,400 would make this type of money? 1310 00:52:57,400 --> 00:52:59,790 Now if you think about these two questions, at least 1311 00:52:59,790 --> 00:53:01,540 under some assumptions, they should add up 1312 00:53:01,540 --> 00:53:03,680 to the same number. 1313 00:53:03,680 --> 00:53:06,860 So essentially if you aggregate over everybody, 1314 00:53:06,860 --> 00:53:08,590 if the people's expectations are correct, 1315 00:53:08,590 --> 00:53:10,120 they should add up to the same number, at least 1316 00:53:10,120 --> 00:53:11,350 under some assumptions. 1317 00:53:11,350 --> 00:53:13,690 Usually they do not. 1318 00:53:13,690 --> 00:53:16,000 The difference is actually relatively small. 1319 00:53:16,000 --> 00:53:19,030 MIT is a little bit of a special population in a sense. 1320 00:53:19,030 --> 00:53:21,025 If you look at like surveys across the world, 1321 00:53:21,025 --> 00:53:24,490 for example, if you think about driving behavior in the world, 1322 00:53:24,490 --> 00:53:26,620 something like 90%, 95% of drivers 1323 00:53:26,620 --> 00:53:28,930 think that better than the median driver. 1324 00:53:28,930 --> 00:53:32,680 Lots of people think they're smarter than the median person 1325 00:53:32,680 --> 00:53:33,730 and so on. 1326 00:53:33,730 --> 00:53:36,430 There's lots of evidence of overconfidence in the world. 1327 00:53:36,430 --> 00:53:39,370 So MIT to some degree has actually an opposite issue, 1328 00:53:39,370 --> 00:53:41,320 which is under confidence. 1329 00:53:41,320 --> 00:53:44,757 When you look at student surveys at MIT, 1330 00:53:44,757 --> 00:53:46,840 there's quite a bit of evidence that essentially-- 1331 00:53:46,840 --> 00:53:48,730 and I'll show you this, in fact, in class-- 1332 00:53:48,730 --> 00:53:50,980 about the typical MIT student thinks 1333 00:53:50,980 --> 00:53:53,110 they're worse than the average MIT student. 1334 00:53:53,110 --> 00:53:53,650 So-- 1335 00:53:53,650 --> 00:53:55,080 [LAUGHTER] 1336 00:53:55,080 --> 00:53:57,340 --there seems to be some overconfidence here. 1337 00:53:57,340 --> 00:54:01,150 But it's, in fact, not particularly strong. 1338 00:54:01,150 --> 00:54:03,700 So there you can see there's some over and under confidence. 1339 00:54:03,700 --> 00:54:05,980 Perhaps there's a little bit of overconfidence here. 1340 00:54:05,980 --> 00:54:08,050 But in fact, it's fairly weak. 1341 00:54:08,050 --> 00:54:09,782 But overall as a pattern in the world, 1342 00:54:09,782 --> 00:54:11,740 there seems to be quite a bit of overconfidence 1343 00:54:11,740 --> 00:54:13,165 in many situations. 1344 00:54:13,165 --> 00:54:15,040 And then the question is, does overconfidence 1345 00:54:15,040 --> 00:54:16,840 lead to bad outcomes? 1346 00:54:16,840 --> 00:54:19,012 If you're overconfident about certain situations, 1347 00:54:19,012 --> 00:54:21,220 then are you going to fail a lot because you do stuff 1348 00:54:21,220 --> 00:54:24,520 that is too difficult for you? 1349 00:54:24,520 --> 00:54:26,710 If you're under confident, you might not 1350 00:54:26,710 --> 00:54:28,570 engage in certain behaviors and not 1351 00:54:28,570 --> 00:54:30,740 try things that would be good for you. 1352 00:54:30,740 --> 00:54:33,850 In fact, Maddie, who is currently traveling-- 1353 00:54:33,850 --> 00:54:34,835 one of your TAs-- 1354 00:54:34,835 --> 00:54:36,460 has some very interesting work in India 1355 00:54:36,460 --> 00:54:39,430 about women being under confident and thus 1356 00:54:39,430 --> 00:54:41,920 not trying to convince the household 1357 00:54:41,920 --> 00:54:44,807 members or their husbands and so on to try and work 1358 00:54:44,807 --> 00:54:45,640 in the labor market. 1359 00:54:45,640 --> 00:54:48,730 And that leads to lower labor supply, earnings, 1360 00:54:48,730 --> 00:54:50,000 and so on and so forth. 1361 00:54:50,000 --> 00:54:51,700 So if you are under confident, you 1362 00:54:51,700 --> 00:54:54,967 might not try certain things that might be good for you. 1363 00:54:54,967 --> 00:54:56,800 And then the problem is if you never try it, 1364 00:54:56,800 --> 00:54:58,450 you might actually never learn about it. 1365 00:54:58,450 --> 00:54:59,867 If you never even try it yourself, 1366 00:54:59,867 --> 00:55:02,260 then you might never even fail or learn 1367 00:55:02,260 --> 00:55:04,160 that you might be actually able to do that. 1368 00:55:04,160 --> 00:55:07,300 So these potential traps of under confidence-- 1369 00:55:07,300 --> 00:55:09,520 or if you're overconfident, you might just 1370 00:55:09,520 --> 00:55:15,050 fail because you try I think it's actually not good for you. 1371 00:55:15,050 --> 00:55:18,850 And then there's other issues about motivated beliefs 1372 00:55:18,850 --> 00:55:21,220 that I discussed this already last time, which 1373 00:55:21,220 --> 00:55:24,340 is the idea that you might be overconfident, not because you 1374 00:55:24,340 --> 00:55:27,400 actually necessarily believe or deeply 1375 00:55:27,400 --> 00:55:31,490 believe what you like to believe, 1376 00:55:31,490 --> 00:55:33,760 but in fact, because it just makes you feel good 1377 00:55:33,760 --> 00:55:35,560 about the world or about yourself. 1378 00:55:35,560 --> 00:55:38,920 So I might not actually-- if you really ask me I'm a better 1379 00:55:38,920 --> 00:55:41,440 driver than the median driver, I might not actually 1380 00:55:41,440 --> 00:55:42,772 necessarily-- 1381 00:55:42,772 --> 00:55:44,730 if you make me think about it, I might actually 1382 00:55:44,730 --> 00:55:46,450 agree that that's not the case. 1383 00:55:46,450 --> 00:55:48,450 But I like to think about I do things well. 1384 00:55:48,450 --> 00:55:50,033 So in some sense, I might just say yes 1385 00:55:50,033 --> 00:55:51,850 because it makes me feel good about myself. 1386 00:55:51,850 --> 00:55:53,950 So essentially overconfidence could 1387 00:55:53,950 --> 00:55:56,650 be driven by motivated beliefs. 1388 00:55:56,650 --> 00:55:59,230 And that's another topic that we're going to try and study. 1389 00:56:04,210 --> 00:56:09,640 So now, as I said, there are preferences and beliefs 1390 00:56:09,640 --> 00:56:10,410 that we study. 1391 00:56:10,410 --> 00:56:12,610 So in some sense, preferences about what you want, 1392 00:56:12,610 --> 00:56:14,950 beliefs about what you think the world is like. 1393 00:56:14,950 --> 00:56:16,540 Now putting this together, as I said, 1394 00:56:16,540 --> 00:56:19,293 people's choices should be entirely determined. 1395 00:56:19,293 --> 00:56:20,710 But that tends to be not the case. 1396 00:56:20,710 --> 00:56:24,700 And one classic survey that's a survey question or a version 1397 00:56:24,700 --> 00:56:27,070 of a survey question that Richard Thaler-- 1398 00:56:27,070 --> 00:56:30,220 who won the Nobel Prize recently in economics-- 1399 00:56:30,220 --> 00:56:34,210 has asked a bunch of people, which is the following. 1400 00:56:34,210 --> 00:56:36,190 The question is, imagine that you're 1401 00:56:36,190 --> 00:56:38,860 about to purchase an iPad for $500. 1402 00:56:38,860 --> 00:56:42,130 The salesman tells you that you can get the exact same-- 1403 00:56:42,130 --> 00:56:46,030 good in a nearby location for $15 off. 1404 00:56:46,030 --> 00:56:48,520 You would need to walk for 30 minutes in total. 1405 00:56:48,520 --> 00:56:51,910 Would you go to the other store? 1406 00:56:51,910 --> 00:56:54,468 So you look at this question. 1407 00:56:54,468 --> 00:56:56,260 If an economist looks at this, the question 1408 00:56:56,260 --> 00:56:58,180 is, there's some cost of walking. 1409 00:56:58,180 --> 00:56:59,170 It's tedious to walk. 1410 00:56:59,170 --> 00:57:01,550 Maybe it's raining, maybe not. 1411 00:57:01,550 --> 00:57:03,310 And then you have some value of time. 1412 00:57:03,310 --> 00:57:08,140 Either your value of time for 30 minutes is below or above $15. 1413 00:57:08,140 --> 00:57:10,398 If the answer is your value of time is higher, 1414 00:57:10,398 --> 00:57:11,440 you're not going to walk. 1415 00:57:11,440 --> 00:57:13,630 If your answer is the value of time is lower, 1416 00:57:13,630 --> 00:57:14,950 you're going to walk. 1417 00:57:14,950 --> 00:57:15,550 Case closed. 1418 00:57:15,550 --> 00:57:16,210 You're done. 1419 00:57:16,210 --> 00:57:17,590 And you know your preferences. 1420 00:57:17,590 --> 00:57:20,320 And you have all the information you need. 1421 00:57:20,320 --> 00:57:22,160 Now I'm asking you a related question, 1422 00:57:22,160 --> 00:57:24,470 which is in fact, the exact same questions, 1423 00:57:24,470 --> 00:57:26,140 if you think about it, except for what's 1424 00:57:26,140 --> 00:57:30,350 different is it's about an iPad versus an iPad case. 1425 00:57:30,350 --> 00:57:32,000 I'm asking you the exact same question, 1426 00:57:32,000 --> 00:57:36,040 which is the question about would you walk for half an hour 1427 00:57:36,040 --> 00:57:37,700 for $15? 1428 00:57:37,700 --> 00:57:41,140 So again either your value of time 1429 00:57:41,140 --> 00:57:43,750 is above or below $15 for 30 minutes. 1430 00:57:43,750 --> 00:57:46,450 If the answer is yes, you should say-- 1431 00:57:46,450 --> 00:57:48,010 if your value of time is higher, you 1432 00:57:48,010 --> 00:57:49,720 should say no to both questions. 1433 00:57:49,720 --> 00:57:51,520 If the value of time is lower, you 1434 00:57:51,520 --> 00:57:53,650 should say yes to both questions. 1435 00:57:53,650 --> 00:57:56,845 But notice here that questions are exactly the same. 1436 00:57:56,845 --> 00:57:58,720 The only thing that's changing is essentially 1437 00:57:58,720 --> 00:58:01,070 the value of the item. 1438 00:58:01,070 --> 00:58:03,370 Now why might people still say different things? 1439 00:58:03,370 --> 00:58:08,010 Or what's potentially affecting behavior here? 1440 00:58:08,010 --> 00:58:09,360 Yes? 1441 00:58:09,360 --> 00:58:11,980 AUDIENCE: [INAUDIBLE]. 1442 00:58:11,980 --> 00:58:13,230 FRANK SCHILBACH: Yes, exactly. 1443 00:58:13,230 --> 00:58:16,620 So we're thinking in relative terms, which 1444 00:58:16,620 --> 00:58:22,020 is to say the $15 seems pretty small when it compares to $500. 1445 00:58:22,020 --> 00:58:23,850 That's to say, $500 is a lot of money. 1446 00:58:23,850 --> 00:58:25,740 So if you spend $500, you might as well-- 1447 00:58:25,740 --> 00:58:29,040 whatever, $15 more or less, it doesn't really matter. 1448 00:58:29,040 --> 00:58:31,867 When you spend $30, $15 is like half of that. 1449 00:58:31,867 --> 00:58:32,700 It's a lot of money. 1450 00:58:32,700 --> 00:58:33,900 So you get like $15 off. 1451 00:58:33,900 --> 00:58:38,510 That would be a really great deal to get. 1452 00:58:38,510 --> 00:58:40,400 But when you think about now going back-- 1453 00:58:40,400 --> 00:58:42,620 and this is I guess when you go back to recitation, 1454 00:58:42,620 --> 00:58:44,390 look at utility maximization-- 1455 00:58:44,390 --> 00:58:47,300 there's no place for these relative considerations. 1456 00:58:47,300 --> 00:58:48,980 When you maximize utility, it would 1457 00:58:48,980 --> 00:58:53,960 be like the utility of this potentially consumption items 1458 00:58:53,960 --> 00:58:54,510 and so on. 1459 00:58:54,510 --> 00:58:56,000 These are all in absolute terms. 1460 00:58:56,000 --> 00:58:59,120 And either money makes you happy or not or walking 1461 00:58:59,120 --> 00:59:00,710 makes you unhappy. 1462 00:59:00,710 --> 00:59:03,818 There's no relative comparison here overall. 1463 00:59:03,818 --> 00:59:05,360 Yet what seems to be really important 1464 00:59:05,360 --> 00:59:09,800 is all these relative concerns. 1465 00:59:09,800 --> 00:59:11,258 And then you look at this question. 1466 00:59:11,258 --> 00:59:13,008 This is I think one of the question that's 1467 00:59:13,008 --> 00:59:15,470 the most robust I've found in any of the surveys I did. 1468 00:59:15,470 --> 00:59:17,720 This always works, essentially. 1469 00:59:17,720 --> 00:59:20,300 People are much more likely to-- or say, at least, 1470 00:59:20,300 --> 00:59:22,850 they're much more likely to walk for an iPad case 1471 00:59:22,850 --> 00:59:24,440 than they are for an actual iPad. 1472 00:59:24,440 --> 00:59:27,470 There's various reasons why you might say, in this case, 1473 00:59:27,470 --> 00:59:31,700 maybe if you have an iPad, you buy an iPad for $300-- 1474 00:59:31,700 --> 00:59:33,920 whatever I said-- you might have a lot more money. 1475 00:59:33,920 --> 00:59:35,940 And so you're richer and so on and so forth. 1476 00:59:35,940 --> 00:59:37,670 And maybe that can explain the results. 1477 00:59:37,670 --> 00:59:39,800 There's 10 different versions of these question 1478 00:59:39,800 --> 00:59:41,820 that all have different flavors. 1479 00:59:41,820 --> 00:59:44,450 And essentially this result is extremely robust. 1480 00:59:44,450 --> 00:59:47,510 People tend to think in relative terms and not in absolute terms 1481 00:59:47,510 --> 00:59:49,130 when making these kinds of choices. 1482 00:59:49,130 --> 00:59:51,950 And their choices are extremely malleable to those kinds 1483 00:59:51,950 --> 00:59:54,860 of comparisons. 1484 00:59:54,860 --> 00:59:56,750 Any questions about that? 1485 01:00:00,250 --> 01:00:00,750 Yeah 1486 01:00:00,750 --> 01:00:02,458 AUDIENCE: So how about joint evaluations? 1487 01:00:02,458 --> 01:00:08,635 So if you were to ask people that you would buy this iPad, 1488 01:00:08,635 --> 01:00:11,800 you get the $15 off and also for the same iPad case, 1489 01:00:11,800 --> 01:00:13,110 you get $15 off. 1490 01:00:13,110 --> 01:00:17,200 And if you go there and buy both, you get $30 off. 1491 01:00:17,200 --> 01:00:21,410 So do you think the same behavior to walk would 1492 01:00:21,410 --> 01:00:22,438 [INAUDIBLE]? 1493 01:00:22,438 --> 01:00:24,480 FRANK SCHILBACH: I think what people in some ways 1494 01:00:24,480 --> 01:00:27,030 do in their heads, they do essentially what fraction 1495 01:00:27,030 --> 01:00:29,140 do you get off in some ways? 1496 01:00:29,140 --> 01:00:32,387 And if you then bought both, I guess that would be-- 1497 01:00:32,387 --> 01:00:34,470 I don't know what the prices where-- it would be I 1498 01:00:34,470 --> 01:00:35,520 guess $530. 1499 01:00:35,520 --> 01:00:37,290 And then you get-- 1500 01:00:37,290 --> 01:00:39,510 I think it would probably be somewhere in between. 1501 01:00:39,510 --> 01:00:43,260 The behavior would probably be quite similar to the iPad thing 1502 01:00:43,260 --> 01:00:44,730 overall because you would have-- 1503 01:00:44,730 --> 01:00:46,688 I think what people do in their mind is to say, 1504 01:00:46,688 --> 01:00:50,880 I'm spending already x dollars-- in case $500 or $530-- 1505 01:00:50,880 --> 01:00:54,390 now does the $15 look large or small compared to that? 1506 01:00:54,390 --> 01:00:56,620 $15 looks really large compared to $30. 1507 01:00:56,620 --> 01:00:57,600 So let's just walk. 1508 01:00:57,600 --> 01:01:00,420 And I might as well do it and get a really good deal. 1509 01:01:00,420 --> 01:01:02,100 I get 50% off. 1510 01:01:02,100 --> 01:01:06,088 When you have $500 or $530, the $15 seems really small. 1511 01:01:06,088 --> 01:01:07,380 That's not really a great deal. 1512 01:01:07,380 --> 01:01:08,550 So why bother doing it? 1513 01:01:08,550 --> 01:01:10,620 Let's just spend it all anyway. 1514 01:01:10,620 --> 01:01:15,392 But of course, the issue is $15, and you're 1515 01:01:15,392 --> 01:01:17,100 going to use that $15 for something else, 1516 01:01:17,100 --> 01:01:18,810 presumably not just iPads. 1517 01:01:18,810 --> 01:01:21,967 And then you should give essentially the same answer. 1518 01:01:21,967 --> 01:01:23,550 There's some very interesting evidence 1519 01:01:23,550 --> 01:01:27,450 I'm going to show you at the very end of the class, which is 1520 01:01:27,450 --> 01:01:30,360 about poor versus rich people. 1521 01:01:30,360 --> 01:01:34,320 So there's some evidence that [INAUDIBLE] and others 1522 01:01:34,320 --> 01:01:38,520 have done that do exactly these kinds of relative thinking 1523 01:01:38,520 --> 01:01:41,850 comparisons with rich people and with poor people. 1524 01:01:41,850 --> 01:01:44,280 And what you find there overall is 1525 01:01:44,280 --> 01:01:46,860 that rich people do this a lot, poor people 1526 01:01:46,860 --> 01:01:48,600 actually not so much. 1527 01:01:48,600 --> 01:01:50,396 And why is that? 1528 01:01:50,396 --> 01:01:51,382 AUDIENCE: [INAUDIBLE]? 1529 01:01:58,877 --> 01:01:59,710 FRANK SCHILBACH: No. 1530 01:01:59,710 --> 01:02:01,598 It's for the same amount. 1531 01:02:01,598 --> 01:02:02,890 They keep the amounts the same. 1532 01:02:02,890 --> 01:02:05,645 AUDIENCE: They value it a lot more than rich people. 1533 01:02:05,645 --> 01:02:07,270 FRANK SCHILBACH: So they value it more. 1534 01:02:07,270 --> 01:02:09,190 Yes, that's right. 1535 01:02:09,190 --> 01:02:10,063 Yeah? 1536 01:02:10,063 --> 01:02:11,980 AUDIENCE: Because rich people have more money, 1537 01:02:11,980 --> 01:02:15,700 they can more afford to spend less time thinking 1538 01:02:15,700 --> 01:02:18,103 about the $15 and therefore, might 1539 01:02:18,103 --> 01:02:20,370 make more of a snap judgment [INAUDIBLE] 1540 01:02:20,370 --> 01:02:23,522 poor people might deliberate about it for a minute. 1541 01:02:23,522 --> 01:02:24,480 FRANK SCHILBACH: Right. 1542 01:02:24,480 --> 01:02:24,980 Exactly. 1543 01:02:24,980 --> 01:02:26,147 So in some sense-- 1544 01:02:26,147 --> 01:02:27,730 so there's two different explanations. 1545 01:02:27,730 --> 01:02:33,060 One is in a way, like a classical explanation, 1546 01:02:33,060 --> 01:02:35,100 is to say there's certain amounts of money 1547 01:02:35,100 --> 01:02:37,230 that for the rich, it just doesn't matter. 1548 01:02:37,230 --> 01:02:40,740 If you have a million dollars, why walk anyway and whatever? 1549 01:02:40,740 --> 01:02:42,120 I just don't pay attention. 1550 01:02:42,120 --> 01:02:44,130 And in some sense, I make them certain mistakes. 1551 01:02:44,130 --> 01:02:46,342 But the mistakes that really not very costly. 1552 01:02:46,342 --> 01:02:47,800 It's just that I don't really care. 1553 01:02:47,800 --> 01:02:48,870 I might make mistakes. 1554 01:02:48,870 --> 01:02:52,680 But it's really not an issue because I'm rich anyway. 1555 01:02:52,680 --> 01:02:56,310 There's another view, and that is to say, for the poor, 1556 01:02:56,310 --> 01:02:59,640 at a daily basis, they evaluate something like $15-- 1557 01:02:59,640 --> 01:03:01,440 for the poor, they know what $15 is worth. 1558 01:03:01,440 --> 01:03:04,770 For them, it's not a comparison to $500 or whatever. 1559 01:03:04,770 --> 01:03:07,045 $15 is a meal for your family. 1560 01:03:07,045 --> 01:03:09,420 It's about the difference between your child being hungry 1561 01:03:09,420 --> 01:03:13,990 versus getting food and so on. 1562 01:03:13,990 --> 01:03:16,290 So then for them, $1 is $1. 1563 01:03:16,290 --> 01:03:18,390 And they're less affected by these framing 1564 01:03:18,390 --> 01:03:21,400 and other choices. 1565 01:03:21,400 --> 01:03:24,263 But that's one of the most interesting evidence in poverty 1566 01:03:24,263 --> 01:03:25,680 research, where people essentially 1567 01:03:25,680 --> 01:03:28,800 try to understand how do people's monetary choices-- 1568 01:03:28,800 --> 01:03:32,370 or how are they affected by these framings? 1569 01:03:32,370 --> 01:03:35,620 And the view overall is that the poor might, in fact, 1570 01:03:35,620 --> 01:03:42,630 be more rational in some sense, in the sense of less swayed 1571 01:03:42,630 --> 01:03:44,790 by framing and other choices that 1572 01:03:44,790 --> 01:03:51,450 might lead to inconsistent choices, 1573 01:03:51,450 --> 01:03:54,390 while the rich do that a lot more. 1574 01:03:54,390 --> 01:03:56,160 And the downside of that seems to be 1575 01:03:56,160 --> 01:03:58,500 that thinking a lot of money, the poor then spend 1576 01:03:58,500 --> 01:04:00,930 a lot of cognitive energy on thinking 1577 01:04:00,930 --> 01:04:02,520 about these monetary issues. 1578 01:04:02,520 --> 01:04:05,070 And that might lead you to make worse decisions 1579 01:04:05,070 --> 01:04:06,720 in other domains of life. 1580 01:04:06,720 --> 01:04:08,790 You might have just less cognitive function 1581 01:04:08,790 --> 01:04:11,325 or cognitive energy if you want available-- 1582 01:04:11,325 --> 01:04:15,270 or resources available for other potentially important choices 1583 01:04:15,270 --> 01:04:16,600 in life. 1584 01:04:16,600 --> 01:04:19,140 And that's what we talk about in the property development 1585 01:04:19,140 --> 01:04:21,640 section. 1586 01:04:21,640 --> 01:04:25,460 Then there was another choice that we 1587 01:04:25,460 --> 01:04:27,140 did that usually tends to work. 1588 01:04:27,140 --> 01:04:29,150 And this is a very interesting line 1589 01:04:29,150 --> 01:04:31,250 of research by Dan Ariely and others, 1590 01:04:31,250 --> 01:04:32,960 which essentially is about anchoring. 1591 01:04:32,960 --> 01:04:34,640 In some sense, looking at this exposed, 1592 01:04:34,640 --> 01:04:37,010 I think the questions were not quite right. 1593 01:04:37,010 --> 01:04:39,270 So the idea of anchoring is to say, 1594 01:04:39,270 --> 01:04:41,300 if I set you a very arbitrary anchor, 1595 01:04:41,300 --> 01:04:45,600 but let you think about certain numbers, in this case, 1596 01:04:45,600 --> 01:04:50,122 it was the sum of people's phone numbers. 1597 01:04:50,122 --> 01:04:52,580 Then essentially when you have a high sum of phone numbers, 1598 01:04:52,580 --> 01:04:53,580 you willing to pay more. 1599 01:04:53,580 --> 01:04:55,850 If you have a sum of phone numbers, 1600 01:04:55,850 --> 01:04:57,197 you're willing to pay less. 1601 01:04:57,197 --> 01:04:59,030 There's a bunch of experiments actually done 1602 01:04:59,030 --> 01:05:02,420 at MIT with MBA students, mostly, I have to say-- 1603 01:05:02,420 --> 01:05:04,670 but MBA students mostly, about Dan Ariely, who 1604 01:05:04,670 --> 01:05:06,200 used to be a faculty member at MIT, 1605 01:05:06,200 --> 01:05:08,030 that a bunch of these kinds of experiments 1606 01:05:08,030 --> 01:05:10,280 were essentially-- you can anchor people fairly 1607 01:05:10,280 --> 01:05:11,545 arbitrarily. 1608 01:05:11,545 --> 01:05:13,670 And if you would make them think about the last two 1609 01:05:13,670 --> 01:05:15,680 digits of their social security number, 1610 01:05:15,680 --> 01:05:17,510 if that social security number is high, 1611 01:05:17,510 --> 01:05:19,070 they're willing to pay more for wine 1612 01:05:19,070 --> 01:05:21,380 than if the social security number is low. 1613 01:05:21,380 --> 01:05:24,170 And this is also a real choice that tends to be quite robust. 1614 01:05:24,170 --> 01:05:26,840 One problem with the experiment that we're doing here 1615 01:05:26,840 --> 01:05:28,610 that didn't seem to show up-- 1616 01:05:28,610 --> 01:05:31,970 in 2017 there was more anchoring. 1617 01:05:31,970 --> 01:05:35,027 Here it doesn't seem like maybe it's upward sloping, maybe not. 1618 01:05:35,027 --> 01:05:36,860 One problem here is that the anchor-- if you 1619 01:05:36,860 --> 01:05:39,170 look at the values of the anchor that we were setting, 1620 01:05:39,170 --> 01:05:41,450 they're actually pretty low, they're essentially 1621 01:05:41,450 --> 01:05:43,370 between something like 10 and 30, 1622 01:05:43,370 --> 01:05:46,170 while people's valuation tends to be often a lot higher. 1623 01:05:46,170 --> 01:05:48,770 So maybe that's why it potentially didn't work. 1624 01:05:48,770 --> 01:05:51,170 It could be that you guys are less prone to anchoring. 1625 01:05:51,170 --> 01:05:53,123 But overall I think the bottom line 1626 01:05:53,123 --> 01:05:55,040 is there's quite a bit of evidence essentially 1627 01:05:55,040 --> 01:05:59,023 that when you anchor people, that affects their choices. 1628 01:05:59,023 --> 01:06:01,190 And again of course, it shouldn't affect your choice 1629 01:06:01,190 --> 01:06:01,440 at all. 1630 01:06:01,440 --> 01:06:03,857 When it makes you think about your social security number, 1631 01:06:03,857 --> 01:06:06,262 a large or small numbers, that should by no means 1632 01:06:06,262 --> 01:06:07,970 affect how much are you're willing to pay 1633 01:06:07,970 --> 01:06:11,120 for a glass of water a bottle of wine or anything else. 1634 01:06:11,120 --> 01:06:14,840 Yet there's a relatively robust line of research 1635 01:06:14,840 --> 01:06:17,660 that shows that in fact, that's the case. 1636 01:06:17,660 --> 01:06:21,500 And for example it seems like, at least in 2017, 1637 01:06:21,500 --> 01:06:25,610 that was in fact going on. 1638 01:06:25,610 --> 01:06:28,860 Finally, I want to talk to something briefly, 1639 01:06:28,860 --> 01:06:30,860 which is quite interesting literature 1640 01:06:30,860 --> 01:06:33,020 in social psychology. 1641 01:06:33,020 --> 01:06:35,120 And it's this paper by Darley & Batson, which 1642 01:06:35,120 --> 01:06:41,630 is a classic study in psychology that studies helping behavior. 1643 01:06:41,630 --> 01:06:44,120 And broadly speaking, you can think of social psychology 1644 01:06:44,120 --> 01:06:46,970 perhaps as a way of thinking about how 1645 01:06:46,970 --> 01:06:50,480 do social circumstances or your environments 1646 01:06:50,480 --> 01:06:52,940 affect your behavior or your preferences? 1647 01:06:52,940 --> 01:06:55,430 And that's a deviation from what economists think about. 1648 01:06:55,430 --> 01:06:57,888 Usually economists would say, you have certain preferences. 1649 01:06:57,888 --> 01:06:59,540 You either want to be nice or not. 1650 01:06:59,540 --> 01:07:01,640 Or you either want to eat certain things or not. 1651 01:07:01,640 --> 01:07:05,180 It shouldn't depend that much on your social environment. 1652 01:07:05,180 --> 01:07:07,100 And social psychology studies in detail 1653 01:07:07,100 --> 01:07:09,560 are like how your choices or behaviors-- in this case, 1654 01:07:09,560 --> 01:07:13,830 I guess helping behavior-- is affected by your environment. 1655 01:07:13,830 --> 01:07:17,970 And so what they did is they studied helping behavior. 1656 01:07:17,970 --> 01:07:20,030 These are Princeton theology students 1657 01:07:20,030 --> 01:07:21,890 on their way to a seminar. 1658 01:07:21,890 --> 01:07:23,967 And they pass an ostensibly injured man 1659 01:07:23,967 --> 01:07:25,800 slumped in a doorway, coughing and groaning. 1660 01:07:25,800 --> 01:07:27,870 This is somebody in the experiment. 1661 01:07:27,870 --> 01:07:30,380 So social psychology studies are very 1662 01:07:30,380 --> 01:07:33,350 rich in creating interesting environments. 1663 01:07:33,350 --> 01:07:34,970 So they had this man who essentially 1664 01:07:34,970 --> 01:07:37,640 is looking very injured. 1665 01:07:37,640 --> 01:07:40,100 And they were looking at determinants 1666 01:07:40,100 --> 01:07:41,070 of helping behavior. 1667 01:07:41,070 --> 01:07:44,060 So the question is, do these theology students 1668 01:07:44,060 --> 01:07:45,860 stop and help this man when they're 1669 01:07:45,860 --> 01:07:47,780 on the way to their seminar? 1670 01:07:47,780 --> 01:07:51,170 And they had three different manipulations. 1671 01:07:51,170 --> 01:07:54,800 I should say very clearly this with a very small sample alert, 1672 01:07:54,800 --> 01:07:57,140 the study is very small. 1673 01:07:57,140 --> 01:07:59,330 So there's some question-- 1674 01:07:59,330 --> 01:08:01,010 does this replicate, which is often 1675 01:08:01,010 --> 01:08:05,180 for several social psychology studies, classic studies, 1676 01:08:05,180 --> 01:08:06,930 sample sizes are relatively small. 1677 01:08:06,930 --> 01:08:08,660 So there are some questions on how does it replicate? 1678 01:08:08,660 --> 01:08:10,285 But I think the point broadly speaking, 1679 01:08:10,285 --> 01:08:13,440 is a good one, which is why I'm telling you about this. 1680 01:08:13,440 --> 01:08:15,050 So what are the three manipulations? 1681 01:08:15,050 --> 01:08:15,720 Three things. 1682 01:08:15,720 --> 01:08:18,410 One was they had a lecture on the parable of the Good 1683 01:08:18,410 --> 01:08:20,359 Samaritan versus some other content. 1684 01:08:20,359 --> 01:08:22,505 What is the Good Samaritan parable? 1685 01:08:27,630 --> 01:08:28,130 Yes? 1686 01:08:28,130 --> 01:08:29,345 AUDIENCE: It's about helping someone in need 1687 01:08:29,345 --> 01:08:30,560 on the side of the road? 1688 01:08:30,560 --> 01:08:31,602 FRANK SCHILBACH: Exactly. 1689 01:08:31,602 --> 01:08:35,263 So it's about essentially this person who-- 1690 01:08:35,263 --> 01:08:36,680 long story short, there's a person 1691 01:08:36,680 --> 01:08:38,840 in need on the side of the road. 1692 01:08:38,840 --> 01:08:41,420 They're people who are from the same area, the priests 1693 01:08:41,420 --> 01:08:43,939 and so on-- nobody is helping because they're really busy. 1694 01:08:43,939 --> 01:08:46,710 And then the Samaritan comes, who is an outcast in society 1695 01:08:46,710 --> 01:08:47,300 and so on. 1696 01:08:47,300 --> 01:08:48,140 And he's helping. 1697 01:08:48,140 --> 01:08:50,870 And the point is you should help anybody 1698 01:08:50,870 --> 01:08:53,029 and so on, regardless of background 1699 01:08:53,029 --> 01:08:54,270 and so on and so forth. 1700 01:08:54,270 --> 01:08:56,210 So the whole story is essentially saying, 1701 01:08:56,210 --> 01:08:58,460 you should be helping somebody. 1702 01:08:58,460 --> 01:09:01,372 So there are these students now who have been essentially-- 1703 01:09:01,372 --> 01:09:03,080 they've been thinking about this parable. 1704 01:09:03,080 --> 01:09:04,910 They've thinking about helping behavior. 1705 01:09:04,910 --> 01:09:07,430 They're o supposed to give essentially a lecture 1706 01:09:07,430 --> 01:09:10,165 about the parable of the Good Samaritan. 1707 01:09:10,165 --> 01:09:11,540 In their mind, is to think about, 1708 01:09:11,540 --> 01:09:13,080 you should be helping others. 1709 01:09:13,080 --> 01:09:16,430 So we determine that mindset in [INAUDIBLE],, and then in fact, 1710 01:09:16,430 --> 01:09:19,729 to see the practical version of that in life. 1711 01:09:19,729 --> 01:09:22,250 The question is, does that affect behavior? 1712 01:09:22,250 --> 01:09:25,880 The second variation is variation and time pressures, 1713 01:09:25,880 --> 01:09:28,430 just are people in a hurry or not? 1714 01:09:28,430 --> 01:09:30,890 And the third one is personality measures, 1715 01:09:30,890 --> 01:09:32,390 in particular religiosity. 1716 01:09:32,390 --> 01:09:34,880 So people are more or less religious. 1717 01:09:34,880 --> 01:09:37,880 And you might think that that also affects or determines 1718 01:09:37,880 --> 01:09:41,779 how much people are helping. 1719 01:09:41,779 --> 01:09:44,540 So now you can look at which of those three things 1720 01:09:44,540 --> 01:09:45,800 matters for behavior. 1721 01:09:45,800 --> 01:09:50,290 Who thinks one is most important? 1722 01:09:50,290 --> 01:09:52,540 What about two? 1723 01:09:52,540 --> 01:09:54,271 Three? 1724 01:09:54,271 --> 01:09:57,970 I set this up in some sense maybe not the right way. 1725 01:09:57,970 --> 01:10:03,378 So it turns out two is, in fact, as you said, 1726 01:10:03,378 --> 01:10:04,420 the most important thing. 1727 01:10:04,420 --> 01:10:06,128 The hurry condition, essentially-- again, 1728 01:10:06,128 --> 01:10:07,000 it's a small sample. 1729 01:10:07,000 --> 01:10:09,070 But in no hurry, about 60% of people 1730 01:10:09,070 --> 01:10:11,470 stopped, medium hurry, 45%. 1731 01:10:11,470 --> 01:10:14,080 Of people in a high hurry, essentially nobody stops. 1732 01:10:14,080 --> 01:10:15,745 People essentially-- but the reason 1733 01:10:15,745 --> 01:10:17,620 they're worried about coming to that seminar, 1734 01:10:17,620 --> 01:10:19,328 about the Good Samaritan, and they're not 1735 01:10:19,328 --> 01:10:21,718 sort of particularly good Samaritans themselves. 1736 01:10:21,718 --> 01:10:24,010 It turns out the personality characteristics don't seem 1737 01:10:24,010 --> 01:10:25,520 to really predict behaviors. 1738 01:10:25,520 --> 01:10:30,100 And so what we learn from that is that situations 1739 01:10:30,100 --> 01:10:31,595 can matter a great deal. 1740 01:10:31,595 --> 01:10:33,700 Now in some sense that's very much consistent what 1741 01:10:33,700 --> 01:10:36,340 I was saying to you earlier, about social influences 1742 01:10:36,340 --> 01:10:40,506 affect behavior, in some sense, in the sense of you 1743 01:10:40,506 --> 01:10:43,370 care about others in a way depends 1744 01:10:43,370 --> 01:10:46,850 a lot on your circumstances in the sense of are you 1745 01:10:46,850 --> 01:10:51,210 observed or not or in malleable situations. 1746 01:10:51,210 --> 01:10:53,600 But I think in sense, here's like a choice that's 1747 01:10:53,600 --> 01:10:55,340 exactly the same choice. 1748 01:10:55,340 --> 01:10:58,110 You might either be nice or not. 1749 01:10:58,110 --> 01:11:01,460 But in some sense, the value of time 1750 01:11:01,460 --> 01:11:03,440 seems to be quite important here for people. 1751 01:11:03,440 --> 01:11:07,340 So people's choices in some sense might actually be-- 1752 01:11:07,340 --> 01:11:09,715 it's not like you are a nice person versus not. 1753 01:11:09,715 --> 01:11:11,090 But in some situations, you might 1754 01:11:11,090 --> 01:11:15,060 behave very nicely and friendly and in others, not so much. 1755 01:11:15,060 --> 01:11:17,750 And that's a much more fundamental problem 1756 01:11:17,750 --> 01:11:21,625 perhaps for economics or for writing down utility functions 1757 01:11:21,625 --> 01:11:23,000 because essentially, the issue is 1758 01:11:23,000 --> 01:11:26,060 that it's not even clear what your social preference 1759 01:11:26,060 --> 01:11:26,720 parameter is. 1760 01:11:26,720 --> 01:11:29,053 It's not even clear are you a nice person or not so much 1761 01:11:29,053 --> 01:11:32,510 a nice person because it depends so much on your environment. 1762 01:11:32,510 --> 01:11:34,965 And people are shaped by their environments overall. 1763 01:11:34,965 --> 01:11:37,340 So one thing they're going to think about towards the end 1764 01:11:37,340 --> 01:11:41,210 of the class is to say, people's-- it's not just that 1765 01:11:41,210 --> 01:11:44,420 their preferences or their beliefs systematically deviate 1766 01:11:44,420 --> 01:11:48,000 in ways that we can quantify from the classical model. 1767 01:11:48,000 --> 01:11:50,690 But in fact, it's the case that they're actually 1768 01:11:50,690 --> 01:11:53,630 very volatile and maybe even hard to elicit in some ways 1769 01:11:53,630 --> 01:11:55,880 because they're changing so much. 1770 01:11:55,880 --> 01:11:59,900 And in fact, people might not even know necessarily 1771 01:11:59,900 --> 01:12:00,943 their preferences. 1772 01:12:00,943 --> 01:12:03,110 Or their preferences are just malleable in the sense 1773 01:12:03,110 --> 01:12:08,600 of they depend a lot on circumstantial situations. 1774 01:12:08,600 --> 01:12:13,840 So then let me summarize quickly what we discussed. 1775 01:12:13,840 --> 01:12:15,950 So I think one way-- and this is what 1776 01:12:15,950 --> 01:12:19,310 I was saying earlier-- one way to think about this class is 1777 01:12:19,310 --> 01:12:22,490 one goal of this class is really to get you to think 1778 01:12:22,490 --> 01:12:25,580 more and observe the world more in a way through the lens 1779 01:12:25,580 --> 01:12:28,010 of economics, but also through the lens of psychology 1780 01:12:28,010 --> 01:12:29,240 and economics. 1781 01:12:29,240 --> 01:12:32,000 That's to say you can think about a lot of situations, how 1782 01:12:32,000 --> 01:12:34,280 people behave and try to understand why they behave 1783 01:12:34,280 --> 01:12:35,840 in the ways they do. 1784 01:12:35,840 --> 01:12:38,630 And you often see certain puzzles or certain behaviors 1785 01:12:38,630 --> 01:12:39,950 that don't look quite right. 1786 01:12:39,950 --> 01:12:43,400 It seems like people are sort of making mistakes in some ways. 1787 01:12:43,400 --> 01:12:45,590 They seem to be sort of erratic in various ways. 1788 01:12:45,590 --> 01:12:47,690 But often you can actually put a lot of structure 1789 01:12:47,690 --> 01:12:48,350 in their behavior. 1790 01:12:48,350 --> 01:12:50,000 We can actually understand systematically 1791 01:12:50,000 --> 01:12:51,542 through a certain puzzle to a certain 1792 01:12:51,542 --> 01:12:53,210 [INAUDIBLE],, through certain theories 1793 01:12:53,210 --> 01:12:56,270 or certain topics that we discussed in class, what 1794 01:12:56,270 --> 01:12:59,430 people do and why they do that. 1795 01:12:59,430 --> 01:13:01,863 There's also similar issues about firms. 1796 01:13:01,863 --> 01:13:04,280 When you think about when you get credit card offers, when 1797 01:13:04,280 --> 01:13:07,940 you think about a sales item and so on, when you think about 1798 01:13:07,940 --> 01:13:10,160 like pricing decisions that firms make, 1799 01:13:10,160 --> 01:13:12,500 a lot of these pricing or decisions that firms make 1800 01:13:12,500 --> 01:13:14,690 are actually thinking very carefully 1801 01:13:14,690 --> 01:13:16,460 about the psychology of consumers. 1802 01:13:16,460 --> 01:13:17,960 So essentially what firms are doing, 1803 01:13:17,960 --> 01:13:19,877 they're trying to think about potential-- what 1804 01:13:19,877 --> 01:13:22,850 they call behavioral biases or issues that consumers have. 1805 01:13:22,850 --> 01:13:25,100 And when you think about what do you see in the world, 1806 01:13:25,100 --> 01:13:26,767 often that makes actually a lot of sense 1807 01:13:26,767 --> 01:13:28,670 through the lens of psychology and economics 1808 01:13:28,670 --> 01:13:30,720 or behavioral economics. 1809 01:13:30,720 --> 01:13:32,810 So you can think about yourself or others 1810 01:13:32,810 --> 01:13:34,440 when you see certain products. 1811 01:13:34,440 --> 01:13:38,570 Think about how does this relate to the content of the course 1812 01:13:38,570 --> 01:13:39,570 that we have? 1813 01:13:39,570 --> 01:13:42,347 And I think that's a general lesson. 1814 01:13:42,347 --> 01:13:44,180 So what we try to do in the course, in part, 1815 01:13:44,180 --> 01:13:47,720 is think about individual choices and some theories 1816 01:13:47,720 --> 01:13:50,690 in economics and psychology, but in particular think about how 1817 01:13:50,690 --> 01:13:53,510 do these series and relate to real world outcomes, 1818 01:13:53,510 --> 01:13:56,900 how can we perhaps apply them to situations in the world 1819 01:13:56,900 --> 01:14:00,410 and potentially improve choices? 1820 01:14:00,410 --> 01:14:02,630 So here's broadly where we are. 1821 01:14:02,630 --> 01:14:06,080 As I said, you're going to talk about preferences 1822 01:14:06,080 --> 01:14:10,040 for the next few weeks, in particular time preferences 1823 01:14:10,040 --> 01:14:11,120 and self control. 1824 01:14:11,120 --> 01:14:14,150 As I said, a lot of that has to do with procrastination, 1825 01:14:14,150 --> 01:14:15,335 including for problem sets. 1826 01:14:15,335 --> 01:14:17,210 As I said, the problem set will come probably 1827 01:14:17,210 --> 01:14:18,650 something around Monday. 1828 01:14:21,170 --> 01:14:22,610 Here's the reading for next week. 1829 01:14:22,610 --> 01:14:24,170 This is on the course website. 1830 01:14:24,170 --> 01:14:28,520 Please read Frederick O'Donoghue 2002 sections 1, as well as 1831 01:14:28,520 --> 01:14:30,830 4.1 to 5.1. 1832 01:14:30,830 --> 01:14:32,830 Thank you very much.