1 00:00:00,000 --> 00:00:01,932 [SQUEAKING] 2 00:00:01,932 --> 00:00:03,864 [RUSTLING] 3 00:00:03,864 --> 00:00:07,728 [CLICKING] 4 00:00:10,630 --> 00:00:13,400 All right, I'm going to get started. 5 00:00:13,400 --> 00:00:18,820 This is lecture 22 on 14.13 on happiness and mental health. 6 00:00:18,820 --> 00:00:21,670 OK, so what are we going to talk about today? 7 00:00:21,670 --> 00:00:23,320 Happiness and subjective well-being 8 00:00:23,320 --> 00:00:24,982 and a little bit of mental health. 9 00:00:24,982 --> 00:00:27,190 So we're going to talk about rationality and revealed 10 00:00:27,190 --> 00:00:29,320 preferences and what can we learn from choices, 11 00:00:29,320 --> 00:00:32,020 and what does that tell us about happiness. 12 00:00:32,020 --> 00:00:35,123 We also talked about utility and how we can measure utility. 13 00:00:35,123 --> 00:00:36,915 In particular, how do we measure happiness, 14 00:00:36,915 --> 00:00:40,030 and is that a good or bad idea? 15 00:00:40,030 --> 00:00:41,770 And then at the very end, we'll have, 16 00:00:41,770 --> 00:00:44,690 hopefully, a llama or a goat visit, 17 00:00:44,690 --> 00:00:48,420 which will at least improve my happiness, hopefully yours too. 18 00:00:48,420 --> 00:00:52,060 I've had some trouble communicating by email 19 00:00:52,060 --> 00:00:57,400 with the llama, goat, and their keepers. 20 00:00:57,400 --> 00:00:59,320 So I'm not exactly sure when they're 21 00:00:59,320 --> 00:01:01,210 going to show up, so they're scheduled-- 22 00:01:01,210 --> 00:01:04,879 the problem is, so they had only different certain time slots. 23 00:01:04,879 --> 00:01:07,990 The only time slot that was available was at 2:30, 24 00:01:07,990 --> 00:01:12,940 so I asked them to show up earlier, 2:20 or so. 25 00:01:12,940 --> 00:01:16,000 I hope that's actually going to happen, 26 00:01:16,000 --> 00:01:18,220 but I haven't heard back, so I'm not exactly sure. 27 00:01:18,220 --> 00:01:19,720 So it might have to be at like 2:30. 28 00:01:19,720 --> 00:01:22,660 I hope you don't have to run. 29 00:01:22,660 --> 00:01:24,460 There was also some confusion about which 30 00:01:24,460 --> 00:01:26,740 Zoom room code the llama should go into, 31 00:01:26,740 --> 00:01:30,540 and hopefully, they find our room, 32 00:01:30,540 --> 00:01:32,430 even though they don't have and MIT ID. 33 00:01:32,430 --> 00:01:34,320 So anyway, that's that. 34 00:01:34,320 --> 00:01:36,510 And then on Monday, we're going to talk about policy 35 00:01:36,510 --> 00:01:39,540 with people agents, which is very much the idea that, 36 00:01:39,540 --> 00:01:42,360 well, now if people are potentially making mistakes 37 00:01:42,360 --> 00:01:46,350 or have certain biases and we might 38 00:01:46,350 --> 00:01:48,390 think that we know about it, then 39 00:01:48,390 --> 00:01:52,140 perhaps we can do policies to improve their behaviors. 40 00:01:52,140 --> 00:01:55,500 And in some cases, that's straightforward. 41 00:01:55,500 --> 00:01:59,460 If you think people have biased beliefs or just 42 00:01:59,460 --> 00:02:02,190 have wrong information based on which they make information, 43 00:02:02,190 --> 00:02:04,440 then you might say, well, it should have just improved 44 00:02:04,440 --> 00:02:07,260 their information, and then they make better choices. 45 00:02:07,260 --> 00:02:09,330 In other situations, it's much trickier, 46 00:02:09,330 --> 00:02:12,180 because A, people might have certain preferences that 47 00:02:12,180 --> 00:02:14,100 are not stable over time, and then 48 00:02:14,100 --> 00:02:18,090 it's very tricky to figure out what is the objective function 49 00:02:18,090 --> 00:02:20,710 that we should maximize. 50 00:02:20,710 --> 00:02:25,530 Second, it might be tricky because who knows what we 51 00:02:25,530 --> 00:02:27,420 know versus what people know? 52 00:02:27,420 --> 00:02:30,030 So who knows that the government or anybody else 53 00:02:30,030 --> 00:02:33,480 knows actually better what's good for other people? 54 00:02:33,480 --> 00:02:35,640 So in some sense, if you want to intervene, 55 00:02:35,640 --> 00:02:38,580 you better be reasonably confident that what you're 56 00:02:38,580 --> 00:02:41,610 doing is actually well founded and you 57 00:02:41,610 --> 00:02:43,230 know better what's good for people 58 00:02:43,230 --> 00:02:44,670 than they know about themselves. 59 00:02:44,670 --> 00:02:47,070 And that gets you in tricky situations, 60 00:02:47,070 --> 00:02:49,010 because potentially, if you get it wrong, 61 00:02:49,010 --> 00:02:50,970 or maybe if you just misunderstand 62 00:02:50,970 --> 00:02:54,630 people's preferences, or their choices, or behaviors, 63 00:02:54,630 --> 00:02:56,070 then you might intervene and make 64 00:02:56,070 --> 00:02:59,700 things actually [AUDIO OUT] because you just misunderstood 65 00:02:59,700 --> 00:03:01,670 [AUDIO OUT]. 66 00:03:01,670 --> 00:03:02,640 OK. 67 00:03:02,640 --> 00:03:08,460 So what is rationality in classical economics? 68 00:03:08,460 --> 00:03:10,740 How do we [AUDIO OUT]? 69 00:03:10,740 --> 00:03:14,550 So broadly, we think that beliefs, preferences, 70 00:03:14,550 --> 00:03:19,680 and actions are rational if they are mutually consistent. 71 00:03:19,680 --> 00:03:20,900 What do we mean by that? 72 00:03:20,900 --> 00:03:22,770 Well, we mean by that that if we see 73 00:03:22,770 --> 00:03:26,010 certain behavior or certain actions, 74 00:03:26,010 --> 00:03:29,290 we can measure people's beliefs in certain ways. 75 00:03:29,290 --> 00:03:31,440 And then we essentially can find preferences 76 00:03:31,440 --> 00:03:33,870 that could rationalize these actions. 77 00:03:33,870 --> 00:03:36,000 And so that's a very odd definition, 78 00:03:36,000 --> 00:03:39,330 and without that definition, we can essentially 79 00:03:39,330 --> 00:03:43,230 say it's possible to be a rational cocaine addict. 80 00:03:43,230 --> 00:03:45,540 It's possible to rationally commit suicide. 81 00:03:45,540 --> 00:03:48,150 It's possible to rationally marry somebody 82 00:03:48,150 --> 00:03:49,970 who you met six hours ago. 83 00:03:49,970 --> 00:03:52,950 And it's also possible to be rational violent offender. 84 00:03:52,950 --> 00:03:54,810 If there's preferences that people have, 85 00:03:54,810 --> 00:03:57,870 you could write down some preferences that rationalize 86 00:03:57,870 --> 00:03:59,190 all of those things. 87 00:03:59,190 --> 00:04:02,850 And so [AUDIO OUT] in mainstream economics very much 88 00:04:02,850 --> 00:04:05,080 a main assumption. 89 00:04:05,080 --> 00:04:07,530 So the researchers' jobs is to identify 90 00:04:07,530 --> 00:04:11,170 preferences that are consistent with observed human behavior. 91 00:04:11,170 --> 00:04:13,070 So when you see people [AUDIO OUT] way, 92 00:04:13,070 --> 00:04:14,910 so then the question is like, what 93 00:04:14,910 --> 00:04:19,200 are preferences that rationalize that behavior? 94 00:04:19,200 --> 00:04:20,839 Now what's the idea? 95 00:04:20,839 --> 00:04:22,390 We talked about this already before. 96 00:04:22,390 --> 00:04:24,750 So actors make choices, economists 97 00:04:24,750 --> 00:04:27,630 observe their choices, and then [AUDIO OUT] 98 00:04:27,630 --> 00:04:33,720 generate these choices if an actor were perfectly rational. 99 00:04:33,720 --> 00:04:37,080 [AUDIO OUT] tend to give these imputed preferences 100 00:04:37,080 --> 00:04:38,010 normative meaning. 101 00:04:38,010 --> 00:04:41,070 That's essentially, if I know your preferences, that 102 00:04:41,070 --> 00:04:42,375 [AUDIO OUT]. 103 00:04:42,375 --> 00:04:47,610 And that's what you should be doing, as in, if you prefer B, 104 00:04:47,610 --> 00:04:49,980 I should be helping you to do that. 105 00:04:49,980 --> 00:04:53,610 That would be a good thing for you. 106 00:04:53,610 --> 00:04:56,910 Now just to give you an example of where 107 00:04:56,910 --> 00:04:59,115 this could lead you potentially astray, 108 00:04:59,115 --> 00:05:02,940 or where you might argue, is that really rational 109 00:05:02,940 --> 00:05:06,900 versus not, for example, take Jack, who prefers 110 00:05:06,900 --> 00:05:09,000 taking cocaine over quitting. 111 00:05:09,000 --> 00:05:13,915 Jack has lots of speeches about wanting to quit, but in a way, 112 00:05:13,915 --> 00:05:15,040 he doesn't actually act on. 113 00:05:15,040 --> 00:05:18,150 That's just cheap talk. 114 00:05:18,150 --> 00:05:20,760 Jack might, in fact, be [AUDIO OUT] if he were clean. 115 00:05:20,760 --> 00:05:23,045 But getting clean is too costly, because there 116 00:05:23,045 --> 00:05:25,590 are withdrawal costs. 117 00:05:25,590 --> 00:05:30,060 Jack probably didn't [INAUDIBLE] [AUDIO OUT] tried cocaine, 118 00:05:30,060 --> 00:05:32,460 but this bad outcome was sufficiently unlikely that his 119 00:05:32,460 --> 00:05:34,830 early experiments with it made sense. 120 00:05:34,830 --> 00:05:37,560 So finally, cocaine should be legalized unless it 121 00:05:37,560 --> 00:05:39,120 generates externalities. 122 00:05:39,120 --> 00:05:42,210 That's essentially, people make choices. 123 00:05:42,210 --> 00:05:45,720 They make-- if Jack started taking it, 124 00:05:45,720 --> 00:05:49,605 it must have been that he preferred cocaine 125 00:05:49,605 --> 00:05:52,260 over not taking cocaine, even though there's 126 00:05:52,260 --> 00:05:54,180 a chance that [AUDIO OUT] get really addicted 127 00:05:54,180 --> 00:05:56,220 and really be miserable now. 128 00:05:56,220 --> 00:06:00,570 And probably even if he keeps taking cocaine and [AUDIO OUT] 129 00:06:00,570 --> 00:06:02,730 to quit-- well, if he doesn't do it, 130 00:06:02,730 --> 00:06:05,600 well, then it must be a rational choice for him 131 00:06:05,600 --> 00:06:08,430 to, not because it must be that the withdrawal 132 00:06:08,430 --> 00:06:10,110 costs are really high. 133 00:06:10,110 --> 00:06:12,690 Now we talked a little bit about this theory 134 00:06:12,690 --> 00:06:15,750 of rational addiction, much about Becker and Murphy. 135 00:06:15,750 --> 00:06:18,690 We talked about already that there's 136 00:06:18,690 --> 00:06:21,340 present bias or other factors involved, 137 00:06:21,340 --> 00:06:22,500 including biased beliefs. 138 00:06:22,500 --> 00:06:24,150 That might not be the case. 139 00:06:24,150 --> 00:06:27,450 [INAUDIBLE] understand that traditional economics, 140 00:06:27,450 --> 00:06:31,680 until very recently, would say that the rational thing to do 141 00:06:31,680 --> 00:06:33,810 with Jack here-- 142 00:06:33,810 --> 00:06:41,940 in fact, no good reason to make certain or any drugs illegal on 143 00:06:41,940 --> 00:06:42,780 externalities. 144 00:06:42,780 --> 00:06:45,285 So externalities would be like if Jack causes 145 00:06:45,285 --> 00:06:49,320 behavior that's bad for others. 146 00:06:49,320 --> 00:06:51,270 Then well, we shouldn't let him do 147 00:06:51,270 --> 00:06:54,590 that, because other people are harmed 148 00:06:54,590 --> 00:06:56,700 and you might not internalize [INAUDIBLE].. 149 00:06:56,700 --> 00:07:01,900 But unless that's the case, we shouldn't intervene in any way. 150 00:07:01,900 --> 00:07:06,910 OK, but so now, you might say, but what did we talk about 151 00:07:06,910 --> 00:07:09,110 in all [INAUDIBLE]? 152 00:07:09,110 --> 00:07:12,460 How is that consistent with what we talked about before? 153 00:07:12,460 --> 00:07:15,610 And the key of that question is the question of, 154 00:07:15,610 --> 00:07:18,830 do people act in their best interest? 155 00:07:18,830 --> 00:07:22,150 And so what we essentially assume often 156 00:07:22,150 --> 00:07:24,640 is that there is a rational relationship between people's 157 00:07:24,640 --> 00:07:28,540 choices, what they actually do, and the hedonic consequences 158 00:07:28,540 --> 00:07:30,760 of those choices, which are true well-being. 159 00:07:30,760 --> 00:07:32,260 And so what we assume is essentially 160 00:07:32,260 --> 00:07:35,500 that you make certain choices that maximize your well-being. 161 00:07:35,500 --> 00:07:38,110 Essentially what you learned in 14.01 or 14.03 162 00:07:38,110 --> 00:07:40,450 is essentially utility maximization, 163 00:07:40,450 --> 00:07:44,620 and that's at the heart of the rationality assumptions. 164 00:07:44,620 --> 00:07:46,073 So economists then tend to believe 165 00:07:46,073 --> 00:07:48,490 that most of the time, people act, at least approximately, 166 00:07:48,490 --> 00:07:50,650 in their best interest. 167 00:07:50,650 --> 00:07:53,110 Now as we talked about already, and this 168 00:07:53,110 --> 00:07:58,603 is our previous lecture about Nisbett and Wilson 169 00:07:58,603 --> 00:08:01,020 telling more than we can know, we should be actually quite 170 00:08:01,020 --> 00:08:02,700 skeptical about the fact that people actually 171 00:08:02,700 --> 00:08:04,290 know what they're doing and why they're doing it, 172 00:08:04,290 --> 00:08:07,200 because [INAUDIBLE],, as I showed you last time, in some sense, 173 00:08:07,200 --> 00:08:09,450 people often seem to have essentially no clue 174 00:08:09,450 --> 00:08:10,988 of why they do what they do. 175 00:08:10,988 --> 00:08:12,780 So it seems like a bit of an odd assumption 176 00:08:12,780 --> 00:08:16,890 to say, well, people are maximizing 177 00:08:16,890 --> 00:08:18,880 what's best for them. 178 00:08:18,880 --> 00:08:21,510 So now how can we check whether this assumption is appropriate? 179 00:08:21,510 --> 00:08:22,080 What would you do? 180 00:08:22,080 --> 00:08:23,640 So I'm telling you here some behaviors. 181 00:08:23,640 --> 00:08:25,260 How can you actually check whether this assumption 182 00:08:25,260 --> 00:08:26,177 is an appropriate one? 183 00:08:47,646 --> 00:08:48,660 Lauren. 184 00:08:48,660 --> 00:08:51,120 AUDIENCE: We could check their well-being 185 00:08:51,120 --> 00:08:54,300 after they make these decisions and if they're 186 00:08:54,300 --> 00:08:57,180 better or worse off. 187 00:08:57,180 --> 00:08:59,670 FRANK SCHILBACH: Yeah, and we have to somehow figure out 188 00:08:59,670 --> 00:09:03,750 some situations where we can track people's well-being. 189 00:09:03,750 --> 00:09:06,840 Ideally, we would have some sort of experiment. 190 00:09:06,840 --> 00:09:10,560 There's group A and B, and in group A [INAUDIBLE],, 191 00:09:10,560 --> 00:09:12,760 you let them choose, and in the other groups-- 192 00:09:12,760 --> 00:09:15,220 [INAUDIBLE] group C-- in some group, 193 00:09:15,220 --> 00:09:17,828 we essentially tell people, you should be doing A, 194 00:09:17,828 --> 00:09:19,620 doing one thing, and another group, we say, 195 00:09:19,620 --> 00:09:21,030 you should be doing another thing. 196 00:09:21,030 --> 00:09:22,470 And then there's another group where we 197 00:09:22,470 --> 00:09:23,762 have people essentially choose. 198 00:09:23,762 --> 00:09:24,845 And then you can look at-- 199 00:09:24,845 --> 00:09:26,040 track people's well-being. 200 00:09:26,040 --> 00:09:27,960 And then we could figure out, is the group 201 00:09:27,960 --> 00:09:29,760 that gets to choose actually doing better 202 00:09:29,760 --> 00:09:32,052 than maybe some groups where we actually just tell them 203 00:09:32,052 --> 00:09:33,390 what to do? 204 00:09:33,390 --> 00:09:39,450 And in some ways, or the work by Dan Ariely and others, 205 00:09:39,450 --> 00:09:41,460 if I can essentially manipulate you 206 00:09:41,460 --> 00:09:44,820 into making very drastically different choices by changing 207 00:09:44,820 --> 00:09:46,410 small things that really, arguably, 208 00:09:46,410 --> 00:09:48,270 should be irrelevant for your choice, 209 00:09:48,270 --> 00:09:51,000 like for example, the Social Security number-- 210 00:09:51,000 --> 00:09:52,800 for example, if I could manipulate 211 00:09:52,800 --> 00:09:55,750 you with anchoring you with your social security number 212 00:09:55,750 --> 00:09:58,920 and then say, would you like to purchase this item for $10, 213 00:09:58,920 --> 00:10:01,440 yes or no, and that affects your choice, 214 00:10:01,440 --> 00:10:04,180 well, then presumably, that's a clue in some ways 215 00:10:04,180 --> 00:10:06,780 that something is not going right. 216 00:10:06,780 --> 00:10:09,630 And so you are not necessarily optimizing, 217 00:10:09,630 --> 00:10:12,690 precisely because the Social Security number really 218 00:10:12,690 --> 00:10:15,042 should be entirely irrelevant, arguably. 219 00:10:15,042 --> 00:10:16,500 Some forms of what we talked about, 220 00:10:16,500 --> 00:10:19,140 like preference reversal, or some form of demand 221 00:10:19,140 --> 00:10:20,573 from commitment or the like. 222 00:10:20,573 --> 00:10:22,740 If that's the case, it seems to be that then there's 223 00:10:22,740 --> 00:10:25,530 self control problems involved or biased beliefs 224 00:10:25,530 --> 00:10:27,490 in various ways that we all talked about, 225 00:10:27,490 --> 00:10:29,550 which seems then sent to say, well, you 226 00:10:29,550 --> 00:10:30,900 keep saying you want one thing. 227 00:10:30,900 --> 00:10:34,610 You keep spending money on, say, healthy eating, but then you 228 00:10:34,610 --> 00:10:35,440 never-- 229 00:10:35,440 --> 00:10:37,410 the food all goes bad in your fridge, 230 00:10:37,410 --> 00:10:39,545 and you keep eating donuts. 231 00:10:39,545 --> 00:10:40,920 That seems to be something that's 232 00:10:40,920 --> 00:10:45,425 not quite-- don't seem to be maximizing your utility. 233 00:10:45,425 --> 00:10:46,800 But in general, it would be great 234 00:10:46,800 --> 00:10:48,383 if you could measure behavior and then 235 00:10:48,383 --> 00:10:50,640 the hedonic consequences of those behaviors, people's 236 00:10:50,640 --> 00:10:53,020 well-being, and if you then could say, 237 00:10:53,020 --> 00:10:56,850 well, option A or option B is better for people in general, 238 00:10:56,850 --> 00:10:59,190 or specifically when you ask them to choose, 239 00:10:59,190 --> 00:11:02,570 they can figure out what's best for them. 240 00:11:02,570 --> 00:11:05,420 Notice that you need to know the counterfactual here. 241 00:11:05,420 --> 00:11:08,420 You can't-- if I see you doing one thing and then track 242 00:11:08,420 --> 00:11:10,730 your well-being, it's not enough, 243 00:11:10,730 --> 00:11:12,723 because I don't know the counterfactual, 244 00:11:12,723 --> 00:11:14,890 whether you would be happy otherwise if you were not 245 00:11:14,890 --> 00:11:17,550 to choose that, OK. 246 00:11:17,550 --> 00:11:19,760 So great idea. 247 00:11:19,760 --> 00:11:21,330 So now, how do we actually do that? 248 00:11:21,330 --> 00:11:25,400 So it's useful to have some terminology here 249 00:11:25,400 --> 00:11:27,940 that, in particular, that Kahneman and others have 250 00:11:27,940 --> 00:11:28,630 introduced. 251 00:11:28,630 --> 00:11:33,290 And Kahneman, remember, this is from Tversky and Kahneman, 252 00:11:33,290 --> 00:11:35,450 the Nobel Prize winner, who worked 253 00:11:35,450 --> 00:11:39,080 a lot on loss aversion and prospect theory 254 00:11:39,080 --> 00:11:40,377 that we talked about. 255 00:11:40,377 --> 00:11:42,710 Then Kahneman also has quite a bit of work on well-being 256 00:11:42,710 --> 00:11:44,370 and happiness and so on. 257 00:11:44,370 --> 00:11:47,700 So and he introduces very useful terminology, 258 00:11:47,700 --> 00:11:51,440 which is-- they call decision utility and experience utility. 259 00:11:51,440 --> 00:11:53,330 So now what's decision utility? 260 00:11:53,330 --> 00:11:56,060 So economists tend to use the word utility or utility 261 00:11:56,060 --> 00:11:58,940 function to describe the preferences that rationalize 262 00:11:58,940 --> 00:12:00,620 observed choices. 263 00:12:00,620 --> 00:12:03,810 Essentially, if you took 14.01, 14.03, et cetera, 264 00:12:03,810 --> 00:12:05,720 people talk about the utility function. 265 00:12:05,720 --> 00:12:09,830 Essentially, it's the function that you write down that maps 266 00:12:09,830 --> 00:12:13,952 your choices into well-being. 267 00:12:13,952 --> 00:12:15,410 And the function that you maximize, 268 00:12:15,410 --> 00:12:19,040 or by maximizing it, essentially say, well, given this utility 269 00:12:19,040 --> 00:12:21,620 function, given your behavior, it 270 00:12:21,620 --> 00:12:24,620 must be this utility function or that utility function 271 00:12:24,620 --> 00:12:26,510 allows me to rationalize what you're doing. 272 00:12:26,510 --> 00:12:28,580 That actually allows me then to explain why 273 00:12:28,580 --> 00:12:30,800 you choose apples over bananas. 274 00:12:30,800 --> 00:12:34,040 And so Kahneman calls these revealed preferences 275 00:12:34,040 --> 00:12:35,720 the decision utility. 276 00:12:35,720 --> 00:12:38,810 That's essentially the preferences 277 00:12:38,810 --> 00:12:39,980 that rationalize decisions. 278 00:12:39,980 --> 00:12:43,940 When people make choices, that's the utility that they perceive 279 00:12:43,940 --> 00:12:48,260 or that they think they will perceive or receive 280 00:12:48,260 --> 00:12:50,150 as a consequence of their choices. 281 00:12:50,150 --> 00:12:55,920 That's essentially-- you might call that wanting or choosing. 282 00:12:55,920 --> 00:12:59,760 So quickly, for an addict, the decision utility 283 00:12:59,760 --> 00:13:03,720 of drug consumption exceeds the decision utility of quitting. 284 00:13:03,720 --> 00:13:08,670 So if that person makes the decision of taking drugs, 285 00:13:08,670 --> 00:13:11,970 it must be that their decision utility is higher. 286 00:13:11,970 --> 00:13:16,500 When making those choices, they prefer one option 287 00:13:16,500 --> 00:13:18,870 over the other, which, again, doesn't necessarily 288 00:13:18,870 --> 00:13:22,113 then mean that afterwards, the experience is actually better. 289 00:13:22,113 --> 00:13:24,030 But when making the choice of taking the drug, 290 00:13:24,030 --> 00:13:27,180 they think that's a better option for them. 291 00:13:27,180 --> 00:13:30,540 Now in contrast, Kahneman also talks 292 00:13:30,540 --> 00:13:34,920 about experience neutrality, which implies or talks 293 00:13:34,920 --> 00:13:37,830 about the hedonic consequences of choices. 294 00:13:37,830 --> 00:13:42,510 He calls these hedonic experiences experience utility. 295 00:13:42,510 --> 00:13:44,550 That's essentially the preferences 296 00:13:44,550 --> 00:13:46,620 that coincide with doing. 297 00:13:46,620 --> 00:13:49,950 So while you actually experience the whole consequences 298 00:13:49,950 --> 00:13:53,190 of your choice, what experience do you 299 00:13:53,190 --> 00:13:55,680 have, and what utility, or what's your well-being 300 00:13:55,680 --> 00:13:56,490 during that? 301 00:13:56,490 --> 00:13:58,380 That's actually going back a long time ago. 302 00:13:58,380 --> 00:14:02,280 If you look at some of the wards in economics, Bentham 303 00:14:02,280 --> 00:14:05,730 and others, they talk a lot about pleasure and pain, which 304 00:14:05,730 --> 00:14:07,230 is very much about the experience 305 00:14:07,230 --> 00:14:09,990 people have in their life, as opposed to utility, 306 00:14:09,990 --> 00:14:13,860 which is, again, the construct that you have at the time 307 00:14:13,860 --> 00:14:15,420 of [INAUDIBLE]. 308 00:14:15,420 --> 00:14:19,290 Now how do we measure these hedonic experiences 309 00:14:19,290 --> 00:14:20,723 of well-being? 310 00:14:20,723 --> 00:14:22,140 So there's two questions, I guess. 311 00:14:22,140 --> 00:14:23,710 One is, how can we measure this? 312 00:14:23,710 --> 00:14:26,610 If I wanted to know, what is your hedonic experience 313 00:14:26,610 --> 00:14:29,062 of this lecture right now, how would I do this? 314 00:14:29,062 --> 00:14:31,020 How do you enjoy, how much do you enjoy things? 315 00:14:31,020 --> 00:14:34,860 And then second, well, I can measure your experience 316 00:14:34,860 --> 00:14:37,340 right now, maybe this moment, maybe this hour, 317 00:14:37,340 --> 00:14:40,210 but somehow I need to aggregate those over time. 318 00:14:40,210 --> 00:14:43,847 I need to know, you took class A versus class B. 319 00:14:43,847 --> 00:14:45,930 Now I can measure you at different points of time, 320 00:14:45,930 --> 00:14:47,580 but somehow, then I need to aggregate 321 00:14:47,580 --> 00:14:49,020 this overall experience. 322 00:14:49,020 --> 00:14:51,640 And how people are doing that is [INAUDIBLE] some questions. 323 00:14:51,640 --> 00:14:53,140 Let's start with the first question, 324 00:14:53,140 --> 00:14:56,020 how do you measure people's well-being? 325 00:14:56,020 --> 00:14:58,540 If you wanted to learn about your classmates' 326 00:14:58,540 --> 00:15:01,630 experience in any type of situations, what would you do? 327 00:15:01,630 --> 00:15:03,610 What would you measure? 328 00:15:03,610 --> 00:15:06,520 One is you can ask them verbally, like, 329 00:15:06,520 --> 00:15:09,220 how did you like activity A, how do you like B, 330 00:15:09,220 --> 00:15:12,460 or feel [AUDIO OUT] now? 331 00:15:12,460 --> 00:15:15,050 So this is asking what people want. 332 00:15:15,050 --> 00:15:20,950 But also just look at their facial features or the like 333 00:15:20,950 --> 00:15:27,290 and try to see how much they're smiling or the like. 334 00:15:27,290 --> 00:15:30,280 Notice that that, you need to map into some happiness 335 00:15:30,280 --> 00:15:31,240 as well. 336 00:15:31,240 --> 00:15:35,140 In some sense, I could be smiling but be deeply unhappy, 337 00:15:35,140 --> 00:15:37,210 and then you would misinterpret that. 338 00:15:37,210 --> 00:15:41,170 So you'd have to have some function from facial feature 339 00:15:41,170 --> 00:15:44,020 to some underlying well-being that's somewhat 340 00:15:44,020 --> 00:15:45,370 grounded in some assumptions. 341 00:15:45,370 --> 00:15:47,787 It seems reasonable to think that when people are smiling, 342 00:15:47,787 --> 00:15:50,170 they're happy, and when they're crying, they're unhappy, 343 00:15:50,170 --> 00:15:54,520 but I'm just pointing out that that requires some underlying 344 00:15:54,520 --> 00:15:56,230 data to be able to map those things. 345 00:15:56,230 --> 00:16:00,700 One is the physiological measures such as heart rates, 346 00:16:00,700 --> 00:16:04,310 [INAUDIBLE], et cetera. 347 00:16:04,310 --> 00:16:07,780 And so there, again, I think commonsensical stuff, 348 00:16:07,780 --> 00:16:11,560 like if you're really afraid and your heart goes up, presumably 349 00:16:11,560 --> 00:16:12,980 that's not a good thing. 350 00:16:12,980 --> 00:16:15,938 But if you're really excited and your heart rate goes up, 351 00:16:15,938 --> 00:16:17,230 presumably that's a good thing. 352 00:16:17,230 --> 00:16:20,410 So one has to add some assumptions or some structure 353 00:16:20,410 --> 00:16:21,640 on these measures. 354 00:16:21,640 --> 00:16:25,690 In some sense, you're just measuring them by themselves. 355 00:16:25,690 --> 00:16:27,760 In general, it might be misleading, 356 00:16:27,760 --> 00:16:31,793 but with some structure, [INAUDIBLE] use that well. 357 00:16:31,793 --> 00:16:34,210 The other part is measuring-- there's lots of neuroscience 358 00:16:34,210 --> 00:16:35,950 and other work on this. 359 00:16:35,950 --> 00:16:38,500 Measuring people's brain and parts of the brain, 360 00:16:38,500 --> 00:16:42,710 and seeing which parts light up at which times 361 00:16:42,710 --> 00:16:43,820 could be another option. 362 00:16:43,820 --> 00:16:46,120 Of course, again there, you need to map that 363 00:16:46,120 --> 00:16:47,560 into something else. 364 00:16:47,560 --> 00:16:49,030 So you need to then be able to say, 365 00:16:49,030 --> 00:16:52,210 suppose I do something nice for you or [INAUDIBLE],, 366 00:16:52,210 --> 00:16:55,540 and I can then figure out which parts of the brain 367 00:16:55,540 --> 00:16:57,400 are affected. 368 00:16:57,400 --> 00:17:00,100 And then I can use that then to later, 369 00:17:00,100 --> 00:17:02,350 now if I wanted to figure out, suppose I could measure 370 00:17:02,350 --> 00:17:04,390 your brains during lecture. 371 00:17:04,390 --> 00:17:06,760 If I had figured out earlier which parts of the brain 372 00:17:06,760 --> 00:17:09,069 are associated with happy versus unhappy things, 373 00:17:09,069 --> 00:17:11,015 and now I could see what your brains are doing 374 00:17:11,015 --> 00:17:12,640 during lecture, I could then figure out 375 00:17:12,640 --> 00:17:14,589 whether you are happy or unhappy. 376 00:17:14,589 --> 00:17:19,720 But notice you need to have some mapping, because otherwise, 377 00:17:19,720 --> 00:17:23,170 it's hard to interpret what's going on. 378 00:17:23,170 --> 00:17:25,810 But of course, people have already worked a lot 379 00:17:25,810 --> 00:17:27,549 on that, that kind of mapping. 380 00:17:27,549 --> 00:17:30,160 And just to be clear, so the revealed 381 00:17:30,160 --> 00:17:34,384 preference measures are-- 382 00:17:34,384 --> 00:17:36,540 hold on, let me just share what I have here. 383 00:17:36,540 --> 00:17:38,040 But the revealed preference measures 384 00:17:38,040 --> 00:17:41,070 are useful for some things, but in a way, 385 00:17:41,070 --> 00:17:44,850 if you wanted to measure whether revealed preferences give you 386 00:17:44,850 --> 00:17:48,930 a good answer, then of course, the real preference measure 387 00:17:48,930 --> 00:17:49,800 is not helpful. 388 00:17:49,800 --> 00:17:53,820 Or put differently, you want to check precisely. 389 00:17:53,820 --> 00:17:58,020 Suppose if I ask you, do you want lecture A or lecture B, 390 00:17:58,020 --> 00:18:01,230 and then everybody says they want lecture A-- or whatever, 391 00:18:01,230 --> 00:18:05,340 good A or good B, and everybody says they want good A-- 392 00:18:05,340 --> 00:18:08,010 the econ assumption is, by revealed preference, 393 00:18:08,010 --> 00:18:11,160 it must be that A makes you happier than B. 394 00:18:11,160 --> 00:18:14,730 But if I wanted to the test, is that actually true, then I 395 00:18:14,730 --> 00:18:16,980 need to have your revealed preference, 396 00:18:16,980 --> 00:18:19,080 but then I also need to know your counterfactual 397 00:18:19,080 --> 00:18:19,663 in some sense. 398 00:18:19,663 --> 00:18:21,930 So I'd say give you good A, and I'd give you good B. 399 00:18:21,930 --> 00:18:24,090 How would that make you? 400 00:18:24,090 --> 00:18:25,590 So I need the measures that we have 401 00:18:25,590 --> 00:18:27,890 here in the slides, a version of that, 402 00:18:27,890 --> 00:18:30,600 to say, OK, I'm going to look at your facial expression 403 00:18:30,600 --> 00:18:32,490 and see how happy you look when I give you 404 00:18:32,490 --> 00:18:35,258 good A versus good B. Or I look at your brain 405 00:18:35,258 --> 00:18:37,050 and see which parts of it-- do the pleasure 406 00:18:37,050 --> 00:18:39,785 parts of your brain light up? 407 00:18:39,785 --> 00:18:41,160 And if that's the case, I'm going 408 00:18:41,160 --> 00:18:44,040 to conclude that you're happier with good A versus good B, 409 00:18:44,040 --> 00:18:46,470 or if that's more so the case for good A. 410 00:18:46,470 --> 00:18:48,570 And then I can look at your revealed preference. 411 00:18:48,570 --> 00:18:51,690 Is it really the case that you choose A over B, 412 00:18:51,690 --> 00:18:56,910 if the experienced utility says A makes you happier than B? 413 00:18:56,910 --> 00:18:59,790 Does that make sense? 414 00:18:59,790 --> 00:19:00,300 OK. 415 00:19:00,300 --> 00:19:02,750 So what are these techniques? 416 00:19:02,750 --> 00:19:05,760 [AUDIO OUT] I think we covered most of [AUDIO OUT].. 417 00:19:05,760 --> 00:19:10,660 There's the observer ratings, facial [AUDIO OUT],, 418 00:19:10,660 --> 00:19:13,570 reports of mood, pain, pleasure, happiness, and so on. 419 00:19:13,570 --> 00:19:15,420 This is what [AUDIO OUT]. 420 00:19:15,420 --> 00:19:19,510 There's autonomic measures, which 421 00:19:19,510 --> 00:19:22,600 is the respiratory, cardiovascular, your blood 422 00:19:22,600 --> 00:19:26,740 pressure, or your pulse, and so on, and so forth-- sweat. 423 00:19:26,740 --> 00:19:29,497 There's vocal measures as well, which is pitch, loudness, tone. 424 00:19:29,497 --> 00:19:31,330 And you can probably detect whether somebody 425 00:19:31,330 --> 00:19:32,710 is angry or excited and so on. 426 00:19:32,710 --> 00:19:34,990 And again, you can apply some mapping here. 427 00:19:34,990 --> 00:19:37,540 You can look at people's brains. 428 00:19:37,540 --> 00:19:40,900 There's also some responses to emotion sensitive tasks. 429 00:19:40,900 --> 00:19:44,950 So if I ask you, do you want to talk to a friend 430 00:19:44,950 --> 00:19:47,050 or do you want us to do something nice, 431 00:19:47,050 --> 00:19:49,560 and if the answer is, no, absolutely not, 432 00:19:49,560 --> 00:19:51,340 it could obviously be you're busy. 433 00:19:51,340 --> 00:19:53,680 But when people are in a bad mood, 434 00:19:53,680 --> 00:19:55,840 they tend to not want to do things that 435 00:19:55,840 --> 00:20:01,010 likely should make them happy. 436 00:20:01,010 --> 00:20:03,080 OK, so we can try to do that. 437 00:20:03,080 --> 00:20:07,250 Now why might decision utility and experienced utility differ? 438 00:20:07,250 --> 00:20:09,500 That's of course what we discussed 439 00:20:09,500 --> 00:20:13,040 at length in the class already. 440 00:20:13,040 --> 00:20:17,442 There's various different explanations. 441 00:20:17,442 --> 00:20:19,650 One is-- we're going to talk about this a little bit, 442 00:20:19,650 --> 00:20:22,530 which is inaccurate memories of past experiences. 443 00:20:22,530 --> 00:20:26,030 So you might just misremember your past experiences, 444 00:20:26,030 --> 00:20:28,280 and based on that, you might have 445 00:20:28,280 --> 00:20:32,900 poor forecasts of your future preferences or your utility. 446 00:20:32,900 --> 00:20:36,350 So this is essentially saying people's beliefs might 447 00:20:36,350 --> 00:20:38,450 be wrong in various ways, perhaps 448 00:20:38,450 --> 00:20:43,220 because people's memories are tricking them in some ways. 449 00:20:43,220 --> 00:20:44,930 People might also have wrong beliefs 450 00:20:44,930 --> 00:20:49,940 because they have some failures to anticipate adaptation, which 451 00:20:49,940 --> 00:20:51,573 is very much like projection bias 452 00:20:51,573 --> 00:20:53,240 that we talked about already previously. 453 00:20:53,240 --> 00:20:55,578 I'm going to talk about this a little bit more. 454 00:20:55,578 --> 00:20:57,620 There could be things like emotions, which again, 455 00:20:57,620 --> 00:21:01,830 is in some ways often like projection bias, which 456 00:21:01,830 --> 00:21:07,560 is about things like people might be really angry, 457 00:21:07,560 --> 00:21:10,400 or people might be hungry. 458 00:21:10,400 --> 00:21:14,235 [AUDIO OUT] essentially it's very hard for them to make 459 00:21:14,235 --> 00:21:18,470 certain choices, failing to anticipate their anger 460 00:21:18,470 --> 00:21:24,283 or their hunger or other impulses [INAUDIBLE] over time. 461 00:21:24,283 --> 00:21:25,950 There's plenty funding of other reasons, 462 00:21:25,950 --> 00:21:28,313 but we talked about this quite a lot. 463 00:21:28,313 --> 00:21:29,730 And so you could think about a lot 464 00:21:29,730 --> 00:21:31,320 of what we discussed in the course 465 00:21:31,320 --> 00:21:34,050 about disconnects between decision utility 466 00:21:34,050 --> 00:21:37,310 and experienced utility. 467 00:21:37,310 --> 00:21:42,730 Now I mentioned this already a little bit. 468 00:21:42,730 --> 00:21:46,150 The third part in people's utility 469 00:21:46,150 --> 00:21:49,180 is what Kahneman would call remembered utility, which 470 00:21:49,180 --> 00:21:51,160 essentially is to say, well, when you think 471 00:21:51,160 --> 00:21:54,730 about a past experience, you might not actually 472 00:21:54,730 --> 00:21:56,530 evaluate it correctly. 473 00:21:56,530 --> 00:21:57,760 In a sense, you might-- 474 00:21:57,760 --> 00:22:03,980 as I said, I might look at your experienced utility using 475 00:22:03,980 --> 00:22:05,900 these kinds of features, and then 476 00:22:05,900 --> 00:22:07,670 I might ask you afterwards, how did you 477 00:22:07,670 --> 00:22:09,140 like certain experiences? 478 00:22:09,140 --> 00:22:12,830 And then I can test, in a way, how are you aggregating, 479 00:22:12,830 --> 00:22:15,800 or how appropriately are you aggregating 480 00:22:15,800 --> 00:22:18,440 your experienced utility when I just ask you afterwards, 481 00:22:18,440 --> 00:22:21,830 how did you like certain experiences? 482 00:22:21,830 --> 00:22:25,460 And one quite interesting feature in people's remembered 483 00:22:25,460 --> 00:22:27,764 utility is what's called-- excuse me-- 484 00:22:27,764 --> 00:22:29,660 duration neglect. 485 00:22:29,660 --> 00:22:32,570 People tend to remember the quality, rather than 486 00:22:32,570 --> 00:22:34,760 the length of the experience. 487 00:22:34,760 --> 00:22:38,060 And it seems to be also what's called the peak-end rule, which 488 00:22:38,060 --> 00:22:42,560 essentially says that retrospective evaluations are 489 00:22:42,560 --> 00:22:44,360 very much predicted by two things, 490 00:22:44,360 --> 00:22:48,740 or an average of the peak affective response recorded 491 00:22:48,740 --> 00:22:53,090 during an episode and the end value recorded just 492 00:22:53,090 --> 00:22:55,020 before the termination of the episode. 493 00:22:55,020 --> 00:22:56,270 And how do I think about this? 494 00:22:56,270 --> 00:23:02,000 Well, mostly, I think this as being very salient. 495 00:23:02,000 --> 00:23:03,590 So when you think about and I ask you 496 00:23:03,590 --> 00:23:07,790 about a semester or a certain lecture or a certain experience 497 00:23:07,790 --> 00:23:10,010 that you had, maybe a TV show or the like, 498 00:23:10,010 --> 00:23:12,770 when you think about it and try to remember how much you 499 00:23:12,770 --> 00:23:16,850 enjoyed that experience, well, what's most salient, of course, 500 00:23:16,850 --> 00:23:18,380 is your latest experience. 501 00:23:18,380 --> 00:23:19,730 That's what comes to mind. 502 00:23:19,730 --> 00:23:22,700 The last thing you remember is the last thing you experienced. 503 00:23:22,700 --> 00:23:27,290 And if the last episode of a TV show was terrible, 504 00:23:27,290 --> 00:23:28,480 that's what comes to mind. 505 00:23:28,480 --> 00:23:30,188 And you might say, well, the whole series 506 00:23:30,188 --> 00:23:32,270 was really bad, even though maybe previously, 507 00:23:32,270 --> 00:23:33,690 it was quite good. 508 00:23:33,690 --> 00:23:35,330 And then the other thing, I guess, 509 00:23:35,330 --> 00:23:37,100 that seems to be really important 510 00:23:37,100 --> 00:23:40,235 is the peak, both positive or negative. 511 00:23:40,235 --> 00:23:42,360 So if something was really, really great or really, 512 00:23:42,360 --> 00:23:44,060 really bad, people really, really 513 00:23:44,060 --> 00:23:47,552 remember that, as opposed to stuff in between. 514 00:23:47,552 --> 00:23:49,010 And again, I think that's something 515 00:23:49,010 --> 00:23:51,740 that's just very salient in people's minds, 516 00:23:51,740 --> 00:23:56,450 and then in thinking about these experiences, that's 517 00:23:56,450 --> 00:23:57,560 what comes to mind. 518 00:23:57,560 --> 00:23:59,180 And in studies that-- 519 00:23:59,180 --> 00:24:00,560 so what are these studies doing? 520 00:24:00,560 --> 00:24:03,080 Well, there are these studies that tend to measure 521 00:24:03,080 --> 00:24:05,690 experienced utility in certain ways, 522 00:24:05,690 --> 00:24:11,570 either through just measuring people's-- 523 00:24:11,570 --> 00:24:14,300 tracking people's well-being over time 524 00:24:14,300 --> 00:24:18,102 and then trying to predict their remembered utility, how 525 00:24:18,102 --> 00:24:19,310 did you like that experience. 526 00:24:19,310 --> 00:24:20,420 That's one version. 527 00:24:20,420 --> 00:24:24,670 Another version is-- let me show you. 528 00:24:24,670 --> 00:24:25,990 Hang on one second-- 529 00:24:25,990 --> 00:24:30,010 is where people do certain tasks specifically, 530 00:24:30,010 --> 00:24:32,590 and they manipulate the experience deliberately. 531 00:24:32,590 --> 00:24:34,660 For example, there are these tasks 532 00:24:34,660 --> 00:24:36,940 which is called a cold pressor task, which 533 00:24:36,940 --> 00:24:39,760 is they ask people to put their hands into cold water 534 00:24:39,760 --> 00:24:43,450 and then vary the temperature, or they vary the duration. 535 00:24:43,450 --> 00:24:45,190 And then essentially, you can then 536 00:24:45,190 --> 00:24:49,360 try to predict people's remembered utility 537 00:24:49,360 --> 00:24:53,260 and then show things such as duration neglect or the like. 538 00:24:53,260 --> 00:24:55,870 And let me show you what that looks like. 539 00:24:55,870 --> 00:24:57,940 For example, this is a famous study 540 00:24:57,940 --> 00:25:00,690 by Schreiber and Kahneman. 541 00:25:00,690 --> 00:25:03,180 They did a short trial, which is you 542 00:25:03,180 --> 00:25:08,220 put your hand into 14-degree water 543 00:25:08,220 --> 00:25:10,380 for 60 seconds, which is very cold, 544 00:25:10,380 --> 00:25:14,940 or a long trial where you put it into water for 60 second, 545 00:25:14,940 --> 00:25:17,610 and then the temperature rises to 15 degrees. 546 00:25:17,610 --> 00:25:19,110 I think this-- 547 00:25:19,110 --> 00:25:20,970 I hope this is Celsius, not Fahrenheit. 548 00:25:20,970 --> 00:25:23,520 But anyway, it's cold. 549 00:25:23,520 --> 00:25:25,590 But the point here is that putting 550 00:25:25,590 --> 00:25:30,360 your hand for 60 seconds into 14-degree water, and then 551 00:25:30,360 --> 00:25:32,430 another 30 seconds into 15-degree water, 552 00:25:32,430 --> 00:25:34,230 these are both unpleasant experiences. 553 00:25:34,230 --> 00:25:37,020 So if you did each of these things in separation, 554 00:25:37,020 --> 00:25:43,240 people would say, [AUDIO OUT] just 60 seconds of cold water. 555 00:25:43,240 --> 00:25:47,860 I also don't like 30 seconds of 15-degree water. 556 00:25:47,860 --> 00:25:50,950 These are both unpleasant experiences. 557 00:25:50,950 --> 00:25:52,950 So neither of them are actually desirable. 558 00:25:52,950 --> 00:25:56,633 If you ask people, if you give people [AUDIO OUT] 559 00:25:56,633 --> 00:25:59,050 do you want to experience this again, people say, hell no. 560 00:25:59,050 --> 00:26:02,950 I really don't like this, for both of these, in separation. 561 00:26:02,950 --> 00:26:05,740 But once you put them together, the 30 seconds 562 00:26:05,740 --> 00:26:08,710 of 15-degree water is more pleasant 563 00:26:08,710 --> 00:26:10,630 than the 14-degree water. 564 00:26:10,630 --> 00:26:14,140 So when you do that, people say they prefer the long trial 565 00:26:14,140 --> 00:26:16,735 over the short trial. 566 00:26:16,735 --> 00:26:18,110 And the reason being is what they 567 00:26:18,110 --> 00:26:21,140 remember is the last experience, which 568 00:26:21,140 --> 00:26:24,340 was not that painful, when things got better over time. 569 00:26:24,340 --> 00:26:26,960 So people's remembered utility is 570 00:26:26,960 --> 00:26:31,550 violating any sort of rules of what you would usually 571 00:26:31,550 --> 00:26:34,460 think in terms of rationality and economics, 572 00:26:34,460 --> 00:26:38,120 which is precisely because they remember the end rather 573 00:26:38,120 --> 00:26:39,800 than the overall duration. 574 00:26:39,800 --> 00:26:41,330 So essentially, the utility does not 575 00:26:41,330 --> 00:26:44,120 aggregate in the way we would think they would, 576 00:26:44,120 --> 00:26:47,000 because the experience at each point in time [AUDIO OUT] 577 00:26:47,000 --> 00:26:49,970 negative. 578 00:26:49,970 --> 00:26:53,930 There's more unpleasant studies for colonoscopy. 579 00:26:53,930 --> 00:26:55,970 Don't ask me about how they got ethical approval 580 00:26:55,970 --> 00:26:57,000 for those things. 581 00:26:57,000 --> 00:27:01,850 But anyway, a study [INAUDIBLE] where 582 00:27:01,850 --> 00:27:07,820 getting an extra minute where nothing bad happens. 583 00:27:07,820 --> 00:27:09,560 And essentially, the treatment group 584 00:27:09,560 --> 00:27:13,790 has better experience and memories of those kinds of-- 585 00:27:13,790 --> 00:27:15,350 of the overall experience. 586 00:27:15,350 --> 00:27:17,810 Now that sounds kind of like torturing people, 587 00:27:17,810 --> 00:27:21,380 but in fact, if you think about it, well, if-- 588 00:27:21,380 --> 00:27:25,220 in particular in medical procedures that are unpleasant 589 00:27:25,220 --> 00:27:30,680 overall, if we can help people have better memories of those 590 00:27:30,680 --> 00:27:33,290 experiences, maybe they will be a lot likely 591 00:27:33,290 --> 00:27:36,080 to come back and get future treatments or the like. 592 00:27:36,080 --> 00:27:38,317 So in a way, you might say, what are these studies? 593 00:27:38,317 --> 00:27:39,650 Why are people torturing people? 594 00:27:39,650 --> 00:27:43,040 Why are researchers torturing people? 595 00:27:43,040 --> 00:27:45,390 I agree with that notion, but in fact, 596 00:27:45,390 --> 00:27:47,660 if you think about if you wanted to design, 597 00:27:47,660 --> 00:27:52,010 and if you think people are under-using certain medical 598 00:27:52,010 --> 00:27:54,680 procedures or certain medical tests that they should be 599 00:27:54,680 --> 00:27:55,520 doing-- 600 00:27:55,520 --> 00:27:58,610 think of dentists-- if there's some [AUDIO OUT] 601 00:27:58,610 --> 00:28:01,520 people's overall memory or memory 602 00:28:01,520 --> 00:28:04,040 of their overall experience better by having 603 00:28:04,040 --> 00:28:09,662 the end of that experience slightly better 604 00:28:09,662 --> 00:28:12,860 than the previous parts, that could [AUDIO OUT] 605 00:28:12,860 --> 00:28:17,510 helpful in generatings of behavior that's good for people 606 00:28:17,510 --> 00:28:21,060 in terms of their health, even though, of course, 607 00:28:21,060 --> 00:28:23,373 the last minute of the actual experience 608 00:28:23,373 --> 00:28:24,290 would not be pleasant. 609 00:28:24,290 --> 00:28:26,060 Nobody wants to say, would you rather 610 00:28:26,060 --> 00:28:28,220 have another minute of colonoscopy? 611 00:28:28,220 --> 00:28:29,420 Absolutely not. 612 00:28:29,420 --> 00:28:32,390 But maybe the overall experience that you remember 613 00:28:32,390 --> 00:28:35,390 is, in fact, better than if not. 614 00:28:35,390 --> 00:28:37,030 So there's other types of experience 615 00:28:37,030 --> 00:28:40,115 that shows this duration neglected 616 00:28:40,115 --> 00:28:41,680 and peak-end evaluations. 617 00:28:41,680 --> 00:28:48,520 People have shown pleasant and unpleasant movies or contents 618 00:28:48,520 --> 00:28:50,120 of movies. 619 00:28:50,120 --> 00:28:54,910 They have done these experiments with sounds or noise 620 00:28:54,910 --> 00:28:56,830 of various loudness and duration. 621 00:28:56,830 --> 00:29:02,110 There's also animal [INAUDIBLE],, where essentially animals are 622 00:29:02,110 --> 00:29:05,440 given unpleasant experiences, and then you look at what-- 623 00:29:05,440 --> 00:29:07,100 there's different types of experiences 624 00:29:07,100 --> 00:29:10,653 where the animal can go into one direction or the other, press 625 00:29:10,653 --> 00:29:12,070 one lever versus another, and then 626 00:29:12,070 --> 00:29:14,153 bad stuff happens in both of them, and one of them 627 00:29:14,153 --> 00:29:15,410 is worse than the other. 628 00:29:15,410 --> 00:29:16,993 And then you can look at, essentially, 629 00:29:16,993 --> 00:29:21,940 do animals have similar duration or other peak-end evaluations 630 00:29:21,940 --> 00:29:22,790 of experiences? 631 00:29:22,790 --> 00:29:27,610 And it seems to be very much a phenomenon that's 632 00:29:27,610 --> 00:29:30,160 also true for animals. 633 00:29:30,160 --> 00:29:31,630 Any questions on this? 634 00:29:45,546 --> 00:29:47,190 OK. 635 00:29:47,190 --> 00:29:51,250 So then, of course, if that's the case, 636 00:29:51,250 --> 00:29:56,920 if people remember things differently, 637 00:29:56,920 --> 00:30:00,880 that will lead to violations of the decision utility 638 00:30:00,880 --> 00:30:03,550 maximizing the experienced utility. 639 00:30:03,550 --> 00:30:05,710 If you think about the experienced utility 640 00:30:05,710 --> 00:30:11,590 or the overall experienced utility of a certain choice 641 00:30:11,590 --> 00:30:14,590 that you make, in this case, I guess, the cold pressor task, 642 00:30:14,590 --> 00:30:16,660 think about this like the integral over, 643 00:30:16,660 --> 00:30:19,580 say, in this case, I guess, 90 seconds. 644 00:30:19,580 --> 00:30:21,160 In the first case, it's 60 seconds 645 00:30:21,160 --> 00:30:24,410 plus 30 seconds, 60 seconds feeling bad, 646 00:30:24,410 --> 00:30:27,070 versus the 30 seconds of putting your hand out, 647 00:30:27,070 --> 00:30:30,020 which is somewhat of a neutral experience. 648 00:30:30,020 --> 00:30:33,530 If you're integrating that, presumably, the short trial 649 00:30:33,530 --> 00:30:36,080 would give you a higher valuation 650 00:30:36,080 --> 00:30:41,227 of higher, better, more positive experience overall. 651 00:30:41,227 --> 00:30:43,310 But then if you ask people, which one do you want, 652 00:30:43,310 --> 00:30:47,450 because people remember the long trial as being less bad, 653 00:30:47,450 --> 00:30:52,520 people might choose the longer trial over the short trial 654 00:30:52,520 --> 00:30:53,372 overall. 655 00:30:53,372 --> 00:30:54,830 So then that leads to, essentially, 656 00:30:54,830 --> 00:30:57,178 violations of the decision utility maximizing 657 00:30:57,178 --> 00:30:59,720 the experienced utility, because essentially, people remember 658 00:30:59,720 --> 00:31:04,090 things in a biased way. 659 00:31:04,090 --> 00:31:04,870 OK. 660 00:31:04,870 --> 00:31:08,260 So then we talked about this very briefly. 661 00:31:08,260 --> 00:31:11,620 So so far, we talked about different ways of measuring. 662 00:31:11,620 --> 00:31:14,050 One specific thing that people have done a lot 663 00:31:14,050 --> 00:31:18,130 is just asking people simply about their happiness and life 664 00:31:18,130 --> 00:31:19,940 satisfaction. 665 00:31:19,940 --> 00:31:21,430 So simply, you ask people. 666 00:31:21,430 --> 00:31:23,310 There's different ways. 667 00:31:23,310 --> 00:31:25,570 There are these types of ladder questions, 668 00:31:25,570 --> 00:31:29,260 how satisfied are you with your life as a whole these days? 669 00:31:29,260 --> 00:31:31,300 There's also affect questions about like, 670 00:31:31,300 --> 00:31:33,388 did you experience certain emotions yesterday? 671 00:31:33,388 --> 00:31:34,180 Did you feel angry? 672 00:31:34,180 --> 00:31:34,990 Did you feel happy? 673 00:31:34,990 --> 00:31:35,890 Did you feel sad? 674 00:31:35,890 --> 00:31:36,940 And so on. 675 00:31:36,940 --> 00:31:41,710 The ladder question is a mixture of-- some people think 676 00:31:41,710 --> 00:31:43,240 of this as life satisfaction. 677 00:31:43,240 --> 00:31:45,550 Some people interpret it as happiness. 678 00:31:45,550 --> 00:31:47,710 Others would argue that it's rather 679 00:31:47,710 --> 00:31:49,860 about social comparisons. 680 00:31:49,860 --> 00:31:51,652 So the ladder question has this notion of-- 681 00:31:51,652 --> 00:31:53,693 I'm going to show you the definition in a second, 682 00:31:53,693 --> 00:31:54,580 or in a bit. 683 00:31:54,580 --> 00:31:58,168 The ladder definition is very much just, 684 00:31:58,168 --> 00:32:02,252 on a ladder from 1 to 10, how great could your life be? 685 00:32:02,252 --> 00:32:03,460 Where are you in this ladder? 686 00:32:03,460 --> 00:32:07,340 So it has very much this flavor of some social comparison, 687 00:32:07,340 --> 00:32:11,320 which is not quite ideal, because it very well makes 688 00:32:11,320 --> 00:32:13,817 it like, well, on the ladder, how far and low are you 689 00:32:13,817 --> 00:32:14,400 at the ladder? 690 00:32:14,400 --> 00:32:16,275 It makes it very salient that some people are 691 00:32:16,275 --> 00:32:18,110 higher or lower than others. 692 00:32:18,110 --> 00:32:19,960 So some of the responses that you 693 00:32:19,960 --> 00:32:23,590 see in those kinds of ladder questions are about happiness, 694 00:32:23,590 --> 00:32:26,320 but they potentially are also about people comparing 695 00:32:26,320 --> 00:32:33,250 each other to others in society and so on, which might not 696 00:32:33,250 --> 00:32:36,070 be necessarily the same as, how happy are you right now 697 00:32:36,070 --> 00:32:37,670 [INAUDIBLE]. 698 00:32:37,670 --> 00:32:40,240 Now what's problematic about these happiness questions, 699 00:32:40,240 --> 00:32:45,680 or why are economists skeptical, or what 700 00:32:45,680 --> 00:32:47,010 instances do we [AUDIO OUT]? 701 00:32:59,560 --> 00:33:02,230 So just to be clear, so some people would argue, 702 00:33:02,230 --> 00:33:05,590 we should have a World Happiness Report and GDP. 703 00:33:05,590 --> 00:33:09,730 We should not maximize GDP or look at GDP growth 704 00:33:09,730 --> 00:33:12,070 as an overall performance of society. 705 00:33:12,070 --> 00:33:14,350 Instead, we should measure subjective well-being, 706 00:33:14,350 --> 00:33:16,750 and that's what we should essentially maximize. 707 00:33:16,750 --> 00:33:18,760 And policy should be-- the objective 708 00:33:18,760 --> 00:33:21,620 should be to maximize that. 709 00:33:21,620 --> 00:33:24,450 So what's the pushback on that? 710 00:33:24,450 --> 00:33:27,240 So part of that is people and stuff 711 00:33:27,240 --> 00:33:32,820 that we talked already about, about remembered utility, 712 00:33:32,820 --> 00:33:36,815 or people just might not know what's good for them. 713 00:33:36,815 --> 00:33:38,190 But what's wrong with just asking 714 00:33:38,190 --> 00:33:39,360 people, how happy are you? 715 00:33:39,360 --> 00:33:41,140 And are people just going to tell you, 716 00:33:41,140 --> 00:33:42,515 or what's happening with that? 717 00:33:42,515 --> 00:33:45,040 Yeah, so one problem, broadly speaking, 718 00:33:45,040 --> 00:33:49,817 is when you look at different countries, different societies, 719 00:33:49,817 --> 00:33:51,900 particularly, for example, if you look at Germans, 720 00:33:51,900 --> 00:33:55,638 Germans tend to be pretty grumpy and not super happy. 721 00:33:55,638 --> 00:33:57,180 So when you ask a German, how are you 722 00:33:57,180 --> 00:34:00,840 doing, the typical German would say, OK, by which they mean, 723 00:34:00,840 --> 00:34:02,010 actually pretty good. 724 00:34:02,010 --> 00:34:05,610 If you ask a typical American, how are you doing, 725 00:34:05,610 --> 00:34:06,570 they say, amazing. 726 00:34:06,570 --> 00:34:09,000 They probably mean, pretty good. 727 00:34:09,000 --> 00:34:12,389 So when people give you certain statements, 728 00:34:12,389 --> 00:34:14,800 there's lots of cultural and other aspect 729 00:34:14,800 --> 00:34:17,760 that make comparison, particularly across societies, 730 00:34:17,760 --> 00:34:19,080 kind of tricky. 731 00:34:19,080 --> 00:34:21,239 You might think people in Latin America tend 732 00:34:21,239 --> 00:34:23,270 to report higher levels of happiness 733 00:34:23,270 --> 00:34:25,770 because that's a cultural thing, which might be that they're 734 00:34:25,770 --> 00:34:27,812 actually happier, or it might be that that's just 735 00:34:27,812 --> 00:34:30,780 the thing that you're supposed to exhibit. 736 00:34:30,780 --> 00:34:35,110 While people in some other states, Scandinavia, Germany, 737 00:34:35,110 --> 00:34:37,150 et cetera, might be, in fact, quite happy, 738 00:34:37,150 --> 00:34:41,429 but what they report to you is just some moderate level 739 00:34:41,429 --> 00:34:43,080 of enthusiasm overall. 740 00:34:43,080 --> 00:34:48,927 So one is cultural comparison is quite hard 741 00:34:48,927 --> 00:34:50,219 using those types of questions. 742 00:34:50,219 --> 00:34:52,500 There's lots of Gallup surveys that people 743 00:34:52,500 --> 00:34:55,330 have done at large scales. 744 00:34:55,330 --> 00:34:58,470 It's the Gallup world happiness, or whatever, surveys. 745 00:34:58,470 --> 00:35:01,927 There's huge amounts of data that are being collected. 746 00:35:01,927 --> 00:35:03,760 They're a nationally representative surveys, 747 00:35:03,760 --> 00:35:04,780 and so on and so forth. 748 00:35:04,780 --> 00:35:06,270 So those data are available. 749 00:35:06,270 --> 00:35:10,475 Of course, there's questions of, in some places, 750 00:35:10,475 --> 00:35:11,850 it's much harder to collect these 751 00:35:11,850 --> 00:35:13,820 than in others, and so on and so forth. 752 00:35:13,820 --> 00:35:15,812 And some people don't reply to surveys, 753 00:35:15,812 --> 00:35:16,770 and so on and so forth. 754 00:35:16,770 --> 00:35:19,020 For example, the elderly might be much less 755 00:35:19,020 --> 00:35:22,260 likely to answer some surveys, or some people 756 00:35:22,260 --> 00:35:24,460 without phones or internet, and so on and so forth, 757 00:35:24,460 --> 00:35:25,530 you might not be able to reach them. 758 00:35:25,530 --> 00:35:27,697 But in principle, you can solve all of those issues. 759 00:35:27,697 --> 00:35:29,772 But I think that it's just cost of use. 760 00:35:29,772 --> 00:35:31,980 The reference dependence is a very interesting issue, 761 00:35:31,980 --> 00:35:34,680 because in some ways, you might say, well, now your income, 762 00:35:34,680 --> 00:35:36,600 but then your reference point goes off. 763 00:35:36,600 --> 00:35:39,360 Something happens to your reported happiness? 764 00:35:39,360 --> 00:35:42,220 But presumably you're happier than before? 765 00:35:42,220 --> 00:35:44,190 So that [AUDIO OUT]. 766 00:35:44,190 --> 00:35:47,915 On the other hand, utility might, in fact, 767 00:35:47,915 --> 00:35:49,290 be very much reference-dependent. 768 00:35:49,290 --> 00:35:53,450 For example, you might compare yourself to others. 769 00:35:53,450 --> 00:35:55,270 And then inequality, for example, 770 00:35:55,270 --> 00:35:57,960 might be extremely important for people's happiness. 771 00:35:57,960 --> 00:36:01,090 And one of the things that people argue is that, 772 00:36:01,090 --> 00:36:04,440 for example, [INAUDIBLE] the GDP, also for the US-- 773 00:36:04,440 --> 00:36:06,270 GDP in many places has gone up quite a bit, 774 00:36:06,270 --> 00:36:09,490 but happiness has not increased. 775 00:36:09,490 --> 00:36:12,270 And people argue that's because, in part, inequality also 776 00:36:12,270 --> 00:36:13,200 went up a lot. 777 00:36:13,200 --> 00:36:16,490 And if then inequality really makes people deeply unhappy, 778 00:36:16,490 --> 00:36:19,410 well, that is important, and we should 779 00:36:19,410 --> 00:36:22,830 incorporate that in some ways in our considerations overall. 780 00:36:22,830 --> 00:36:26,860 But you see, that causes tricky issues, because in one case, 781 00:36:26,860 --> 00:36:29,220 doubling everybody's income, and presumably, people 782 00:36:29,220 --> 00:36:32,730 should be happier, because their living standard goes up. 783 00:36:32,730 --> 00:36:36,227 So presumably, there, even if happiness doesn't go up, 784 00:36:36,227 --> 00:36:38,310 presumably, because living standards have gone up, 785 00:36:38,310 --> 00:36:40,560 and something is going wrong if I don't see increased 786 00:36:40,560 --> 00:36:42,600 happiness in terms of measurement-- 787 00:36:42,600 --> 00:36:46,170 on the other hand, there might be very much real issues about, 788 00:36:46,170 --> 00:36:49,830 inequality is quite important, and reference-dependent matters 789 00:36:49,830 --> 00:36:51,930 for people's happiness and well-being, 790 00:36:51,930 --> 00:36:54,480 in the sense that what really might matter a lot 791 00:36:54,480 --> 00:36:57,750 is relative income rather than absolute income, 792 00:36:57,750 --> 00:37:02,530 and we should capture that in some meaningful way. 793 00:37:02,530 --> 00:37:03,030 Oops. 794 00:37:03,030 --> 00:37:04,440 Sorry. 795 00:37:04,440 --> 00:37:06,150 In particular, there's lots of issues 796 00:37:06,150 --> 00:37:08,760 with, depending on what I focus you 797 00:37:08,760 --> 00:37:10,530 on when I ask certain questions, I 798 00:37:10,530 --> 00:37:12,150 can get very, very different answers. 799 00:37:12,150 --> 00:37:14,930 And one, for example, example of this 800 00:37:14,930 --> 00:37:16,800 is a paper by Strack et. al. that 801 00:37:16,800 --> 00:37:19,350 looked at the correlation between general happiness 802 00:37:19,350 --> 00:37:21,270 and happiness with dating. 803 00:37:21,270 --> 00:37:23,375 And so what they did is a very simple thing. 804 00:37:23,375 --> 00:37:24,750 They essentially just manipulated 805 00:37:24,750 --> 00:37:25,868 the order of questions. 806 00:37:25,868 --> 00:37:27,660 So first, they asked people about how happy 807 00:37:27,660 --> 00:37:29,028 are you in general. 808 00:37:29,028 --> 00:37:30,570 And then they asked people, how happy 809 00:37:30,570 --> 00:37:32,063 are you with your dating life? 810 00:37:32,063 --> 00:37:33,480 And in a different subsample, they 811 00:37:33,480 --> 00:37:35,590 asked people, how happy are you with the dating life, 812 00:37:35,590 --> 00:37:37,440 and then asked people about their general happiness. 813 00:37:37,440 --> 00:37:38,898 And what that does, essentially, it 814 00:37:38,898 --> 00:37:42,270 focuses people very much about dating as part of their life 815 00:37:42,270 --> 00:37:44,310 being important and for their happiness. 816 00:37:44,310 --> 00:37:47,448 And similarly, I think this is exactly what Maya 817 00:37:47,448 --> 00:37:48,240 was saying as well. 818 00:37:48,240 --> 00:37:50,650 We need to ask, as you know, about the ladder question, 819 00:37:50,650 --> 00:37:53,130 which is about other questions about, depending 820 00:37:53,130 --> 00:37:56,942 on how I phrase the question, I can get very different answers 821 00:37:56,942 --> 00:37:57,900 and what you're saying. 822 00:37:57,900 --> 00:37:59,695 So that makes it very tricky to figure out 823 00:37:59,695 --> 00:38:01,320 which questions should we use, how much 824 00:38:01,320 --> 00:38:02,820 weight should be put on it. 825 00:38:02,820 --> 00:38:07,500 And even things like the order of questions matter. 826 00:38:07,500 --> 00:38:09,845 Matters quite a bit, which makes economists very nervous 827 00:38:09,845 --> 00:38:11,970 and so on, and then that's why, in part, economists 828 00:38:11,970 --> 00:38:15,070 would argue that we really should have revealed preference 829 00:38:15,070 --> 00:38:16,210 and choice. 830 00:38:16,210 --> 00:38:20,470 Now on the other hand, I very much think the questions, 831 00:38:20,470 --> 00:38:23,240 when you just ask people and say, I'm deeply unhappy, 832 00:38:23,240 --> 00:38:26,420 that has some content, and there's information there. 833 00:38:26,420 --> 00:38:28,738 And so then the question is, well, 834 00:38:28,738 --> 00:38:30,280 how do we weigh these considerations? 835 00:38:30,280 --> 00:38:32,110 On the one hand, the survey instrument 836 00:38:32,110 --> 00:38:34,910 is really just very fragile, and we don't measure very much. 837 00:38:34,910 --> 00:38:38,110 On the other hand, there's useful information here 838 00:38:38,110 --> 00:38:40,510 that we should be incorporating somehow. 839 00:38:40,510 --> 00:38:43,440 So I just wanted to have you-- 840 00:38:43,440 --> 00:38:44,610 that caveat in mind. 841 00:38:44,610 --> 00:38:46,410 And so let's now look at some data 842 00:38:46,410 --> 00:38:47,850 that people have collected. 843 00:38:47,850 --> 00:38:50,640 So what's a very nice source overall 844 00:38:50,640 --> 00:38:54,750 is what's called Our World in Data, which 845 00:38:54,750 --> 00:38:56,865 has lots of interesting graphs and figures 846 00:38:56,865 --> 00:38:58,740 where you can look at happiness and all sorts 847 00:38:58,740 --> 00:39:01,300 of things data on the world. 848 00:39:01,300 --> 00:39:04,180 And when you look at life satisfaction around the globe, 849 00:39:04,180 --> 00:39:07,020 this is very much the ladder question here, which you see, 850 00:39:07,020 --> 00:39:09,210 exactly as I was saying. 851 00:39:09,210 --> 00:39:12,540 There's the ladder from zero to 10. 852 00:39:12,540 --> 00:39:15,180 And again, the ladder question is very much something 853 00:39:15,180 --> 00:39:18,570 hierarchical and makes inequality or relative 854 00:39:18,570 --> 00:39:20,100 comparisons quite salient. 855 00:39:20,100 --> 00:39:22,680 Even if it only asks you about your worst or your best 856 00:39:22,680 --> 00:39:26,370 possible life for yourself, it does have the feeling 857 00:39:26,370 --> 00:39:28,890 that really, it's about comparing yourself to others. 858 00:39:28,890 --> 00:39:31,707 But anyway, that's very much used in many cases. 859 00:39:31,707 --> 00:39:33,540 Often people will call it life satisfaction, 860 00:39:33,540 --> 00:39:36,170 but even here, they also call it happiness overall. 861 00:39:36,170 --> 00:39:38,520 This is from the World Happiness Report. 862 00:39:38,520 --> 00:39:41,130 And what you see is that, in general, 863 00:39:41,130 --> 00:39:42,930 rich countries tend to be-- 864 00:39:42,930 --> 00:39:45,120 report higher happiness, Scandinavian countries 865 00:39:45,120 --> 00:39:45,780 in particular. 866 00:39:45,780 --> 00:39:47,460 The US is reasonably happy. 867 00:39:47,460 --> 00:39:49,050 Europe is quite happy. 868 00:39:49,050 --> 00:39:50,940 And then if you go to poorer countries, 869 00:39:50,940 --> 00:39:53,730 in particular in Africa, they tend 870 00:39:53,730 --> 00:39:57,570 to report a lot lower happiness level. 871 00:39:57,570 --> 00:39:59,700 When you look at comparisons across countries, 872 00:39:59,700 --> 00:40:03,270 you see a very clear relationship. 873 00:40:03,270 --> 00:40:06,540 Rich countries tend to be a lot-- 874 00:40:06,540 --> 00:40:10,620 report much higher life satisfaction 875 00:40:10,620 --> 00:40:11,910 compared to poor countries. 876 00:40:11,910 --> 00:40:14,190 That's a very clear association. 877 00:40:14,190 --> 00:40:16,290 That's true, both across countries 878 00:40:16,290 --> 00:40:18,340 but also true within countries. 879 00:40:18,340 --> 00:40:19,740 So what this graph here shows you 880 00:40:19,740 --> 00:40:23,200 is the average income in a country. 881 00:40:23,200 --> 00:40:26,140 And then these arrows show you the gradient. 882 00:40:26,140 --> 00:40:27,820 Within the slope of the arrows is 883 00:40:27,820 --> 00:40:29,230 the gradient within countries. 884 00:40:29,230 --> 00:40:31,015 And what you see is essentially-- 885 00:40:31,015 --> 00:40:32,890 so this is comparing people within countries, 886 00:40:32,890 --> 00:40:35,560 rich versus poor countries. 887 00:40:35,560 --> 00:40:40,150 If you increase income within a country by $1,000 or something, 888 00:40:40,150 --> 00:40:44,180 how much does people's happiness go up? 889 00:40:44,180 --> 00:40:46,990 And what you see is all of those countries have-- 890 00:40:46,990 --> 00:40:49,600 almost all countries have increasing gradients, 891 00:40:49,600 --> 00:40:52,330 which tells you that, both within and across countries, 892 00:40:52,330 --> 00:40:56,410 richer people report higher life satisfactions 893 00:40:56,410 --> 00:40:58,790 compared to poorer people. 894 00:40:58,790 --> 00:41:01,340 Similarly, that's also true for mental health. 895 00:41:01,340 --> 00:41:03,910 So these are all questions about subjective well-being, 896 00:41:03,910 --> 00:41:07,240 how happy are you overall, asking the overall population. 897 00:41:07,240 --> 00:41:10,930 But if you try to look at mental health conditions 898 00:41:10,930 --> 00:41:16,330 in terms of particularly depression and anxiety, so 899 00:41:16,330 --> 00:41:22,840 using clear definitions and psychiatric disorders 900 00:41:22,840 --> 00:41:26,080 for depression and anxiety and also for other conditions, 901 00:41:26,080 --> 00:41:28,480 you find that the poor in any given location 902 00:41:28,480 --> 00:41:32,530 are more likely to suffer from depression and/or anxiety. 903 00:41:32,530 --> 00:41:36,620 Interestingly, the prevalence of depression, 904 00:41:36,620 --> 00:41:40,150 in fact, also and anxiety, is higher in rich countries. 905 00:41:40,150 --> 00:41:42,400 There's a lot of questions about measurement problems, 906 00:41:42,400 --> 00:41:44,470 like is it really measuring the same thing, 907 00:41:44,470 --> 00:41:45,843 and so on and so forth. 908 00:41:45,843 --> 00:41:48,010 But it could also be there's other factors involved, 909 00:41:48,010 --> 00:41:50,303 in particular, issues such as inequality. 910 00:41:50,303 --> 00:41:51,970 And it could just be that really, what's 911 00:41:51,970 --> 00:41:54,580 really important is relative income as opposed 912 00:41:54,580 --> 00:41:56,200 to absolute income. 913 00:41:56,200 --> 00:42:01,480 Now we know also that anti-poverty programs improve 914 00:42:01,480 --> 00:42:03,910 mental health in various ways, including 915 00:42:03,910 --> 00:42:11,170 things like cash transfers and other anti-poverty programs. 916 00:42:11,170 --> 00:42:13,270 So this is an overview of studies. 917 00:42:13,270 --> 00:42:16,030 This is essentially a recent meta-analysis 918 00:42:16,030 --> 00:42:19,480 that we did for some [INAUDIBLE] paper of mine. 919 00:42:19,480 --> 00:42:21,430 And what you see here is a number 920 00:42:21,430 --> 00:42:24,010 of different interventions, both cash transfers 921 00:42:24,010 --> 00:42:26,290 and other anti-poverty programs, which 922 00:42:26,290 --> 00:42:28,330 are more broader programs. 923 00:42:28,330 --> 00:42:31,258 And what you see here is the treatment effect estimated 924 00:42:31,258 --> 00:42:32,300 in the different studies. 925 00:42:32,300 --> 00:42:36,110 These are overall poverty anti-poverty programs. 926 00:42:36,110 --> 00:42:37,960 These are cash transfer programs. 927 00:42:37,960 --> 00:42:41,170 But overall, essentially, the overwhelming answer 928 00:42:41,170 --> 00:42:44,260 is that once you give people cash or reduce their poverty, 929 00:42:44,260 --> 00:42:46,750 that improves their mental health, as measured 930 00:42:46,750 --> 00:42:50,860 by different-- these are PWB, means psychological well-being 931 00:42:50,860 --> 00:42:51,730 indices. 932 00:42:51,730 --> 00:42:54,650 This is also depression screening scores and so on. 933 00:42:54,650 --> 00:42:57,340 So by various measures of people's mental health, 934 00:42:57,340 --> 00:43:00,460 giving people more money improves their mental health. 935 00:43:00,460 --> 00:43:04,900 So that's true at the extreme in terms of really mental illness, 936 00:43:04,900 --> 00:43:07,790 depressive and anxiety disorders. 937 00:43:07,790 --> 00:43:09,670 It's also true for other measures 938 00:43:09,670 --> 00:43:11,110 of psychological well-being. 939 00:43:11,110 --> 00:43:14,620 If you just give people cash, they report higher happiness, 940 00:43:14,620 --> 00:43:17,410 in addition to what I told you before, 941 00:43:17,410 --> 00:43:20,710 this being true in the cross section. 942 00:43:20,710 --> 00:43:23,520 Now interestingly, we also tend to think 943 00:43:23,520 --> 00:43:25,840 that others are less happy than they say they are. 944 00:43:25,840 --> 00:43:28,140 So if we just ask people, how happy 945 00:43:28,140 --> 00:43:29,970 are other people in your country, 946 00:43:29,970 --> 00:43:31,565 almost every country in the world, 947 00:43:31,565 --> 00:43:32,940 what you find is that people tend 948 00:43:32,940 --> 00:43:35,460 to say the fraction that people actually say that they're 949 00:43:35,460 --> 00:43:39,270 very happy or rather happy is lower 950 00:43:39,270 --> 00:43:46,680 than they actually say they are, which is an interesting fact. 951 00:43:46,680 --> 00:43:49,830 Now this-- as we talked about this already a little bit-- 952 00:43:49,830 --> 00:43:51,990 how is life satisfaction and happiness related 953 00:43:51,990 --> 00:43:53,293 to life events? 954 00:43:53,293 --> 00:43:55,460 In terms of when you look at different life events-- 955 00:43:55,460 --> 00:43:57,750 it's quite interesting that a lot of life events 956 00:43:57,750 --> 00:44:03,720 tend to affect people's reported well-being or life 957 00:44:03,720 --> 00:44:05,850 satisfaction a lot. 958 00:44:05,850 --> 00:44:09,780 For many events, including stuff like widowhood or divorce 959 00:44:09,780 --> 00:44:14,070 or marriage and other types of events, or winning the lottery, 960 00:44:14,070 --> 00:44:17,550 these effects are very much transitory, which is to say, 961 00:44:17,550 --> 00:44:19,470 when something really bad to somebody happens, 962 00:44:19,470 --> 00:44:21,490 their happiness goes down a lot. 963 00:44:21,490 --> 00:44:23,580 But it tends to recover quite a bit. 964 00:44:23,580 --> 00:44:25,330 This is what we talked about before. 965 00:44:25,330 --> 00:44:28,350 People are quite adaptive to many changes, 966 00:44:28,350 --> 00:44:29,980 and they adjust quite well. 967 00:44:29,980 --> 00:44:32,970 This is what people call the psychological immune system, 968 00:44:32,970 --> 00:44:35,310 except for a few things, including unemployment. 969 00:44:35,310 --> 00:44:38,080 And one potential reason is that, for example, unemployment 970 00:44:38,080 --> 00:44:40,055 might also cause things like depression. 971 00:44:40,055 --> 00:44:41,430 And once you're really depressed, 972 00:44:41,430 --> 00:44:44,190 that doesn't really go back over [INAUDIBLE] that easily. 973 00:44:44,190 --> 00:44:45,180 That's true for women. 974 00:44:45,180 --> 00:44:49,200 It's also true for men overall. 975 00:44:49,200 --> 00:44:50,700 There's also some other issues which 976 00:44:50,700 --> 00:44:52,825 you want to be careful of whether to interpret some 977 00:44:52,825 --> 00:44:57,450 of this data, which is there's called this evolution of latent 978 00:44:57,450 --> 00:45:02,040 situations, which is when you look at things like divorce, 979 00:45:02,040 --> 00:45:03,480 it tends to be that people-- 980 00:45:03,480 --> 00:45:05,303 when you look at people's divorce, 981 00:45:05,303 --> 00:45:06,720 you might say, well, divorce makes 982 00:45:06,720 --> 00:45:08,340 them happier in some ways. 983 00:45:08,340 --> 00:45:10,800 But really, what seems to be-- 984 00:45:10,800 --> 00:45:12,420 or you would look at this and say, 985 00:45:12,420 --> 00:45:13,878 marriage makes people less happy. 986 00:45:13,878 --> 00:45:15,420 But really, what seems to be the case 987 00:45:15,420 --> 00:45:18,150 is that people build up to a positive event, 988 00:45:18,150 --> 00:45:22,020 and then that peaks positive or negatively and then essentially 989 00:45:22,020 --> 00:45:23,490 goes back to what's before. 990 00:45:23,490 --> 00:45:26,370 For example, unemployment, things tend to go worse, 991 00:45:26,370 --> 00:45:32,070 even before people become unemployed, which seems to say, 992 00:45:32,070 --> 00:45:34,110 people tend to-- these underlying events, 993 00:45:34,110 --> 00:45:36,270 people tend to anticipate them already 994 00:45:36,270 --> 00:45:39,330 psychologically in some ways. 995 00:45:39,330 --> 00:45:41,490 Now, this is what I was saying before. 996 00:45:41,490 --> 00:45:44,930 When you look at comparisons over time, 997 00:45:44,930 --> 00:45:47,790 you get sometimes these situations where, particularly 998 00:45:47,790 --> 00:45:49,730 in places including in China, but also 999 00:45:49,730 --> 00:45:53,780 in the US, where real income has risen a lot in many places, 1000 00:45:53,780 --> 00:45:56,240 but yet people's life satisfaction 1001 00:45:56,240 --> 00:45:58,250 does not tend to increase. 1002 00:45:58,250 --> 00:45:59,000 Why is that? 1003 00:45:59,000 --> 00:46:02,170 Well, that's, in part, I think, because of inequality going up, 1004 00:46:02,170 --> 00:46:03,770 but [INAUDIBLE] inequality really 1005 00:46:03,770 --> 00:46:06,588 staying the same and relative income really being important. 1006 00:46:06,588 --> 00:46:07,880 Could be other things going on. 1007 00:46:07,880 --> 00:46:10,310 For example, pollution is really bad for people's health, 1008 00:46:10,310 --> 00:46:12,830 mental health, but also their happiness and so on. 1009 00:46:12,830 --> 00:46:15,315 It could also just be that people's reference points 1010 00:46:15,315 --> 00:46:17,690 adjust, which is just, you make everybody twice as happy, 1011 00:46:17,690 --> 00:46:19,850 and now everybody's-- the reference point goes up, 1012 00:46:19,850 --> 00:46:22,788 even though their overall objective well-being standards, 1013 00:46:22,788 --> 00:46:24,830 in terms of how much can they consume, et cetera, 1014 00:46:24,830 --> 00:46:28,160 has gone up a lot. 1015 00:46:28,160 --> 00:46:34,950 Now this is quite interesting data now on ceiling effects. 1016 00:46:34,950 --> 00:46:37,860 So there's some controversy over this. 1017 00:46:37,860 --> 00:46:39,620 Some people are arguing that when 1018 00:46:39,620 --> 00:46:41,370 you give people more money, at some point, 1019 00:46:41,370 --> 00:46:42,700 there's a ceiling effect. 1020 00:46:42,700 --> 00:46:44,940 It doesn't really make them happier anymore. 1021 00:46:44,940 --> 00:46:47,180 Now when you look at this graph here-- let 1022 00:46:47,180 --> 00:46:48,180 me go back for a second. 1023 00:46:48,180 --> 00:46:54,278 When you look at these graphs here, this-- 1024 00:46:54,278 --> 00:46:56,420 whoops, sorry. 1025 00:46:56,420 --> 00:46:56,990 [INAUDIBLE] 1026 00:46:56,990 --> 00:46:59,795 One second. 1027 00:46:59,795 --> 00:47:01,170 When you look at this graph here, 1028 00:47:01,170 --> 00:47:03,170 it just doesn't look at all like ceiling effect. 1029 00:47:03,170 --> 00:47:06,600 This looks like, essentially, self-reported life satisfaction 1030 00:47:06,600 --> 00:47:09,930 as measured by these ladder question, 1031 00:47:09,930 --> 00:47:14,610 essentially, is increasing, and there seems to be no ceiling, 1032 00:47:14,610 --> 00:47:16,643 or it doesn't seem to be flattening in any way. 1033 00:47:16,643 --> 00:47:18,060 When you look at this one as well, 1034 00:47:18,060 --> 00:47:23,890 it seems to be if you increase income by $1,000, at any level, 1035 00:47:23,890 --> 00:47:26,980 it seems to be that reported life satisfaction goes up. 1036 00:47:26,980 --> 00:47:29,760 So again, that doesn't look like a ceiling effect. 1037 00:47:29,760 --> 00:47:36,780 And that's true as well if you look at these questions here. 1038 00:47:36,780 --> 00:47:38,918 Gallup had done these surveys in the US. 1039 00:47:38,918 --> 00:47:40,710 When you look at the ladder question, which 1040 00:47:40,710 --> 00:47:43,350 is this one here, when you look at annual income in the US, 1041 00:47:43,350 --> 00:47:45,270 once you increase people's income, 1042 00:47:45,270 --> 00:47:48,750 their mean ladder score, which is here 1043 00:47:48,750 --> 00:47:51,660 on the axis on the right, tends to go up 1044 00:47:51,660 --> 00:47:52,752 pretty much monotonically. 1045 00:47:52,752 --> 00:47:54,210 Maybe it's a little bit flattening, 1046 00:47:54,210 --> 00:47:55,377 but essentially, it goes up. 1047 00:47:55,377 --> 00:47:58,020 So people report higher life satisfaction 1048 00:47:58,020 --> 00:47:59,650 when they have higher income. 1049 00:47:59,650 --> 00:48:02,460 But when you look at people's affect, both things 1050 00:48:02,460 --> 00:48:04,980 like positive affect, which is like happiness, smiling, 1051 00:48:04,980 --> 00:48:08,250 enjoyment, not being blue, being stress-free and so on, 1052 00:48:08,250 --> 00:48:11,220 what you see is-- in particular, this is for the US. 1053 00:48:11,220 --> 00:48:13,860 When you go from something like $10,000 to $20,000 1054 00:48:13,860 --> 00:48:18,000 to $40,000 per year of income, there's 1055 00:48:18,000 --> 00:48:19,180 a clear increase over time. 1056 00:48:19,180 --> 00:48:25,290 So being poor is really-- or poor in the US, 1057 00:48:25,290 --> 00:48:28,720 it is really, really bad for your affect 1058 00:48:28,720 --> 00:48:33,330 and for your reported well-being overall. 1059 00:48:33,330 --> 00:48:37,560 But once you reach something like $50,000, $60,000, $70,000 1060 00:48:37,560 --> 00:48:40,260 in the US, it seems to me that there's 1061 00:48:40,260 --> 00:48:43,410 no effect on positive affect, feeling blue, stress-free 1062 00:48:43,410 --> 00:48:44,320 and so on. 1063 00:48:44,320 --> 00:48:48,900 So one interpretation of that is that if we take these affect 1064 00:48:48,900 --> 00:48:52,200 questions as more like a measure of experienced utility 1065 00:48:52,200 --> 00:48:55,110 and happiness in any given life, essentially, 1066 00:48:55,110 --> 00:48:59,250 having something like $70,000, $80,000 per year in the US-- 1067 00:48:59,250 --> 00:49:01,470 granted, this is from 2009, so maybe this number 1068 00:49:01,470 --> 00:49:03,330 has gone a little bit higher. 1069 00:49:03,330 --> 00:49:06,600 But that number is sufficient to make people sufficiently happy, 1070 00:49:06,600 --> 00:49:08,540 and beyond that, doubling of income 1071 00:49:08,540 --> 00:49:10,830 will not do very much on how much you smile every day 1072 00:49:10,830 --> 00:49:13,350 and how happy you are as a measure of those things. 1073 00:49:13,350 --> 00:49:15,910 In contrast, if I ask you the ladder question, which again, 1074 00:49:15,910 --> 00:49:18,960 is very much about comparison, about how you feel yourself 1075 00:49:18,960 --> 00:49:22,260 compared to somebody else in society, 1076 00:49:22,260 --> 00:49:27,940 going from $100,000 to $200,000 will make, potentially, still 1077 00:49:27,940 --> 00:49:30,830 quite a big difference. 1078 00:49:30,830 --> 00:49:32,740 Any questions on that? 1079 00:49:36,508 --> 00:49:37,450 OK. 1080 00:49:37,450 --> 00:49:39,100 So then let me-- 1081 00:49:39,100 --> 00:49:43,960 this is among the most interesting parts of these 1082 00:49:43,960 --> 00:49:46,730 kinds of studies, is the question about what, in fact-- 1083 00:49:46,730 --> 00:49:47,707 so if not just income-- 1084 00:49:47,707 --> 00:49:49,540 so if you know income essentially-- overall, 1085 00:49:49,540 --> 00:49:51,370 if you're rich, you're going to be happier 1086 00:49:51,370 --> 00:49:52,840 compared to if you're poor. 1087 00:49:52,840 --> 00:49:56,170 But again, it seems to be really, 1088 00:49:56,170 --> 00:49:59,270 if you want to not be unhappy, having 1089 00:49:59,270 --> 00:50:02,050 income above $50,000 in the US is really important. 1090 00:50:02,050 --> 00:50:05,780 Beyond that, it doesn't seem to do very much anymore. 1091 00:50:05,780 --> 00:50:08,120 But now what other things matter quite a bit? 1092 00:50:08,120 --> 00:50:09,680 And this is a very simple exercise. 1093 00:50:09,680 --> 00:50:13,900 So what they did is, this is [INAUDIBLE].. 1094 00:50:13,900 --> 00:50:15,970 They looked at the Gallup survey questions 1095 00:50:15,970 --> 00:50:18,730 from 450,000 Americans. 1096 00:50:18,730 --> 00:50:20,800 And what they did is, for each of these four 1097 00:50:20,800 --> 00:50:23,740 measures of psychological well-being, 1098 00:50:23,740 --> 00:50:26,710 which is these here, positive affect, not blue, stress-free, 1099 00:50:26,710 --> 00:50:30,460 and the ladder questions, they were first 1100 00:50:30,460 --> 00:50:33,520 just trying to predict, what is the regression 1101 00:50:33,520 --> 00:50:34,787 coefficient of high income? 1102 00:50:34,787 --> 00:50:36,370 So if you just split the sample in two 1103 00:50:36,370 --> 00:50:39,130 and look at high income versus low income, 1104 00:50:39,130 --> 00:50:41,560 what is the regression coefficients of that? 1105 00:50:41,560 --> 00:50:44,450 They're trying to interpret that as causal effect, which, 1106 00:50:44,450 --> 00:50:47,830 of course, is a little problematic. 1107 00:50:47,830 --> 00:50:49,580 But you can think of this as the question, 1108 00:50:49,580 --> 00:50:53,140 if I wanted to try to predict how happy you are, what do I 1109 00:50:53,140 --> 00:50:54,697 need to know about you? 1110 00:50:54,697 --> 00:50:57,280 What are the factors that I need to know about you if I wanted 1111 00:50:57,280 --> 00:51:00,550 to predict how happy you are in your life on any point in time? 1112 00:51:00,550 --> 00:51:03,010 If you wanted to know, 20 years from now, 1113 00:51:03,010 --> 00:51:04,460 are you going to be happy or not, 1114 00:51:04,460 --> 00:51:07,700 what are the things that you need to know about yourself? 1115 00:51:07,700 --> 00:51:11,110 And so it turns out that high income is predictive. 1116 00:51:11,110 --> 00:51:13,262 So there's some coefficient that says 1117 00:51:13,262 --> 00:51:15,220 that if you have higher income, you're somewhat 1118 00:51:15,220 --> 00:51:17,380 more likely to have positive affect, 1119 00:51:17,380 --> 00:51:20,410 less likely to feel blue or less likely to be stressed, 1120 00:51:20,410 --> 00:51:24,820 and a lot more likely to give a higher answer in the ladder 1121 00:51:24,820 --> 00:51:25,370 questions. 1122 00:51:25,370 --> 00:51:27,460 So in some sense, higher income is good. 1123 00:51:27,460 --> 00:51:29,500 And then what the table does is it 1124 00:51:29,500 --> 00:51:32,350 looks at-- it normalizes high income to 1 1125 00:51:32,350 --> 00:51:34,970 and then shows you other characteristics and says, 1126 00:51:34,970 --> 00:51:40,230 OK, if I knew another about you, are you religious versus not 1127 00:51:40,230 --> 00:51:43,930 and so on, what is the relative coefficient of that compared 1128 00:51:43,930 --> 00:51:45,580 to a high versus low income? 1129 00:51:45,580 --> 00:51:46,210 OK? 1130 00:51:46,210 --> 00:51:49,670 So for example, if you're old, you're-- 1131 00:51:49,670 --> 00:51:52,150 so a positive coefficient means you're 1132 00:51:52,150 --> 00:51:55,060 happier or more likely to have positive affect here, 1133 00:51:55,060 --> 00:51:58,370 and a negative, the opposite, I guess, for positive affect 1134 00:51:58,370 --> 00:51:59,120 here. 1135 00:51:59,120 --> 00:52:01,040 So if you're old, for example, older people, 1136 00:52:01,040 --> 00:52:06,553 at least in the US, tend to report higher positive affect. 1137 00:52:06,553 --> 00:52:07,720 So that's good news for you. 1138 00:52:07,720 --> 00:52:11,520 In 20, 30 years, you'll be likely to be happier. 1139 00:52:11,520 --> 00:52:13,690 It turns out that religious people, for example, 1140 00:52:13,690 --> 00:52:16,660 tend to be more-- 1141 00:52:16,660 --> 00:52:18,310 report more positive affect compared 1142 00:52:18,310 --> 00:52:20,300 to non-religious people. 1143 00:52:20,300 --> 00:52:23,170 And the magnitude of these effects is pretty large. 1144 00:52:23,170 --> 00:52:24,100 That is to say-- 1145 00:52:24,100 --> 00:52:27,880 1.16 is to say, if you compare religious-- 1146 00:52:27,880 --> 00:52:30,880 if I knew about you, are you religious versus not, 1147 00:52:30,880 --> 00:52:33,280 that is as predictive as knowing about you, 1148 00:52:33,280 --> 00:52:35,980 are you rich versus not, are you in the upper or lower half 1149 00:52:35,980 --> 00:52:37,233 of this income distribution? 1150 00:52:37,233 --> 00:52:38,650 So that's quite important in terms 1151 00:52:38,650 --> 00:52:40,840 of trying to predict whether people are happy. 1152 00:52:40,840 --> 00:52:45,010 Now what's particularly striking here when you look at this 1153 00:52:45,010 --> 00:52:50,402 here is alone, smoker, and headache, and particularly 1154 00:52:50,402 --> 00:52:52,030 pain. 1155 00:52:52,030 --> 00:52:54,250 I want to emphasize the alone here. 1156 00:52:54,250 --> 00:52:58,810 Social relationships or people being in relationships 1157 00:52:58,810 --> 00:53:02,950 with friends or in family or with partners 1158 00:53:02,950 --> 00:53:05,980 is extremely predictive of people's well-being. 1159 00:53:05,980 --> 00:53:07,840 This is what I was talking about before when 1160 00:53:07,840 --> 00:53:09,970 it came to social preferences. 1161 00:53:09,970 --> 00:53:13,180 Investing in social relationships in one way is-- 1162 00:53:13,180 --> 00:53:14,240 A, of course. 1163 00:53:14,240 --> 00:53:16,780 Making other people happy and being nice is good. 1164 00:53:16,780 --> 00:53:19,570 But in a way, you think about investing 1165 00:53:19,570 --> 00:53:22,588 or being nice to others and investing 1166 00:53:22,588 --> 00:53:23,630 in a social relationship. 1167 00:53:23,630 --> 00:53:29,221 It's very much an investment in your future well-being. 1168 00:53:29,221 --> 00:53:33,920 And the magnitudes here are enormous. 1169 00:53:33,920 --> 00:53:37,580 Now another way you could do is-- 1170 00:53:37,580 --> 00:53:40,620 so here, you could say, well, what 1171 00:53:40,620 --> 00:53:43,200 should I do to make me happier in the future? 1172 00:53:43,200 --> 00:53:45,790 So one thing you could do is just look at data. 1173 00:53:45,790 --> 00:53:47,540 So one thing you could do is look at data. 1174 00:53:47,540 --> 00:53:50,070 And well, what are these predictors [AUDIO OUT] 1175 00:53:50,070 --> 00:53:50,810 people happy? 1176 00:53:50,810 --> 00:53:52,290 These are all correlations. 1177 00:53:52,290 --> 00:53:54,411 You want to be careful. 1178 00:53:54,411 --> 00:53:56,730 [AUDIO OUT] tells you something about, if you 1179 00:53:56,730 --> 00:54:00,030 want to be happy in 20 years, things you want to invest in, 1180 00:54:00,030 --> 00:54:05,050 these numbers are probably a pretty good start. 1181 00:54:05,050 --> 00:54:08,910 [AUDIO OUT] your health seems to be quite a good idea. 1182 00:54:08,910 --> 00:54:12,780 Smoking, for example, not so much. 1183 00:54:12,780 --> 00:54:16,960 Now another thing you could do is just ask people 1184 00:54:16,960 --> 00:54:20,310 at the above and just [AUDIO OUT] 1185 00:54:20,310 --> 00:54:23,920 about what they should have or could have done in their life. 1186 00:54:23,920 --> 00:54:25,170 What do you think people said? 1187 00:54:25,170 --> 00:54:26,460 So this is an Australian nurse who 1188 00:54:26,460 --> 00:54:28,350 recorded the experience from palliative care 1189 00:54:28,350 --> 00:54:32,130 as end of life experiences and asked people about-- 1190 00:54:32,130 --> 00:54:34,440 or she did not simply ask, but she was just recording 1191 00:54:34,440 --> 00:54:35,430 what people are saying. 1192 00:54:35,430 --> 00:54:37,240 What are the top five regrets? 1193 00:54:37,240 --> 00:54:38,490 What would you think they are? 1194 00:54:43,327 --> 00:54:45,285 AUDIENCE: Not spending enough time with family? 1195 00:54:58,390 --> 00:54:59,390 FRANK SCHILBACH: Mm-hmm. 1196 00:54:59,390 --> 00:55:01,265 Wished I had worked less, somebody else says. 1197 00:55:14,480 --> 00:55:15,950 Good enough. 1198 00:55:15,950 --> 00:55:20,570 So essentially, what people tend to say is a combination of they 1199 00:55:20,570 --> 00:55:22,950 wished they had been more social, 1200 00:55:22,950 --> 00:55:26,030 they wished they had expressed their feelings more, 1201 00:55:26,030 --> 00:55:30,230 they had had been more true to themselves in some ways and not 1202 00:55:30,230 --> 00:55:31,910 done things that others wanted them 1203 00:55:31,910 --> 00:55:37,020 to do but rather spend more time with their friends and family 1204 00:55:37,020 --> 00:55:37,740 and so on. 1205 00:55:37,740 --> 00:55:39,230 Now you want to be a little bit careful with this. 1206 00:55:39,230 --> 00:55:41,147 Again, there's issues with remembered utility. 1207 00:55:41,147 --> 00:55:45,870 It's also issues with if you are alone at the end of your life, 1208 00:55:45,870 --> 00:55:48,090 you might be quite unhappy because of that. 1209 00:55:48,090 --> 00:55:50,625 Be careful how to interpret that. 1210 00:55:50,625 --> 00:55:53,000 But I think we can actually learn quite a bit about this. 1211 00:55:53,000 --> 00:55:55,040 In some ways, if you think about certain choices 1212 00:55:55,040 --> 00:55:58,700 you make in life, understanding some of these issues 1213 00:55:58,700 --> 00:56:01,170 seem to be quite important overall. 1214 00:56:01,170 --> 00:56:03,320 And when you think about what should you maximize, 1215 00:56:03,320 --> 00:56:08,361 it seems to be that maximizing income, if that comes 1216 00:56:08,361 --> 00:56:10,850 at the cost of not having friends, 1217 00:56:10,850 --> 00:56:13,220 seems to be not a good idea if you want 1218 00:56:13,220 --> 00:56:15,447 to be happy in the long run. 1219 00:56:15,447 --> 00:56:17,030 Another thing that people-- by the way 1220 00:56:17,030 --> 00:56:20,250 here, this is not in here-- tend to emphasize 1221 00:56:20,250 --> 00:56:23,150 quite a lot is having a meaningful job 1222 00:56:23,150 --> 00:56:25,700 or having work that they believe in in terms of something 1223 00:56:25,700 --> 00:56:28,730 meaningful as opposed to maximizing income, 1224 00:56:28,730 --> 00:56:33,130 which seems quite important. 1225 00:56:33,130 --> 00:56:33,880 OK. 1226 00:56:33,880 --> 00:56:36,550 So then what kinds of things then could 1227 00:56:36,550 --> 00:56:38,350 you do to make yourself happier? 1228 00:56:38,350 --> 00:56:41,538 And I have a few things written down here. 1229 00:56:41,538 --> 00:56:43,330 So one of the thing things that I really is 1230 00:56:43,330 --> 00:56:46,390 invest in and maintain social relationships. 1231 00:56:46,390 --> 00:56:48,820 Small acts can make big differences, 1232 00:56:48,820 --> 00:56:51,670 as we talked about, of letters of gratitude 1233 00:56:51,670 --> 00:56:52,930 or random acts of kindness. 1234 00:56:52,930 --> 00:56:55,300 These are very small things that you could do. 1235 00:56:55,300 --> 00:56:58,552 Of course, you can do also really important things. 1236 00:56:58,552 --> 00:57:00,760 It matters quite a lot how much you help your friends 1237 00:57:00,760 --> 00:57:03,670 or how much you invest in others. 1238 00:57:03,670 --> 00:57:06,130 Again, that's partially reflective of just being 1239 00:57:06,130 --> 00:57:08,380 pro-social and being nice to others, 1240 00:57:08,380 --> 00:57:10,990 but you can think of this very much as-- 1241 00:57:10,990 --> 00:57:12,990 neurotically in some ways, you can think of this 1242 00:57:12,990 --> 00:57:17,150 also as investing in your own happiness in the future. 1243 00:57:17,150 --> 00:57:19,750 Choosing meaningful work over money seems really important. 1244 00:57:22,525 --> 00:57:24,400 If you think about what to do with your life, 1245 00:57:24,400 --> 00:57:27,190 there's so much talent at MIT, and I sometimes 1246 00:57:27,190 --> 00:57:29,710 wish that talent would go more into more meaningful things, 1247 00:57:29,710 --> 00:57:32,770 or because it's society or making society better 1248 00:57:32,770 --> 00:57:36,670 overall, as opposed to some perhaps potentially 1249 00:57:36,670 --> 00:57:39,970 [INAUDIBLE] that pay you a lot of money. 1250 00:57:39,970 --> 00:57:42,820 That's, in some sense, potentially socially optimal 1251 00:57:42,820 --> 00:57:44,470 in terms of just use MIT's talents 1252 00:57:44,470 --> 00:57:47,320 or MIT students' talents for useful things in the world. 1253 00:57:47,320 --> 00:57:49,780 But if you're just selfish and want 1254 00:57:49,780 --> 00:57:52,360 to make yourself happy in 20 years from now, 1255 00:57:52,360 --> 00:57:55,260 it seems to me that just maximizing income is just not 1256 00:57:55,260 --> 00:57:56,510 what's going to get you there. 1257 00:57:56,510 --> 00:57:59,080 So it seems to be like being rich is certainly 1258 00:57:59,080 --> 00:58:01,190 glamorous and exciting, but in fact, 1259 00:58:01,190 --> 00:58:05,320 if that comes at the cost of your health, [INAUDIBLE],, 1260 00:58:05,320 --> 00:58:07,810 and your friendships, most likely, 1261 00:58:07,810 --> 00:58:09,940 or at the cost of having a meaningful job 1262 00:58:09,940 --> 00:58:12,370 that you actually believe in and think is useful, 1263 00:58:12,370 --> 00:58:14,600 that's not going to make you happy in the long run. 1264 00:58:14,600 --> 00:58:16,990 So for those of you thinking about what kind of work 1265 00:58:16,990 --> 00:58:20,930 to pursue in the future, keep that in mind. 1266 00:58:20,930 --> 00:58:22,445 And then I have two more things here 1267 00:58:22,445 --> 00:58:23,945 in my list, which is seeking support 1268 00:58:23,945 --> 00:58:27,340 to improve your mental health and reduce social media usage. 1269 00:58:27,340 --> 00:58:28,490 So what do I mean by that? 1270 00:58:28,490 --> 00:58:30,580 So psychotherapies have very much 1271 00:58:30,580 --> 00:58:34,220 been shown to be effective in reducing depression, anxiety, 1272 00:58:34,220 --> 00:58:34,900 and so on. 1273 00:58:34,900 --> 00:58:37,380 Yet people tend to not make use of those services. 1274 00:58:37,380 --> 00:58:38,830 Now why is that? 1275 00:58:38,830 --> 00:58:41,490 There's obvious reasons, which is stigma, shame, and on. 1276 00:58:41,490 --> 00:58:44,030 There's also potentially misperceptions, projection 1277 00:58:44,030 --> 00:58:44,530 bias. 1278 00:58:44,530 --> 00:58:46,910 Also depression itself, for example, 1279 00:58:46,910 --> 00:58:49,878 could precisely generate these kinds of beliefs. 1280 00:58:49,878 --> 00:58:51,670 There could also be other behavioral biases 1281 00:58:51,670 --> 00:58:53,878 like other health conditions, where people just don't 1282 00:58:53,878 --> 00:58:55,490 like to see doctors and so on. 1283 00:58:55,490 --> 00:58:57,170 Now another way to view this is to say, 1284 00:58:57,170 --> 00:58:59,372 well, you might think that psychotherapy 1285 00:58:59,372 --> 00:59:01,330 is really important for depression and anxiety, 1286 00:59:01,330 --> 00:59:04,300 but psychotherapy could also be viewed as a coach 1287 00:59:04,300 --> 00:59:05,050 to make you happy. 1288 00:59:05,050 --> 00:59:07,323 Just think about this as a happiness coach. 1289 00:59:07,323 --> 00:59:09,490 It might help you figure out your objective function 1290 00:59:09,490 --> 00:59:12,370 in life, what really makes you happy, and how to pursue that. 1291 00:59:12,370 --> 00:59:14,380 And that's regardless of depression, anxiety, 1292 00:59:14,380 --> 00:59:15,380 or any mental disorders. 1293 00:59:15,380 --> 00:59:17,470 It's to say, think about sports. 1294 00:59:17,470 --> 00:59:19,720 In sports, you get a coach to be a better tennis 1295 00:59:19,720 --> 00:59:22,603 player or whatever, soccer or whatever sports you do. 1296 00:59:22,603 --> 00:59:24,020 That's a very natural thing to do. 1297 00:59:24,020 --> 00:59:25,420 People help you to do better. 1298 00:59:25,420 --> 00:59:27,922 Now why not have a coach for many other things in life, 1299 00:59:27,922 --> 00:59:29,380 including how to be happier and how 1300 00:59:29,380 --> 00:59:34,117 to lead a life that makes you satisfied? 1301 00:59:34,117 --> 00:59:35,575 In some sense, if you're maximizing 1302 00:59:35,575 --> 00:59:37,617 your wrong objective function, which again, could 1303 00:59:37,617 --> 00:59:40,780 be you just maximize money over other things, 1304 00:59:40,780 --> 00:59:43,440 if a therapist helped you do that or helped you 1305 00:59:43,440 --> 00:59:48,130 correct that, that could be extremely valuable and helpful. 1306 00:59:48,130 --> 00:59:52,630 And again, that's in the absence of any serious mental illness. 1307 00:59:52,630 --> 00:59:54,220 That's just for the average person. 1308 00:59:54,220 --> 00:59:57,610 Having somebody to talk to to help you optimize your life 1309 00:59:57,610 --> 01:00:00,640 and mental well-being seems to be extremely 1310 01:00:00,640 --> 01:00:04,800 important and worth trying. 1311 01:00:04,800 --> 01:00:06,500 Second, about social media. 1312 01:00:06,500 --> 01:00:09,310 So there's a recent paper that's quite intriguing. 1313 01:00:09,310 --> 01:00:10,810 Or there's actually a couple papers, 1314 01:00:10,810 --> 01:00:12,270 but let me just mention one, which 1315 01:00:12,270 --> 01:00:17,220 is by Alcott et al that randomizes paying students 1316 01:00:17,220 --> 01:00:19,360 to stay off Facebook for a month. 1317 01:00:19,360 --> 01:00:22,830 This was before the 2018 election, midterm election, 1318 01:00:22,830 --> 01:00:27,120 which is, in some sense, perhaps relevant in some ways. 1319 01:00:27,120 --> 01:00:31,260 The timing might matter for some of these results. 1320 01:00:31,260 --> 01:00:33,617 So what they find is that getting people to pay-- you 1321 01:00:33,617 --> 01:00:35,450 have to pay students quite a lot to do this. 1322 01:00:35,450 --> 01:00:37,580 So students, you have to pay them like $50, 1323 01:00:37,580 --> 01:00:39,800 and otherwise, they're not willing to do this. 1324 01:00:39,800 --> 01:00:44,520 When you do that, people reduce their online activities. 1325 01:00:44,520 --> 01:00:47,460 They also reduce their factual new knowledge, 1326 01:00:47,460 --> 01:00:49,710 political polarization and so on, all kinds of things 1327 01:00:49,710 --> 01:00:51,330 that you might expect. 1328 01:00:51,330 --> 01:00:53,730 Then you see increased subjective well-being, 1329 01:00:53,730 --> 01:00:56,220 both happiness but also reduced depression. 1330 01:00:56,220 --> 01:01:00,300 So students are happier, quite a bit happier 1331 01:01:00,300 --> 01:01:03,300 by doing that Moreover, there seem 1332 01:01:03,300 --> 01:01:06,060 to be large, persistent reductions 1333 01:01:06,060 --> 01:01:08,320 in post-experiment Facebook usage. 1334 01:01:08,320 --> 01:01:10,740 That is to say, once the experiment is over, 1335 01:01:10,740 --> 01:01:13,800 lots of students who have been paid to stay off Facebook 1336 01:01:13,800 --> 01:01:16,080 tend to do that, or to continue doing 1337 01:01:16,080 --> 01:01:19,403 that, which very much seems to say that it's a habit good. 1338 01:01:19,403 --> 01:01:20,820 Perhaps there's also some learning 1339 01:01:20,820 --> 01:01:26,940 involved as students learn that they're now happier overall. 1340 01:01:26,940 --> 01:01:29,040 Now that always leads to the question, well, 1341 01:01:29,040 --> 01:01:31,950 why are people then on Facebook if it doesn't make them happy? 1342 01:01:31,950 --> 01:01:34,170 And does it have to do with peer pressure? 1343 01:01:34,170 --> 01:01:36,120 Does it have to do with habit formation? 1344 01:01:36,120 --> 01:01:39,360 Does it have to do with self-control problems? 1345 01:01:39,360 --> 01:01:41,950 Does it have to do with biased beliefs and so on? 1346 01:01:41,950 --> 01:01:44,110 But overall in some ways, I think the reason 1347 01:01:44,110 --> 01:01:46,920 that I show this evidence is, when you think about your life 1348 01:01:46,920 --> 01:01:48,570 or things that you do in your life, 1349 01:01:48,570 --> 01:01:51,030 you want to think about what are the things that you 1350 01:01:51,030 --> 01:01:51,730 do in your life? 1351 01:01:51,730 --> 01:01:54,660 What are the things that make you happy or not? 1352 01:01:54,660 --> 01:01:57,270 And perhaps experimenting with those kinds of things, 1353 01:01:57,270 --> 01:01:59,520 or just looking at data from studies 1354 01:01:59,520 --> 01:02:01,638 seems like a very reasonable thing to do. 1355 01:02:01,638 --> 01:02:03,180 So it could well be that social media 1356 01:02:03,180 --> 01:02:05,070 makes you really happy in connecting with your friends. 1357 01:02:05,070 --> 01:02:07,680 In particular right now, you think that's really helpful, 1358 01:02:07,680 --> 01:02:09,270 because you can stay in touch if you 1359 01:02:09,270 --> 01:02:11,230 need to talk online and so on and so forth, 1360 01:02:11,230 --> 01:02:13,230 and that's truly a thing that's worth doing. 1361 01:02:13,230 --> 01:02:17,850 So that could really be very beneficial for your happiness 1362 01:02:17,850 --> 01:02:20,970 overall, but it might also be that there's 1363 01:02:20,970 --> 01:02:24,780 large costs of not, coming from perhaps social comparison 1364 01:02:24,780 --> 01:02:27,060 or just seeing everybody happy on Facebook, 1365 01:02:27,060 --> 01:02:28,680 or whatever people post selectively 1366 01:02:28,680 --> 01:02:33,780 makes it look like people are way happier than they actually 1367 01:02:33,780 --> 01:02:34,572 are. 1368 01:02:34,572 --> 01:02:36,780 And then that makes people feel bad about themselves, 1369 01:02:36,780 --> 01:02:39,150 seeing others post overall. 1370 01:02:39,150 --> 01:02:40,770 But I think the reason why I show 1371 01:02:40,770 --> 01:02:42,870 you this is I want you to think about or be more 1372 01:02:42,870 --> 01:02:45,810 conscious about, what are the things that do make you, 1373 01:02:45,810 --> 01:02:49,980 in fact, happier and try to invest in those that seem 1374 01:02:49,980 --> 01:02:51,147 to make you happier overall. 1375 01:02:51,147 --> 01:02:52,938 So some of these things you could obviously 1376 01:02:52,938 --> 01:02:53,670 experiment with. 1377 01:02:53,670 --> 01:02:55,295 For example, you can obviously turn off 1378 01:02:55,295 --> 01:02:57,210 your Facebook or Instagram, whatever, 1379 01:02:57,210 --> 01:02:58,932 very quickly and easily. 1380 01:02:58,932 --> 01:03:00,390 Other things, of course-- the stuff 1381 01:03:00,390 --> 01:03:04,560 that I showed you in the very long run, 1382 01:03:04,560 --> 01:03:13,332 like the Kahneman here, these kinds of data, of course, 1383 01:03:13,332 --> 01:03:15,040 it's much harder to experiment with that. 1384 01:03:15,040 --> 01:03:18,280 It's harder to say, I'm going to be religious for a month, 1385 01:03:18,280 --> 01:03:21,520 from now on, and then see, are you happier? 1386 01:03:21,520 --> 01:03:23,230 But it is worth to think about, what 1387 01:03:23,230 --> 01:03:25,022 are the things that you're really pursuing, 1388 01:03:25,022 --> 01:03:27,250 and are these things going to be making you happy? 1389 01:03:27,250 --> 01:03:29,028 Not just right now and, is this just 1390 01:03:29,028 --> 01:03:30,820 doing some things because your friends want 1391 01:03:30,820 --> 01:03:32,980 it or some other influence that you have, 1392 01:03:32,980 --> 01:03:35,525 as opposed to what are the things that you really want, 1393 01:03:35,525 --> 01:03:37,900 and what are the things that are going to make you happy, 1394 01:03:37,900 --> 01:03:41,390 not just this month or next month, but five 10, 1395 01:03:41,390 --> 01:03:42,555 20 years from now? 1396 01:03:42,555 --> 01:03:43,930 Because a lot of choices that you 1397 01:03:43,930 --> 01:03:48,940 make in the next couple of years will 1398 01:03:48,940 --> 01:03:51,580 be very persistent in terms of depending on what jobs 1399 01:03:51,580 --> 01:03:54,880 you choose, what kinds of friendships you have, 1400 01:03:54,880 --> 01:03:56,740 what kind of partners you have, and so on, 1401 01:03:56,740 --> 01:03:58,810 will be very much persistent for a long time 1402 01:03:58,810 --> 01:04:02,520 in terms of determining your long run happiness. 1403 01:04:02,520 --> 01:04:07,620 And so that's a very useful thing to think about. 1404 01:04:07,620 --> 01:04:09,210 Let me see. 1405 01:04:09,210 --> 01:04:12,690 So what do people think? 1406 01:04:12,690 --> 01:04:14,480 I guess, at some point, hopefully-- 1407 01:04:14,480 --> 01:04:16,710 so Cubek is monitoring the llama situation, I hope. 1408 01:04:18,943 --> 01:04:20,610 We'll see about that, because hopefully, 1409 01:04:20,610 --> 01:04:21,780 that makes people happier. 1410 01:04:21,780 --> 01:04:23,515 But what do people think is-- 1411 01:04:23,515 --> 01:04:25,890 why are people on Facebook if it doesn't make them happy? 1412 01:04:25,890 --> 01:04:27,330 So it could be essentially issues 1413 01:04:27,330 --> 01:04:32,340 with present bias or the like, that any point in time, 1414 01:04:32,340 --> 01:04:35,010 there's some temptations of perhaps posting or the like, 1415 01:04:35,010 --> 01:04:38,730 and then that makes you happy in the very short run. 1416 01:04:38,730 --> 01:04:41,010 But in the long run overall, then that's 1417 01:04:41,010 --> 01:04:44,250 not really helping you overall, and there's essentially 1418 01:04:44,250 --> 01:04:47,865 some self-control problems in some ways. 1419 01:04:47,865 --> 01:04:49,740 In particular right now-- and so to be clear, 1420 01:04:49,740 --> 01:04:51,907 the study that was done here-- and I actually talked 1421 01:04:51,907 --> 01:04:54,660 to [INAUDIBLE] some of the authors of the study would also 1422 01:04:54,660 --> 01:04:57,940 tell me, if you did the study right now, 1423 01:04:57,940 --> 01:05:00,760 we would find the opposite, because now, in fact, 1424 01:05:00,760 --> 01:05:02,280 Facebook or any other social media 1425 01:05:02,280 --> 01:05:05,363 helps people stay in touch, and there's some positive benefits. 1426 01:05:05,363 --> 01:05:07,530 So what you're saying is, well, presumably, Facebook 1427 01:05:07,530 --> 01:05:11,750 was developed early on. 1428 01:05:11,750 --> 01:05:13,125 Or the reason why it's successful 1429 01:05:13,125 --> 01:05:14,875 is because it really helped people connect 1430 01:05:14,875 --> 01:05:16,230 to each other in some ways. 1431 01:05:16,230 --> 01:05:18,480 And people thought there were some benefits from that. 1432 01:05:18,480 --> 01:05:20,640 And that's particularly true if you're not 1433 01:05:20,640 --> 01:05:22,980 in the same place physically. 1434 01:05:22,980 --> 01:05:25,570 And particularly right now, then there 1435 01:05:25,570 --> 01:05:28,948 are these benefits overall. 1436 01:05:28,948 --> 01:05:30,490 And then there's a question of, well, 1437 01:05:30,490 --> 01:05:32,430 what are the costs and benefits of that? 1438 01:05:32,430 --> 01:05:40,590 And maybe people misperceive some of these or the like. 1439 01:05:40,590 --> 01:05:42,450 What I'm trying to encourage you is to say, 1440 01:05:42,450 --> 01:05:46,100 well, there's ways to learn about this. 1441 01:05:46,100 --> 01:05:48,260 And Facebook is actually a very good example 1442 01:05:48,260 --> 01:05:50,510 where you could learn very easily, where you just say, 1443 01:05:50,510 --> 01:05:53,000 let's turn it off for a month and see what happens. 1444 01:05:53,000 --> 01:05:54,950 You could also-- there's some other activities where you say, 1445 01:05:54,950 --> 01:05:57,656 let's do more sports, or let's get up earlier in the morning 1446 01:05:57,656 --> 01:05:58,880 or sleep more at night. 1447 01:05:58,880 --> 01:06:02,420 There's some things that are quite easy to experiment with. 1448 01:06:02,420 --> 01:06:04,850 Other stuff, as I said, becoming religious versus not, 1449 01:06:04,850 --> 01:06:07,940 is much trickier to experiment with, or being healthy 1450 01:06:07,940 --> 01:06:08,540 versus not. 1451 01:06:08,540 --> 01:06:13,730 In some sense, it's just a very much long run effect. 1452 01:06:13,730 --> 01:06:16,550 But exactly as you said, people have never actually experienced 1453 01:06:16,550 --> 01:06:18,560 any of this, so how would you know 1454 01:06:18,560 --> 01:06:20,610 how you're going to be happier? 1455 01:06:20,610 --> 01:06:24,450 There's also some public goods issues and some issues 1456 01:06:24,450 --> 01:06:26,640 of externalities [INAUDIBLE]. 1457 01:06:26,640 --> 01:06:29,280 If all of [AUDIO OUT] are on social media and you are not, 1458 01:06:29,280 --> 01:06:34,410 you're a bit of an outsider, and that will make you likely 1459 01:06:34,410 --> 01:06:35,240 very-- 1460 01:06:35,240 --> 01:06:38,160 or likely not increase your happiness. 1461 01:06:38,160 --> 01:06:42,780 But it might well be that if everybody decides to go off 1462 01:06:42,780 --> 01:06:45,307 social media, then in some sense, 1463 01:06:45,307 --> 01:06:46,890 you don't have to rely on social media 1464 01:06:46,890 --> 01:06:49,140 to go to parties or whatever and so on. 1465 01:06:49,140 --> 01:06:51,217 And so there might be also some coordination 1466 01:06:51,217 --> 01:06:52,050 issues, potentially. 1467 01:06:52,050 --> 01:06:54,720 So conditional on everybody being on Facebook around you, 1468 01:06:54,720 --> 01:06:57,780 you might want to be on Facebook or whatever other social media. 1469 01:06:57,780 --> 01:07:00,520 But if that's not the case, then-- 1470 01:07:00,520 --> 01:07:02,260 so it could be for every single person. 1471 01:07:02,260 --> 01:07:05,310 Every single person could be happier if everybody else were 1472 01:07:05,310 --> 01:07:07,890 not on social media, but conditional on that everybody 1473 01:07:07,890 --> 01:07:11,100 is on social media, that's not the case. 1474 01:07:11,100 --> 01:07:14,555 And what I'm saying here is, we tend to do the same things over 1475 01:07:14,555 --> 01:07:15,180 and over again. 1476 01:07:15,180 --> 01:07:17,820 As Maya was saying, is what if we just 1477 01:07:17,820 --> 01:07:22,380 experiment more off of this and we did costs, perhaps long run, 1478 01:07:22,380 --> 01:07:22,980 benefits. 1479 01:07:22,980 --> 01:07:25,560 There's default effects, of course, and so on that 1480 01:07:25,560 --> 01:07:27,000 keep us from experimenting. 1481 01:07:27,000 --> 01:07:29,250 But if you think about it, if you could make yourself 1482 01:07:29,250 --> 01:07:31,960 just a little bit happier every day for many, many years, 1483 01:07:31,960 --> 01:07:34,350 it seems to be worth experimenting a lot. 1484 01:07:34,350 --> 01:07:37,530 I would very much encourage you to go and try out 1485 01:07:37,530 --> 01:07:41,400 new things in life that potentially make you happier. 1486 01:07:41,400 --> 01:07:43,235 [INAUDIBLE] has as a nice-- 1487 01:07:43,235 --> 01:07:44,860 I hope this is still working, the link, 1488 01:07:44,860 --> 01:07:46,402 but it has a very nice New York Times 1489 01:07:46,402 --> 01:07:50,880 article that argues that [INAUDIBLE] experiment more 1490 01:07:50,880 --> 01:07:54,640 and try to figure out what makes us happier. 1491 01:07:54,640 --> 01:07:56,965 Any other thoughts on social media or Facebook? 1492 01:08:20,899 --> 01:08:21,399 OK. 1493 01:08:21,399 --> 01:08:28,130 So that's all I have for now. 1494 01:08:28,130 --> 01:08:30,020 Next time, we're going to talk about policy. 1495 01:08:30,020 --> 01:08:31,510 That's the last lecture. 1496 01:08:31,510 --> 01:08:35,394 Please do read Thaler and Sunstein. 1497 01:08:35,394 --> 01:08:38,200 Peluk tells me the llama is not here yet, 1498 01:08:38,200 --> 01:08:45,654 neither in my private room where the llama might be otherwise. 1499 01:08:45,654 --> 01:08:46,779 I'll sit here just to wait. 1500 01:08:46,779 --> 01:08:49,340 I think the announced time is 2:30, 1501 01:08:49,340 --> 01:08:50,573 so that's in four minutes. 1502 01:08:50,573 --> 01:08:52,490 So if you have to run, of course, that's fine. 1503 01:08:52,490 --> 01:08:54,615 Otherwise, I'm just going to stick around and wait. 1504 01:08:54,615 --> 01:08:56,229 If you want to talk about more, I'd 1505 01:08:56,229 --> 01:08:59,920 love to know more about why people are experimenting more. 1506 01:08:59,920 --> 01:09:03,753 I also would love to know more about how people think. 1507 01:09:03,753 --> 01:09:05,170 So one thing I was mentioning here 1508 01:09:05,170 --> 01:09:07,295 is that, while social media might be bad 1509 01:09:07,295 --> 01:09:08,920 and there might be detrimental effects, 1510 01:09:08,920 --> 01:09:12,350 in particular coming from social comparisons, which is to say, 1511 01:09:12,350 --> 01:09:13,270 look. 1512 01:09:13,270 --> 01:09:14,830 You see everybody on Instagram looks 1513 01:09:14,830 --> 01:09:17,870 like they have a glamorous life and are happy all the time. 1514 01:09:17,870 --> 01:09:20,960 You see that, and that might make you feel depressed. 1515 01:09:20,960 --> 01:09:22,654 By the way, that might also lead you 1516 01:09:22,654 --> 01:09:24,920 to [AUDIO OUT] glamorous pictures yourself. 1517 01:09:24,920 --> 01:09:28,040 So you might reinforce the whole issue overall. 1518 01:09:28,040 --> 01:09:30,710 I think there surely are some downs, actually. 1519 01:09:30,710 --> 01:09:32,240 But there's also potential upsides 1520 01:09:32,240 --> 01:09:35,490 of connecting people online, in particular 1521 01:09:35,490 --> 01:09:37,180 that I'm trying to work on. 1522 01:09:37,180 --> 01:09:41,920 [AUDIO OUT] if anybody has some small things [INAUDIBLE] could 1523 01:09:41,920 --> 01:09:48,040 do right now online to help people be more connected, 1524 01:09:48,040 --> 01:09:51,660 and less lonely, and less depressed.