1 00:00:00,000 --> 00:00:02,922 [SQUEAKING] 2 00:00:02,922 --> 00:00:04,870 [RUSTLING] 3 00:00:04,870 --> 00:00:06,818 [CLICKING] 4 00:00:10,720 --> 00:00:14,260 PROFESSOR: Welcome to Lecture 15 of 14.13. 5 00:00:14,260 --> 00:00:16,810 Today we're going to talk about utility from beliefs. 6 00:00:16,810 --> 00:00:20,580 Overall, lectures 15 and 16 are talking about utility 7 00:00:20,580 --> 00:00:24,040 from beliefs and learning. 8 00:00:24,040 --> 00:00:25,720 Well in the previous lecture, we talked 9 00:00:25,720 --> 00:00:28,210 about attention and the idea that attention 10 00:00:28,210 --> 00:00:32,229 might be a limited role-- so you might not have the capacity 11 00:00:32,229 --> 00:00:35,200 to attend as much as we can. 12 00:00:35,200 --> 00:00:38,620 And then thought a little bit about, what kinds of things 13 00:00:38,620 --> 00:00:40,420 are people paying attention to? 14 00:00:40,420 --> 00:00:42,363 And can it be that people systematically 15 00:00:42,363 --> 00:00:44,530 attend to the wrong things and miss important things 16 00:00:44,530 --> 00:00:47,180 in the world? 17 00:00:47,180 --> 00:00:50,320 Today we talk about different deviations from, perhaps, 18 00:00:50,320 --> 00:00:54,370 optimal belief or optimal information acquisition, which 19 00:00:54,370 --> 00:00:57,790 is people might derive utility directly from beliefs that 20 00:00:57,790 --> 00:01:00,610 therefore have potential incentives to deceive 21 00:01:00,610 --> 00:01:03,680 themselves in certain ways. 22 00:01:03,680 --> 00:01:06,040 Next time, we're going to talk more 23 00:01:06,040 --> 00:01:08,200 about learning-- sort of deviations 24 00:01:08,200 --> 00:01:11,351 or systematic deviations from Bayesian learning. 25 00:01:15,560 --> 00:01:17,330 So to summarize today, we're going 26 00:01:17,330 --> 00:01:18,955 to talk about the utility from beliefs. 27 00:01:18,955 --> 00:01:21,108 People directly deriving utility from beliefs, 28 00:01:21,108 --> 00:01:22,650 from what they think about the world, 29 00:01:22,650 --> 00:01:25,400 about themselves, what's going to happen in the future 30 00:01:25,400 --> 00:01:29,010 or how smart they are and so on. 31 00:01:29,010 --> 00:01:33,140 On Monday, we're going to talk about non-Bayesian learning-- 32 00:01:33,140 --> 00:01:35,690 this idea that people essentially have 33 00:01:35,690 --> 00:01:38,480 trouble being visual learners. 34 00:01:38,480 --> 00:01:40,040 Being Bayesian is quite hard. 35 00:01:40,040 --> 00:01:43,500 You may have taken probability theory and so on 36 00:01:43,500 --> 00:01:46,100 and have learned quite a bit. 37 00:01:46,100 --> 00:01:50,810 But even as a very smart and educated MIT student, 38 00:01:50,810 --> 00:01:53,100 there will be Bayesian learning problems 39 00:01:53,100 --> 00:01:56,210 that are way too hard once you have several variables, lots 40 00:01:56,210 --> 00:01:57,920 of information. 41 00:01:57,920 --> 00:02:00,170 It's just really hard to do these things in your head. 42 00:02:00,170 --> 00:02:02,240 And people might just not be very good at it 43 00:02:02,240 --> 00:02:07,544 and then sort of use heuristics and biases instead. 44 00:02:07,544 --> 00:02:09,919 And then after that, we're going to talk about projection 45 00:02:09,919 --> 00:02:13,190 and attribution bias, which essentially, 46 00:02:13,190 --> 00:02:15,080 if people have trouble predicting 47 00:02:15,080 --> 00:02:17,330 how they might feel in different states of the world-- 48 00:02:17,330 --> 00:02:19,080 if you're hungry, it might be hard for you 49 00:02:19,080 --> 00:02:22,580 to not think about being hungry or think 50 00:02:22,580 --> 00:02:23,630 about how it might feel. 51 00:02:23,630 --> 00:02:24,830 But you're not hungry. 52 00:02:24,830 --> 00:02:26,690 If you're sad, it might be very difficult 53 00:02:26,690 --> 00:02:29,060 for you to understand how it feels 54 00:02:29,060 --> 00:02:30,410 when you're happy and so on. 55 00:02:33,460 --> 00:02:36,320 And so I should say, on Thursday and Friday, 56 00:02:36,320 --> 00:02:40,120 tomorrow, you're going to talk in recitation about Bayesian 57 00:02:40,120 --> 00:02:41,890 learning, which is sort of like, how 58 00:02:41,890 --> 00:02:45,610 do economists or statisticians think you should optimally 59 00:02:45,610 --> 00:02:46,630 learn? 60 00:02:46,630 --> 00:02:49,150 What's the Bayesian benchmark here? 61 00:02:49,150 --> 00:02:51,700 And we're going to talk then about-- on Monday-- 62 00:02:51,700 --> 00:02:54,970 deviations from that. 63 00:02:54,970 --> 00:02:55,570 OK. 64 00:02:55,570 --> 00:02:58,090 So the first big picture overview is like, 65 00:02:58,090 --> 00:03:01,120 why might people miss information and fail to learn? 66 00:03:01,120 --> 00:03:02,260 What are potential reasons? 67 00:03:02,260 --> 00:03:04,750 Why might people do that? 68 00:03:04,750 --> 00:03:07,120 And we talked about true potential issues 69 00:03:07,120 --> 00:03:09,460 already last time. 70 00:03:09,460 --> 00:03:12,580 One is, very broadly, attention is limited. 71 00:03:12,580 --> 00:03:15,130 That's to say, there's so much information in the world that 72 00:03:15,130 --> 00:03:17,690 we just cannot attend to everything. 73 00:03:17,690 --> 00:03:18,190 Right? 74 00:03:18,190 --> 00:03:21,405 And so one specific example was inattention to taxes. 75 00:03:21,405 --> 00:03:22,780 But you could think about there's 76 00:03:22,780 --> 00:03:24,363 lots of different prices in the world. 77 00:03:24,363 --> 00:03:27,490 It's just way too much information in the world 78 00:03:27,490 --> 00:03:29,720 for you to attend to everything. 79 00:03:29,720 --> 00:03:31,690 And so one example that we talked about 80 00:03:31,690 --> 00:03:33,070 was inattention to taxes. 81 00:03:33,070 --> 00:03:38,170 It's saying that taxes that are not included in the sales price 82 00:03:38,170 --> 00:03:39,880 might be easily missed by people. 83 00:03:39,880 --> 00:03:43,040 And we have this very nice paper by Chetty et al. 84 00:03:43,040 --> 00:03:45,040 that demonstrated that-- that people essentially 85 00:03:45,040 --> 00:03:49,990 systematically under-estimate or under-appreciate taxes 86 00:03:49,990 --> 00:03:51,130 that they are exposed to. 87 00:03:53,703 --> 00:03:55,120 People pay more and more attention 88 00:03:55,120 --> 00:03:58,420 to taxes that are directly included in the sales price 89 00:03:58,420 --> 00:04:01,360 compared to taxes that are only added at the counter. 90 00:04:01,360 --> 00:04:03,802 And of course, both of them you have to pay. 91 00:04:03,802 --> 00:04:05,260 And so you really should not really 92 00:04:05,260 --> 00:04:07,450 under-appreciate those taxes, because that's 93 00:04:07,450 --> 00:04:09,460 just expensive for you. 94 00:04:09,460 --> 00:04:12,210 We did not get much into why are people doing that, 95 00:04:12,210 --> 00:04:13,960 but more like show the existence of people 96 00:04:13,960 --> 00:04:16,660 seem to be missing those taxes. 97 00:04:16,660 --> 00:04:21,820 Then we talked about reasons why people might systematically-- 98 00:04:21,820 --> 00:04:26,530 and even in the presence of lots of data in front of them-- 99 00:04:26,530 --> 00:04:32,030 not update properly, even if they are, in fact, Bayesians. 100 00:04:32,030 --> 00:04:33,700 And the reason might be that they 101 00:04:33,700 --> 00:04:35,702 might have the wrong theories of the world. 102 00:04:35,702 --> 00:04:37,660 You might think that certain variables are just 103 00:04:37,660 --> 00:04:40,120 not important for your well-being 104 00:04:40,120 --> 00:04:42,610 or for any important things in the world, 105 00:04:42,610 --> 00:04:44,350 and therefore, you have essentially-- 106 00:04:44,350 --> 00:04:46,570 since your attention is limited, you will only 107 00:04:46,570 --> 00:04:47,900 collect some information. 108 00:04:47,900 --> 00:04:51,100 You only focus your scarce attention 109 00:04:51,100 --> 00:04:54,560 and memory to what you think is important things in the world. 110 00:04:54,560 --> 00:05:00,400 And so now if you do that, you might systematically 111 00:05:00,400 --> 00:05:04,510 miss or not collect data that would help you essentially 112 00:05:04,510 --> 00:05:07,510 improve your theories and therefore, 113 00:05:07,510 --> 00:05:10,420 even in the very long run, even when you get lots of data, 114 00:05:10,420 --> 00:05:11,680 you will not update. 115 00:05:11,680 --> 00:05:13,150 Notice that that's quite different 116 00:05:13,150 --> 00:05:14,550 from rational inattention. 117 00:05:14,550 --> 00:05:17,150 There is rational inattention theories would say, 118 00:05:17,150 --> 00:05:19,040 well people's attention is limited, 119 00:05:19,040 --> 00:05:23,560 but they focus attention to whatever 120 00:05:23,560 --> 00:05:25,040 is most important for them. 121 00:05:25,040 --> 00:05:27,970 So it can't really be that they have huge losses 122 00:05:27,970 --> 00:05:31,300 from a lapse of attention, because if they were, then 123 00:05:31,300 --> 00:05:34,270 they would direct their attention to something that 124 00:05:34,270 --> 00:05:35,990 could be potentially important. 125 00:05:35,990 --> 00:05:38,110 In contrast, the "Learning by Noticing" 126 00:05:38,110 --> 00:05:42,550 paper, the theory paper by Hanna et al. that we discussed, 127 00:05:42,550 --> 00:05:45,220 provides a reason why people might not 128 00:05:45,220 --> 00:05:48,470 pay attention to stuff that's really important for them. 129 00:05:48,470 --> 00:05:50,860 And so, again, if your theory says 130 00:05:50,860 --> 00:05:54,610 that certain aspects or certain information is not helpful, 131 00:05:54,610 --> 00:05:58,027 you would not pay attention to those pieces of information. 132 00:05:58,027 --> 00:06:00,610 And if you don't pay attention to those pieces of information, 133 00:06:00,610 --> 00:06:02,740 you will never update your own theories 134 00:06:02,740 --> 00:06:04,270 and your own theory will essentially 135 00:06:04,270 --> 00:06:09,660 persist for a long time, potentially forever. 136 00:06:09,660 --> 00:06:12,990 We're going to talk now about two other reasons 137 00:06:12,990 --> 00:06:14,910 for potentially wrong beliefs. 138 00:06:14,910 --> 00:06:16,890 One is anticipatory utility. 139 00:06:16,890 --> 00:06:19,800 This is, essentially, people like to look forward 140 00:06:19,800 --> 00:06:21,270 to good things in the world. 141 00:06:21,270 --> 00:06:24,720 Suppose you have a vacation coming up in six months 142 00:06:24,720 --> 00:06:25,600 from now. 143 00:06:25,600 --> 00:06:29,590 You might think about this positive event going forward. 144 00:06:29,590 --> 00:06:32,460 Essentially, good or bad events in the future that might happen 145 00:06:32,460 --> 00:06:36,720 and you might derive utility from those events already now. 146 00:06:36,720 --> 00:06:40,680 Second, there's what people call ego utility, which 147 00:06:40,680 --> 00:06:45,030 is, people derive utility from thinking that they are smart, 148 00:06:45,030 --> 00:06:47,390 good-looking, and so on. 149 00:06:47,390 --> 00:06:49,560 So essentially, they derive utility 150 00:06:49,560 --> 00:06:53,050 from thinking positively about themselves. 151 00:06:53,050 --> 00:06:56,820 Now for both of those kinds of pieces of utility, 152 00:06:56,820 --> 00:07:04,230 people now potentially have some incentives to delude themselves 153 00:07:04,230 --> 00:07:06,630 and to think that things are perhaps 154 00:07:06,630 --> 00:07:09,720 better than they actually are, because that 155 00:07:09,720 --> 00:07:10,947 will make them happier. 156 00:07:10,947 --> 00:07:13,030 And so then often there is, at the end of the day, 157 00:07:13,030 --> 00:07:16,200 there's some trade-off between having 158 00:07:16,200 --> 00:07:19,770 overly positive or optimistic beliefs will 159 00:07:19,770 --> 00:07:21,267 make people happier. 160 00:07:21,267 --> 00:07:22,600 They feel good about themselves. 161 00:07:22,600 --> 00:07:25,050 They feel about good things happening in the future. 162 00:07:25,050 --> 00:07:29,370 But that may come at the cost of not preparing or taking 163 00:07:29,370 --> 00:07:34,700 wrong actions because people are sort of overly optimistic. 164 00:07:34,700 --> 00:07:35,200 Right? 165 00:07:35,200 --> 00:07:37,440 So one example would be people might not 166 00:07:37,440 --> 00:07:41,790 get tested for certain diseases because they want 167 00:07:41,790 --> 00:07:43,650 to think that they're the healthy people 168 00:07:43,650 --> 00:07:46,490 and want to be happy about the future 169 00:07:46,490 --> 00:07:48,610 and look forward to a positive future. 170 00:07:48,610 --> 00:07:50,490 But of course, if they were getting tested, 171 00:07:50,490 --> 00:07:54,000 that would help them to take optimal actions-- potentially 172 00:07:54,000 --> 00:07:56,250 medication or any other actions reacting 173 00:07:56,250 --> 00:07:59,280 to the potential disease that they may have. 174 00:07:59,280 --> 00:08:01,300 And for ego utility, for example, 175 00:08:01,300 --> 00:08:04,020 if you always think that you are smarter than you actually are, 176 00:08:04,020 --> 00:08:05,358 well that might make you happy. 177 00:08:05,358 --> 00:08:07,650 But at the same time, that might have come at some cost 178 00:08:07,650 --> 00:08:09,270 and you might not prepare properly 179 00:08:09,270 --> 00:08:11,400 for some exams or interviews or other things. 180 00:08:11,400 --> 00:08:13,108 You might just miss important things that 181 00:08:13,108 --> 00:08:15,700 might hurt you along the way. 182 00:08:15,700 --> 00:08:18,730 And then finally, the last reason 183 00:08:18,730 --> 00:08:22,440 that we'll talk about next week is people might simply just 184 00:08:22,440 --> 00:08:25,140 be bad at Bayesian learning. 185 00:08:25,140 --> 00:08:28,620 Then there's no utility from beliefs or the like involved. 186 00:08:28,620 --> 00:08:31,230 It's just a lot of the computational issues 187 00:08:31,230 --> 00:08:34,320 that we ask people to do are just really hard 188 00:08:34,320 --> 00:08:35,980 and people might be bad at that. 189 00:08:35,980 --> 00:08:42,600 And therefore, people use heuristics and biases. 190 00:08:42,600 --> 00:08:45,265 And then systematically, most of the time these heuristics 191 00:08:45,265 --> 00:08:47,280 and biases are quite helpful and help people 192 00:08:47,280 --> 00:08:49,480 making reasonable decisions. 193 00:08:49,480 --> 00:08:53,130 But then often these heuristics and biases are, in some ways, 194 00:08:53,130 --> 00:08:56,300 systematically wrong. 195 00:08:56,300 --> 00:08:58,370 In some cases-- and we can then think 196 00:08:58,370 --> 00:09:00,553 about some systematic errors and perhaps offer 197 00:09:00,553 --> 00:09:01,970 some ways in which you can improve 198 00:09:01,970 --> 00:09:06,170 people's decision-making by avoiding these errors. 199 00:09:11,840 --> 00:09:13,960 Now let's talk about utility from beliefs. 200 00:09:13,960 --> 00:09:17,530 So it's important to understand that economists typically 201 00:09:17,530 --> 00:09:21,070 define utility functions over outcomes, such as money, 202 00:09:21,070 --> 00:09:22,870 consumption, health. 203 00:09:22,870 --> 00:09:24,700 Think of these things as things that people 204 00:09:24,700 --> 00:09:27,610 consume one way or the other, or things that they can usually 205 00:09:27,610 --> 00:09:28,790 measure at some point. 206 00:09:28,790 --> 00:09:30,720 So you can look at it like, you have money. 207 00:09:30,720 --> 00:09:35,560 Then if you have money, you will spend it on consumption goods. 208 00:09:35,560 --> 00:09:37,550 It could be apples and bananas, of course. 209 00:09:37,550 --> 00:09:41,080 It could be also like haircuts or other services or health, 210 00:09:41,080 --> 00:09:41,770 for example. 211 00:09:41,770 --> 00:09:44,320 And I can measure whether you have good or bad health. 212 00:09:44,320 --> 00:09:46,340 Usually we think about some outcomes that 213 00:09:46,340 --> 00:09:48,050 could be at levels of outcomes. 214 00:09:48,050 --> 00:09:51,820 So essentially, like three apples or five apples or seven 215 00:09:51,820 --> 00:09:54,478 apples are the arguments of utility function. 216 00:09:54,478 --> 00:09:56,770 It could be also-- as we discussed when we talked about 217 00:09:56,770 --> 00:09:59,200 reference-dependent utility-- 218 00:09:59,200 --> 00:10:01,990 it could be in part being deviations from some reference 219 00:10:01,990 --> 00:10:03,640 point or changes over time. 220 00:10:03,640 --> 00:10:06,820 But importantly, here, the arguments-- or goods 221 00:10:06,820 --> 00:10:10,390 and services, essentially, are the arguments 222 00:10:10,390 --> 00:10:12,820 of the utility function. 223 00:10:12,820 --> 00:10:14,830 Instead, another source of utility 224 00:10:14,830 --> 00:10:17,750 could be beliefs about such outcomes. 225 00:10:17,750 --> 00:10:21,100 So now it's about what might happen either in the future 226 00:10:21,100 --> 00:10:23,290 or what's happening right now. 227 00:10:23,290 --> 00:10:28,630 And you might directly derive utility from such beliefs. 228 00:10:28,630 --> 00:10:32,230 Now such utility from beliefs can be a very powerful source 229 00:10:32,230 --> 00:10:33,963 of utility. 230 00:10:33,963 --> 00:10:35,380 One example, for example, would be 231 00:10:35,380 --> 00:10:37,240 a high-profile public speech. 232 00:10:37,240 --> 00:10:39,760 Suppose you have to give a public speech in front 233 00:10:39,760 --> 00:10:41,350 of the entire universe. 234 00:10:41,350 --> 00:10:43,720 Now of course you might sort of derive utility 235 00:10:43,720 --> 00:10:45,720 directly from that experience itself. 236 00:10:45,720 --> 00:10:47,270 You might like it a lot. 237 00:10:47,270 --> 00:10:48,680 You might find it very stressful. 238 00:10:48,680 --> 00:10:51,560 So the 10, 20, 30, whatever minutes-- 239 00:10:51,560 --> 00:10:52,870 however long the speech is-- 240 00:10:52,870 --> 00:10:58,660 might give you positive or negative utility. 241 00:10:58,660 --> 00:11:01,090 But perhaps the utility of that experience 242 00:11:01,090 --> 00:11:03,370 itself might be or could be dwarfed 243 00:11:03,370 --> 00:11:07,240 by the stress and anticipation derived beforehand 244 00:11:07,240 --> 00:11:09,340 from anticipating it. 245 00:11:09,340 --> 00:11:11,140 That is to say, you might think about it. 246 00:11:11,140 --> 00:11:12,265 You might prepare about it. 247 00:11:12,265 --> 00:11:13,770 You might worry about it. 248 00:11:13,770 --> 00:11:15,070 You might not sleep at night. 249 00:11:15,070 --> 00:11:18,400 So there might be lots of stress or positive excitement 250 00:11:18,400 --> 00:11:19,900 and anticipation. 251 00:11:19,900 --> 00:11:23,709 Every day-- if you give the speech sometime in September, 252 00:11:23,709 --> 00:11:25,292 every day until September you're going 253 00:11:25,292 --> 00:11:27,310 to think about it positively or negatively. 254 00:11:27,310 --> 00:11:31,390 And you might derive positive or negative anticipatory utility 255 00:11:31,390 --> 00:11:32,203 from it. 256 00:11:32,203 --> 00:11:34,120 There's another part of utility that we're not 257 00:11:34,120 --> 00:11:34,953 going to talk about. 258 00:11:34,953 --> 00:11:36,340 It's utility from memories. 259 00:11:36,340 --> 00:11:38,345 So you might afterwards look back and say, oh, 260 00:11:38,345 --> 00:11:39,220 that was really nice. 261 00:11:39,220 --> 00:11:41,850 And you have a video, and you're going to enjoy it, and so on. 262 00:11:41,850 --> 00:11:44,183 We're going to not talk about that, but that's sort of-- 263 00:11:44,183 --> 00:11:48,230 backward-looking utility could be quite important as well. 264 00:11:48,230 --> 00:11:52,270 Now when you think about utility from beliefs, one way to think 265 00:11:52,270 --> 00:11:53,682 is that this is all in your head. 266 00:11:53,682 --> 00:11:55,390 But it turns out the utility from beliefs 267 00:11:55,390 --> 00:11:59,080 can actually even affect physical outcomes. 268 00:11:59,080 --> 00:12:01,990 In particular, there's a large body 269 00:12:01,990 --> 00:12:05,440 of research on placebo effects, which 270 00:12:05,440 --> 00:12:08,470 is defined as a treatment that can help patients merely 271 00:12:08,470 --> 00:12:11,500 because he or she believes it will. 272 00:12:11,500 --> 00:12:14,620 That is to say, if you give people a placebo or a sugar 273 00:12:14,620 --> 00:12:17,620 pill, something that essentially has no active ingredient, 274 00:12:17,620 --> 00:12:23,270 and tell them that this is, in fact, an effective drug 275 00:12:23,270 --> 00:12:27,290 for something, that drug itself, relative to control, 276 00:12:27,290 --> 00:12:30,170 might have treatment effects. 277 00:12:30,170 --> 00:12:38,680 Now that often is the case for things like pain medication. 278 00:12:38,680 --> 00:12:41,540 Although usually we think pain medication-- if you give people 279 00:12:41,540 --> 00:12:46,250 placebo pills, compared to actual pain medication, 280 00:12:46,250 --> 00:12:49,580 these drugs are about a third as effective as pain medication. 281 00:12:49,580 --> 00:12:52,018 Depends, of course, on the dosage. 282 00:12:52,018 --> 00:12:54,060 So that's quite powerful, but then you might say, 283 00:12:54,060 --> 00:12:55,400 well that's only in people's head, 284 00:12:55,400 --> 00:12:56,775 and maybe people just say they're 285 00:12:56,775 --> 00:12:59,777 happier or less in pain, when, in fact, they're not. 286 00:12:59,777 --> 00:13:01,610 There's also some studies that have actually 287 00:13:01,610 --> 00:13:04,040 found some physical symptoms. 288 00:13:04,040 --> 00:13:05,960 So you give people a placebo pill and there's 289 00:13:05,960 --> 00:13:09,230 actually physically, in some measurable ways-- 290 00:13:09,230 --> 00:13:11,540 hard measures, not just self reports-- people 291 00:13:11,540 --> 00:13:12,450 are doing better. 292 00:13:12,450 --> 00:13:16,710 So the placebo effect can be, in fact, quite powerful. 293 00:13:16,710 --> 00:13:22,730 So when you now think overall about sources of utility, 294 00:13:22,730 --> 00:13:25,220 it's hard to imagine, in fact, any source of utility 295 00:13:25,220 --> 00:13:28,340 that's not influenced to some extent by beliefs. 296 00:13:28,340 --> 00:13:31,950 Often it's about how you look forward to some things. 297 00:13:31,950 --> 00:13:34,460 It's not just about going to a restaurant or a date 298 00:13:34,460 --> 00:13:36,110 or having fun with friends. 299 00:13:36,110 --> 00:13:43,060 A lot of it is also just the anticipation of all of that. 300 00:13:43,060 --> 00:13:43,630 OK. 301 00:13:43,630 --> 00:13:46,810 So now more specifically, what do we 302 00:13:46,810 --> 00:13:49,300 mean by anticipatory utility? 303 00:13:49,300 --> 00:13:53,925 So many emotions in particular are intimately 304 00:13:53,925 --> 00:13:57,280 linked, related to what a person thinks about the future. 305 00:13:57,280 --> 00:14:02,020 And you think about hope, fear, anxiety, savoring, et cetera-- 306 00:14:02,020 --> 00:14:04,090 a lot has to do with some stuff that 307 00:14:04,090 --> 00:14:06,910 might happen in the future, some events that 308 00:14:06,910 --> 00:14:10,160 might happen for sure, or some events that could happen 309 00:14:10,160 --> 00:14:13,160 but we're just worried about, such as when 310 00:14:13,160 --> 00:14:15,888 people are afraid and anxious about certain things. 311 00:14:15,888 --> 00:14:17,930 Those are actually things that might happen, even 312 00:14:17,930 --> 00:14:21,140 with a pretty low probability, yet people right now 313 00:14:21,140 --> 00:14:28,270 might be for sure quite anxious or worried or afraid. 314 00:14:28,270 --> 00:14:31,630 Now then how do we define and think 315 00:14:31,630 --> 00:14:33,700 about anticipatory utility? 316 00:14:33,700 --> 00:14:36,700 Well it's utility derived now from anticipating 317 00:14:36,700 --> 00:14:37,490 in the future. 318 00:14:37,490 --> 00:14:39,220 So there's stuff that, in the future, 319 00:14:39,220 --> 00:14:42,190 enters your utility function directly. 320 00:14:42,190 --> 00:14:45,520 And those events that will happen in the future 321 00:14:45,520 --> 00:14:47,380 could be consumption, could be also 322 00:14:47,380 --> 00:14:50,080 services, could be like bad shocks or any things that 323 00:14:50,080 --> 00:14:52,120 will happen in the future. 324 00:14:52,120 --> 00:14:54,490 Right now, in the current period, 325 00:14:54,490 --> 00:14:56,620 you're already feeling good or bad about it. 326 00:14:56,620 --> 00:15:00,580 Or something that will happen one or several periods 327 00:15:00,580 --> 00:15:01,540 in the future. 328 00:15:01,540 --> 00:15:04,630 And anticipatory utility is a prime example 329 00:15:04,630 --> 00:15:06,550 of utility from beliefs. 330 00:15:06,550 --> 00:15:10,407 Now how might anticipatory utility affect behaviors? 331 00:15:10,407 --> 00:15:11,990 You can think about, broadly speaking, 332 00:15:11,990 --> 00:15:14,742 two classes of implications that we are going to think about. 333 00:15:14,742 --> 00:15:16,450 There are sort of other potential effects 334 00:15:16,450 --> 00:15:17,960 that might be there. 335 00:15:17,960 --> 00:15:23,320 So the first one is people, if they have anticipatory utility, 336 00:15:23,320 --> 00:15:27,640 that might affect the choice of timing of certain outcomes. 337 00:15:27,640 --> 00:15:31,090 That is to say, if you have the choice when you can experience 338 00:15:31,090 --> 00:15:36,190 a certain good, you might want to delay that or not, 339 00:15:36,190 --> 00:15:39,040 depending on your anticipatory utility. 340 00:15:39,040 --> 00:15:40,660 For example, if I asked you, would you 341 00:15:40,660 --> 00:15:44,250 like to go on a vacation next week or six months from now 342 00:15:44,250 --> 00:15:45,860 or six weeks from now? 343 00:15:45,860 --> 00:15:49,810 You might say, in fact, six months from now 344 00:15:49,810 --> 00:15:52,090 is preferable because then you'll 345 00:15:52,090 --> 00:15:55,270 have six months to look forward to that experience 346 00:15:55,270 --> 00:15:57,040 and be really excited about it. 347 00:15:57,040 --> 00:15:58,880 Instead if I say, it's only next week, well 348 00:15:58,880 --> 00:16:00,617 then, it might be fun to go next week 349 00:16:00,617 --> 00:16:02,200 and maybe if you're discounting, you'd 350 00:16:02,200 --> 00:16:04,760 rather it earlier than later. 351 00:16:04,760 --> 00:16:08,260 But then there's not very much time to look forward to it 352 00:16:08,260 --> 00:16:12,700 and to savor and be excited about this experience. 353 00:16:12,700 --> 00:16:15,760 So anticipatory utility might affect the choice 354 00:16:15,760 --> 00:16:16,840 of timing of outcomes. 355 00:16:16,840 --> 00:16:19,780 We're going to talk about this in a bit. 356 00:16:19,780 --> 00:16:23,890 Second, and perhaps more consequentially, 357 00:16:23,890 --> 00:16:26,200 anticipatory utility also may affect 358 00:16:26,200 --> 00:16:29,830 people's beliefs and their information acquisition. 359 00:16:29,830 --> 00:16:31,540 That is to say, if there is some things 360 00:16:31,540 --> 00:16:34,280 that I like to look forward to in life-- for example, 361 00:16:34,280 --> 00:16:37,090 if I think I'm a healthy person, I'm going to age really 362 00:16:37,090 --> 00:16:39,580 happily, and I'm going to live until age 90 363 00:16:39,580 --> 00:16:46,010 happily, for a long time, well then for example, 364 00:16:46,010 --> 00:16:48,340 that anticipatory utility might then 365 00:16:48,340 --> 00:16:52,310 depend a lot on the beliefs about my health status. 366 00:16:52,310 --> 00:16:54,610 And now if there were tests or some information 367 00:16:54,610 --> 00:17:02,110 that I could acquire about HIV, be it about cancer, 368 00:17:02,110 --> 00:17:04,300 be it about Huntington's Disease and the like-- 369 00:17:04,300 --> 00:17:08,150 it's very serious potential diseases, 370 00:17:08,150 --> 00:17:14,390 people might under-use or under-acquire such a type 371 00:17:14,390 --> 00:17:17,359 of information because they want to maintain 372 00:17:17,359 --> 00:17:19,520 overly positive beliefs about what's 373 00:17:19,520 --> 00:17:20,944 going to happen in the future. 374 00:17:20,944 --> 00:17:22,819 And that could be potentially quite important 375 00:17:22,819 --> 00:17:25,790 because if people are not getting, for example, cancer 376 00:17:25,790 --> 00:17:29,300 screening, well then some treatments that perhaps could 377 00:17:29,300 --> 00:17:32,210 be done very early might get delayed, 378 00:17:32,210 --> 00:17:34,490 and that could be potentially quite costly. 379 00:17:34,490 --> 00:17:36,460 We're going to talk about this second. 380 00:17:36,460 --> 00:17:36,960 OK. 381 00:17:36,960 --> 00:17:39,050 Let me show you some motivating evidence 382 00:17:39,050 --> 00:17:41,130 of anticipatory utility. 383 00:17:41,130 --> 00:17:44,760 This is not so much to establish the existence of such utility 384 00:17:44,760 --> 00:17:47,660 to you as like a very rigorous test and showing exactly 385 00:17:47,660 --> 00:17:50,330 this is anticipatory utility and how much of that is there, 386 00:17:50,330 --> 00:17:52,040 but rather to get you to start thinking 387 00:17:52,040 --> 00:17:55,646 about the patterns of behavior generated by it. 388 00:17:55,646 --> 00:17:58,010 So George Loewenstein ingeniously 389 00:17:58,010 --> 00:18:01,340 asked undergrads about their hypothetical willingness 390 00:18:01,340 --> 00:18:05,133 to pay now to obtain or avoid certain experiences. 391 00:18:05,133 --> 00:18:06,800 So they asked them about the willingness 392 00:18:06,800 --> 00:18:09,620 to pay as a function of the amount of time 393 00:18:09,620 --> 00:18:12,120 until the experience occurs. 394 00:18:12,120 --> 00:18:14,510 So all values are normalized relative to students' 395 00:18:14,510 --> 00:18:17,960 willingness to pay to obtain the experience right away. 396 00:18:17,960 --> 00:18:19,910 So you ask people about, like, how much 397 00:18:19,910 --> 00:18:23,880 are you going to pay for right away and 24 hours from now 398 00:18:23,880 --> 00:18:26,900 and three days from now, and a year from now, five years, 399 00:18:26,900 --> 00:18:28,340 10 years from now? 400 00:18:28,340 --> 00:18:31,820 And then the willingness to pay right now 401 00:18:31,820 --> 00:18:36,740 is 1 by definition and everything is relative to that. 402 00:18:36,740 --> 00:18:40,370 What are the experiences that you ask people about? 403 00:18:40,370 --> 00:18:45,360 There's sort of both pleasant and unpleasant experiences. 404 00:18:45,360 --> 00:18:47,360 One of them is receiving monetary gain 405 00:18:47,360 --> 00:18:49,190 or avoiding monetary losses. 406 00:18:49,190 --> 00:18:50,780 There's obtaining a kiss from a movie 407 00:18:50,780 --> 00:18:52,520 star of the student's choice. 408 00:18:52,520 --> 00:18:55,310 And there is avoiding a non-lethal but very painful 409 00:18:55,310 --> 00:18:56,360 electric shock. 410 00:18:56,360 --> 00:18:58,880 And now you can think about why Loewenstein 411 00:18:58,880 --> 00:19:01,010 asked about hypothetical willingness to pay, 412 00:19:01,010 --> 00:19:05,870 because implementing that would be pretty tough to actually do. 413 00:19:05,870 --> 00:19:10,530 Now when you think about these types of experiences, 414 00:19:10,530 --> 00:19:12,030 you can think of it about like, what 415 00:19:12,030 --> 00:19:14,700 would your hypothetical willingness to pay be? 416 00:19:14,700 --> 00:19:18,480 And how would that look like as a function of the timing? 417 00:19:18,480 --> 00:19:21,660 Would it go up or would it go down over time? 418 00:19:21,660 --> 00:19:23,650 And there's going to be two forces at play. 419 00:19:23,650 --> 00:19:25,390 There's going to be discounting at play. 420 00:19:25,390 --> 00:19:28,920 And there's going to be anticipatory utility at play. 421 00:19:28,920 --> 00:19:30,480 We talked about discounting already 422 00:19:30,480 --> 00:19:32,430 a lot, which is, people like to have 423 00:19:32,430 --> 00:19:34,830 positive stuff in the present-- 424 00:19:34,830 --> 00:19:37,650 or rather in the present than in the future. 425 00:19:37,650 --> 00:19:40,900 People like to push negative stuff away to the future. 426 00:19:40,900 --> 00:19:43,740 So if bad stuff is going to happen, like an electric shock, 427 00:19:43,740 --> 00:19:46,890 you'd rather have that 10 years from now 428 00:19:46,890 --> 00:19:49,440 rather than right away. 429 00:19:49,440 --> 00:19:53,310 So then the question now is, how does anticipatory utility 430 00:19:53,310 --> 00:19:56,430 affect those kinds of choices? 431 00:19:56,430 --> 00:20:02,080 And I already gave you some of the situation to start with, 432 00:20:02,080 --> 00:20:07,410 which is like, if you had a vacation that you potentially 433 00:20:07,410 --> 00:20:10,380 can experience, would you rather have this right now? 434 00:20:10,380 --> 00:20:12,760 Would you rather have it in a week from now, 435 00:20:12,760 --> 00:20:15,310 six months from now, or 10 years from now? 436 00:20:15,310 --> 00:20:17,380 Well that depends on your anticipatory utility. 437 00:20:17,380 --> 00:20:21,270 You might prefer to have it in six months from now 438 00:20:21,270 --> 00:20:23,340 because that allows you to look forward 439 00:20:23,340 --> 00:20:25,170 to it for quite a long time. 440 00:20:25,170 --> 00:20:27,780 You might not want to have it 10 years from now. 441 00:20:27,780 --> 00:20:29,850 That might be good from the perspective 442 00:20:29,850 --> 00:20:32,910 of anticipatory utility, but it's really, really far away 443 00:20:32,910 --> 00:20:36,340 and you'd rather have things in the present than in the future. 444 00:20:36,340 --> 00:20:39,570 So maybe if you take together anticipatory utility 445 00:20:39,570 --> 00:20:44,670 and discounting, you'd rather have it in an immediate amount 446 00:20:44,670 --> 00:20:47,880 of time from now, which gives you the sort of a form 447 00:20:47,880 --> 00:20:53,000 of an inverse U-shape of valuations over time, 448 00:20:53,000 --> 00:20:55,910 or valuations as a function of when-- 449 00:20:55,910 --> 00:20:57,500 right now of a function of what's 450 00:20:57,500 --> 00:20:59,430 going to happen in the future. 451 00:20:59,430 --> 00:21:01,290 Let me show you what I mean. 452 00:21:01,290 --> 00:21:05,870 So when you look at Loewenstein's examples-- 453 00:21:05,870 --> 00:21:09,230 in particular, the kiss of the movie star 454 00:21:09,230 --> 00:21:12,390 is an example of an inverse U, where 455 00:21:12,390 --> 00:21:14,510 this is a pleasant experience. 456 00:21:14,510 --> 00:21:17,270 And what seems to be the case is that some people 457 00:21:17,270 --> 00:21:24,290 prefer that experience to happen in three hours, 24 hours, three 458 00:21:24,290 --> 00:21:28,310 days, and one year from now, compared to immediately. 459 00:21:28,310 --> 00:21:30,470 You see the values here are always 460 00:21:30,470 --> 00:21:34,098 higher than 1, which is sort of what things are normalized to. 461 00:21:34,098 --> 00:21:35,390 So let me back up for a second. 462 00:21:35,390 --> 00:21:36,620 What does this graph show? 463 00:21:36,620 --> 00:21:40,160 It shows the time delay on the x-axis, which is immediately, 464 00:21:40,160 --> 00:21:42,860 three hours, 24 hours, three days, one year, 10 years 465 00:21:42,860 --> 00:21:44,430 from now on the x-axis. 466 00:21:44,430 --> 00:21:47,060 That's when the actual experience is happening. 467 00:21:47,060 --> 00:21:48,950 And on the y-axis is the proportion 468 00:21:48,950 --> 00:21:50,450 of the current value. 469 00:21:50,450 --> 00:21:55,340 And it's normalized to 1, so the valuation immediately is 1. 470 00:21:55,340 --> 00:21:58,960 Notice that these are willingness to pay 471 00:21:58,960 --> 00:22:02,910 to receive the experience or willingness to pay to avoid it. 472 00:22:02,910 --> 00:22:05,240 So everything is positive. 473 00:22:05,240 --> 00:22:08,450 Now for the kiss now, we see that the delay 474 00:22:08,450 --> 00:22:11,630 of three days from now seems to have the highest valuation. 475 00:22:11,630 --> 00:22:15,480 So people like to delay this for a bit. 476 00:22:15,480 --> 00:22:18,710 Notice that discounting would not generate you this pattern. 477 00:22:18,710 --> 00:22:20,270 Discounting would say, you want it 478 00:22:20,270 --> 00:22:24,140 right away because you really enjoy this experience. 479 00:22:24,140 --> 00:22:27,180 So anticipatory utility would say, well three days from now 480 00:22:27,180 --> 00:22:30,213 and even a year from now it's better, because now you 481 00:22:30,213 --> 00:22:31,880 can savor it, really look forward to it, 482 00:22:31,880 --> 00:22:35,230 be excited about it when that's happening. 483 00:22:35,230 --> 00:22:37,050 However, 10 years from now, people 484 00:22:37,050 --> 00:22:39,990 seem to like less than right now, presumably because it's 485 00:22:39,990 --> 00:22:41,370 really, really far away. 486 00:22:41,370 --> 00:22:42,900 So A-- I guess you're discounting 487 00:22:42,900 --> 00:22:47,040 that a lot and the experience itself gets discounted a lot. 488 00:22:47,040 --> 00:22:50,040 B-- even the anticipatory utility 489 00:22:50,040 --> 00:22:51,870 might only kick in nine years from now 490 00:22:51,870 --> 00:22:55,140 because it's really so far away that it's 491 00:22:55,140 --> 00:22:57,600 hard to think about this because it's so far away. 492 00:22:57,600 --> 00:23:02,220 And you might even experience anticipatory utility 493 00:23:02,220 --> 00:23:04,090 nine years from now. 494 00:23:04,090 --> 00:23:06,790 But you might discount that as well because it's really 495 00:23:06,790 --> 00:23:07,910 far away in the future. 496 00:23:07,910 --> 00:23:10,593 So it seems like people have some preference 497 00:23:10,593 --> 00:23:11,260 for the present. 498 00:23:11,260 --> 00:23:13,450 There is some discounting going on. 499 00:23:13,450 --> 00:23:15,760 And there's sort of two forces at play-- discounting 500 00:23:15,760 --> 00:23:18,400 versus anticipatory utility. 501 00:23:18,400 --> 00:23:24,040 Now in contrast, when you look at the negative shock, 502 00:23:24,040 --> 00:23:28,940 here it seems to be that people really 503 00:23:28,940 --> 00:23:33,830 seem to dislike having that shock in 10 years or one year 504 00:23:33,830 --> 00:23:37,280 from now, compared to having it right away. 505 00:23:37,280 --> 00:23:38,180 What's going on here? 506 00:23:38,180 --> 00:23:41,060 Well if you have the shock in one year, 10 years from now, 507 00:23:41,060 --> 00:23:42,770 this is really an unpleasant thing 508 00:23:42,770 --> 00:23:44,300 that hangs over your head. 509 00:23:44,300 --> 00:23:50,180 So what, instead, people want is they'd rather sort of get it 510 00:23:50,180 --> 00:23:55,400 over with right away and not have to think about it too much 511 00:23:55,400 --> 00:23:56,540 overall. 512 00:23:56,540 --> 00:24:06,970 Now and that's, again, sort of the opposite of discounting. 513 00:24:06,970 --> 00:24:09,130 If you just had discounting, what you would say 514 00:24:09,130 --> 00:24:12,548 is, well rather I'm willing to pay 515 00:24:12,548 --> 00:24:13,840 to avoid it if it's right away. 516 00:24:13,840 --> 00:24:15,430 It's really unpleasant right away. 517 00:24:15,430 --> 00:24:17,530 It's not so bad if it's 10 years from now. 518 00:24:17,530 --> 00:24:20,800 So what discounting would say is, 519 00:24:20,800 --> 00:24:22,210 the function should be going down 520 00:24:22,210 --> 00:24:24,453 as you see it for losing $4. 521 00:24:24,453 --> 00:24:25,120 And why is that? 522 00:24:25,120 --> 00:24:28,000 Well losing $4 seems to be like, people rather 523 00:24:28,000 --> 00:24:30,373 do that in the future rather than in the present. 524 00:24:30,373 --> 00:24:32,540 And it's not really a thing that you think about it. 525 00:24:32,540 --> 00:24:34,660 You don't have a lot of anticipatory utility 526 00:24:34,660 --> 00:24:37,450 about losing $4 in 10 years from now. 527 00:24:37,450 --> 00:24:40,430 That's not such a hugely important thing. 528 00:24:40,430 --> 00:24:43,150 However, if you get an electric shock 10 years from now, 529 00:24:43,150 --> 00:24:45,230 that might be really unpleasant. 530 00:24:45,230 --> 00:24:48,160 So you really don't want to have that in the year or 10 years 531 00:24:48,160 --> 00:24:50,890 from now, having this hang over your head for a long time. 532 00:24:50,890 --> 00:24:53,440 So you'd rather have it right away 533 00:24:53,440 --> 00:24:57,430 to not have to suffer from any anticipatory utility, even 534 00:24:57,430 --> 00:24:59,680 though from a discounting perspective, that's actually 535 00:24:59,680 --> 00:25:03,300 sort of worse. 536 00:25:03,300 --> 00:25:04,080 OK. 537 00:25:04,080 --> 00:25:07,200 And so I already said all of this-- subjects 538 00:25:07,200 --> 00:25:12,420 prefer a kiss from a movie star three days rather than now. 539 00:25:12,420 --> 00:25:14,910 They also prefer to have a shock now rather 540 00:25:14,910 --> 00:25:17,830 than in one year or 10 years. 541 00:25:17,830 --> 00:25:21,430 Remember this is willingness to pay to avoid a situation, 542 00:25:21,430 --> 00:25:24,900 so the willingness to pay right now for the shock right 543 00:25:24,900 --> 00:25:26,880 now is lower than the willingness 544 00:25:26,880 --> 00:25:29,290 to pay in one or 10 years. 545 00:25:29,290 --> 00:25:31,200 So the curve is going up. 546 00:25:31,200 --> 00:25:34,110 Now this contradicts discounting of any kind. 547 00:25:34,110 --> 00:25:35,820 If you think about any discounting, 548 00:25:35,820 --> 00:25:37,500 positive or negative discounting-- 549 00:25:37,500 --> 00:25:40,260 as in, you might have a preference for the present 550 00:25:40,260 --> 00:25:41,980 or preference for the future-- 551 00:25:41,980 --> 00:25:44,558 you cannot generate these kinds of patterns. 552 00:25:44,558 --> 00:25:46,350 In particular, a pattern that's really hard 553 00:25:46,350 --> 00:25:48,900 to generate is a pattern where you 554 00:25:48,900 --> 00:25:51,180 get sort of these hump-shaped cases 555 00:25:51,180 --> 00:25:54,760 where, essentially, for inverse use, 556 00:25:54,760 --> 00:25:56,760 because either you want something in the present 557 00:25:56,760 --> 00:26:00,360 or really far away in the future. 558 00:26:00,360 --> 00:26:03,090 But as I said before, if you had discounting, what you would get 559 00:26:03,090 --> 00:26:06,480 is essentially things are either increasing or decreasing. 560 00:26:06,480 --> 00:26:09,960 In the time horizon, positive things, you want right away. 561 00:26:09,960 --> 00:26:13,480 Negative things you'd rather have in the future. 562 00:26:13,480 --> 00:26:17,470 And this is not at all with Loewenstein finds. 563 00:26:17,470 --> 00:26:19,390 Now what is the natural explanation? 564 00:26:19,390 --> 00:26:20,950 People look forward to the kiss. 565 00:26:20,950 --> 00:26:23,050 So they delay to enjoy anticipation. 566 00:26:23,050 --> 00:26:25,610 And it's very unpleasant to anticipate the shock, 567 00:26:25,610 --> 00:26:28,460 so they get it over with quickly. 568 00:26:28,460 --> 00:26:31,107 Now why did Loewenstein choose the kiss example? 569 00:26:31,107 --> 00:26:32,690 You can think about this for a second. 570 00:26:38,300 --> 00:26:40,670 Well it is an experience with a high degree 571 00:26:40,670 --> 00:26:44,060 of savorability-- something you can think about and look 572 00:26:44,060 --> 00:26:46,570 forward to in the future. 573 00:26:46,570 --> 00:26:49,340 Loewenstein tries to rule out alternative explanations 574 00:26:49,340 --> 00:26:50,750 based on preparation. 575 00:26:50,750 --> 00:26:55,040 You might sort of say, well kissing a movie star right away 576 00:26:55,040 --> 00:27:01,400 might not be great because you want to get ready, shower, 577 00:27:01,400 --> 00:27:03,800 sleep well, and tell all your friends about it, 578 00:27:03,800 --> 00:27:10,220 and so on, or get some advice on which movie star to pick and so 579 00:27:10,220 --> 00:27:10,880 on. 580 00:27:10,880 --> 00:27:13,422 So there might be some reason that preparation might push you 581 00:27:13,422 --> 00:27:15,020 in three days from now. 582 00:27:15,020 --> 00:27:18,530 Similarly, if he had said a dinner at a restaurant instead, 583 00:27:18,530 --> 00:27:21,170 you might say, well in a few days' 584 00:27:21,170 --> 00:27:22,970 delay is perfectly reasonable even 585 00:27:22,970 --> 00:27:24,860 without anticipatory utility. 586 00:27:24,860 --> 00:27:27,863 You might want to make a time right away. 587 00:27:27,863 --> 00:27:29,030 It might not be a good time. 588 00:27:29,030 --> 00:27:31,113 You want to get a date, et cetera, for the dinner. 589 00:27:31,113 --> 00:27:34,400 So choosing stuff later because of preparation 590 00:27:34,400 --> 00:27:35,450 seems totally reasonable. 591 00:27:35,450 --> 00:27:39,500 That's not what Loewenstein is talking about here. 592 00:27:39,500 --> 00:27:43,880 He really is thinking about anticipatory utility. 593 00:27:43,880 --> 00:27:47,960 And so he has some other examples, 594 00:27:47,960 --> 00:27:50,720 but really, you can argue that really this is not 595 00:27:50,720 --> 00:27:55,130 about preparation, but part of that 596 00:27:55,130 --> 00:27:56,790 could be preparation as well. 597 00:27:56,790 --> 00:27:58,748 But what you really think about is going on is, 598 00:27:58,748 --> 00:28:01,640 really, this is about anticipatory utility. 599 00:28:01,640 --> 00:28:03,260 When you think about, is preparation 600 00:28:03,260 --> 00:28:06,410 problematic for the electric shock in particular? 601 00:28:06,410 --> 00:28:08,090 Well the answer is no, because that 602 00:28:08,090 --> 00:28:11,670 would induce subjects to prefer it later, 603 00:28:11,670 --> 00:28:13,490 which is not what Loewenstein finds. 604 00:28:13,490 --> 00:28:15,200 Remember, Loewenstein finds that people 605 00:28:15,200 --> 00:28:17,990 would like to have the shock right away, as opposed 606 00:28:17,990 --> 00:28:20,580 to a year or 10 years from now. 607 00:28:20,580 --> 00:28:22,370 And so preparation would say, well, I 608 00:28:22,370 --> 00:28:26,400 want it in three days and a year from now instead. 609 00:28:26,400 --> 00:28:29,870 But that's not really what Loewenstein finds. 610 00:28:29,870 --> 00:28:32,060 So maybe if you're not happy with the preparation 611 00:28:32,060 --> 00:28:35,060 explanation for the kiss of the movie star, 612 00:28:35,060 --> 00:28:37,760 preparation will not be able to explain to you 613 00:28:37,760 --> 00:28:41,310 the electric shock example. 614 00:28:41,310 --> 00:28:42,380 OK. 615 00:28:42,380 --> 00:28:43,380 There's another example. 616 00:28:43,380 --> 00:28:44,838 So Loewenstein has this funny thing 617 00:28:44,838 --> 00:28:48,390 where he says, well getting electric shock has 618 00:28:48,390 --> 00:28:50,270 been a little bit of a weird example. 619 00:28:50,270 --> 00:28:52,080 So let me find something more realistic. 620 00:28:52,080 --> 00:28:54,960 And then being an academic, what he comes up with 621 00:28:54,960 --> 00:28:57,450 is cleaning hamster cages. 622 00:28:57,450 --> 00:29:01,260 So he asked subjects how much they'd 623 00:29:01,260 --> 00:29:04,290 have to be paid to clean 100 hamster cages. 624 00:29:04,290 --> 00:29:07,500 I've never cleaned any hamster cages in my life. 625 00:29:07,500 --> 00:29:09,950 I imagine it'd be quite unpleasant. 626 00:29:09,950 --> 00:29:11,700 So you think about, what kinds of patterns 627 00:29:11,700 --> 00:29:13,552 do you expect to see? 628 00:29:13,552 --> 00:29:15,510 And what you expect to see is sort of something 629 00:29:15,510 --> 00:29:17,430 similar to the electric shock. 630 00:29:17,430 --> 00:29:19,200 So people might really not look forward 631 00:29:19,200 --> 00:29:22,325 to cleaning hamster cages in a while from now 632 00:29:22,325 --> 00:29:23,700 and would rather do it right away 633 00:29:23,700 --> 00:29:25,842 to avoid anticipatory utility. 634 00:29:25,842 --> 00:29:26,925 That would be the pattern. 635 00:29:30,860 --> 00:29:34,400 So that willingness to pay for avoid it 636 00:29:34,400 --> 00:29:38,420 right away or you'd have to pay them not so much 637 00:29:38,420 --> 00:29:39,980 do it right away, compared to having 638 00:29:39,980 --> 00:29:43,250 to do it in a while from now, because the anticipatory 639 00:29:43,250 --> 00:29:46,940 utility makes things worse, as it did for the electric shock. 640 00:29:46,940 --> 00:29:50,750 And this is what Loewenstein finds. 641 00:29:50,750 --> 00:29:52,520 If the payment is now for cleaning 642 00:29:52,520 --> 00:29:56,720 to be performed next week, it's going to be like $30. 643 00:29:56,720 --> 00:29:58,310 But if the payment is now for cleaning 644 00:29:58,310 --> 00:30:00,590 to be performed in a year from now, 645 00:30:00,590 --> 00:30:02,170 the payment has to be higher. 646 00:30:02,170 --> 00:30:03,980 It's $37. 647 00:30:03,980 --> 00:30:06,320 And again, discounting preparation or the like 648 00:30:06,320 --> 00:30:10,860 would create the opposite-- 649 00:30:10,860 --> 00:30:12,740 people would rather have it in the future 650 00:30:12,740 --> 00:30:15,860 than in the present, which is not what Loewenstein finds. 651 00:30:15,860 --> 00:30:22,080 Notice, again, these are willingness to pay 652 00:30:22,080 --> 00:30:23,635 to have to do the experience. 653 00:30:23,635 --> 00:30:26,380 So how much would I have to pay you to do it? 654 00:30:29,230 --> 00:30:32,320 Now again there's two forces here at play-- 655 00:30:32,320 --> 00:30:36,130 there is discounting versus anticipation. 656 00:30:36,130 --> 00:30:38,030 Now how do we think about discounting? 657 00:30:38,030 --> 00:30:40,510 So we talked about discounting before. 658 00:30:40,510 --> 00:30:42,570 And there are sort of three steps 659 00:30:42,570 --> 00:30:45,220 to thinking about discounting that we talked about. 660 00:30:45,220 --> 00:30:46,840 One was, there was a question like, 661 00:30:46,840 --> 00:30:49,630 what determines a person's instantaneous utility 662 00:30:49,630 --> 00:30:51,142 at each point in time? 663 00:30:51,142 --> 00:30:53,600 We didn't talk about this very much during time preference. 664 00:30:53,600 --> 00:30:55,990 We were just saying like, here's an instantaneous utility 665 00:30:55,990 --> 00:30:58,990 function that determines how happy you are at each point 666 00:30:58,990 --> 00:31:00,310 in time. 667 00:31:00,310 --> 00:31:02,080 What we talked a lot about was, how 668 00:31:02,080 --> 00:31:06,170 do people integrate or aggregate those utilities across time? 669 00:31:06,170 --> 00:31:08,570 So you have some utility today, tomorrow, two days, 670 00:31:08,570 --> 00:31:09,640 three days from now. 671 00:31:09,640 --> 00:31:11,475 And we then talked about, how do you 672 00:31:11,475 --> 00:31:12,850 aggregate, how much weight do you 673 00:31:12,850 --> 00:31:14,882 put on the different periods? 674 00:31:14,882 --> 00:31:16,340 And then we talked about, what does 675 00:31:16,340 --> 00:31:19,287 the person predict about the future utility and behavior? 676 00:31:19,287 --> 00:31:21,370 Which is sort of the question about sophistication 677 00:31:21,370 --> 00:31:25,610 and naivete that we discussed as well. 678 00:31:25,610 --> 00:31:27,460 So as I said, the discounting issues 679 00:31:27,460 --> 00:31:29,860 you covered are about point number 2. 680 00:31:29,860 --> 00:31:32,050 The sophistication versus naivete issues 681 00:31:32,050 --> 00:31:34,600 are more about point number 3. 682 00:31:34,600 --> 00:31:37,390 And here, people's prediction about the future utility 683 00:31:37,390 --> 00:31:39,580 and behavior was only relevant to the extent 684 00:31:39,580 --> 00:31:43,210 that it was helpful or harmful in helping 685 00:31:43,210 --> 00:31:44,520 people make good decisions. 686 00:31:44,520 --> 00:31:45,020 Right? 687 00:31:45,020 --> 00:31:49,780 So people didn't derive utility from being sophisticated 688 00:31:49,780 --> 00:31:53,170 or naive directly. 689 00:31:53,170 --> 00:31:56,320 People only derive utility from their instantaneous utility 690 00:31:56,320 --> 00:31:59,020 function and from depending on the weight they 691 00:31:59,020 --> 00:32:02,710 put on those different instantaneous utilities. 692 00:32:02,710 --> 00:32:06,100 But their prediction affected their choices, 693 00:32:06,100 --> 00:32:08,770 and their choices, in turn, affected the utility. 694 00:32:08,770 --> 00:32:11,860 But it was never the case that people's predictions, 695 00:32:11,860 --> 00:32:15,530 their beliefs directly entered the utility function. 696 00:32:15,530 --> 00:32:18,760 So instead, anticipatory utility is about point number 1. 697 00:32:18,760 --> 00:32:21,400 It's about the instantaneous utility function-- what 698 00:32:21,400 --> 00:32:25,210 goes into that, and perhaps people's beliefs 699 00:32:25,210 --> 00:32:26,950 about the future might do that. 700 00:32:26,950 --> 00:32:30,370 That's what we're talking about now. 701 00:32:30,370 --> 00:32:33,820 Now notice that things like social preferences 702 00:32:33,820 --> 00:32:37,180 or reference-dependent preferences, 703 00:32:37,180 --> 00:32:39,210 those are also about number 1. 704 00:32:39,210 --> 00:32:41,730 Again, time preferences so far that we talked about 705 00:32:41,730 --> 00:32:44,980 were about items number 2 and 3. 706 00:32:44,980 --> 00:32:47,890 Now what are the interactions between anticipation 707 00:32:47,890 --> 00:32:49,380 and discounting? 708 00:32:49,380 --> 00:32:50,800 We talked about this already. 709 00:32:50,800 --> 00:32:55,330 So you can distinguish between pleasant and unpleasant 710 00:32:55,330 --> 00:32:56,170 experiences. 711 00:32:56,170 --> 00:33:02,410 So anticipatory utility is stronger for events 712 00:33:02,410 --> 00:33:03,790 that are closer in time, right? 713 00:33:03,790 --> 00:33:05,457 Something that's like 10 years from now, 714 00:33:05,457 --> 00:33:09,220 you're not that worried about right now. 715 00:33:09,220 --> 00:33:12,200 For something that's tomorrow or in two hours from now, 716 00:33:12,200 --> 00:33:13,870 you might be really worried about. 717 00:33:13,870 --> 00:33:16,390 Now for pleasant and savorable experiences, 718 00:33:16,390 --> 00:33:20,440 some delay is optimal to have a few periods of anticipation. 719 00:33:20,440 --> 00:33:22,360 So think about each hour, each day 720 00:33:22,360 --> 00:33:24,250 as a period of anticipation. 721 00:33:24,250 --> 00:33:28,360 So you'd rather have three days than two days than one day 722 00:33:28,360 --> 00:33:31,720 of anticipation because each day counts 723 00:33:31,720 --> 00:33:35,650 as the day of anticipation, and you may value each day. 724 00:33:35,650 --> 00:33:38,500 But you don't want it to be too far away because then there's 725 00:33:38,500 --> 00:33:39,490 also discounting. 726 00:33:39,490 --> 00:33:43,150 You'd rather have things at the present or close to the present 727 00:33:43,150 --> 00:33:44,170 than in the future. 728 00:33:44,170 --> 00:33:47,153 You don't want stuff to be in a year or two years from now 729 00:33:47,153 --> 00:33:48,820 because that's really far away, and then 730 00:33:48,820 --> 00:33:50,740 you discount that experience very much. 731 00:33:50,740 --> 00:33:52,510 Notice that you might also discount 732 00:33:52,510 --> 00:33:54,810 the anticipatory utility. 733 00:33:54,810 --> 00:33:59,310 That is to say, suppose you have only three days of anticipatory 734 00:33:59,310 --> 00:34:01,973 utility of a nice restaurant or some nice experience. 735 00:34:01,973 --> 00:34:03,390 Only two or three days before that 736 00:34:03,390 --> 00:34:08,070 experience you're going to be excited about it. 737 00:34:08,070 --> 00:34:09,810 Well then if that's in 10 years from now, 738 00:34:09,810 --> 00:34:11,540 you're going to discount all of that. 739 00:34:11,540 --> 00:34:12,909 And then that's really far away. 740 00:34:12,909 --> 00:34:16,380 And if you really present bias or just discount a lot, 741 00:34:16,380 --> 00:34:20,159 then you're also going to discount 742 00:34:20,159 --> 00:34:21,630 the anticipatory utility. 743 00:34:21,630 --> 00:34:24,420 So there's some trade-off for savorable, pleasant experiences 744 00:34:24,420 --> 00:34:27,480 between discounting and anticipatory utility, 745 00:34:27,480 --> 00:34:33,159 which might give you this inverse-U, hump shape power. 746 00:34:33,159 --> 00:34:36,580 Instead or in contrast, for unpleasant, fearful 747 00:34:36,580 --> 00:34:39,219 experiences, you want to either do it immediately 748 00:34:39,219 --> 00:34:40,944 to eliminate periods of anticipation-- 749 00:34:40,944 --> 00:34:42,940 so you might just say, get it out of the way. 750 00:34:42,940 --> 00:34:45,580 I don't want any anticipatory utility. 751 00:34:45,580 --> 00:34:48,070 And getting it out of the way essentially minimizes 752 00:34:48,070 --> 00:34:52,300 anticipatory utility because you get it over with. 753 00:34:52,300 --> 00:34:55,780 Or you might put it off as much as possible for discounting 754 00:34:55,780 --> 00:34:57,400 reasons and to weaken anticipation. 755 00:34:57,400 --> 00:34:59,890 That's to say, let's just do it in 10 years from now. 756 00:34:59,890 --> 00:35:01,840 I might not worry about it for quite a while. 757 00:35:01,840 --> 00:35:06,510 I might also just discount any future anticipation. 758 00:35:06,510 --> 00:35:09,380 And for discounting, if there's stuff in the future, 759 00:35:09,380 --> 00:35:11,480 I'd rather have bad stuff happen in the future 760 00:35:11,480 --> 00:35:13,230 than in the present because it essentially 761 00:35:13,230 --> 00:35:14,760 discounts the future. 762 00:35:14,760 --> 00:35:17,460 And it's good for me to discount bad stuff that's 763 00:35:17,460 --> 00:35:20,690 happening in the future. 764 00:35:20,690 --> 00:35:22,750 So far we talked about the implications 765 00:35:22,750 --> 00:35:25,620 of anticipatory utility for the timing of consumption. 766 00:35:25,620 --> 00:35:26,120 Right? 767 00:35:26,120 --> 00:35:29,980 That is to say, when would you like a certain experience 768 00:35:29,980 --> 00:35:31,720 to happen? 769 00:35:31,720 --> 00:35:36,530 In the immediate present, in a few days, or further away 770 00:35:36,530 --> 00:35:37,480 in the future? 771 00:35:37,480 --> 00:35:40,810 But we took it as a given that something might happen 772 00:35:40,810 --> 00:35:43,300 and then the question was just when. 773 00:35:43,300 --> 00:35:46,990 You could think of some examples where that might be important. 774 00:35:46,990 --> 00:35:51,790 Perhaps, more important overall for people's welfare decisions, 775 00:35:51,790 --> 00:35:55,270 and perhaps also for policy is people's information-gathering 776 00:35:55,270 --> 00:35:56,770 and beliefs. 777 00:35:56,770 --> 00:36:00,025 That is to say, you might get a situation where anticipatory 778 00:36:00,025 --> 00:36:01,540 or ego utility-- 779 00:36:01,540 --> 00:36:02,710 I'm going to show you-- 780 00:36:02,710 --> 00:36:06,580 might affect how much information people gather. 781 00:36:06,580 --> 00:36:07,990 And they might not be as informed 782 00:36:07,990 --> 00:36:10,000 as they could be otherwise because 783 00:36:10,000 --> 00:36:12,940 of those anticipatory utility reasons. 784 00:36:12,940 --> 00:36:15,400 So one question you might ask yourself is, well, 785 00:36:15,400 --> 00:36:17,890 would an individual in a non-strategic setting 786 00:36:17,890 --> 00:36:21,790 with no anticipatory utility ever strictly prefer 787 00:36:21,790 --> 00:36:24,130 to refuse free information? 788 00:36:24,130 --> 00:36:27,310 And it's actually hard to come up with some situations 789 00:36:27,310 --> 00:36:29,680 where people just would refuse information if they're 790 00:36:29,680 --> 00:36:31,360 perfectly neoclassical. 791 00:36:31,360 --> 00:36:33,425 You might say, well people might be overwhelmed. 792 00:36:33,425 --> 00:36:35,050 There's too much information available. 793 00:36:35,050 --> 00:36:38,320 And you might not be able to attend to everything. 794 00:36:38,320 --> 00:36:41,050 But in general, I think the assumption in economics 795 00:36:41,050 --> 00:36:44,450 is, well the information could be useful for something. 796 00:36:44,450 --> 00:36:46,090 You might not even know that right now, 797 00:36:46,090 --> 00:36:47,757 but it might turn out that actually it's 798 00:36:47,757 --> 00:36:49,660 quite interesting information. 799 00:36:49,660 --> 00:36:51,640 And in particular, usually, the assumption 800 00:36:51,640 --> 00:36:53,470 is there's free disposal. 801 00:36:53,470 --> 00:36:54,070 Right? 802 00:36:54,070 --> 00:36:56,260 If the information is not useful, just forget about it. 803 00:36:56,260 --> 00:36:56,830 Don't care about it. 804 00:36:56,830 --> 00:36:57,330 Whatever. 805 00:36:57,330 --> 00:36:59,650 Just write it down somewhere and whatever. 806 00:36:59,650 --> 00:37:01,500 It might be important in the future. 807 00:37:01,500 --> 00:37:04,270 But really, why bother whether or not to have it? 808 00:37:04,270 --> 00:37:05,230 It's no problem. 809 00:37:05,230 --> 00:37:09,560 So you would never refuse it directly. 810 00:37:09,560 --> 00:37:12,230 You might not pay for it if you think it's useless, 811 00:37:12,230 --> 00:37:17,970 but you would never refuse to actually get it for free. 812 00:37:17,970 --> 00:37:21,750 In contrast, somebody with anticipatory feeling cannot 813 00:37:21,750 --> 00:37:25,050 ignore information because that information affects his or her 814 00:37:25,050 --> 00:37:25,913 emotions. 815 00:37:25,913 --> 00:37:28,080 But if it's something that some piece of information 816 00:37:28,080 --> 00:37:31,320 makes you really unhappy, well it's very hard, actually, 817 00:37:31,320 --> 00:37:34,800 to forget about it and ignore it because once it's out there, 818 00:37:34,800 --> 00:37:37,650 once you've heard it, it's really that you can't unhear it 819 00:37:37,650 --> 00:37:39,210 and it's very hard to forget. 820 00:37:39,210 --> 00:37:41,190 In particular, it's hard to forget stuff 821 00:37:41,190 --> 00:37:43,320 that you really care about, that you're 822 00:37:43,320 --> 00:37:44,910 upset about or anxious about. 823 00:37:44,910 --> 00:37:48,340 And so it's really hard to do. 824 00:37:48,340 --> 00:37:52,530 And so one quite interesting example 825 00:37:52,530 --> 00:37:55,140 is Dr. House and Thirteen. 826 00:37:55,140 --> 00:37:58,490 Thirteen is one of his employees. 827 00:37:58,490 --> 00:38:00,450 She's called Thirteen because I think 828 00:38:00,450 --> 00:38:06,440 he had 25 or something interns. 829 00:38:06,440 --> 00:38:09,440 And I think he fired almost all of them 830 00:38:09,440 --> 00:38:12,710 except for Thirteen who was number 13 of these interns. 831 00:38:12,710 --> 00:38:14,450 He never-- at the beginning, at least-- 832 00:38:14,450 --> 00:38:16,062 bothered to learn people's names. 833 00:38:16,062 --> 00:38:17,270 So he just gave them numbers. 834 00:38:17,270 --> 00:38:19,530 So Thirteen was number 13. 835 00:38:19,530 --> 00:38:22,000 She was an excellent doctor. 836 00:38:22,000 --> 00:38:24,610 It turns out that her family had a history 837 00:38:24,610 --> 00:38:27,310 of Huntington's disease. 838 00:38:27,310 --> 00:38:34,210 And Thirteen had the prime example, in the show, 839 00:38:34,210 --> 00:38:36,010 of information avoidance. 840 00:38:36,010 --> 00:38:39,840 She did not want to learn about whether she had the disease 841 00:38:39,840 --> 00:38:40,340 or not. 842 00:38:40,340 --> 00:38:42,640 And so the Huntington's Disease is such 843 00:38:42,640 --> 00:38:45,820 that, if your parents have the disease, your probability 844 00:38:45,820 --> 00:38:48,700 of having the disease yourself is way higher than 845 00:38:48,700 --> 00:38:49,810 in the general population. 846 00:38:49,810 --> 00:38:56,060 It's about 50% overall if one of your parents have the disease. 847 00:38:56,060 --> 00:38:58,930 And so then you might think it's quite valuable to learn about 848 00:38:58,930 --> 00:39:01,960 whether you have disease or not, for a variety of decisions 849 00:39:01,960 --> 00:39:04,360 that you might make in the future or that are relevant 850 00:39:04,360 --> 00:39:07,780 for the future, ranging from when to retire, 851 00:39:07,780 --> 00:39:10,210 how much to exercise, what education 852 00:39:10,210 --> 00:39:12,700 to get, how much to save, whether you have children 853 00:39:12,700 --> 00:39:15,220 or not. 854 00:39:15,220 --> 00:39:18,940 And Dr. House was usually pretty irrational. 855 00:39:18,940 --> 00:39:21,610 And there is choices or decisions 856 00:39:21,610 --> 00:39:26,830 that he does, was in fact being very neoclassical here 857 00:39:26,830 --> 00:39:28,690 and saying, you should get this information 858 00:39:28,690 --> 00:39:30,820 because it would be very helpful for you. 859 00:39:30,820 --> 00:39:32,470 He, in fact, went behind her back, 860 00:39:32,470 --> 00:39:35,680 tested her, and sent her the results. 861 00:39:35,680 --> 00:39:39,370 And but then, Thirteen, for at least for quite a while, 862 00:39:39,370 --> 00:39:41,290 did not go to look at these results, 863 00:39:41,290 --> 00:39:44,800 presumably because she wanted to make herself think that she 864 00:39:44,800 --> 00:39:48,520 will be healthy and sort of delude herself in some ways 865 00:39:48,520 --> 00:39:51,790 about being a healthy person and not having this disease, 866 00:39:51,790 --> 00:39:53,950 even though she was not sure. 867 00:39:53,950 --> 00:39:58,120 And so avoiding that information helps 868 00:39:58,120 --> 00:39:59,950 you do that because once you're tested, 869 00:39:59,950 --> 00:40:01,970 you're either positive or negative. 870 00:40:01,970 --> 00:40:04,512 And it's very hard to ignore that information 871 00:40:04,512 --> 00:40:05,470 once you have taken it. 872 00:40:05,470 --> 00:40:06,970 But before you have been tested, you 873 00:40:06,970 --> 00:40:08,560 could always sort of delude yourself 874 00:40:08,560 --> 00:40:13,500 and make yourself think that you don't have the disease. 875 00:40:13,500 --> 00:40:15,560 Now what is Huntington's Disease? 876 00:40:15,560 --> 00:40:17,840 Let me be a little bit more specific. 877 00:40:17,840 --> 00:40:23,030 It's a degenerative neurological disorder. 878 00:40:23,030 --> 00:40:27,350 It's a very severe and heartbreaking disease. 879 00:40:27,350 --> 00:40:31,930 Essentially, it affects the brain. 880 00:40:31,930 --> 00:40:35,660 It sets on around age 40 and really makes 881 00:40:35,660 --> 00:40:38,060 people very dysfunctional. 882 00:40:38,060 --> 00:40:40,460 And life expectancy is around 60 for somebody 883 00:40:40,460 --> 00:40:42,980 with Huntington's Disease compared to 80 884 00:40:42,980 --> 00:40:44,630 or even higher for people who don't 885 00:40:44,630 --> 00:40:47,270 have Huntington's Disease. 886 00:40:47,270 --> 00:40:50,450 People with parents or a parent with Huntington's Disease 887 00:40:50,450 --> 00:40:53,270 have about a 50% chance of developing Huntington's Disease 888 00:40:53,270 --> 00:40:55,740 themselves. 889 00:40:55,740 --> 00:40:59,390 Since the 1990s, there's a genetic blood test available. 890 00:40:59,390 --> 00:41:02,510 And so you can provide at-risk individuals with certainty 891 00:41:02,510 --> 00:41:04,830 about whether will develop it so you 892 00:41:04,830 --> 00:41:08,420 know for sure either you will develop it or you will not. 893 00:41:08,420 --> 00:41:12,890 And those lab tests cost about $200 to $300, 894 00:41:12,890 --> 00:41:16,493 plus consulting and other costs. 895 00:41:16,493 --> 00:41:18,410 The tests here are often covered by insurance, 896 00:41:18,410 --> 00:41:21,350 but most tests are paid out of pocket. 897 00:41:21,350 --> 00:41:25,200 Emily Oster and co-authors of the paper that 898 00:41:25,200 --> 00:41:27,480 is on the reading list were arguing 899 00:41:27,480 --> 00:41:29,880 that people were not doing that partially to keep 900 00:41:29,880 --> 00:41:33,780 the test results private. 901 00:41:33,780 --> 00:41:36,750 Importantly, there's no cure or nothing that we can actually 902 00:41:36,750 --> 00:41:39,410 do to improve the disease. 903 00:41:39,410 --> 00:41:41,910 So when you think about other diseases, such as HIV or other 904 00:41:41,910 --> 00:41:46,020 like, usually there, there's a key motivation 905 00:41:46,020 --> 00:41:50,550 to learn about whether somebody is positive or negative, 906 00:41:50,550 --> 00:41:54,000 because then they can get antiretroviral treatment 907 00:41:54,000 --> 00:41:54,840 or the like. 908 00:41:54,840 --> 00:41:56,560 Here that's not the case. 909 00:41:56,560 --> 00:41:59,350 So there's no actual cure or behaviors 910 00:41:59,350 --> 00:42:03,270 that you could do to that could mitigate 911 00:42:03,270 --> 00:42:05,260 the course of the disease. 912 00:42:05,260 --> 00:42:07,320 However, there's other potential behaviors 913 00:42:07,320 --> 00:42:08,520 that you could engage in-- 914 00:42:08,520 --> 00:42:13,230 so prepare you better and deal better with the disease 915 00:42:13,230 --> 00:42:15,490 once it sets on. 916 00:42:15,490 --> 00:42:18,390 So what are these reasons? 917 00:42:18,390 --> 00:42:20,350 Why could such a test be valuable? 918 00:42:20,350 --> 00:42:23,130 Think about this for a second. 919 00:42:23,130 --> 00:42:25,890 There's a number of potential reasons 920 00:42:25,890 --> 00:42:27,630 that you might think about. 921 00:42:27,630 --> 00:42:28,830 I've listed a few. 922 00:42:28,830 --> 00:42:30,570 There's a question about whether or not 923 00:42:30,570 --> 00:42:32,160 you want to have a child. 924 00:42:32,160 --> 00:42:35,160 There's issues about marriage. 925 00:42:35,160 --> 00:42:38,500 Would you like to get married or find a partner? 926 00:42:38,500 --> 00:42:40,000 Or do you want to tell your partner? 927 00:42:40,000 --> 00:42:40,950 And so on. 928 00:42:40,950 --> 00:42:43,410 There's questions about retirement. 929 00:42:43,410 --> 00:42:45,700 When do you want to retire? 930 00:42:45,700 --> 00:42:48,690 For example, if you wanted to travel a lot after retirement. 931 00:42:48,690 --> 00:42:52,140 Well if the disease sets on at age 40, 932 00:42:52,140 --> 00:42:54,510 you might want to do that a lot earlier. 933 00:42:54,510 --> 00:42:57,690 Education-- usually we think that the returns to education 934 00:42:57,690 --> 00:42:59,530 are high. 935 00:42:59,530 --> 00:43:01,710 So it's good news for anybody who is studying. 936 00:43:01,710 --> 00:43:03,780 But usually these returns are accrued 937 00:43:03,780 --> 00:43:05,760 over a long period of time. 938 00:43:05,760 --> 00:43:09,480 You're done with your education at the age 22 or 25. 939 00:43:09,480 --> 00:43:15,660 You can then start working from age 25 to age 65 940 00:43:15,660 --> 00:43:17,340 or the like or even longer. 941 00:43:17,340 --> 00:43:19,403 And so that's a long time. 942 00:43:19,403 --> 00:43:21,320 But if instead, you have Huntington's Disease, 943 00:43:21,320 --> 00:43:24,470 you might only have until age 40 or 45 or the like. 944 00:43:24,470 --> 00:43:28,080 So the returns to education are way lower. 945 00:43:28,080 --> 00:43:30,440 You might also do things like participation 946 00:43:30,440 --> 00:43:31,710 in clinical research. 947 00:43:31,710 --> 00:43:34,010 Perhaps there's some chance that at some point 948 00:43:34,010 --> 00:43:35,030 there might be a cure. 949 00:43:35,030 --> 00:43:37,050 You maybe want to contribute to that. 950 00:43:37,050 --> 00:43:40,660 So there's lots of potential benefits of knowing, 951 00:43:40,660 --> 00:43:44,090 and that test could really be extremely valuable for you. 952 00:43:44,090 --> 00:43:49,820 And surely the value could be higher than $200 or $300. 953 00:43:49,820 --> 00:43:52,970 Now what of the paper by Oster et al? 954 00:43:52,970 --> 00:43:55,850 So Oster et al. observe a sample of previously 955 00:43:55,850 --> 00:44:01,350 untested at-risk individuals over the course of 10 years. 956 00:44:01,350 --> 00:44:06,500 Why is it useful to have a sample of at-risk individuals? 957 00:44:06,500 --> 00:44:08,720 Well because the base rate, the rate 958 00:44:08,720 --> 00:44:10,550 of people who are not at risk, who 959 00:44:10,550 --> 00:44:13,580 don't have any parents with Huntington's Disease, 960 00:44:13,580 --> 00:44:16,740 that rate is quite low. 961 00:44:16,740 --> 00:44:18,870 So not being tested is not so much a puzzle. 962 00:44:18,870 --> 00:44:21,370 If your chance is like 0.00-something percent, 963 00:44:21,370 --> 00:44:25,260 then it's really not that-- it's quite costly to do. 964 00:44:25,260 --> 00:44:28,690 It's $200, $300, plus hassle cost and so on. 965 00:44:28,690 --> 00:44:30,480 And there's many diseases one could have. 966 00:44:30,480 --> 00:44:31,740 Why test for Huntington's? 967 00:44:31,740 --> 00:44:34,080 It's perfectly reasonable to not test for it. 968 00:44:34,080 --> 00:44:37,560 But here, there are people, these at-risk people, 969 00:44:37,560 --> 00:44:43,818 their chance is 50% to start with. 970 00:44:43,818 --> 00:44:45,110 So there's a lot to learn from. 971 00:44:45,110 --> 00:44:49,470 You go from 50% to either 0% of having it or 100% of having it. 972 00:44:49,470 --> 00:44:53,180 So there's huge changes in your beliefs, potentially. 973 00:44:53,180 --> 00:44:57,350 While if you are a person who is not at risk, 974 00:44:57,350 --> 00:45:01,130 your chance of getting it to start with is like 975 00:45:01,130 --> 00:45:04,910 0.00-something percent, with very high chance you're going 976 00:45:04,910 --> 00:45:05,990 to go to zero from that. 977 00:45:05,990 --> 00:45:08,360 So that's almost no change at all anyway. 978 00:45:08,360 --> 00:45:10,080 And there's only like a tiny, tiny chance 979 00:45:10,080 --> 00:45:12,665 that you're actually positive, in which, of course, 980 00:45:12,665 --> 00:45:14,790 you might want to change your behavior quite a bit. 981 00:45:14,790 --> 00:45:16,540 But the weight on that should be quite low 982 00:45:16,540 --> 00:45:20,570 because the probability of a positive test is really low. 983 00:45:20,570 --> 00:45:27,410 So Oster et al find very low rates of this genetic testing. 984 00:45:27,410 --> 00:45:30,410 Fewer than 10% of individuals pursue predictive testing 985 00:45:30,410 --> 00:45:34,220 during the studies, over this long period of time. 986 00:45:34,220 --> 00:45:37,310 Many individuals get tested to confirm the disease rather 987 00:45:37,310 --> 00:45:40,590 than actually learning about whether they have it or not. 988 00:45:40,590 --> 00:45:44,810 You might think, what's really important is for the reasons 989 00:45:44,810 --> 00:45:47,070 that I mentioned here-- 990 00:45:47,070 --> 00:45:48,990 it's generating knowledge. 991 00:45:48,990 --> 00:45:53,460 Right, you want to know, should you retire early or not? 992 00:45:53,460 --> 00:45:55,080 For that, you want knowledge. 993 00:45:55,080 --> 00:45:57,580 You want to reveal uncertainty. 994 00:45:57,580 --> 00:45:59,850 So if your chance to have the disease to start with 995 00:45:59,850 --> 00:46:04,170 is 50%, going from 50% to 0% or 50% to 100%, 996 00:46:04,170 --> 00:46:09,210 that's huge potential knowledge that's being created. 997 00:46:09,210 --> 00:46:13,230 Once you think the chances like 99% or 99.9%, 998 00:46:13,230 --> 00:46:16,360 going from 99% to 100%, you actually don't learn that much. 999 00:46:16,360 --> 00:46:18,450 And you shouldn't change your behavior very much 1000 00:46:18,450 --> 00:46:21,345 from confirming that you have the disease. 1001 00:46:21,345 --> 00:46:23,220 Similarly, actually, there's similar evidence 1002 00:46:23,220 --> 00:46:25,020 in studies on HIV, breast cancer, 1003 00:46:25,020 --> 00:46:26,460 and other types of testing. 1004 00:46:26,460 --> 00:46:29,760 People tend to systematically under-test 1005 00:46:29,760 --> 00:46:33,480 even if the chance of having certain diseases is quite high. 1006 00:46:33,480 --> 00:46:36,150 And what's somewhat different for HIV and cancer-- 1007 00:46:36,150 --> 00:46:39,720 any cancer, particularly breast cancer testing, 1008 00:46:39,720 --> 00:46:44,340 there's at least some hope that if a disease gets 1009 00:46:44,340 --> 00:46:46,740 detected early, one can potentially 1010 00:46:46,740 --> 00:46:48,120 do something about it. 1011 00:46:48,120 --> 00:46:53,610 There's some recent controversies or discussions 1012 00:46:53,610 --> 00:46:56,323 on like, is cancer testing actually helpful? 1013 00:46:56,323 --> 00:46:58,740 And are we over-testing in the sense, is the test actually 1014 00:46:58,740 --> 00:47:02,190 helpful in detecting potential diseases? 1015 00:47:02,190 --> 00:47:04,560 But surely for HIV, it would be quite valuable 1016 00:47:04,560 --> 00:47:06,930 if you knew early on whether you have a disease 1017 00:47:06,930 --> 00:47:08,730 or not because then you could essentially 1018 00:47:08,730 --> 00:47:13,910 take ARV, antiretroviral treatment, to help you. 1019 00:47:13,910 --> 00:47:18,450 Now let me show you first who's getting tested. 1020 00:47:18,450 --> 00:47:22,970 So what you see on this graph is on the x-axis investigator 1021 00:47:22,970 --> 00:47:26,240 evaluation of symptoms at the last visit. 1022 00:47:26,240 --> 00:47:28,985 So on the left, these are people who are normal, 1023 00:47:28,985 --> 00:47:30,380 who don't show any symptoms. 1024 00:47:30,380 --> 00:47:32,270 On the right, people have like certain signs 1025 00:47:32,270 --> 00:47:33,270 of Huntington's Disease. 1026 00:47:33,270 --> 00:47:35,240 And everything else in between, this 1027 00:47:35,240 --> 00:47:37,680 is sort of in the middle of that. 1028 00:47:37,680 --> 00:47:41,870 So further to the right means people have more symptoms. 1029 00:47:41,870 --> 00:47:43,798 In particular, to the very right, 1030 00:47:43,798 --> 00:47:45,590 these are people have like certain symptoms 1031 00:47:45,590 --> 00:47:47,060 of Huntington's Disease. 1032 00:47:47,060 --> 00:47:49,100 And what we see is, on the y-axis, 1033 00:47:49,100 --> 00:47:51,230 whether people since their last visit 1034 00:47:51,230 --> 00:47:54,380 get tested-- so there are several visits over time. 1035 00:47:54,380 --> 00:48:00,350 And the raw means are what I wanted you to focus on. 1036 00:48:00,350 --> 00:48:03,920 This is the upper of those two lines. 1037 00:48:03,920 --> 00:48:06,340 And what we see is essentially that testing rates 1038 00:48:06,340 --> 00:48:08,590 are very, very low. 1039 00:48:08,590 --> 00:48:13,240 The largest are about 5% in this particular sample. 1040 00:48:13,240 --> 00:48:17,350 And that's essentially increasing in the symptoms 1041 00:48:17,350 --> 00:48:18,310 that people have. 1042 00:48:18,310 --> 00:48:20,800 So overall that fraction is high. 1043 00:48:20,800 --> 00:48:22,330 And in particular for people where 1044 00:48:22,330 --> 00:48:25,810 you think the value of testing would be quite high, these are 1045 00:48:25,810 --> 00:48:28,810 people who don't necessarily show any symptoms 1046 00:48:28,810 --> 00:48:32,260 and people who have some symptoms but not very 1047 00:48:32,260 --> 00:48:34,210 strong symptoms, you would think for them 1048 00:48:34,210 --> 00:48:37,150 it would be particularly helpful to get tested 1049 00:48:37,150 --> 00:48:40,160 because you could learn a lot. 1050 00:48:40,160 --> 00:48:42,160 But instead, those are, in fact, the people 1051 00:48:42,160 --> 00:48:43,630 who almost don't test at all. 1052 00:48:43,630 --> 00:48:46,920 The test rates are something like 1% to 3%. 1053 00:48:46,920 --> 00:48:52,150 Now as I said before, for people with certain signs 1054 00:48:52,150 --> 00:48:54,150 of Huntington's Disease, the objective knowledge 1055 00:48:54,150 --> 00:48:56,150 from the test is very low since there's no cure. 1056 00:48:59,250 --> 00:49:01,720 You're almost sure that you have it anyway, so in a way, 1057 00:49:01,720 --> 00:49:03,550 you just really don't learn very much. 1058 00:49:03,550 --> 00:49:06,050 If you have these certain signs of Huntington's Disease plus 1059 00:49:06,050 --> 00:49:09,000 you're genetically predisposed to have the disease, 1060 00:49:09,000 --> 00:49:12,600 the chance of actually having disease is almost one anyway. 1061 00:49:12,600 --> 00:49:15,550 So the test really does not give you a lot of information. 1062 00:49:15,550 --> 00:49:18,990 However, perhaps, it changes your utilities 1063 00:49:18,990 --> 00:49:22,870 from beliefs, which I'm going to talk about in a bit. 1064 00:49:22,870 --> 00:49:24,750 Notice that the perceived probabilities 1065 00:49:24,750 --> 00:49:27,090 of having the disease might be lower even for people 1066 00:49:27,090 --> 00:49:28,540 with certain symptoms. 1067 00:49:28,540 --> 00:49:32,250 So what I'm showing you here is the investigator evaluation. 1068 00:49:32,250 --> 00:49:35,910 This is essentially a doctor or neurologist actually looking 1069 00:49:35,910 --> 00:49:38,640 at the symptoms and looking at, objectively, 1070 00:49:38,640 --> 00:49:40,530 what's your chance of having the disease. 1071 00:49:40,530 --> 00:49:42,840 That's different from what people actually 1072 00:49:42,840 --> 00:49:44,213 think they have. 1073 00:49:44,213 --> 00:49:45,630 So I guess A-- and what we learned 1074 00:49:45,630 --> 00:49:50,630 here-- is the testing rates are quite low. 1075 00:49:50,630 --> 00:49:54,900 Second, people systematically under-predict 1076 00:49:54,900 --> 00:49:58,843 or are overoptimistic about not having a disease. 1077 00:49:58,843 --> 00:50:00,260 That is what I'm showing you here, 1078 00:50:00,260 --> 00:50:02,120 is a motor score, which essentially 1079 00:50:02,120 --> 00:50:05,488 is an assessment of how good are your motor skills. 1080 00:50:05,488 --> 00:50:07,655 And one unfortunate thing about Huntington's Disease 1081 00:50:07,655 --> 00:50:10,490 is that your motor skills are deteriorating a lot. 1082 00:50:10,490 --> 00:50:15,610 And so a high score means here, essentially-- 1083 00:50:15,610 --> 00:50:18,980 if it's bad means essentially you're not doing well 1084 00:50:18,980 --> 00:50:22,020 and you're more likely to have Huntington's Disease. 1085 00:50:22,020 --> 00:50:24,890 And what you see in the upper limit on the y-axis, 1086 00:50:24,890 --> 00:50:27,080 is essentially the probability-- the objective 1087 00:50:27,080 --> 00:50:30,690 and subjective probabilities-- of having the disease. 1088 00:50:30,690 --> 00:50:33,420 The upper line here that you see at the top, 1089 00:50:33,420 --> 00:50:36,980 that line is the actual probability, the estimated 1090 00:50:36,980 --> 00:50:38,300 actual probability. 1091 00:50:38,300 --> 00:50:42,500 And you see that's essentially increasing in the motor scores. 1092 00:50:42,500 --> 00:50:46,200 So people who have almost no symptoms overall, 1093 00:50:46,200 --> 00:50:49,160 they have essentially a 50% chance of having the disease. 1094 00:50:49,160 --> 00:50:50,793 That's the risk to start with. 1095 00:50:50,793 --> 00:50:52,460 But then once people have more symptoms, 1096 00:50:52,460 --> 00:50:54,080 that chance increases. 1097 00:50:54,080 --> 00:50:57,140 And once you have a motor score in the 20s or 30s, 1098 00:50:57,140 --> 00:50:59,840 your objective probability of having the disease 1099 00:50:59,840 --> 00:51:02,010 is close to 1. 1100 00:51:02,010 --> 00:51:04,440 Now in contrast, the second line that you see here 1101 00:51:04,440 --> 00:51:07,770 in the middle, that is the people's perceived 1102 00:51:07,770 --> 00:51:08,820 probabilities. 1103 00:51:08,820 --> 00:51:13,050 Those perceived probabilities are vastly lower and far less 1104 00:51:13,050 --> 00:51:15,400 steep than the actual probabilities. 1105 00:51:15,400 --> 00:51:17,790 You see that people's perceived probabilities when 1106 00:51:17,790 --> 00:51:21,000 they have no symptoms are still actually reasonably high, 1107 00:51:21,000 --> 00:51:22,980 and they're pretty close to the truth. 1108 00:51:22,980 --> 00:51:27,720 They're about 40%, compared to like something like 50%, 55%. 1109 00:51:27,720 --> 00:51:29,370 So that's actually not that far off, 1110 00:51:29,370 --> 00:51:31,290 perhaps because it's very clear. 1111 00:51:31,290 --> 00:51:33,313 If you have a parent who has the disease, 1112 00:51:33,313 --> 00:51:35,730 everybody knows essentially, and particularly in this kind 1113 00:51:35,730 --> 00:51:39,000 of study, everybody knows that the probability of having 1114 00:51:39,000 --> 00:51:41,932 the disease is high, and so that perhaps, people can't really 1115 00:51:41,932 --> 00:51:43,890 lie to themselves and know that the chances are 1116 00:51:43,890 --> 00:51:46,230 pretty high if they have been untested. 1117 00:51:46,230 --> 00:51:48,660 But then people seem to be largely in denial when 1118 00:51:48,660 --> 00:51:50,100 it comes to these symptoms. 1119 00:51:50,100 --> 00:51:52,770 People have these symptoms, and when 1120 00:51:52,770 --> 00:51:57,540 you have scores of 20 to 30, their perceived probability 1121 00:51:57,540 --> 00:52:00,070 of having the disease is really only slightly higher. 1122 00:52:00,070 --> 00:52:01,760 It's about 50% or the like. 1123 00:52:01,760 --> 00:52:03,510 And so the people who are about here, well 1124 00:52:03,510 --> 00:52:05,520 their actual probability is much higher. 1125 00:52:05,520 --> 00:52:08,280 And people don't seem to update information that's 1126 00:52:08,280 --> 00:52:11,070 really unfavorable toward them in terms 1127 00:52:11,070 --> 00:52:14,700 of anticipatory utility because they seem 1128 00:52:14,700 --> 00:52:17,260 to be feeling bad about it. 1129 00:52:17,260 --> 00:52:20,460 Notice that there's actually a significant share of people 1130 00:52:20,460 --> 00:52:24,423 who persist and really thinks about our reports 1131 00:52:24,423 --> 00:52:26,590 saying that there's no chance of having the disease. 1132 00:52:26,590 --> 00:52:31,380 So there are some people who have a motor score of 20 to 30, 1133 00:52:31,380 --> 00:52:34,845 and 10% to 20% of people here seem to say, 1134 00:52:34,845 --> 00:52:36,930 I have a 0 chance of Huntington's Disease, which 1135 00:52:36,930 --> 00:52:41,190 really seems to be unfortunately delusional in some ways. 1136 00:52:41,190 --> 00:52:43,650 But really, these might be people who really just do not 1137 00:52:43,650 --> 00:52:45,360 want to think about it and are just 1138 00:52:45,360 --> 00:52:46,500 really worried and anxious. 1139 00:52:46,500 --> 00:52:49,800 And they'd rather rule out that possibility 1140 00:52:49,800 --> 00:52:53,710 of being sick for themselves. 1141 00:52:53,710 --> 00:52:56,230 Now one important question then is, 1142 00:52:56,230 --> 00:52:58,620 well do individuals adjust their behaviors? 1143 00:52:58,620 --> 00:52:59,180 Right? 1144 00:52:59,180 --> 00:53:02,770 So far, we talked about, are people getting tested? 1145 00:53:02,770 --> 00:53:04,330 And the answer is largely no. 1146 00:53:04,330 --> 00:53:08,860 Second, what are people's beliefs about the disease? 1147 00:53:08,860 --> 00:53:11,435 And people seem to essentially be overly optimistic. 1148 00:53:11,435 --> 00:53:13,060 Now there's a question, well, does that 1149 00:53:13,060 --> 00:53:16,180 translate into their behaviors? 1150 00:53:16,180 --> 00:53:20,090 And now this is another little bit complicated graph. 1151 00:53:20,090 --> 00:53:24,220 But essentially what the figure shows 1152 00:53:24,220 --> 00:53:27,220 is coefficients relative to those tested negative 1153 00:53:27,220 --> 00:53:28,600 for Huntington's Disease. 1154 00:53:28,600 --> 00:53:31,550 Let's focus on the black graphs to start with. 1155 00:53:31,550 --> 00:53:34,930 These are people who are certain to have the disease, compared 1156 00:53:34,930 --> 00:53:37,850 to people who are tested negative. 1157 00:53:37,850 --> 00:53:39,550 So what we're comparing here is people 1158 00:53:39,550 --> 00:53:43,285 who tested positive compared to people who tested negative. 1159 00:53:43,285 --> 00:53:45,160 And so the coefficients, if they're positive, 1160 00:53:45,160 --> 00:53:46,535 essentially means that people are 1161 00:53:46,535 --> 00:53:48,490 more likely to get divorced, more 1162 00:53:48,490 --> 00:53:51,700 likely to retire, more likely to change their finances recently 1163 00:53:51,700 --> 00:53:55,450 in their preparation. 1164 00:53:55,450 --> 00:53:58,608 They're also more likely to be pregnant. 1165 00:53:58,608 --> 00:54:00,400 So they change, essentially, their behavior 1166 00:54:00,400 --> 00:54:03,298 relative to being negative. 1167 00:54:03,298 --> 00:54:04,840 So this is essentially just comparing 1168 00:54:04,840 --> 00:54:06,760 positive and negative people. 1169 00:54:06,760 --> 00:54:08,620 I should have said already, this is a type 1170 00:54:08,620 --> 00:54:13,450 of setting where experiments are pretty difficult and ethically 1171 00:54:13,450 --> 00:54:17,290 difficult to do, because if people really derive utility 1172 00:54:17,290 --> 00:54:20,560 from information, providing people with information 1173 00:54:20,560 --> 00:54:23,180 could really make them worse off. 1174 00:54:23,180 --> 00:54:26,940 It's quite different from other sort of experiments involving 1175 00:54:26,940 --> 00:54:30,210 information [AUDIO OUT] wrong beliefs 1176 00:54:30,210 --> 00:54:33,780 or sort of biased information about certain outcomes. 1177 00:54:33,780 --> 00:54:35,902 Usually there, you would say, well let's 1178 00:54:35,902 --> 00:54:37,360 provide them with good information. 1179 00:54:37,360 --> 00:54:38,977 Maybe they improve their behavior. 1180 00:54:38,977 --> 00:54:41,310 So there if you provide people with correct information, 1181 00:54:41,310 --> 00:54:43,590 you could only make people better off. 1182 00:54:43,590 --> 00:54:47,100 Here, however, since people derive utility 1183 00:54:47,100 --> 00:54:50,460 from this information, telling them the truth about stuff 1184 00:54:50,460 --> 00:54:52,050 that they perhaps didn't know about 1185 00:54:52,050 --> 00:54:54,270 could really make them worse off and deeply unhappy. 1186 00:54:54,270 --> 00:54:56,980 And so one wants to be very careful with that. 1187 00:54:56,980 --> 00:55:00,120 And therefore, there are not a lot of experiments 1188 00:55:00,120 --> 00:55:01,455 in this space, at least for now. 1189 00:55:05,340 --> 00:55:07,140 So the study here that I'm showing you 1190 00:55:07,140 --> 00:55:09,240 is essentially observational data. 1191 00:55:09,240 --> 00:55:11,100 And in this case, comparing people 1192 00:55:11,100 --> 00:55:12,915 who have tested positive, compared 1193 00:55:12,915 --> 00:55:15,600 to people who tested negative. 1194 00:55:15,600 --> 00:55:17,940 If you look at people who are just tested recently, 1195 00:55:17,940 --> 00:55:19,710 arguably, these two types of people 1196 00:55:19,710 --> 00:55:23,070 are not that different because when 1197 00:55:23,070 --> 00:55:25,800 they made the choice of being tested, they didn't, of course, 1198 00:55:25,800 --> 00:55:28,990 know whether they were positive or negative. 1199 00:55:28,990 --> 00:55:30,990 So that comparison really shows that people 1200 00:55:30,990 --> 00:55:32,470 do adjust their behavior. 1201 00:55:32,470 --> 00:55:34,800 So people who have been tested positive 1202 00:55:34,800 --> 00:55:38,040 are more likely to change their lives in pretty dramatic ways. 1203 00:55:38,040 --> 00:55:40,350 For example, people are more likely to get divorced, 1204 00:55:40,350 --> 00:55:42,270 more likely retire, and so on. 1205 00:55:42,270 --> 00:55:45,610 People are also more likely to have a pregnancy. 1206 00:55:45,610 --> 00:55:48,270 I think this might be for women only 1207 00:55:48,270 --> 00:55:52,020 or for couples with their spouse. 1208 00:55:52,020 --> 00:55:55,810 And that seems like an interesting result, in a sense. 1209 00:55:55,810 --> 00:55:58,180 That's not what I would have expected. 1210 00:55:58,180 --> 00:56:03,510 Now that could be in part-- so the authors in the paper itself 1211 00:56:03,510 --> 00:56:09,210 say that the sample that's used to look at pregnancies 1212 00:56:09,210 --> 00:56:11,240 is relatively small. 1213 00:56:11,240 --> 00:56:13,650 So maybe in some sense one needs to be a little cautious 1214 00:56:13,650 --> 00:56:15,820 in interpreting those results. 1215 00:56:15,820 --> 00:56:19,980 It could be that people try to get pregnant or have 1216 00:56:19,980 --> 00:56:22,290 a child when they know about the disease 1217 00:56:22,290 --> 00:56:25,080 because they want to have a child early on. 1218 00:56:25,080 --> 00:56:27,090 There seem to be some technologies where 1219 00:56:27,090 --> 00:56:30,490 people can, in fact, avoid that their child had the disease. 1220 00:56:30,490 --> 00:56:33,210 So you could make sure that doesn't happen. 1221 00:56:33,210 --> 00:56:35,950 And there might be some couples who might say, well, 1222 00:56:35,950 --> 00:56:40,350 even if it's the case that one of the parents 1223 00:56:40,350 --> 00:56:43,702 will die early on, if the person was like 20 years old or the 1224 00:56:43,702 --> 00:56:48,930 like, they might want to have the child early so they can 1225 00:56:48,930 --> 00:56:51,390 spend sufficiently much time with the child 1226 00:56:51,390 --> 00:56:54,120 before the onset of the disease. 1227 00:56:54,120 --> 00:56:56,070 But anyway, it seems to be that people-- 1228 00:56:56,070 --> 00:57:00,660 the black lines are relatively clearly above zero. 1229 00:57:00,660 --> 00:57:03,540 These are coefficients that compare essentially the black 1230 00:57:03,540 --> 00:57:07,570 versus the group that's negative, 1231 00:57:07,570 --> 00:57:11,580 which is the positive versus negative tested people. 1232 00:57:11,580 --> 00:57:14,550 Now when you look at the uncertain people, 1233 00:57:14,550 --> 00:57:16,410 these are people who, essentially, 1234 00:57:16,410 --> 00:57:18,090 with some probability, are positive 1235 00:57:18,090 --> 00:57:20,730 and with some probability are negative. 1236 00:57:20,730 --> 00:57:22,800 What we would sort of expect is that those people 1237 00:57:22,800 --> 00:57:25,740 are somewhere in between the positive 1238 00:57:25,740 --> 00:57:26,860 and the negative people. 1239 00:57:26,860 --> 00:57:29,910 So you would expect that the gray lines are also positive 1240 00:57:29,910 --> 00:57:33,950 perhaps not as big as the black lines. 1241 00:57:33,950 --> 00:57:36,360 Instead, you see essentially-- except for the pregnancy 1242 00:57:36,360 --> 00:57:38,790 result, and that doesn't seem to be significant perhaps 1243 00:57:38,790 --> 00:57:42,300 because the sample is small-- 1244 00:57:42,300 --> 00:57:45,500 the gray lines are essentially pretty close to zero. 1245 00:57:45,500 --> 00:57:49,140 So essentially, people who are uncertain about the disease 1246 00:57:49,140 --> 00:57:54,547 behave very similarly to people who are tested negative. 1247 00:57:54,547 --> 00:57:56,505 So essentially, people who have not been tested 1248 00:57:56,505 --> 00:57:58,380 or who are uncertain about the disease, 1249 00:57:58,380 --> 00:58:02,640 they behave as if they were tested negatively 1250 00:58:02,640 --> 00:58:05,970 and likely because they essentially delude themselves 1251 00:58:05,970 --> 00:58:13,260 in some ways, thinking they are negative when there's no action 1252 00:58:13,260 --> 00:58:14,550 that they need to take. 1253 00:58:14,550 --> 00:58:17,340 So that really shows, or the authors 1254 00:58:17,340 --> 00:58:21,450 argue that, essentially the lack of testing 1255 00:58:21,450 --> 00:58:23,970 really changes people's behavior. 1256 00:58:23,970 --> 00:58:26,790 Notice that it's very hard for us to think about welfare-- 1257 00:58:26,790 --> 00:58:28,890 are people better or worse off because of that? 1258 00:58:28,890 --> 00:58:31,140 Because of course, it's a choice to do so. 1259 00:58:31,140 --> 00:58:32,550 And you might say, well, I really 1260 00:58:32,550 --> 00:58:34,620 value thinking about I'm pretty healthy 1261 00:58:34,620 --> 00:58:36,390 and life is going to be good. 1262 00:58:36,390 --> 00:58:38,400 And that's really valuable to people. 1263 00:58:38,400 --> 00:58:42,780 And they take into account that changes in their finance 1264 00:58:42,780 --> 00:58:45,852 and so on would be valuable, but they're not doing this 1265 00:58:45,852 --> 00:58:47,310 because if they were doing it, they 1266 00:58:47,310 --> 00:58:51,090 would have to acknowledge to themselves that they are, 1267 00:58:51,090 --> 00:58:52,620 potentially at least, sick. 1268 00:58:58,810 --> 00:58:59,310 OK. 1269 00:58:59,310 --> 00:59:01,685 So how should we think about these beliefs and behaviors? 1270 00:59:01,685 --> 00:59:03,390 Let me summarize what we have seen. 1271 00:59:03,390 --> 00:59:06,120 So many people don't get tested, despite arguably good reasons 1272 00:59:06,120 --> 00:59:07,710 to do so. 1273 00:59:07,710 --> 00:59:10,440 People are often overoptimistic about the probability 1274 00:59:10,440 --> 00:59:12,990 of not having the disease, despite often fairly 1275 00:59:12,990 --> 00:59:14,970 clear signs that they have it. 1276 00:59:14,970 --> 00:59:17,490 And that really seems to be very much consistent 1277 00:59:17,490 --> 00:59:19,322 with anticipatory utility. 1278 00:59:19,322 --> 00:59:21,030 People want to feel good about themselves 1279 00:59:21,030 --> 00:59:22,290 and about the future. 1280 00:59:22,290 --> 00:59:24,150 And therefore, they're not getting tested 1281 00:59:24,150 --> 00:59:27,270 and therefore they're overly optimistic. 1282 00:59:27,270 --> 00:59:30,420 Such over-optimism then translates into behavior. 1283 00:59:30,420 --> 00:59:32,190 So people react less to the signs 1284 00:59:32,190 --> 00:59:35,400 of likely having the disease than they arguably should. 1285 00:59:35,400 --> 00:59:38,100 In particular, people who are not tested, 1286 00:59:38,100 --> 00:59:40,320 they look a lot in their behavior 1287 00:59:40,320 --> 00:59:43,260 similar to people who have been tested negative. 1288 00:59:43,260 --> 00:59:47,670 And that's to say, perhaps if they were tested positively, 1289 00:59:47,670 --> 00:59:49,920 they would change their behavior in ways 1290 00:59:49,920 --> 00:59:53,130 that could be useful for them, at least along the road. 1291 00:59:53,130 --> 00:59:55,950 Now what models can explain such behaviors? 1292 00:59:55,950 --> 00:59:58,797 It's important to understand neoclassical models would not 1293 00:59:58,797 --> 01:00:00,880 generate these facts, because you would say, well, 1294 01:00:00,880 --> 01:00:02,963 you should be tested because that would be useful. 1295 01:00:02,963 --> 01:00:04,560 And utility is just not-- 1296 01:00:04,560 --> 01:00:06,550 there's essentially no utility from beliefs. 1297 01:00:06,550 --> 01:00:08,275 So why would you not get tested? 1298 01:00:08,275 --> 01:00:09,900 Let's say, well it's kind of expensive. 1299 01:00:09,900 --> 01:00:13,560 But arguably, the value of doing so should be quite high. 1300 01:00:13,560 --> 01:00:16,980 And the insurance would also pay for it 1301 01:00:16,980 --> 01:00:20,000 if you were not able to afford it. 1302 01:00:20,000 --> 01:00:23,180 Now let me write down or show you a simple model. 1303 01:00:23,180 --> 01:00:26,330 There will be a problem set, a question on this model 1304 01:00:26,330 --> 01:00:28,470 for you to understand it a little better. 1305 01:00:28,470 --> 01:00:29,900 So the model is extremely simple. 1306 01:00:29,900 --> 01:00:30,950 It has two periods. 1307 01:00:30,950 --> 01:00:34,258 And the relevant outcomes occur in Period 2. 1308 01:00:34,258 --> 01:00:36,050 In Period 2, the decision-maker will either 1309 01:00:36,050 --> 01:00:38,822 be negative with probability p. 1310 01:00:38,822 --> 01:00:40,280 So that positive person essentially 1311 01:00:40,280 --> 01:00:43,220 would be healthy, with probably p of not having the disease. 1312 01:00:43,220 --> 01:00:47,990 And that person will be positive with probability 1-p. 1313 01:00:47,990 --> 01:00:52,580 The instantaneous utilities in Period 2, how they will feel. 1314 01:00:52,580 --> 01:00:55,990 Think about Period 2 as being everything in the future. 1315 01:00:55,990 --> 01:00:57,560 So Period 1 is the period when you 1316 01:00:57,560 --> 01:01:00,230 think about potentially getting tested, which is right now 1317 01:01:00,230 --> 01:01:01,970 or this month or the like. 1318 01:01:01,970 --> 01:01:04,700 And Period 2 now is, say, next year, 1319 01:01:04,700 --> 01:01:07,250 everything starting from all of the future. 1320 01:01:07,250 --> 01:01:10,220 Now the instantaneous utility of this long future period 1321 01:01:10,220 --> 01:01:12,320 are u-minus and u-plus. 1322 01:01:12,320 --> 01:01:16,370 u-minus is if you're negative so u-minus is the good case. 1323 01:01:16,370 --> 01:01:19,250 And u-plus is if you're positive. 1324 01:01:19,250 --> 01:01:22,130 You sort of think that u-minus is larger than u-plus. 1325 01:01:22,130 --> 01:01:26,780 It's better to be negative than positive in that disease 1326 01:01:26,780 --> 01:01:29,370 or in general. 1327 01:01:29,370 --> 01:01:32,890 In Period 1, we're going to look at, now, people's choices. 1328 01:01:32,890 --> 01:01:34,880 We can assume no discounting whatsoever. 1329 01:01:34,880 --> 01:01:36,470 That's just for simplicity. 1330 01:01:36,470 --> 01:01:38,030 We also assume, for now at least, 1331 01:01:38,030 --> 01:01:41,120 that nothing can be done about the person's condition. 1332 01:01:41,120 --> 01:01:43,040 So A-- there's no cure. 1333 01:01:43,040 --> 01:01:45,890 B-- there's also no financial or other adjustment 1334 01:01:45,890 --> 01:01:49,478 that the person might be able to take to mitigate 1335 01:01:49,478 --> 01:01:50,520 the onset of the disease. 1336 01:01:50,520 --> 01:01:52,270 For now we're going to just rule that out. 1337 01:01:52,270 --> 01:01:55,220 This is just about, do you want to find out or not? 1338 01:01:55,220 --> 01:01:57,550 And why or why not? 1339 01:01:57,550 --> 01:02:00,770 Now what would I expect the utility theory to say? 1340 01:02:00,770 --> 01:02:03,080 Here what is expected utility? 1341 01:02:03,080 --> 01:02:04,010 Well it's very simple. 1342 01:02:04,010 --> 01:02:06,800 It's essentially, there's no anticipatory utility. 1343 01:02:06,800 --> 01:02:09,710 Expected utility is just what's going to happen in the future. 1344 01:02:09,710 --> 01:02:12,150 The probability p, you get u-minus, which 1345 01:02:12,150 --> 01:02:15,050 is the person is negative. 1346 01:02:15,050 --> 01:02:18,530 With probability 1 minus p, the person 1347 01:02:18,530 --> 01:02:22,220 is positive and then recedes utility u-plus. 1348 01:02:22,220 --> 01:02:24,230 Notice again, there's no discounting. 1349 01:02:24,230 --> 01:02:25,250 This is in Period 2. 1350 01:02:25,250 --> 01:02:27,730 But then since no discounting, delta or beta 1351 01:02:27,730 --> 01:02:30,320 are essentially just 1. 1352 01:02:30,320 --> 01:02:33,750 Now let's add anticipatory utility to this model. 1353 01:02:33,750 --> 01:02:38,900 Notice that this type of utility can only 1354 01:02:38,900 --> 01:02:40,910 depend on the person's beliefs about 1355 01:02:40,910 --> 01:02:44,180 whether she will have the disease or not, not on what 1356 01:02:44,180 --> 01:02:45,420 actually happens. 1357 01:02:45,420 --> 01:02:45,920 Right? 1358 01:02:45,920 --> 01:02:48,110 So the anticipatory utility depends 1359 01:02:48,110 --> 01:02:51,830 on the probability of having the disease, 1360 01:02:51,830 --> 01:02:54,810 and not on the actual consequences of the disease. 1361 01:02:54,810 --> 01:02:58,520 And since the person in Period 1 has this anticipatory utility, 1362 01:02:58,520 --> 01:03:02,240 doesn't know for sure whether she is going to have it or not, 1363 01:03:02,240 --> 01:03:05,210 the anticipatory utility will only depend on the beliefs, 1364 01:03:05,210 --> 01:03:08,750 on the probability p for perceived probability 1365 01:03:08,750 --> 01:03:11,390 p of having the disease. 1366 01:03:11,390 --> 01:03:15,000 Now we're going to make two extreme assumptions 1367 01:03:15,000 --> 01:03:17,810 about the formation of beliefs. 1368 01:03:17,810 --> 01:03:19,830 Number one is beliefs are correct, 1369 01:03:19,830 --> 01:03:22,080 in the sense that the decision-maker will always 1370 01:03:22,080 --> 01:03:25,140 optimally update their beliefs. 1371 01:03:25,140 --> 01:03:28,680 And once you've given the decision-maker a test, 1372 01:03:28,680 --> 01:03:32,550 the person cannot just make up their beliefs in certain ways. 1373 01:03:32,550 --> 01:03:35,070 If the test is positive, the person knows they're positive. 1374 01:03:35,070 --> 01:03:38,200 It's negative, a person knows that they're negative. 1375 01:03:38,200 --> 01:03:41,070 So p is either 1 or 0 once they have the test. 1376 01:03:41,070 --> 01:03:46,200 Or p is correct in the sense that the person knows it's p 1377 01:03:46,200 --> 01:03:51,040 and there's no wrong information or beliefs about it. 1378 01:03:51,040 --> 01:03:53,500 Second, the decision-maker can choose 1379 01:03:53,500 --> 01:03:55,450 to manipulate her beliefs. 1380 01:03:55,450 --> 01:03:56,950 We're going to start with one, which 1381 01:03:56,950 --> 01:03:59,830 is essentially the assumption that beliefs are correct. 1382 01:03:59,830 --> 01:04:04,060 Let the anticipatory utility in Period 1 be f of p. 1383 01:04:04,060 --> 01:04:07,120 And recall that p is the probability of being negative. 1384 01:04:07,120 --> 01:04:09,760 Now let's assume that, which is obvious in some ways 1385 01:04:09,760 --> 01:04:13,030 and uncontroversial that u-minus is larger than u-plus. 1386 01:04:13,030 --> 01:04:16,570 So essentially, again, remember u-minus is the probability 1387 01:04:16,570 --> 01:04:17,400 of being negative. 1388 01:04:17,400 --> 01:04:19,810 So utility is higher in the second period 1389 01:04:19,810 --> 01:04:22,060 if you're negative than when you're positive. 1390 01:04:22,060 --> 01:04:25,210 And then f of p, f is increasing in p. 1391 01:04:25,210 --> 01:04:27,940 That's actually not so obvious in the sense it's not obvious 1392 01:04:27,940 --> 01:04:31,630 that going from a certain percentage 1393 01:04:31,630 --> 01:04:35,590 to a higher one, the anticipatory utility 1394 01:04:35,590 --> 01:04:37,180 always increases. 1395 01:04:37,180 --> 01:04:38,890 It's a reasonable assumption, but one 1396 01:04:38,890 --> 01:04:41,055 might be arguing about it. 1397 01:04:41,055 --> 01:04:43,180 But for now, we're going to just assume that f of p 1398 01:04:43,180 --> 01:04:45,040 is increasing in p. 1399 01:04:45,040 --> 01:04:48,673 Now does the person want to know her status? 1400 01:04:48,673 --> 01:04:49,840 Well let's think about this. 1401 01:04:49,840 --> 01:04:54,410 So if she doesn't find out her status, 1402 01:04:54,410 --> 01:04:59,920 her utility is f of p, which is her probability. 1403 01:04:59,920 --> 01:05:02,785 Or without finding anything else is p. 1404 01:05:02,785 --> 01:05:05,350 So f of p is just her anticipatory utility, 1405 01:05:05,350 --> 01:05:06,860 plus the term that we had before, 1406 01:05:06,860 --> 01:05:09,865 which is just the expected utility in the future in Period 1407 01:05:09,865 --> 01:05:16,360 2, which is p times u-minus, plus 1 minus p, times u-plus. 1408 01:05:16,360 --> 01:05:18,220 Now if she finds out her status, she 1409 01:05:18,220 --> 01:05:22,450 might find out that she's either negative or positive. 1410 01:05:22,450 --> 01:05:25,990 If she's negative, that's with probability p, 1411 01:05:25,990 --> 01:05:27,910 her utility is the anticipatory part. 1412 01:05:27,910 --> 01:05:29,920 If she's negative, she just gets f of 1. 1413 01:05:29,920 --> 01:05:31,880 She knows for sure that she's negative. 1414 01:05:31,880 --> 01:05:36,100 So there she is f of 1, plus u-minus in the second period. 1415 01:05:36,100 --> 01:05:38,410 All that, again, is with probability p. 1416 01:05:38,410 --> 01:05:42,700 If she's positive, her utility is f of 0. 1417 01:05:42,700 --> 01:05:46,100 So her probability of being negative is 0. 1418 01:05:46,100 --> 01:05:49,600 And then in the second period, you have u-plus. 1419 01:05:49,600 --> 01:05:52,420 Again, that is probability 1 minus p. 1420 01:05:52,420 --> 01:05:56,500 So now her expected utility, putting these things together, 1421 01:05:56,500 --> 01:06:01,150 is the part here in the back, which is p times u-minus, 1422 01:06:01,150 --> 01:06:04,360 plus 1 minus p, times u-plus. 1423 01:06:04,360 --> 01:06:06,370 That's what we already had above. 1424 01:06:06,370 --> 01:06:09,850 That's just the expected utility of Period 2. 1425 01:06:09,850 --> 01:06:16,480 And p times a negative test, which is f of 1, 1426 01:06:16,480 --> 01:06:21,850 plus 1 minus p, a positive test, which is f of 0. 1427 01:06:21,850 --> 01:06:22,480 OK. 1428 01:06:22,480 --> 01:06:25,153 And so now, if you want to figure out, 1429 01:06:25,153 --> 01:06:26,320 does she want to get tested? 1430 01:06:26,320 --> 01:06:30,400 We just have to compare 1 versus 2-- 1431 01:06:30,400 --> 01:06:34,060 or 2 minus 1 if you want, if that's the value of getting 1432 01:06:34,060 --> 01:06:35,050 tested. 1433 01:06:35,050 --> 01:06:37,270 Notice that the expected utility part, 1434 01:06:37,270 --> 01:06:38,920 what's going to happen in the future, 1435 01:06:38,920 --> 01:06:40,510 will always be the same. 1436 01:06:40,510 --> 01:06:44,800 We have p u-minus, plus 1 minus p, u-plus. 1437 01:06:44,800 --> 01:06:46,370 We have that here as well. 1438 01:06:46,370 --> 01:06:50,230 The reason for that is because we assumed previously that we 1439 01:06:50,230 --> 01:06:52,043 cannot change the outcome. 1440 01:06:52,043 --> 01:06:52,850 Right? 1441 01:06:52,850 --> 01:06:57,280 So that's going to cancel out if we subtract 2 minus 1. 1442 01:06:57,280 --> 01:07:01,210 So what we're going to be left with is essentially 1443 01:07:01,210 --> 01:07:03,910 an expression that depends on f of p. 1444 01:07:03,910 --> 01:07:10,740 p times f of 1, plus 1 minus p, f of 0. 1445 01:07:10,740 --> 01:07:12,990 When does she reject or seek information? 1446 01:07:12,990 --> 01:07:18,060 Well she's information-averse if p times f of 1, 1447 01:07:18,060 --> 01:07:22,740 plus 1 minus p, times f of 0 is smaller than f of p. 1448 01:07:22,740 --> 01:07:24,750 That's exactly-- if you thought that's the case, 1449 01:07:24,750 --> 01:07:26,640 it's exactly like the concavity condition. 1450 01:07:26,640 --> 01:07:30,850 If f is concave, that's in fact the definition of concavity. 1451 01:07:30,850 --> 01:07:33,810 So if f is steeper for lower values, 1452 01:07:33,810 --> 01:07:38,310 then the person will be information-averse. 1453 01:07:38,310 --> 01:07:42,150 So if the person really dislikes any suspicion of bad news, 1454 01:07:42,150 --> 01:07:44,610 for example, due to anxiety, there's not 1455 01:07:44,610 --> 01:07:49,290 much value-added certainty, then the person 1456 01:07:49,290 --> 01:07:53,570 will not get tested and not seek information. 1457 01:07:53,570 --> 01:07:58,490 In contrast, that function is convex, 1458 01:07:58,490 --> 01:08:01,760 that is to say, if it's steeper for higher value, 1459 01:08:01,760 --> 01:08:04,400 than the person really likes certainty. 1460 01:08:04,400 --> 01:08:07,910 So some suspicion of bad news is not so painful. 1461 01:08:07,910 --> 01:08:09,410 That is to say, so essentially, when 1462 01:08:09,410 --> 01:08:13,400 you think about p starting with something like 50%, what you 1463 01:08:13,400 --> 01:08:17,960 want to compare is going from 50% to 100%, versus some 50% 1464 01:08:17,960 --> 01:08:19,010 to 0. 1465 01:08:19,010 --> 01:08:22,939 And going from 50% to 100% is more valuable. 1466 01:08:22,939 --> 01:08:27,920 It improves your anticipatory utility more than going 1467 01:08:27,920 --> 01:08:35,260 from 50% to 0% is sort of damaging of p of p-negative. 1468 01:08:35,260 --> 01:08:38,240 It's a little confusing notation. 1469 01:08:38,240 --> 01:08:41,060 Then essentially the person will seek information. 1470 01:08:41,060 --> 01:08:44,510 And it's the opposite if the person really dislikes 1471 01:08:44,510 --> 01:08:49,250 any suspicion of bad news. 1472 01:08:49,250 --> 01:08:52,910 And there's not much added value of certainty. 1473 01:08:52,910 --> 01:08:56,930 Now that is essentially just a question of, 1474 01:08:56,930 --> 01:08:58,580 what does this f function look like? 1475 01:08:58,580 --> 01:09:00,372 And we're going to try to figure out or try 1476 01:09:00,372 --> 01:09:02,020 to estimate that function. 1477 01:09:02,020 --> 01:09:07,130 Notice that it's a very simple example here, where essentially 1478 01:09:07,130 --> 01:09:08,960 the person has to always-- 1479 01:09:08,960 --> 01:09:11,660 there's no self-delusion of self-deception. 1480 01:09:11,660 --> 01:09:17,029 The person always had to adhere to the truth. 1481 01:09:17,029 --> 01:09:19,910 Now consider instead the possibility 1482 01:09:19,910 --> 01:09:21,890 that the person can manipulate their beliefs 1483 01:09:21,890 --> 01:09:23,810 to make yourself feel better. 1484 01:09:23,810 --> 01:09:26,000 Now one question you might ask yourself as well 1485 01:09:26,000 --> 01:09:27,560 in the above framework, would you 1486 01:09:27,560 --> 01:09:29,090 want to hold correct beliefs? 1487 01:09:29,090 --> 01:09:31,640 Is there any good reason to hold correct beliefs? 1488 01:09:31,640 --> 01:09:33,439 And the answer is clearly no. 1489 01:09:33,439 --> 01:09:37,899 Well there's no instrumental reason of beliefs. 1490 01:09:37,899 --> 01:09:38,970 There's no cure. 1491 01:09:38,970 --> 01:09:40,700 You can't really make anything better. 1492 01:09:40,700 --> 01:09:43,880 You can't really adjust your behavior by assumption 1493 01:09:43,880 --> 01:09:44,720 currently. 1494 01:09:44,720 --> 01:09:47,660 So why you would you ever think that you 1495 01:09:47,660 --> 01:09:50,990 have to do this if you can make yourself feel better 1496 01:09:50,990 --> 01:09:53,340 by increasing your p? 1497 01:09:53,340 --> 01:09:57,830 So if she could believe that she's HD-negative, for sure she 1498 01:09:57,830 --> 01:10:01,730 would get higher utility for whatever happens later, 1499 01:10:01,730 --> 01:10:06,620 which is f of 1 is larger than f of 0, or f of 1 1500 01:10:06,620 --> 01:10:08,640 is larger than any f of p. 1501 01:10:08,640 --> 01:10:09,140 Right? 1502 01:10:09,140 --> 01:10:11,375 Essentially f of p is maximized. 1503 01:10:11,375 --> 01:10:13,190 We assume that it's increasing. 1504 01:10:13,190 --> 01:10:15,560 It's maximized that p equals 1. 1505 01:10:15,560 --> 01:10:18,805 So really in the model that I showed you so far, 1506 01:10:18,805 --> 01:10:20,180 if you could choose your beliefs, 1507 01:10:20,180 --> 01:10:21,980 there's really no reason whatsoever 1508 01:10:21,980 --> 01:10:25,340 of not choosing f of 1. 1509 01:10:25,340 --> 01:10:27,530 When there's a question, well why might you 1510 01:10:27,530 --> 01:10:29,690 not want to choose f 1 anyway? 1511 01:10:29,690 --> 01:10:31,190 And of course, the answer is, well 1512 01:10:31,190 --> 01:10:33,680 that's because there might be instrumental reason. 1513 01:10:33,680 --> 01:10:35,240 There might be actually some action 1514 01:10:35,240 --> 01:10:38,670 that you might be able to take to adjust your beliefs. 1515 01:10:38,670 --> 01:10:40,220 So in particular, incorrect beliefs 1516 01:10:40,220 --> 01:10:41,718 can lead to mistaken decisions. 1517 01:10:41,718 --> 01:10:43,385 You can essentially make worse decisions 1518 01:10:43,385 --> 01:10:45,450 if you have incorrect beliefs. 1519 01:10:45,450 --> 01:10:48,240 So then there's a trade-off. 1520 01:10:48,240 --> 01:10:48,740 Right? 1521 01:10:48,740 --> 01:10:51,330 There's a trade-off between, on the one hand, 1522 01:10:51,330 --> 01:10:53,990 if you have incorrect beliefs, if you're overly optimistic, 1523 01:10:53,990 --> 01:10:55,310 you feel good about it. 1524 01:10:55,310 --> 01:10:57,830 And you're happier than you would be otherwise 1525 01:10:57,830 --> 01:11:00,610 because you get to enjoy anticipatory utility. 1526 01:11:00,610 --> 01:11:02,360 But of course, that might come at the cost 1527 01:11:02,360 --> 01:11:05,040 of making wrong decisions. 1528 01:11:05,040 --> 01:11:09,860 So in that trade-off in general, decision-making 1529 01:11:09,860 --> 01:11:13,940 with anticipatory utility, it's like overoptimism leads 1530 01:11:13,940 --> 01:11:16,730 to higher utility than realism. 1531 01:11:16,730 --> 01:11:19,640 That is to say that if you're fully optimizing to start with, 1532 01:11:19,640 --> 01:11:23,570 some overoptimism would make you better off because essentially 1533 01:11:23,570 --> 01:11:26,228 you get the value of overoptimism 1534 01:11:26,228 --> 01:11:28,520 and then you're not going to distort your behavior very 1535 01:11:28,520 --> 01:11:30,740 much if you were optimizing to start with. 1536 01:11:30,740 --> 01:11:38,300 So generally you can show that slight optimism, in many cases, 1537 01:11:38,300 --> 01:11:42,475 in fact, an optimal thing to do, even if it comes at some cost. 1538 01:11:42,475 --> 01:11:43,850 And of course, the intuition here 1539 01:11:43,850 --> 01:11:46,460 is that the person wants to believe that she's healthy, 1540 01:11:46,460 --> 01:11:47,760 and it makes her feel better. 1541 01:11:47,760 --> 01:11:51,890 So she convinced herself that that's, in fact, the case. 1542 01:11:51,890 --> 01:11:55,400 Now optimal expectations, essentially, as I said, 1543 01:11:55,400 --> 01:11:58,670 trade-off-- anticipatory utility versus the value of knowledge 1544 01:11:58,670 --> 01:12:00,515 for making correct choices. 1545 01:12:03,190 --> 01:12:06,110 And then, as I said, overly positive beliefs 1546 01:12:06,110 --> 01:12:08,150 are an economic important implication 1547 01:12:08,150 --> 01:12:09,770 of utility from beliefs. 1548 01:12:09,770 --> 01:12:12,740 So once you assume or think about that there 1549 01:12:12,740 --> 01:12:17,330 might be utility from beliefs, then one corollary is 1550 01:12:17,330 --> 01:12:20,570 that people tend to have overly positive beliefs, 1551 01:12:20,570 --> 01:12:24,230 because you explained that with utility from beliefs-- 1552 01:12:24,230 --> 01:12:26,300 because really people have incentives 1553 01:12:26,300 --> 01:12:29,690 to be overly optimistic even if that 1554 01:12:29,690 --> 01:12:35,000 comes at the cost of potentially making worse choices. 1555 01:12:35,000 --> 01:12:37,910 Next I'm going to show you some other evidence 1556 01:12:37,910 --> 01:12:39,402 of overly optimistic beliefs. 1557 01:12:39,402 --> 01:12:40,860 And then in particular, we're going 1558 01:12:40,860 --> 01:12:46,370 to talk about heuristics and biases, which is again the idea 1559 01:12:46,370 --> 01:12:49,430 that people might be not good patients, 1560 01:12:49,430 --> 01:12:53,510 in the sense that they might not be good at understanding 1561 01:12:53,510 --> 01:12:57,360 probabilities and how to update probabilities accordingly. 1562 01:12:57,360 --> 01:13:01,730 Remember, recitation on Thursday and on Friday 1563 01:13:01,730 --> 01:13:04,900 is going to talk about Bayesian learning. 1564 01:13:04,900 --> 01:13:07,190 So if you want a refresher on that, 1565 01:13:07,190 --> 01:13:09,650 that'll be quite helpful because on Monday we're 1566 01:13:09,650 --> 01:13:12,790 going to talk about deviations from Bayesian learning. 1567 01:13:12,790 --> 01:13:14,820 Thank you very much.