1 00:00:00,000 --> 00:00:02,410 [SQUEAKING] 2 00:00:02,410 --> 00:00:04,338 [RUSTLING] 3 00:00:04,338 --> 00:00:05,784 [CLICKING] 4 00:00:10,604 --> 00:00:12,100 FRANK SCHILBACH: Welcome, everyone. 5 00:00:12,100 --> 00:00:15,120 This is 14.13 Psychology and Economics, also known 6 00:00:15,120 --> 00:00:16,980 as Behavioral Economics. 7 00:00:16,980 --> 00:00:18,570 My name is Frank Schilbach. 8 00:00:18,570 --> 00:00:21,750 I'm a faculty member in the Economics Department, 9 00:00:21,750 --> 00:00:24,450 teaching and doing research in Behavioral Economics 10 00:00:24,450 --> 00:00:28,440 and Psychology and Economics and Development Economics. 11 00:00:28,440 --> 00:00:32,189 There's a syllabi over here in case anybody needs one. 12 00:00:32,189 --> 00:00:35,140 So let me sort of start with introducing ourselves, 13 00:00:35,140 --> 00:00:36,180 including myself. 14 00:00:36,180 --> 00:00:38,070 As I said, I'm Frank Schilbach. 15 00:00:38,070 --> 00:00:42,480 I'm an Economics PhD from Harvard recently. 16 00:00:42,480 --> 00:00:45,840 I'm from Germany in case-- as you may have noticed. 17 00:00:45,840 --> 00:00:48,115 I do research at the intersection of development 18 00:00:48,115 --> 00:00:48,990 behavioral economics. 19 00:00:48,990 --> 00:00:51,720 In particular, I'm interested in integrating 20 00:00:51,720 --> 00:00:54,570 psychological issues and helping us understand 21 00:00:54,570 --> 00:00:57,450 better the lives of the poor. 22 00:00:57,450 --> 00:01:00,270 And so I study all sorts of issues related to poverty, 23 00:01:00,270 --> 00:01:03,450 how poverty itself affects people's behavior, 24 00:01:03,450 --> 00:01:05,230 and how conditions of poverty, or things 25 00:01:05,230 --> 00:01:07,230 that are associated with poverty might feed back 26 00:01:07,230 --> 00:01:11,340 into people's decision-making and their productivity or labor 27 00:01:11,340 --> 00:01:12,420 market behaviors. 28 00:01:12,420 --> 00:01:15,660 And then perhaps, sort of lead to the persistence of poverty 29 00:01:15,660 --> 00:01:17,410 through those kinds of effects. 30 00:01:17,410 --> 00:01:19,740 So I have some work on financial constraints, 31 00:01:19,740 --> 00:01:22,920 how financial constraints affect people's behavior, in terms 32 00:01:22,920 --> 00:01:24,708 of thinking about money itself. 33 00:01:24,708 --> 00:01:26,250 I have some work on sleep deprivation 34 00:01:26,250 --> 00:01:27,450 among the urban poor. 35 00:01:27,450 --> 00:01:29,280 I'm thinking about pain and substance abuse 36 00:01:29,280 --> 00:01:32,100 and how that might affect people's choices. 37 00:01:32,100 --> 00:01:35,555 Most recently, I'm interested in things 38 00:01:35,555 --> 00:01:37,680 related to mental health, in particular, depression 39 00:01:37,680 --> 00:01:40,320 and loneliness, how those might affect people's well-being, 40 00:01:40,320 --> 00:01:44,130 and then their behaviors. 41 00:01:44,130 --> 00:01:48,752 I have office hours where you can sign up on my websites. 42 00:01:48,752 --> 00:01:50,460 Currently, they're on Tuesday afternoons, 43 00:01:50,460 --> 00:01:52,560 but if you cannot find a time set work for you, 44 00:01:52,560 --> 00:01:55,110 please email me and we'll find some time that works out. 45 00:01:55,110 --> 00:01:56,700 My assistant is Krista Moody in case 46 00:01:56,700 --> 00:01:58,530 you want to sort of get a hold of me 47 00:01:58,530 --> 00:02:00,660 and I'm not available for some reason, 48 00:02:00,660 --> 00:02:03,010 you can reach out to her. 49 00:02:03,010 --> 00:02:05,500 So now, let me give you sort of the overview of the class. 50 00:02:05,500 --> 00:02:07,982 I'm going to start sort of with four things. 51 00:02:07,982 --> 00:02:09,690 One, first, I'm going to sort of tell you 52 00:02:09,690 --> 00:02:11,770 what is this class, what is behavioral economics, 53 00:02:11,770 --> 00:02:12,990 or what is psychology economics? 54 00:02:12,990 --> 00:02:13,890 What do we mean by that? 55 00:02:13,890 --> 00:02:15,848 Is it sort of just putting two fields together, 56 00:02:15,848 --> 00:02:17,430 or is this something more specific, 57 00:02:17,430 --> 00:02:18,955 which I'll argue it is. 58 00:02:18,955 --> 00:02:20,580 I'm going to give you some example then 59 00:02:20,580 --> 00:02:24,450 about how you might use behavioral economics, 60 00:02:24,450 --> 00:02:27,150 and a specific example and a policy that 61 00:02:27,150 --> 00:02:28,260 might affect all of you. 62 00:02:28,260 --> 00:02:30,510 And in fact, it will because we have a specific laptop 63 00:02:30,510 --> 00:02:32,008 policy in this class. 64 00:02:32,008 --> 00:02:34,300 And I'll tell you a little bit how to think about this. 65 00:02:34,300 --> 00:02:35,967 There will also be a problem set that'll 66 00:02:35,967 --> 00:02:37,920 help you think about this a bit more. 67 00:02:37,920 --> 00:02:40,930 Then we talk about somewhat boring, of course, logistics. 68 00:02:40,930 --> 00:02:43,320 And at the end, we do a questionnaire, a quiz, 69 00:02:43,320 --> 00:02:45,157 for everybody, in part, because we 70 00:02:45,157 --> 00:02:46,740 want to know some information for you. 71 00:02:46,740 --> 00:02:48,115 In part, we want to sort of learn 72 00:02:48,115 --> 00:02:51,450 a little bit about some decisions that you might make. 73 00:02:51,450 --> 00:02:52,200 OK. 74 00:02:52,200 --> 00:02:53,690 Great. 75 00:02:53,690 --> 00:02:56,100 Sorry, this is-- 76 00:02:56,100 --> 00:02:56,600 OK. 77 00:02:56,600 --> 00:02:58,460 So what is psychology economics? 78 00:02:58,460 --> 00:02:59,870 So you may have heard of this. 79 00:02:59,870 --> 00:03:02,540 The class used to be called economics and psychology, 80 00:03:02,540 --> 00:03:04,340 or behavioral economics. 81 00:03:04,340 --> 00:03:07,500 For our purposes, that's kind of like all the same. 82 00:03:07,500 --> 00:03:10,970 Some economists sort of argue that all economics 83 00:03:10,970 --> 00:03:11,970 is about behavior. 84 00:03:11,970 --> 00:03:15,877 So like behavioral economics is sort of a bit of a weird word. 85 00:03:15,877 --> 00:03:17,960 So that's why partially I'm using sort of the word 86 00:03:17,960 --> 00:03:19,340 psychology and economics. 87 00:03:19,340 --> 00:03:21,830 In some sense, it's sort of broader than that. 88 00:03:21,830 --> 00:03:25,280 One definition is, "it's a field of academic research 89 00:03:25,280 --> 00:03:26,930 that studies the joint influences 90 00:03:26,930 --> 00:03:30,320 of psychological and economic factors on behaviors." 91 00:03:30,320 --> 00:03:31,890 You could be sort of broader and say 92 00:03:31,890 --> 00:03:33,890 we're trying to integrate insights from not just 93 00:03:33,890 --> 00:03:36,320 psychology, but also anthropology, sociology, 94 00:03:36,320 --> 00:03:39,950 medicine, psychiatry, et cetera, and so on, into economics. 95 00:03:39,950 --> 00:03:43,370 And trying with an attempt to make economic models more 96 00:03:43,370 --> 00:03:46,490 realistic, and therefore, more predictive and help us 97 00:03:46,490 --> 00:03:48,230 understand people's behavior better. 98 00:03:48,230 --> 00:03:51,170 And then help us make better policies, perhaps, 99 00:03:51,170 --> 00:03:54,740 in trying to influence people's behavior. 100 00:03:54,740 --> 00:03:58,100 Now, as I said, that's medicine, sociology, et cetera. 101 00:03:58,100 --> 00:04:00,350 Broadly, we're trying to use insight from those fields 102 00:04:00,350 --> 00:04:02,267 and try and understand how we missed something 103 00:04:02,267 --> 00:04:04,470 by making the fairly stark assumptions 104 00:04:04,470 --> 00:04:07,610 that economic models usually make. 105 00:04:07,610 --> 00:04:10,505 That leads me to what are then the standard economics models? 106 00:04:10,505 --> 00:04:13,130 In some sense, what we're going to try and study to some degree 107 00:04:13,130 --> 00:04:15,260 is deviations from the classical, 108 00:04:15,260 --> 00:04:17,362 or sort of the standard economic model. 109 00:04:17,362 --> 00:04:19,279 So then, of course, we need to understand what 110 00:04:19,279 --> 00:04:20,571 is the standard economic model? 111 00:04:20,571 --> 00:04:23,877 To start with, which leads me to impart some prerequisites 112 00:04:23,877 --> 00:04:25,460 of the class, or you should have taken 113 00:04:25,460 --> 00:04:27,050 at least some economics or-- 114 00:04:27,050 --> 00:04:29,150 some economics to start with because I'm 115 00:04:29,150 --> 00:04:32,030 going to talk a lot about sort of deviations from those models 116 00:04:32,030 --> 00:04:32,660 and sort of-- 117 00:04:32,660 --> 00:04:35,660 if you haven't taken any class before, 118 00:04:35,660 --> 00:04:37,760 don't really know that very well, 119 00:04:37,760 --> 00:04:39,350 then understanding the deviations 120 00:04:39,350 --> 00:04:41,840 will be a little bit tricky. 121 00:04:41,840 --> 00:04:44,730 So for those of you who have taken economic classes, 122 00:04:44,730 --> 00:04:46,860 what do you usually assume about people's behavior? 123 00:04:46,860 --> 00:04:48,568 What are some of the assumptions that you 124 00:04:48,568 --> 00:04:50,540 make about economic behavior? 125 00:04:52,940 --> 00:04:53,440 Yes. 126 00:04:53,440 --> 00:04:55,755 AUDIENCE: Stable preferences. 127 00:04:55,755 --> 00:04:58,230 FRANK SCHILBACH: Stable preferences, yes. 128 00:04:58,230 --> 00:05:00,270 So first, well-defined preferences 129 00:05:00,270 --> 00:05:03,090 in the sense of like people know what their preferences are. 130 00:05:03,090 --> 00:05:04,350 They can state them. 131 00:05:04,350 --> 00:05:06,840 And then they're also stable in the sense of when 132 00:05:06,840 --> 00:05:11,190 I ask you today would you rather have apples or bananas, 133 00:05:11,190 --> 00:05:13,200 that will not change unless there's 134 00:05:13,200 --> 00:05:16,060 sort of new information or your circumstances change. 135 00:05:16,060 --> 00:05:18,060 Of course, if I ask you tomorrow and you already 136 00:05:18,060 --> 00:05:21,210 have a bunch of apples, and then you'd say you want bananas, 137 00:05:21,210 --> 00:05:23,355 that doesn't mean your preferences of changed. 138 00:05:23,355 --> 00:05:25,980 But if I ask you about tomorrow, what would you like for lunch, 139 00:05:25,980 --> 00:05:28,620 you tell me a choice, you want apples rather than bananas. 140 00:05:28,620 --> 00:05:31,080 And then tomorrow, you show up and nothing else has changed 141 00:05:31,080 --> 00:05:32,460 and you say, now, I want bananas. 142 00:05:32,460 --> 00:05:35,280 Well then, your preferences are not stable. 143 00:05:35,280 --> 00:05:36,433 That's one assumption, yes. 144 00:05:36,433 --> 00:05:37,350 What else do you have? 145 00:05:37,350 --> 00:05:38,010 Yeah. 146 00:05:38,010 --> 00:05:40,315 AUDIENCE: More broadly, they're rational. 147 00:05:40,315 --> 00:05:41,190 FRANK SCHILBACH: Yes. 148 00:05:41,190 --> 00:05:43,581 And what does that mean? 149 00:05:43,581 --> 00:05:45,850 AUDIENCE: They behaved deterministically 150 00:05:45,850 --> 00:05:47,397 to optimize some utility function. 151 00:05:47,397 --> 00:05:48,730 FRANK SCHILBACH: Right, exactly. 152 00:05:48,730 --> 00:05:51,750 So people essentially optimize some utility function. 153 00:05:51,750 --> 00:05:53,830 So we say here, people have the utility function 154 00:05:53,830 --> 00:05:55,080 and know what that is. 155 00:05:55,080 --> 00:05:56,830 And you're sort of saying they maximize it 156 00:05:56,830 --> 00:05:58,220 to optimize in a certain way. 157 00:05:58,220 --> 00:06:01,910 They don't make mistakes in that maximization process. 158 00:06:01,910 --> 00:06:06,880 So that's to say if you tell me you like apples over bananas, 159 00:06:06,880 --> 00:06:09,172 and then you choose bananas, well then, something 160 00:06:09,172 --> 00:06:10,630 is sort of going wrong in some ways 161 00:06:10,630 --> 00:06:11,890 that we haven't fully understood. 162 00:06:11,890 --> 00:06:13,473 It might be that you sort of like just 163 00:06:13,473 --> 00:06:16,660 making mistakes that could be construed as irrational. 164 00:06:16,660 --> 00:06:19,160 It's hard to rationalize with a model. 165 00:06:19,160 --> 00:06:20,490 What else? 166 00:06:20,490 --> 00:06:21,780 AUDIENCE: Self-interest. 167 00:06:21,780 --> 00:06:23,330 FRANK SCHILBACH: Self-interest, yes. 168 00:06:23,330 --> 00:06:26,860 So like usually in a lot of models all like this sort of-- 169 00:06:26,860 --> 00:06:29,980 the easiest model is a very narrowly defined self-interest 170 00:06:29,980 --> 00:06:31,900 is what people care about themselves. 171 00:06:31,900 --> 00:06:34,030 They care about what they consume 172 00:06:34,030 --> 00:06:35,590 and not about what others consume, 173 00:06:35,590 --> 00:06:38,210 or what others think of them. 174 00:06:38,210 --> 00:06:42,250 So sort of narrowly defined self-interest. 175 00:06:42,250 --> 00:06:42,750 Yes? 176 00:06:42,750 --> 00:06:44,333 AUDIENCE: People have the self-control 177 00:06:44,333 --> 00:06:46,638 to [? smooth their ?] consumption over time. 178 00:06:46,638 --> 00:06:47,640 FRANK SCHILBACH: Yes. 179 00:06:47,640 --> 00:06:49,440 So essentially, perfect self-control 180 00:06:49,440 --> 00:06:52,003 usually is sort of one version of putting that. 181 00:06:52,003 --> 00:06:54,420 One version of putting that is in some sense sort of going 182 00:06:54,420 --> 00:06:56,170 back to what you saying earlier is sort of 183 00:06:56,170 --> 00:06:57,480 to say stable preferences. 184 00:06:57,480 --> 00:07:00,090 If I'm telling you I'd like to exercise tomorrow, 185 00:07:00,090 --> 00:07:02,670 and that's my preference, tomorrow I'm 186 00:07:02,670 --> 00:07:05,790 not going to be like, oh, yeah, actually, I changed my mind 187 00:07:05,790 --> 00:07:08,190 and now I'm just watching movies, or the like, right? 188 00:07:08,190 --> 00:07:10,770 So that's the version of sort of preferences being stable. 189 00:07:10,770 --> 00:07:13,020 But the inherent underlying issue there 190 00:07:13,020 --> 00:07:15,450 is like self-control problems in the sense of like, 191 00:07:15,450 --> 00:07:17,100 if I like certain things, or make 192 00:07:17,100 --> 00:07:20,190 certain plans for the future, I have the self-control 193 00:07:20,190 --> 00:07:21,682 to follow through on those plans. 194 00:07:21,682 --> 00:07:22,890 That's usually an assumption. 195 00:07:22,890 --> 00:07:27,470 And that then shows up as preferences being stable. 196 00:07:27,470 --> 00:07:27,970 What else? 197 00:07:27,970 --> 00:07:28,470 Yeah. 198 00:07:28,470 --> 00:07:30,251 AUDIENCE: They prefer consumption today 199 00:07:30,251 --> 00:07:31,334 than consumption tomorrow. 200 00:07:31,334 --> 00:07:33,143 [INAUDIBLE] 201 00:07:33,143 --> 00:07:34,060 FRANK SCHILBACH: Yeah. 202 00:07:34,060 --> 00:07:36,880 So usually, there is like a discount factor 203 00:07:36,880 --> 00:07:39,490 in terms of how much you like consumption today 204 00:07:39,490 --> 00:07:41,890 versus consumption tomorrow. 205 00:07:41,890 --> 00:07:43,900 That is usually in any economic model. 206 00:07:43,900 --> 00:07:47,350 Usually, we think of like the discount factor being constant, 207 00:07:47,350 --> 00:07:49,510 as like how you think about today versus tomorrow 208 00:07:49,510 --> 00:07:52,060 is the same as you think about today versus in two 209 00:07:52,060 --> 00:07:54,920 days from now, or like a year from now and a year and a day 210 00:07:54,920 --> 00:07:55,420 from now. 211 00:07:55,420 --> 00:07:58,030 So usually, there's a constant discount factor for the future. 212 00:07:58,030 --> 00:08:02,260 Usually, we sort of employ economic-- sorry-- 213 00:08:02,260 --> 00:08:06,280 constant discounting or exponential discounting. 214 00:08:06,280 --> 00:08:07,960 What else do we have? 215 00:08:07,960 --> 00:08:08,460 Yeah. 216 00:08:08,460 --> 00:08:10,260 AUDIENCE: Risk aversion. 217 00:08:10,260 --> 00:08:11,950 FRANK SCHILBACH: Yes, risk aversion 218 00:08:11,950 --> 00:08:16,260 in the sense of, I guess, people are risk averse, 219 00:08:16,260 --> 00:08:19,747 and people often sort of define their preferences over outcome. 220 00:08:19,747 --> 00:08:22,080 So they have a utility function that we tend to concave, 221 00:08:22,080 --> 00:08:23,880 and the utility function is-- and we'll 222 00:08:23,880 --> 00:08:26,200 get to this in a few lectures. 223 00:08:26,200 --> 00:08:28,290 But the utility function is concave 224 00:08:28,290 --> 00:08:31,800 and it's defined over outcome as opposed to over changes 225 00:08:31,800 --> 00:08:32,490 of outcomes. 226 00:08:32,490 --> 00:08:34,710 And that shows up usually as sort of like essentially 227 00:08:34,710 --> 00:08:38,460 like a strictly concave utility function, 228 00:08:38,460 --> 00:08:42,330 and that's risk aversion. 229 00:08:42,330 --> 00:08:45,543 What about information, how do people think about information? 230 00:08:45,543 --> 00:08:46,710 How do they use information? 231 00:08:46,710 --> 00:08:47,827 AUDIENCE: Bayesian. 232 00:08:47,827 --> 00:08:48,910 FRANK SCHILBACH: Bayesian. 233 00:08:48,910 --> 00:08:49,950 And what does that mean? 234 00:08:49,950 --> 00:08:51,420 AUDIENCE: They update [? their prior. ?] 235 00:08:51,420 --> 00:08:52,462 FRANK SCHILBACH: Exactly. 236 00:08:52,462 --> 00:08:56,293 So people essentially are perfect information processors. 237 00:08:56,293 --> 00:08:57,460 You can call them Bayesians. 238 00:08:57,460 --> 00:08:59,204 But essentially, it's sort of-- for those of you 239 00:08:59,204 --> 00:09:01,287 know that-- it's sort of did they use Bayes' rule, 240 00:09:01,287 --> 00:09:03,280 essentially, which is to say if you gave 241 00:09:03,280 --> 00:09:06,940 a statistician a problem how to update information, 242 00:09:06,940 --> 00:09:09,220 people are able to do that in their heads. 243 00:09:09,220 --> 00:09:10,832 And that's a pretty stark assumption. 244 00:09:10,832 --> 00:09:12,790 But essentially, it's like when you people have 245 00:09:12,790 --> 00:09:16,660 certain information, standard economics does not necessarily 246 00:09:16,660 --> 00:09:19,050 assume that everybody has full information in a sense 247 00:09:19,050 --> 00:09:20,740 that everybody knows everything. 248 00:09:20,740 --> 00:09:23,030 But if you give them some information, 249 00:09:23,030 --> 00:09:26,165 then they update their beliefs accordingly, right? 250 00:09:26,165 --> 00:09:28,540 So they-- essentially, they optimally use new information 251 00:09:28,540 --> 00:09:30,957 and then form their beliefs, their posterior beliefs based 252 00:09:30,957 --> 00:09:34,780 on having new information and what they believe previously, 253 00:09:34,780 --> 00:09:35,950 their priors. 254 00:09:35,950 --> 00:09:37,450 There's another assumption, usually, 255 00:09:37,450 --> 00:09:38,867 and I sort of can actually sort of 256 00:09:38,867 --> 00:09:42,880 put these out for you, which is, there's another assumption. 257 00:09:42,880 --> 00:09:46,360 That is, people have no taste for beliefs or information. 258 00:09:46,360 --> 00:09:47,360 What do we mean by that? 259 00:09:47,360 --> 00:09:49,030 We mean by that is, essentially, people 260 00:09:49,030 --> 00:09:51,800 use information only to make decisions. 261 00:09:51,800 --> 00:09:53,198 So if you tell me something about 262 00:09:53,198 --> 00:09:54,740 like what's going to happen tomorrow, 263 00:09:54,740 --> 00:09:57,310 or if you tell me something about my health status, 264 00:09:57,310 --> 00:10:00,320 or the like, I use that to make better decisions. 265 00:10:00,320 --> 00:10:03,820 So if you tell me I'm sick, I'm going to use that information 266 00:10:03,820 --> 00:10:05,730 to go to the doctor. 267 00:10:05,730 --> 00:10:07,910 But people usually don't assume that people 268 00:10:07,910 --> 00:10:09,660 have certain beliefs about whether they're 269 00:10:09,660 --> 00:10:11,940 sick or healthy, or whether they're smart or not, 270 00:10:11,940 --> 00:10:13,500 or they are good looking or not. 271 00:10:13,500 --> 00:10:16,230 Usually, the assumption that beliefs are only 272 00:10:16,230 --> 00:10:19,050 used-- information is only used to make better decision. 273 00:10:19,050 --> 00:10:21,795 Not that people get utility from beliefs. 274 00:10:21,795 --> 00:10:23,170 But instead, you might say, well, 275 00:10:23,170 --> 00:10:25,770 I'm really good looking and smart, 276 00:10:25,770 --> 00:10:28,047 and maybe smarter than you actually are. 277 00:10:28,047 --> 00:10:29,880 And the reason why people might believe that 278 00:10:29,880 --> 00:10:32,640 is because it makes them feel good about themselves. 279 00:10:32,640 --> 00:10:38,370 That's usually an assumption that standard economic models 280 00:10:38,370 --> 00:10:39,005 make. 281 00:10:39,005 --> 00:10:40,630 I think we mentioned almost everything. 282 00:10:40,630 --> 00:10:42,450 We sort of said Bayesian information processor-- 283 00:10:42,450 --> 00:10:44,010 essentially processing information 284 00:10:44,010 --> 00:10:47,130 optimally-- well-defined and stable preferences, 285 00:10:47,130 --> 00:10:50,250 maximize expected utility, which is sort of like, in some sense, 286 00:10:50,250 --> 00:10:54,090 rationality if you want, apply exponential discount weighting 287 00:10:54,090 --> 00:10:56,190 current and future well-being. 288 00:10:56,190 --> 00:11:00,790 People are narrowly defined in terms of their self-interest. 289 00:11:00,790 --> 00:11:04,270 And people have preferences for final outcomes, not changes. 290 00:11:04,270 --> 00:11:06,900 So what you care about is how the weather is today, not 291 00:11:06,900 --> 00:11:10,710 kind of how the weather changed between yesterday and today 292 00:11:10,710 --> 00:11:12,003 or tomorrow. 293 00:11:12,003 --> 00:11:13,420 And we'll talk about all of these. 294 00:11:13,420 --> 00:11:15,690 By the way, these are all sort of some terms that 295 00:11:15,690 --> 00:11:17,837 seem maybe unfamiliar to you. 296 00:11:17,837 --> 00:11:19,920 We'll talk about all of these kinds of assumptions 297 00:11:19,920 --> 00:11:23,350 and perhaps how to deviate from them specifically. 298 00:11:23,350 --> 00:11:23,850 OK. 299 00:11:23,850 --> 00:11:26,970 So now, that we have those assumptions, now 300 00:11:26,970 --> 00:11:29,820 what kinds of deviations when you look into the world-- 301 00:11:29,820 --> 00:11:31,350 and sort of one part how you think 302 00:11:31,350 --> 00:11:33,568 about behavioral economics is, in some sense, 303 00:11:33,568 --> 00:11:35,610 think about the world, and try to sort of observe 304 00:11:35,610 --> 00:11:36,900 what's going on in the world. 305 00:11:36,900 --> 00:11:38,850 And try to see which assumptions might be 306 00:11:38,850 --> 00:11:41,670 violated in an important way. 307 00:11:41,670 --> 00:11:43,685 And try to sort of improve our models. 308 00:11:43,685 --> 00:11:45,060 So when you think about the world 309 00:11:45,060 --> 00:11:46,643 and look at these assumptions, can you 310 00:11:46,643 --> 00:11:50,760 come up with real world examples that might sort of violate 311 00:11:50,760 --> 00:11:54,310 these assumptions? 312 00:11:54,310 --> 00:11:56,200 Yes. 313 00:11:56,200 --> 00:11:58,310 AUDIENCE: The last assumption that you 314 00:11:58,310 --> 00:12:03,100 listed, the one of taste and beliefs or information 315 00:12:03,100 --> 00:12:05,950 is violated on a constant basis, people 316 00:12:05,950 --> 00:12:08,870 tend to believe what they want to believe. 317 00:12:08,870 --> 00:12:12,080 FRANK SCHILBACH: And do you have an example, a specific? 318 00:12:12,080 --> 00:12:15,924 AUDIENCE: This is very common as it regards political matters. 319 00:12:15,924 --> 00:12:17,591 I'm not going to name a particular topic 320 00:12:17,591 --> 00:12:20,350 because obvious reasons. 321 00:12:20,350 --> 00:12:27,310 But people often choose to discount information that 322 00:12:27,310 --> 00:12:28,920 goes against their beliefs. 323 00:12:28,920 --> 00:12:31,200 FRANK SCHILBACH: Right. 324 00:12:31,200 --> 00:12:33,580 So one reason could be-- and there's quite a bit of work 325 00:12:33,580 --> 00:12:34,080 recently. 326 00:12:34,080 --> 00:12:36,330 We'll talk a little bit about in this class about sort 327 00:12:36,330 --> 00:12:38,910 of how political beliefs or other reasons 328 00:12:38,910 --> 00:12:41,910 why people might be motivated to believe certain things. 329 00:12:41,910 --> 00:12:45,292 One clear example would be climate change, not just 330 00:12:45,292 --> 00:12:46,500 because of political reasons. 331 00:12:46,500 --> 00:12:48,210 There's some interesting work about, 332 00:12:48,210 --> 00:12:50,263 when people live in flood areas, for example, 333 00:12:50,263 --> 00:12:51,930 what do they think about climate change? 334 00:12:51,930 --> 00:12:54,450 And you might sort of say, well, if you really 335 00:12:54,450 --> 00:12:57,150 live in an area that's potentially affected by floods, 336 00:12:57,150 --> 00:12:58,650 you might really want to know about climate change 337 00:12:58,650 --> 00:13:01,108 and know what's going on and really try to inform yourself. 338 00:13:01,108 --> 00:13:03,690 But what people tend to do is sort of try to ignore the issue 339 00:13:03,690 --> 00:13:07,590 and try to sort of be happy as long as sort of nothing 340 00:13:07,590 --> 00:13:08,590 happens. 341 00:13:08,590 --> 00:13:11,980 Now, that's one example, yes. 342 00:13:11,980 --> 00:13:13,510 Yes. 343 00:13:13,510 --> 00:13:15,850 AUDIENCE: Some purposes might not be well-defined. 344 00:13:15,850 --> 00:13:19,180 So say, I have a decision on what I want to eat. 345 00:13:19,180 --> 00:13:22,390 I want to eat steak over chicken if I'm 346 00:13:22,390 --> 00:13:24,550 presented with the opportunity. 347 00:13:24,550 --> 00:13:28,900 But I prefer chicken over pork, but I might else 348 00:13:28,900 --> 00:13:30,142 prefer pork over steak. 349 00:13:30,142 --> 00:13:31,100 FRANK SCHILBACH: Right. 350 00:13:31,100 --> 00:13:33,120 AUDIENCE: So it's not always well-defined. 351 00:13:33,120 --> 00:13:33,420 FRANK SCHILBACH: Right. 352 00:13:33,420 --> 00:13:35,420 So one part that you're telling is essentially-- 353 00:13:35,420 --> 00:13:38,340 so one part might be it might not just be well-defined. 354 00:13:38,340 --> 00:13:40,210 There's other issues, like, for example, 355 00:13:40,210 --> 00:13:42,990 that irrelevant other-- there's an assumption economics. 356 00:13:42,990 --> 00:13:45,480 Usually, as they say, if you choose between A and B, 357 00:13:45,480 --> 00:13:47,880 the availability of C should not matter 358 00:13:47,880 --> 00:13:51,370 for your choice between A and B. But often, that's not the case. 359 00:13:51,370 --> 00:13:52,890 So if I offer you apples and bananas 360 00:13:52,890 --> 00:13:54,570 and you also can get cherries, now 361 00:13:54,570 --> 00:13:56,580 suddenly your choice between apples and bananas 362 00:13:56,580 --> 00:13:59,510 might change even if you can't get the cherries-- 363 00:13:59,510 --> 00:14:03,060 sorry, even if you don't choose the cherries eventually. 364 00:14:03,060 --> 00:14:03,690 Exactly. 365 00:14:03,690 --> 00:14:05,482 There might be sort of peoples' preferences 366 00:14:05,482 --> 00:14:07,510 might not be well-defined. 367 00:14:07,510 --> 00:14:09,930 They might also not be stable. 368 00:14:09,930 --> 00:14:10,790 Any other example? 369 00:14:10,790 --> 00:14:11,730 Yes. 370 00:14:11,730 --> 00:14:14,160 AUDIENCE: Also, for the well-defined preferences, 371 00:14:14,160 --> 00:14:16,290 we have information processing constraints. 372 00:14:16,290 --> 00:14:19,830 If I'm given a menu of 100 choices, 373 00:14:19,830 --> 00:14:23,947 it's going to be difficult for me to know which one. 374 00:14:23,947 --> 00:14:25,280 FRANK SCHILBACH: Right, exactly. 375 00:14:25,280 --> 00:14:29,800 So people have lots of information around themselves. 376 00:14:29,800 --> 00:14:33,420 And if you think about to process that information, 377 00:14:33,420 --> 00:14:36,570 if you go to a supermarket, it's impossible to know 378 00:14:36,570 --> 00:14:39,160 all the prices, all the goods, and make all these choices. 379 00:14:39,160 --> 00:14:41,400 So in some sense, there's an abundance of information 380 00:14:41,400 --> 00:14:42,180 everywhere. 381 00:14:42,180 --> 00:14:47,140 And we have to sort of figure out how to deal with that. 382 00:14:47,140 --> 00:14:47,980 Any other example? 383 00:14:47,980 --> 00:14:49,450 Yes. 384 00:14:49,450 --> 00:14:55,010 AUDIENCE: Some people who might have preferences [INAUDIBLE] 385 00:14:55,010 --> 00:14:57,986 equal an outcome of getting $1,500 386 00:14:57,986 --> 00:15:00,130 would be different if you were initially 387 00:15:00,130 --> 00:15:03,637 promised $1,000 versus if you [INAUDIBLE].. 388 00:15:03,637 --> 00:15:04,970 FRANK SCHILBACH: Right, exactly. 389 00:15:04,970 --> 00:15:09,370 So one part is to say people have preferences 390 00:15:09,370 --> 00:15:11,170 of a final outcome, not changes, which 391 00:15:11,170 --> 00:15:16,180 is to say you might feel about the weather in certain ways, 392 00:15:16,180 --> 00:15:19,300 you know, depending on how the weather was yesterday. 393 00:15:19,300 --> 00:15:23,140 Another version of that would be you evaluate the outcomes 394 00:15:23,140 --> 00:15:24,968 that you get based on your expectations. 395 00:15:24,968 --> 00:15:26,510 And sort of if you thought, you know, 396 00:15:26,510 --> 00:15:30,285 today is going to be really nice or your day is going to be 397 00:15:30,285 --> 00:15:32,410 really good, but then it happened to be not as good 398 00:15:32,410 --> 00:15:35,350 for whatever reason-- the weather is bad or something bad 399 00:15:35,350 --> 00:15:37,300 happen to you today-- 400 00:15:37,300 --> 00:15:39,948 the typical assumption would be just say, well, 401 00:15:39,948 --> 00:15:41,740 you should just evaluate the final outcome. 402 00:15:41,740 --> 00:15:44,157 It shouldn't matter what you thought about previously what 403 00:15:44,157 --> 00:15:45,130 might happen. 404 00:15:45,130 --> 00:15:47,200 You're going to look at what happens at the end. 405 00:15:47,200 --> 00:15:52,160 And that's how you evaluate your well-being. 406 00:15:52,160 --> 00:15:53,816 Yes. 407 00:15:53,816 --> 00:15:56,940 AUDIENCE: You might need a commitment device, 408 00:15:56,940 --> 00:15:59,360 so that you-- 409 00:15:59,360 --> 00:16:01,644 you might avoid buying potato chips 410 00:16:01,644 --> 00:16:03,075 because you think you'll eat them. 411 00:16:03,075 --> 00:16:04,506 That didn't really make sense. 412 00:16:04,506 --> 00:16:05,102 [INAUDIBLE] 413 00:16:05,102 --> 00:16:06,060 FRANK SCHILBACH: Right. 414 00:16:06,060 --> 00:16:08,580 So people might on purpose restrict their choices 415 00:16:08,580 --> 00:16:11,160 in certain ways, right? 416 00:16:11,160 --> 00:16:12,660 And this is what gets me eventually 417 00:16:12,660 --> 00:16:14,770 to the laptop policy. 418 00:16:14,770 --> 00:16:16,337 But people might sort of essentially 419 00:16:16,337 --> 00:16:18,420 know that in the future they make certain choices. 420 00:16:21,580 --> 00:16:23,940 For example, in the laptop case, you might say, 421 00:16:23,940 --> 00:16:26,100 I'm going to use a laptop in class 422 00:16:26,100 --> 00:16:29,160 with all the best intentions to taking notes and paying 423 00:16:29,160 --> 00:16:29,850 attention. 424 00:16:29,850 --> 00:16:31,350 But you know, at the end of the day, 425 00:16:31,350 --> 00:16:33,600 you know, you're going to watch the football review 426 00:16:33,600 --> 00:16:34,800 from yesterday or whatever. 427 00:16:34,800 --> 00:16:36,030 Other stuff comes up. 428 00:16:36,030 --> 00:16:38,620 You're going to chat with your friends and so on and so forth. 429 00:16:38,620 --> 00:16:40,703 So now, in some sense, if you're as sophisticated, 430 00:16:40,703 --> 00:16:43,920 as in if you know what you're going to do in case 431 00:16:43,920 --> 00:16:45,930 you sort of make certain choices, you might say, 432 00:16:45,930 --> 00:16:51,000 I choose not to even allow myself to use a laptop at all. 433 00:16:51,000 --> 00:16:54,270 The reason being because I know that essentially I'm going 434 00:16:54,270 --> 00:16:55,930 to misbehave in the future. 435 00:16:55,930 --> 00:16:57,750 And that's we'll talk about this soon, 436 00:16:57,750 --> 00:17:00,000 I think in lecture three or four, which is essentially 437 00:17:00,000 --> 00:17:01,500 people's demand for commitment, as we 438 00:17:01,500 --> 00:17:03,292 call it, which is to say people have demand 439 00:17:03,292 --> 00:17:05,462 for restricting their choice. 440 00:17:05,462 --> 00:17:07,920 And in any neoclassical model, this doesn't make any sense. 441 00:17:07,920 --> 00:17:10,121 Because you're going to choose optimally anyway. 442 00:17:10,121 --> 00:17:12,329 And you're going to make the best choice for yourself 443 00:17:12,329 --> 00:17:12,829 in any case. 444 00:17:12,829 --> 00:17:13,920 That's an assumption. 445 00:17:13,920 --> 00:17:16,770 So why would you ever sort of shut down certain choices? 446 00:17:16,770 --> 00:17:18,420 Because, for example, in an emergency, 447 00:17:18,420 --> 00:17:19,619 or there might be something really important 448 00:17:19,619 --> 00:17:21,150 coming up in class where you might 449 00:17:21,150 --> 00:17:23,849 want to use your laptop regardless of what's happening. 450 00:17:23,849 --> 00:17:25,720 Why would you shut that down? 451 00:17:25,720 --> 00:17:26,220 Exactly. 452 00:17:26,220 --> 00:17:30,010 So demand for commitment is another example. 453 00:17:30,010 --> 00:17:31,490 Yes. 454 00:17:31,490 --> 00:17:35,720 AUDIENCE: People are also not anywhere near perfect Bayesian 455 00:17:35,720 --> 00:17:39,440 information processors either. 456 00:17:39,440 --> 00:17:43,040 Or as a general rule, people are pretty 457 00:17:43,040 --> 00:17:50,360 bad at updating their beliefs based on new information. 458 00:17:50,360 --> 00:17:53,750 And we frequently traffic in 100% and 0% 459 00:17:53,750 --> 00:17:59,930 certainties very often, which far more often than what 460 00:17:59,930 --> 00:18:03,542 you would expect from a Bayesian information processor. 461 00:18:03,542 --> 00:18:04,500 FRANK SCHILBACH: Right. 462 00:18:04,500 --> 00:18:05,670 AUDIENCE: Practically not [INAUDIBLE].. 463 00:18:05,670 --> 00:18:06,712 FRANK SCHILBACH: Exactly. 464 00:18:06,712 --> 00:18:09,180 So one summary of that is to say Bayes' 465 00:18:09,180 --> 00:18:11,460 rule, a lot of this sort of updating behavior, 466 00:18:11,460 --> 00:18:12,610 is really hard. 467 00:18:12,610 --> 00:18:13,837 It's actually tough to do. 468 00:18:13,837 --> 00:18:16,170 And for those of you who do statistics, math, et cetera, 469 00:18:16,170 --> 00:18:18,007 you might be able to do that very well. 470 00:18:18,007 --> 00:18:19,590 But in a way you're a very small share 471 00:18:19,590 --> 00:18:23,700 of the population who actually does that well. 472 00:18:23,700 --> 00:18:27,000 But even among those, there's a bunch of stuff 473 00:18:27,000 --> 00:18:28,793 that's really hard to compute in your head. 474 00:18:28,793 --> 00:18:30,960 You might be able to write it down and figure it out 475 00:18:30,960 --> 00:18:32,820 eventually, but usually the assumption 476 00:18:32,820 --> 00:18:35,112 is that people just do all of this stuff in their heads 477 00:18:35,112 --> 00:18:36,460 and really complicated problems. 478 00:18:36,460 --> 00:18:40,350 And that's sort of clearly not the case in many situations. 479 00:18:40,350 --> 00:18:44,340 And then, of course, that's sort of one part of that sort 480 00:18:44,340 --> 00:18:46,200 of among MIT students. 481 00:18:46,200 --> 00:18:48,750 But then, you know, there's other populations 482 00:18:48,750 --> 00:18:50,400 that are less educated. 483 00:18:50,400 --> 00:18:52,890 They might not even know what Bayes' rule is. 484 00:18:52,890 --> 00:18:54,890 They might be illiterate and so on and so forth. 485 00:18:54,890 --> 00:18:57,900 So the assumption of people perfectly processing 486 00:18:57,900 --> 00:19:02,680 information is even perhaps more tenuous. 487 00:19:02,680 --> 00:19:03,630 Yeah. 488 00:19:03,630 --> 00:19:05,547 AUDIENCE: A lot of models are usually hedonic, 489 00:19:05,547 --> 00:19:07,031 but people usually care about what 490 00:19:07,031 --> 00:19:08,755 other people are doing as well. 491 00:19:08,755 --> 00:19:12,240 So for example, I may not be willing to take a class. 492 00:19:12,240 --> 00:19:14,670 But if my friend takes a class or gets into the class, 493 00:19:14,670 --> 00:19:16,210 then my utility would go up. 494 00:19:16,210 --> 00:19:17,190 FRANK SCHILBACH: Right. 495 00:19:17,190 --> 00:19:20,073 So there's two parts to that. 496 00:19:20,073 --> 00:19:21,490 This is essentially the assumption 497 00:19:21,490 --> 00:19:23,323 of narrowly defined self-interest and people 498 00:19:23,323 --> 00:19:25,872 caring only about themselves or their own consumption. 499 00:19:25,872 --> 00:19:27,080 So there's two parts of that. 500 00:19:27,080 --> 00:19:28,480 One is you might care about your friends. 501 00:19:28,480 --> 00:19:29,860 You might give money to your friends. 502 00:19:29,860 --> 00:19:30,910 You might sort of help them out. 503 00:19:30,910 --> 00:19:32,327 You might help them with homework. 504 00:19:32,327 --> 00:19:35,590 You might help them in other situations. 505 00:19:35,590 --> 00:19:38,020 You might send money to charity and so on and so forth. 506 00:19:38,020 --> 00:19:40,187 So this is kind of like you care about other people. 507 00:19:40,187 --> 00:19:42,130 And the well-being of other people 508 00:19:42,130 --> 00:19:45,010 is in your utility function one way or the other. 509 00:19:45,010 --> 00:19:48,010 The second part is what you already 510 00:19:48,010 --> 00:19:50,410 discussed as well, which is kind of social influences, 511 00:19:50,410 --> 00:19:52,930 call it, which is essentially peer effects. 512 00:19:52,930 --> 00:19:55,630 You care a lot about what other people think about you. 513 00:19:55,630 --> 00:19:58,630 You care a lot about what other people do. 514 00:19:58,630 --> 00:19:59,650 People get jealous. 515 00:19:59,650 --> 00:20:00,670 People get angry. 516 00:20:00,670 --> 00:20:01,810 People get envious. 517 00:20:01,810 --> 00:20:03,490 And sort of their behavior about what 518 00:20:03,490 --> 00:20:07,810 they want at the end of the day is very volatile and malleable, 519 00:20:07,810 --> 00:20:10,760 essentially, based on influences from others, right? 520 00:20:10,760 --> 00:20:14,030 And that might sort of show up in all sorts of ways, 521 00:20:14,030 --> 00:20:16,347 including in terms of peer effects. 522 00:20:16,347 --> 00:20:18,430 People do all sorts of things because other people 523 00:20:18,430 --> 00:20:20,440 think that's cool. 524 00:20:20,440 --> 00:20:23,150 But it's not necessarily what's best for them 525 00:20:23,150 --> 00:20:25,250 if they had to choose on their own. 526 00:20:25,250 --> 00:20:25,750 OK, great. 527 00:20:25,750 --> 00:20:26,630 So let me stop here. 528 00:20:26,630 --> 00:20:29,060 There's a lot of other examples that we can come up with. 529 00:20:29,060 --> 00:20:29,810 I have a few here. 530 00:20:29,810 --> 00:20:31,880 So one is sort of limited self-control. 531 00:20:31,880 --> 00:20:34,480 That's one of my favorite pictures about gym attendance 532 00:20:34,480 --> 00:20:36,850 where you say, well, they seem to be 533 00:20:36,850 --> 00:20:39,340 people who have preferences for going to the gym. 534 00:20:39,340 --> 00:20:42,340 But then there's an escalator, and people 535 00:20:42,340 --> 00:20:43,450 not working out so much. 536 00:20:43,450 --> 00:20:46,895 So in some sense, something is perhaps amiss here. 537 00:20:46,895 --> 00:20:49,270 When you look at sort of-- and this is one of my favorite 538 00:20:49,270 --> 00:20:50,490 things to do is-- 539 00:20:50,490 --> 00:20:53,213 or not quite as-- it's a fun thing to do, but maybe not one 540 00:20:53,213 --> 00:20:54,130 of my favorite things. 541 00:20:56,830 --> 00:20:59,320 When you sort of Google terms, this 542 00:20:59,320 --> 00:21:02,770 is calories or Weight Watchers or the like, 543 00:21:02,770 --> 00:21:05,350 where you say, well, what is people's interest in googling 544 00:21:05,350 --> 00:21:07,690 those kinds of terms over the course of a year? 545 00:21:07,690 --> 00:21:09,190 And as you can imagine, what you see 546 00:21:09,190 --> 00:21:12,850 are these spikes that are essentially January 1st. 547 00:21:12,850 --> 00:21:15,250 People start sort of googling calories. 548 00:21:15,250 --> 00:21:17,110 They start googling Weight Watchers 549 00:21:17,110 --> 00:21:18,588 and so on and so forth. 550 00:21:18,588 --> 00:21:21,130 And now, of course, you can sort think about some models that 551 00:21:21,130 --> 00:21:23,530 sort of can rationalize this in some sense in saying, 552 00:21:23,530 --> 00:21:27,380 well, January is just a good way to start a diet and so on. 553 00:21:27,380 --> 00:21:29,770 But really the right thing to do is sort of a people, 554 00:21:29,770 --> 00:21:32,560 on January 1st, have all the best of intentions to lose 555 00:21:32,560 --> 00:21:37,070 weight, eat healthier, exercise more, and so on and so forth. 556 00:21:37,070 --> 00:21:40,240 And then the year starts, the semester 557 00:21:40,240 --> 00:21:41,900 happens and so on and so forth. 558 00:21:41,900 --> 00:21:45,040 And the good intentions sort of fade away, 559 00:21:45,040 --> 00:21:49,150 which is essentially limited self-control. 560 00:21:49,150 --> 00:21:51,640 There's some interesting things about demand 561 00:21:51,640 --> 00:21:55,510 for information, especially when it comes to health. 562 00:21:55,510 --> 00:21:57,100 You know, every time I ask this-- 563 00:21:57,100 --> 00:22:00,430 and I've taught this class a few years so far. 564 00:22:00,430 --> 00:22:02,680 Every time I ask this, I sort of realize how old I am. 565 00:22:02,680 --> 00:22:04,570 Who knows Dr. House? 566 00:22:04,570 --> 00:22:05,530 Oh, not bad. 567 00:22:05,530 --> 00:22:08,618 So can anybody explain to me Thirteen and Dr. House 568 00:22:08,618 --> 00:22:09,910 and what is sort of their deal? 569 00:22:09,910 --> 00:22:12,593 And what does this have to do with demand for information? 570 00:22:18,020 --> 00:22:18,830 So who's Thirteen? 571 00:22:18,830 --> 00:22:22,012 Or what is her health issues? 572 00:22:22,012 --> 00:22:23,970 AUDIENCE: She's got a genetic disease, I think. 573 00:22:23,970 --> 00:22:24,960 FRANK SCHILBACH: Yes, she has-- 574 00:22:24,960 --> 00:22:26,202 AUDIENCE: She has, like-- 575 00:22:26,202 --> 00:22:27,030 I'm not-- 576 00:22:27,030 --> 00:22:28,020 FRANK SCHILBACH: Huntington's, yes. 577 00:22:28,020 --> 00:22:29,070 AUDIENCE: Yeah, Huntington's. 578 00:22:29,070 --> 00:22:30,028 FRANK SCHILBACH: Right. 579 00:22:30,028 --> 00:22:31,410 AUDIENCE: He finds out later on. 580 00:22:31,410 --> 00:22:31,690 FRANK SCHILBACH: Right. 581 00:22:31,690 --> 00:22:34,660 So she has a disease, which is called Huntington's disease, 582 00:22:34,660 --> 00:22:38,220 which we'll also talk about more in the class, which essentially 583 00:22:38,220 --> 00:22:41,250 is a brain disease where over time 584 00:22:41,250 --> 00:22:43,080 your brain more or less degenerates. 585 00:22:43,080 --> 00:22:49,050 And it's a really bad condition where at age 40, 50 586 00:22:49,050 --> 00:22:49,950 that manifests. 587 00:22:49,950 --> 00:22:54,210 And it's a very serious condition. 588 00:22:54,210 --> 00:22:56,400 It's a genetic disease in the sense 589 00:22:56,400 --> 00:22:58,410 of, if your parents have that disease, 590 00:22:58,410 --> 00:22:59,950 the chance of you having it is way, 591 00:22:59,950 --> 00:23:03,960 way higher than in the general population. 592 00:23:03,960 --> 00:23:06,550 There's a test for it, but there's no cure. 593 00:23:06,550 --> 00:23:09,990 So now, the dilemma and the show then 594 00:23:09,990 --> 00:23:13,800 comes where she essentially is thinking about getting tested. 595 00:23:13,800 --> 00:23:16,230 And Dr. House, who in some ways very rational 596 00:23:16,230 --> 00:23:18,330 and others not so much, essentially 597 00:23:18,330 --> 00:23:21,540 encourages her to get that test. 598 00:23:21,540 --> 00:23:24,285 And any rational model would say, well, 599 00:23:24,285 --> 00:23:25,410 you should really get that. 600 00:23:25,410 --> 00:23:28,080 Any sort of neoclassical model would say, 601 00:23:28,080 --> 00:23:29,550 you should get that test. 602 00:23:29,550 --> 00:23:30,970 Why should you get the test? 603 00:23:30,970 --> 00:23:32,182 Why is that helpful? 604 00:23:32,182 --> 00:23:33,600 AUDIENCE: More information. 605 00:23:33,600 --> 00:23:35,933 FRANK SCHILBACH: And what's the information helpful for? 606 00:23:35,933 --> 00:23:38,015 AUDIENCE: [? It ?] [? would update ?] [? price ?] 607 00:23:38,015 --> 00:23:38,980 [? of action. ?] 608 00:23:38,980 --> 00:23:39,610 FRANK SCHILBACH: Right, exactly. 609 00:23:39,610 --> 00:23:40,960 You would say information is good. 610 00:23:40,960 --> 00:23:41,890 Why is information good? 611 00:23:41,890 --> 00:23:43,600 Because there's lots of important choices 612 00:23:43,600 --> 00:23:46,330 that you might make that depend on that information. 613 00:23:46,330 --> 00:23:49,120 You might think about how much money you want to save. 614 00:23:49,120 --> 00:23:52,790 You think about your choices about taking 615 00:23:52,790 --> 00:23:57,430 a vacation, about career choices, about partner choices, 616 00:23:57,430 --> 00:24:00,612 about having children, all sorts of issues 617 00:24:00,612 --> 00:24:01,570 about health behaviors. 618 00:24:01,570 --> 00:24:03,528 You might sort of take better care of yourself. 619 00:24:03,528 --> 00:24:05,800 There's all sorts of really important decisions 620 00:24:05,800 --> 00:24:08,860 that might hinge on the fact your life will be dramatically 621 00:24:08,860 --> 00:24:12,190 different whether you have Huntington's disease or not. 622 00:24:12,190 --> 00:24:14,062 Now, Thirteen does not want to-- 623 00:24:14,062 --> 00:24:15,520 I think she actually does the test, 624 00:24:15,520 --> 00:24:20,120 and then doesn't want to see the result. And why is that? 625 00:24:20,120 --> 00:24:20,620 Yes. 626 00:24:20,620 --> 00:24:22,328 AUDIENCE: Even if you have the condition, 627 00:24:22,328 --> 00:24:23,882 there's no cure for the disease. 628 00:24:23,882 --> 00:24:28,090 So it might help your quality of life if you [INAUDIBLE].. 629 00:24:28,090 --> 00:24:29,790 FRANK SCHILBACH: So that's one view. 630 00:24:29,790 --> 00:24:31,330 Just to view that, I think, in some sense, as I said, 631 00:24:31,330 --> 00:24:33,490 you could make other decisions, like economic choices. 632 00:24:33,490 --> 00:24:34,300 You could save more. 633 00:24:34,300 --> 00:24:35,550 You could see the doctor more. 634 00:24:35,550 --> 00:24:37,450 You could go more on vacations or whatever, 635 00:24:37,450 --> 00:24:38,830 do fun things in life. 636 00:24:38,830 --> 00:24:42,650 But you could say, well, maybe that's not that important. 637 00:24:42,650 --> 00:24:46,150 But the neoclassical or the classical economics model 638 00:24:46,150 --> 00:24:48,455 would still argue that you should get that information. 639 00:24:48,455 --> 00:24:49,330 It wouldn't hurt you. 640 00:24:49,330 --> 00:24:51,550 You wouldn't want to refuse it. 641 00:24:51,550 --> 00:24:52,720 But why is she refusing it? 642 00:24:52,720 --> 00:24:53,140 Yes. 643 00:24:53,140 --> 00:24:54,130 AUDIENCE: I think maybe she doesn't 644 00:24:54,130 --> 00:24:55,340 want to live the rest of her life 645 00:24:55,340 --> 00:24:56,560 knowing she has the disease. 646 00:24:56,560 --> 00:24:57,602 FRANK SCHILBACH: Exactly. 647 00:24:57,602 --> 00:25:00,820 So she essentially derives utility from information, 648 00:25:00,820 --> 00:25:02,590 if you want, from her beliefs. 649 00:25:02,590 --> 00:25:05,350 She likes to sort of pretend to herself that she's healthy 650 00:25:05,350 --> 00:25:08,480 or the chance of her being healthy is very high. 651 00:25:08,480 --> 00:25:11,350 Now, under some assumptions, you might 652 00:25:11,350 --> 00:25:14,710 want to not get tested and sort of see the result of that test. 653 00:25:14,710 --> 00:25:17,780 Because once you have the test, if it's positive, 654 00:25:17,780 --> 00:25:20,572 it's very hard to pretend to yourself 655 00:25:20,572 --> 00:25:22,030 that you're healthy because there's 656 00:25:22,030 --> 00:25:24,310 a hard test saying that. 657 00:25:24,310 --> 00:25:26,290 Before the test, she can still sort of think, 658 00:25:26,290 --> 00:25:28,248 try to forget about it, and try to sort of live 659 00:25:28,248 --> 00:25:31,930 a happy life sort of until the symptoms set in. 660 00:25:31,930 --> 00:25:35,350 But that's a clearer violation of the classical model 661 00:25:35,350 --> 00:25:37,660 of the assumptions that I was showing you above. 662 00:25:40,880 --> 00:25:43,700 The next part is default effects. 663 00:25:43,700 --> 00:25:45,770 So what I'm showing over here is the fraction 664 00:25:45,770 --> 00:25:49,310 of organ donors by country and type of default. 665 00:25:49,310 --> 00:25:54,020 There's two types of defaults, often, for organ donations. 666 00:25:54,020 --> 00:25:57,590 This is from a few years ago, but I think, overall, things 667 00:25:57,590 --> 00:25:59,630 have not changed that much. 668 00:25:59,630 --> 00:26:01,740 There's opt-in versus opt-out. 669 00:26:01,740 --> 00:26:04,235 So in the countries on the left, essentially 670 00:26:04,235 --> 00:26:05,840 you have an opt-in policy. 671 00:26:05,840 --> 00:26:07,460 In a sense, if you don't do anything, 672 00:26:07,460 --> 00:26:08,840 you're not an organ donor. 673 00:26:08,840 --> 00:26:11,730 So you have to actively say in some form 674 00:26:11,730 --> 00:26:15,840 or some form of declaration that you want to be an organ donor. 675 00:26:15,840 --> 00:26:18,347 On the right, there's opt-out policies, 676 00:26:18,347 --> 00:26:20,430 which means essentially, if you don't do anything, 677 00:26:20,430 --> 00:26:23,700 you'll be automatically registered or viewed 678 00:26:23,700 --> 00:26:24,870 as an organ donor. 679 00:26:24,870 --> 00:26:29,092 You can opt out, but only if you do so, 680 00:26:29,092 --> 00:26:30,300 you'll actually be opted out. 681 00:26:30,300 --> 00:26:32,217 Otherwise, if there's an accident or the like, 682 00:26:32,217 --> 00:26:34,710 you'll be viewed as an organ donor. 683 00:26:34,710 --> 00:26:36,310 Now, what's weird about this graph? 684 00:26:36,310 --> 00:26:39,330 Or why is this sort of potentially a violation 685 00:26:39,330 --> 00:26:42,735 of the neoclassical or the typical economics assumptions? 686 00:26:45,710 --> 00:26:46,500 Yes. 687 00:26:46,500 --> 00:26:49,850 AUDIENCE: Because assuming that the people have 688 00:26:49,850 --> 00:26:52,850 the same preference of whether they want to opt-in or opt-out, 689 00:26:52,850 --> 00:26:56,000 then there should be the same proportion across. 690 00:26:56,000 --> 00:26:59,030 But the difference is that the effort required 691 00:26:59,030 --> 00:27:02,023 to make that change or the lack of indifference 692 00:27:02,023 --> 00:27:04,190 causes there to be the imbalance between the opt-out 693 00:27:04,190 --> 00:27:04,817 and the opt-in. 694 00:27:04,817 --> 00:27:06,150 FRANK SCHILBACH: Right, exactly. 695 00:27:06,150 --> 00:27:09,632 So some people in-- 696 00:27:09,632 --> 00:27:11,590 I don't want to get in trouble, but some people 697 00:27:11,590 --> 00:27:14,240 would argue Germany and Austria is actually not that different. 698 00:27:14,240 --> 00:27:14,823 So I'm German. 699 00:27:14,823 --> 00:27:16,562 So some Austrians might disagree. 700 00:27:16,562 --> 00:27:19,020 But you would say, you know, overall, Germans and Austrians 701 00:27:19,020 --> 00:27:20,100 are not that different overall. 702 00:27:20,100 --> 00:27:21,960 And you think, you know, their preferences in particular, 703 00:27:21,960 --> 00:27:22,960 they might be different. 704 00:27:22,960 --> 00:27:24,768 But their preferences for organ donations 705 00:27:24,768 --> 00:27:27,060 are probably pretty similar across those two countries, 706 00:27:27,060 --> 00:27:28,890 which is a reasonable assumption. 707 00:27:28,890 --> 00:27:30,690 Second, lots of people care a lot 708 00:27:30,690 --> 00:27:33,080 about what happens to them after they die, right? 709 00:27:33,080 --> 00:27:35,448 So they, for whatever reasons, religious or other, 710 00:27:35,448 --> 00:27:36,990 they might actually care a lot about, 711 00:27:36,990 --> 00:27:40,060 at the end of the day, what happens to their body 712 00:27:40,060 --> 00:27:40,560 and so on. 713 00:27:40,560 --> 00:27:43,470 And that's obviously their choice to make. 714 00:27:43,470 --> 00:27:45,550 Now, if you assume those two things, 715 00:27:45,550 --> 00:27:48,450 then it's very hard to reconcile this picture. 716 00:27:48,450 --> 00:27:51,120 Because, essentially, the opt-in and opt-out 717 00:27:51,120 --> 00:27:52,830 is a very small change to make. 718 00:27:52,830 --> 00:27:55,150 It's very simple and easy to do. 719 00:27:55,150 --> 00:27:56,730 You just have to fill out some form. 720 00:27:56,730 --> 00:27:58,380 Maybe it takes you an hour, maybe two. 721 00:27:58,380 --> 00:28:00,480 You have to maybe go to some office or the like. 722 00:28:00,480 --> 00:28:03,570 But if you really care, you could easily do it. 723 00:28:03,570 --> 00:28:05,910 Yet you see huge differences in outcome 724 00:28:05,910 --> 00:28:09,810 based on this very simple small change, which 725 00:28:09,810 --> 00:28:12,030 is very hard to model or write down a model that sort 726 00:28:12,030 --> 00:28:13,930 of rationalizes this behavior. 727 00:28:13,930 --> 00:28:17,160 So somehow the default, sort of the way in which the decision 728 00:28:17,160 --> 00:28:20,220 is presented, seems to really affect people's behavior 729 00:28:20,220 --> 00:28:22,170 in a fundamental way. 730 00:28:24,950 --> 00:28:26,930 OK, next one is GlowCaps. 731 00:28:26,930 --> 00:28:30,680 This is help for people to take their medication, in particular 732 00:28:30,680 --> 00:28:32,450 for the elderly, which essentially 733 00:28:32,450 --> 00:28:35,200 is a reminder for people. 734 00:28:35,200 --> 00:28:37,700 You can sort of program this in certain ways, like every day 735 00:28:37,700 --> 00:28:40,220 or every few hours, GlowCaps. 736 00:28:40,220 --> 00:28:41,930 So it's either sort of making sounds 737 00:28:41,930 --> 00:28:44,210 or such blinking and the like and gets 738 00:28:44,210 --> 00:28:46,040 you to take your medication. 739 00:28:46,040 --> 00:28:49,070 And it's a very sort of simple proof of concept 740 00:28:49,070 --> 00:28:51,800 that reminders are important and can save people's lives. 741 00:28:51,800 --> 00:28:54,000 And people have done studies on that and so on. 742 00:28:54,000 --> 00:28:56,180 But essentially, it's a very simple proof of concept 743 00:28:56,180 --> 00:28:57,838 that memory is limited, right? 744 00:28:57,838 --> 00:29:00,380 If your memory was perfect, if you could remember everything, 745 00:29:00,380 --> 00:29:03,668 if you were a perfect information processor, 746 00:29:03,668 --> 00:29:05,460 you would not forget about your medication, 747 00:29:05,460 --> 00:29:07,127 especially if it's important, especially 748 00:29:07,127 --> 00:29:10,610 if it's medication that potentially saves your life. 749 00:29:10,610 --> 00:29:17,390 Yet we see that people's medication usage is strongly 750 00:29:17,390 --> 00:29:20,330 affected by sort of types of policies or products 751 00:29:20,330 --> 00:29:22,400 that can help people remember. 752 00:29:22,400 --> 00:29:26,810 So that's essentially a rejection of perfect memory. 753 00:29:26,810 --> 00:29:28,130 Next one is charity. 754 00:29:28,130 --> 00:29:32,090 So if you think about people in all sorts of ways 755 00:29:32,090 --> 00:29:36,830 seem to care about others in the sense of just giving money, 756 00:29:36,830 --> 00:29:39,050 here's one charity that's called GiveDirectly, which 757 00:29:39,050 --> 00:29:41,720 is a very nice charity in a sense that what it does, 758 00:29:41,720 --> 00:29:46,010 essentially, is a very simple way of helping the poor by just 759 00:29:46,010 --> 00:29:48,767 directly transferring money to poor countries. 760 00:29:48,767 --> 00:29:50,600 So on the right, you see an actual recipient 761 00:29:50,600 --> 00:29:52,460 from GiveDirectly. 762 00:29:52,460 --> 00:29:53,930 That person has a phone. 763 00:29:53,930 --> 00:29:56,367 They're sort of a mobile phone based transfers 764 00:29:56,367 --> 00:29:57,950 where, essentially, you can send money 765 00:29:57,950 --> 00:30:00,090 to people with mobile phones. 766 00:30:00,090 --> 00:30:04,100 So if you decided now to give $100 to that person-- 767 00:30:04,100 --> 00:30:06,570 it could be that person or a similar person-- 768 00:30:06,570 --> 00:30:09,200 95 of those or 90 of those dollars would then, in fact, 769 00:30:09,200 --> 00:30:12,140 arrive at the cell phone of that person. 770 00:30:12,140 --> 00:30:14,480 Now, why are people doing this? 771 00:30:14,480 --> 00:30:16,700 Presumably, because they care about others 772 00:30:16,700 --> 00:30:18,290 one way or the other. 773 00:30:18,290 --> 00:30:20,232 It could be either that, in some sense, that's 774 00:30:20,232 --> 00:30:21,440 part of the utility function. 775 00:30:21,440 --> 00:30:23,608 Just donating money to others makes you happier. 776 00:30:23,608 --> 00:30:26,150 And that's essentially a way in which you improve your being. 777 00:30:26,150 --> 00:30:29,675 You just have others well-being in your utility function. 778 00:30:29,675 --> 00:30:31,800 Or it could be things like what we said previously. 779 00:30:31,800 --> 00:30:34,160 It could you sort of think it's appropriate thing to do. 780 00:30:34,160 --> 00:30:35,120 Maybe other people do. 781 00:30:35,120 --> 00:30:36,110 It's an important thing to do. 782 00:30:36,110 --> 00:30:37,777 Maybe you can tell your friends about it 783 00:30:37,777 --> 00:30:39,030 and so on and so forth. 784 00:30:39,030 --> 00:30:40,760 But one way or the other, people give lots of money 785 00:30:40,760 --> 00:30:43,250 to charity, which in some sense says, one way or the other, 786 00:30:43,250 --> 00:30:45,920 you must care about others overall. 787 00:30:45,920 --> 00:30:50,070 So others are in your utility function. 788 00:30:50,070 --> 00:30:53,480 So now, attention, if you have seen this before, 789 00:30:53,480 --> 00:30:55,950 don't tell your friends about it. 790 00:30:55,950 --> 00:31:00,340 Here's a simple attention test that I want you to do. 791 00:31:00,340 --> 00:31:03,507 What's interesting about this is, in some ways, 792 00:31:03,507 --> 00:31:04,840 you might have seen this before. 793 00:31:04,840 --> 00:31:07,540 Some of you may have seen the gorilla before. 794 00:31:07,540 --> 00:31:10,030 Who didn't see the gorilla? 795 00:31:10,030 --> 00:31:10,732 That's OK. 796 00:31:10,732 --> 00:31:11,690 I didn't see it either. 797 00:31:11,690 --> 00:31:13,350 But some people have seen it before. 798 00:31:13,350 --> 00:31:15,100 There's a bunch of experiments that people 799 00:31:15,100 --> 00:31:18,760 have done that essentially show that, often, a large fraction 800 00:31:18,760 --> 00:31:20,783 of people don't see this gorilla. 801 00:31:20,783 --> 00:31:22,450 And there's various ways in which people 802 00:31:22,450 --> 00:31:24,100 do other types of experiments. 803 00:31:24,100 --> 00:31:27,310 There's types of experiments where essentially people, 804 00:31:27,310 --> 00:31:28,495 they go to some bank teller. 805 00:31:28,495 --> 00:31:31,120 The bank teller says, I'm going to go away and then comes back. 806 00:31:31,120 --> 00:31:33,245 And then there's a different person who comes back, 807 00:31:33,245 --> 00:31:34,750 and people don't notice. 808 00:31:34,750 --> 00:31:36,680 There's various versions of that. 809 00:31:36,680 --> 00:31:39,430 And so what's interesting about this in some sense is-- 810 00:31:39,430 --> 00:31:42,700 so A, it proves, in some ways, that attention is limited. 811 00:31:42,700 --> 00:31:46,760 You just focus your attention on some things and not others. 812 00:31:46,760 --> 00:31:48,790 And you might miss important things. 813 00:31:48,790 --> 00:31:50,230 Now, that's OK in some ways. 814 00:31:50,230 --> 00:31:52,330 There's sort of a version of that that says, well, 815 00:31:52,330 --> 00:31:54,790 it might be sort of rational inattention. 816 00:31:54,790 --> 00:31:57,703 In some sense, you focus on the important things in life. 817 00:31:57,703 --> 00:31:59,620 And you might miss some things, and that's OK. 818 00:31:59,620 --> 00:32:01,840 In some sense, I asked you, or the video asked you, 819 00:32:01,840 --> 00:32:03,900 to count the number of passes. 820 00:32:03,900 --> 00:32:05,990 So in some sense, the gorilla is irrelevant. 821 00:32:05,990 --> 00:32:07,568 And that's true in this case. 822 00:32:07,568 --> 00:32:09,610 But in many other cases, people might sort of not 823 00:32:09,610 --> 00:32:10,690 rationally pay attention. 824 00:32:10,690 --> 00:32:12,857 They might sort of miss some really important things 825 00:32:12,857 --> 00:32:15,580 in their lives in part because they're 826 00:32:15,580 --> 00:32:17,500 distracted, in part because they don't want 827 00:32:17,500 --> 00:32:19,930 to pay attention and so on. 828 00:32:19,930 --> 00:32:22,150 And some strand of behavioral economics 829 00:32:22,150 --> 00:32:25,930 which we'll talk about is sort of when 830 00:32:25,930 --> 00:32:28,060 people do not pay attention and then make 831 00:32:28,060 --> 00:32:30,100 mistakes because of that. 832 00:32:32,920 --> 00:32:33,460 OK. 833 00:32:33,460 --> 00:32:36,590 So I could go on and on with lots of different examples. 834 00:32:36,590 --> 00:32:39,680 But the bottom line here is that most researchers in psychology 835 00:32:39,680 --> 00:32:43,300 economics believe that the classical model of behavior, 836 00:32:43,300 --> 00:32:47,470 sort of the homo economicus, is too extreme in various ways. 837 00:32:47,470 --> 00:32:51,310 That person is too selfish, too rational, and too willful. 838 00:32:51,310 --> 00:32:54,490 And in some ways, we want to kind of understand 839 00:32:54,490 --> 00:32:56,830 how relaxing some of these assumptions 840 00:32:56,830 --> 00:32:59,260 might make economic models more realistic. 841 00:32:59,260 --> 00:33:02,078 Now, in some sense, no economist would, in fact, 842 00:33:02,078 --> 00:33:04,120 argue that the assumptions of the standard models 843 00:33:04,120 --> 00:33:05,680 are exactly correct. 844 00:33:05,680 --> 00:33:08,188 The questions are, are these deviations important? 845 00:33:08,188 --> 00:33:09,730 Do they actually matter for something 846 00:33:09,730 --> 00:33:11,740 important in explaining people's behavior? 847 00:33:11,740 --> 00:33:15,340 And which of those assumptions or deviations actually matter? 848 00:33:15,340 --> 00:33:18,550 And that's kind of the name of the game here. 849 00:33:18,550 --> 00:33:22,990 In fact, when you talk to cognitive scientist, 850 00:33:22,990 --> 00:33:26,140 psychologist, et cetera, they will tell you the world 851 00:33:26,140 --> 00:33:28,180 is full of cognitive biases. 852 00:33:28,180 --> 00:33:30,702 And you might not be able to read this. 853 00:33:30,702 --> 00:33:31,910 And that's kind of the point. 854 00:33:31,910 --> 00:33:34,120 There's so many different things in which we are biased. 855 00:33:34,120 --> 00:33:35,500 In some sense, every choice that we make 856 00:33:35,500 --> 00:33:36,875 or any things that we do, there's 857 00:33:36,875 --> 00:33:39,157 lots of biases that interfere with people's choices. 858 00:33:39,157 --> 00:33:40,740 And almost any choice you can think of 859 00:33:40,740 --> 00:33:42,640 or any behavior they can think about, 860 00:33:42,640 --> 00:33:44,590 there will be psychology or other experiments 861 00:33:44,590 --> 00:33:48,740 showing that people do not behave perfectly. 862 00:33:48,740 --> 00:33:54,520 Now, the key question then is, which of those assumptions 863 00:33:54,520 --> 00:33:55,750 are important? 864 00:33:55,750 --> 00:33:59,413 And which of those violations of assumptions should we focus on? 865 00:33:59,413 --> 00:34:01,330 And for that, I want to step back a little bit 866 00:34:01,330 --> 00:34:02,913 and say, OK, what is actually a model? 867 00:34:02,913 --> 00:34:04,992 What are economic models trying to do? 868 00:34:04,992 --> 00:34:05,950 And so what is a model? 869 00:34:05,950 --> 00:34:09,610 A model is a simplified representation of the world. 870 00:34:09,610 --> 00:34:12,610 And we know, in some sense, that the assumptions of the models 871 00:34:12,610 --> 00:34:13,179 are not true. 872 00:34:13,179 --> 00:34:16,330 They're sort of supposed to be approximately true and exactly 873 00:34:16,330 --> 00:34:18,020 false. 874 00:34:18,020 --> 00:34:20,139 So when you think about used models of the Earth, 875 00:34:20,139 --> 00:34:22,630 in some sense, they're flat, spherical, ellipsoid, 876 00:34:22,630 --> 00:34:24,940 and so on and so forth, models. 877 00:34:24,940 --> 00:34:28,000 Now, good models do not account for bumps and grooves 878 00:34:28,000 --> 00:34:28,870 and so on. 879 00:34:28,870 --> 00:34:32,260 A perfect replica of the Earth is not a useful model to use. 880 00:34:32,260 --> 00:34:34,120 You kind of want to simplify and capture 881 00:34:34,120 --> 00:34:37,100 the essence of what's important. 882 00:34:37,100 --> 00:34:38,590 So now, then what's a good model? 883 00:34:38,590 --> 00:34:41,257 Well, a good model-- and you can read a bit about this in Gabaix 884 00:34:41,257 --> 00:34:42,280 and Laibson's paper. 885 00:34:42,280 --> 00:34:44,360 A good model is supposed to be simple. 886 00:34:44,360 --> 00:34:46,570 It's supposed to be easy to work with and tractable. 887 00:34:46,570 --> 00:34:48,670 So in some sense, you have only a few variables, 888 00:34:48,670 --> 00:34:51,580 a few things that matter. 889 00:34:51,580 --> 00:34:54,050 It's conceptually insightful in the sense 890 00:34:54,050 --> 00:34:56,130 that it focuses on important things. 891 00:34:56,130 --> 00:34:57,880 It tells you about behavior that we really 892 00:34:57,880 --> 00:34:59,830 care about and important ideas. 893 00:34:59,830 --> 00:35:02,440 It's generalizable in a sense that, ideally, we 894 00:35:02,440 --> 00:35:06,040 are trying to look for some behaviors that are generally 895 00:35:06,040 --> 00:35:06,800 in some sense. 896 00:35:06,800 --> 00:35:10,420 So if I can just explain you one simple choice in one domain 897 00:35:10,420 --> 00:35:12,310 and write a model about it, but that model 898 00:35:12,310 --> 00:35:14,320 doesn't apply for anything else, that's 899 00:35:14,320 --> 00:35:16,900 not a good model to work with. 900 00:35:16,900 --> 00:35:19,058 The model is supposed to be falsifiable in a sense 901 00:35:19,058 --> 00:35:20,350 that we can actually test them. 902 00:35:20,350 --> 00:35:22,480 And that's kind of what we do in experimental and behavioral 903 00:35:22,480 --> 00:35:23,080 economics. 904 00:35:23,080 --> 00:35:26,860 We try to test people's theories and then empirical work 905 00:35:26,860 --> 00:35:30,580 and then sort of falsify or reject models that are wrong 906 00:35:30,580 --> 00:35:33,010 and sort of accept or do not reject 907 00:35:33,010 --> 00:35:36,522 models we think are better. 908 00:35:36,522 --> 00:35:38,980 There's supposed to be empirical consistency in a sense of, 909 00:35:38,980 --> 00:35:40,840 if I explain one behavior, I should also 910 00:35:40,840 --> 00:35:42,913 explain this over time or in different domains. 911 00:35:42,913 --> 00:35:44,830 And we should be able to make good predictions 912 00:35:44,830 --> 00:35:47,590 for people's behavior. 913 00:35:47,590 --> 00:35:50,050 Now, are the assumptions of the standard models 914 00:35:50,050 --> 00:35:51,680 true for most people? 915 00:35:51,680 --> 00:35:53,350 The answer is no. 916 00:35:53,350 --> 00:35:57,490 But the key insight here, the key properties of good models, 917 00:35:57,490 --> 00:35:58,840 is simplicity. 918 00:35:58,840 --> 00:36:01,863 So in some sense, assuming perfect rationality, 919 00:36:01,863 --> 00:36:04,030 selfishness and willpower is actually a simple thing 920 00:36:04,030 --> 00:36:05,020 to do in some sense. 921 00:36:05,020 --> 00:36:08,320 The reason why economists have made those assumptions 922 00:36:08,320 --> 00:36:10,870 to start with is not necessarily because they thought 923 00:36:10,870 --> 00:36:13,960 people are perfectly rational or that there 924 00:36:13,960 --> 00:36:16,773 are not these psychological issues going on. 925 00:36:16,773 --> 00:36:18,190 The reason is actually simplicity. 926 00:36:18,190 --> 00:36:19,370 It's an easy thing to do. 927 00:36:19,370 --> 00:36:21,640 You model how you think people should behave 928 00:36:21,640 --> 00:36:24,370 and how they behave perfectly. 929 00:36:24,370 --> 00:36:27,250 And making some of these models sort of richer 930 00:36:27,250 --> 00:36:29,680 in their psychology is actually complicated 931 00:36:29,680 --> 00:36:32,860 and makes the models more complex and harder to analyze. 932 00:36:32,860 --> 00:36:35,200 So that is just to say, well, then the question is, 933 00:36:35,200 --> 00:36:37,240 can we find some assumptions of economics? 934 00:36:37,240 --> 00:36:39,670 Can we make them more realistic in a tractable way? 935 00:36:39,670 --> 00:36:41,230 In a sense, can we find key things 936 00:36:41,230 --> 00:36:45,100 that we change that keep the models tractable 937 00:36:45,100 --> 00:36:47,380 while then also explaining important things better 938 00:36:47,380 --> 00:36:48,665 than we can before? 939 00:36:48,665 --> 00:36:50,290 So it's not about just taking all sorts 940 00:36:50,290 --> 00:36:52,270 of psychological issues that might be going on 941 00:36:52,270 --> 00:36:54,850 and say, well, economic assumptions are wrong 942 00:36:54,850 --> 00:36:58,630 or models of economics are wrong or the assumptions of those 943 00:36:58,630 --> 00:36:59,270 are wrong. 944 00:36:59,270 --> 00:37:01,190 We know that these assumptions are wrong. 945 00:37:01,190 --> 00:37:04,150 The question is, can we make somewhat simple assumptions 946 00:37:04,150 --> 00:37:06,040 or improvements of those assumptions 947 00:37:06,040 --> 00:37:07,810 based on insights from psychology 948 00:37:07,810 --> 00:37:10,600 and other fields that help us improve those models 949 00:37:10,600 --> 00:37:13,240 and then make better predictions and all 950 00:37:13,240 --> 00:37:15,650 of that in a tractable way? 951 00:37:15,650 --> 00:37:17,200 So now, this is very important here. 952 00:37:17,200 --> 00:37:20,710 A good behavior economist or a good student in this class 953 00:37:20,710 --> 00:37:22,660 is also a good economist. 954 00:37:22,660 --> 00:37:24,490 Behavioral economics is not sort of trying 955 00:37:24,490 --> 00:37:26,460 to replace standard economics. 956 00:37:26,460 --> 00:37:28,210 So I don't want you to go to my colleagues 957 00:37:28,210 --> 00:37:31,150 and say, you know, 1401, 1402, and so on, this is all garbage. 958 00:37:31,150 --> 00:37:34,155 Frank is telling you we do things differently. 959 00:37:34,155 --> 00:37:36,530 In fact, you very much need to know those kinds of things 960 00:37:36,530 --> 00:37:38,930 from standard economics to understand 961 00:37:38,930 --> 00:37:42,500 what the assumptions are and how to best to deviate from that. 962 00:37:42,500 --> 00:37:47,630 And so such then, key principles of mainstream economics 963 00:37:47,630 --> 00:37:49,310 continue to apply. 964 00:37:49,310 --> 00:37:51,470 Decision makers are still highly sophisticated. 965 00:37:51,470 --> 00:37:53,580 Markets and incentives matter. 966 00:37:53,580 --> 00:37:57,230 They, in fact, play a key role in shaping behavior. 967 00:37:57,230 --> 00:38:00,868 And markets allocate resources well most of the time. 968 00:38:00,868 --> 00:38:03,410 The question is, can we sort of focus on important deviations 969 00:38:03,410 --> 00:38:06,120 and try to fix those? 970 00:38:06,120 --> 00:38:08,210 And then, again, sort of methodological principles 971 00:38:08,210 --> 00:38:09,320 still apply. 972 00:38:09,320 --> 00:38:11,660 Use observational experimental data. 973 00:38:11,660 --> 00:38:14,785 Mathematical models are good and so on and so forth. 974 00:38:14,785 --> 00:38:16,160 And ideally, models would sort of 975 00:38:16,160 --> 00:38:18,320 finesse the special case of perfect 976 00:38:18,320 --> 00:38:21,755 rationality or perfectly sort of standard economic models. 977 00:38:21,755 --> 00:38:23,630 And they try to build in some parameters that 978 00:38:23,630 --> 00:38:25,255 deviate from that and try to understand 979 00:38:25,255 --> 00:38:27,740 can we make better predictions. 980 00:38:27,740 --> 00:38:30,050 And then so, often, prices are actually 981 00:38:30,050 --> 00:38:31,940 the most important aspects of choice. 982 00:38:31,940 --> 00:38:33,620 Here's one experiment by Ito et al. 983 00:38:33,620 --> 00:38:36,570 that try to essentially sort of change people's energy usage 984 00:38:36,570 --> 00:38:37,310 and so on. 985 00:38:37,310 --> 00:38:38,630 And they have two treatments. 986 00:38:38,630 --> 00:38:40,760 One is called moral suasion, which 987 00:38:40,760 --> 00:38:42,500 essentially is telling people-- 988 00:38:42,500 --> 00:38:44,673 sort of appealing to their morals in some ways. 989 00:38:44,673 --> 00:38:47,090 And the other one is essentially just financial incentive, 990 00:38:47,090 --> 00:38:49,932 straight up changing the price of people's choices. 991 00:38:49,932 --> 00:38:51,890 And what you see in this experiment essentially 992 00:38:51,890 --> 00:38:55,130 is there's, in red, you have the impact. 993 00:38:55,130 --> 00:38:57,800 These are all treatment effects over time comparing a treatment 994 00:38:57,800 --> 00:38:59,210 group and a control group. 995 00:38:59,210 --> 00:39:01,370 And you see, in red, the treatment effects 996 00:39:01,370 --> 00:39:02,300 of the incentives. 997 00:39:02,300 --> 00:39:05,000 And you see these are relatively large and persist over time. 998 00:39:05,000 --> 00:39:07,250 And you see the treatment effects of the moral suasion 999 00:39:07,250 --> 00:39:11,300 treatment, which happens to be reasonably large to start with, 1000 00:39:11,300 --> 00:39:13,293 but essentially just goes away. 1001 00:39:13,293 --> 00:39:14,460 What did we learn from that? 1002 00:39:14,460 --> 00:39:16,400 We've learned from that prices matter. 1003 00:39:16,400 --> 00:39:18,080 And maybe in this case, moral suasion 1004 00:39:18,080 --> 00:39:19,400 is just not that important. 1005 00:39:19,400 --> 00:39:20,580 And that's perfectly fine. 1006 00:39:20,580 --> 00:39:23,450 We're trying to identify situations or cases 1007 00:39:23,450 --> 00:39:26,810 where some of the underlying psychological issues 1008 00:39:26,810 --> 00:39:27,590 are important. 1009 00:39:27,590 --> 00:39:30,360 And we've learned from an experiment that, in some cases, 1010 00:39:30,360 --> 00:39:31,310 it doesn't apply. 1011 00:39:31,310 --> 00:39:32,060 Well, so be it. 1012 00:39:32,060 --> 00:39:33,770 Then we should focus on other cases 1013 00:39:33,770 --> 00:39:38,330 where the psychological issues are more important. 1014 00:39:38,330 --> 00:39:38,840 OK. 1015 00:39:38,840 --> 00:39:44,380 So then what's sort of our broad approach to each topic? 1016 00:39:44,380 --> 00:39:46,900 We're trying to sort of start with an intuitive, empirical, 1017 00:39:46,900 --> 00:39:49,060 and experimental examples how people 1018 00:39:49,060 --> 00:39:51,293 behave in some situations. 1019 00:39:51,293 --> 00:39:52,960 We try to think about their motivations, 1020 00:39:52,960 --> 00:39:55,127 try to see about how they make choices and behaviors 1021 00:39:55,127 --> 00:39:57,117 and how they behave in certain situations. 1022 00:39:57,117 --> 00:39:59,200 And then we're going to try and sort of model this 1023 00:39:59,200 --> 00:40:01,510 in a more precise way and try to consider 1024 00:40:01,510 --> 00:40:03,700 how the modeling of that perhaps deviates 1025 00:40:03,700 --> 00:40:07,678 from the neoclassical or the classical model of economics. 1026 00:40:07,678 --> 00:40:09,220 Some other times, you also just start 1027 00:40:09,220 --> 00:40:10,870 from the classical model of economics, 1028 00:40:10,870 --> 00:40:13,425 look at the predictions, and say, well, can we reject that? 1029 00:40:13,425 --> 00:40:14,800 Or do we make certain predictions 1030 00:40:14,800 --> 00:40:15,640 that are just not true? 1031 00:40:15,640 --> 00:40:17,140 And then say, can we sort of improve 1032 00:40:17,140 --> 00:40:18,645 those types of assumptions? 1033 00:40:18,645 --> 00:40:20,770 And then, overall, we're trying to think about then 1034 00:40:20,770 --> 00:40:23,110 how these hypotheses or deviations might 1035 00:40:23,110 --> 00:40:25,210 be able to explain how people behave in markets 1036 00:40:25,210 --> 00:40:27,748 and what choices they make and how, perhaps, we 1037 00:40:27,748 --> 00:40:30,040 can think about policies that might affect the people's 1038 00:40:30,040 --> 00:40:35,310 well-being, their welfare, or any other consequences. 1039 00:40:35,310 --> 00:40:36,820 Any questions so far? 1040 00:40:42,310 --> 00:40:43,300 OK. 1041 00:40:43,300 --> 00:40:47,400 So now, I want to give you one very simple, in some sense, 1042 00:40:47,400 --> 00:40:50,740 stylized example which sort of demonstrates a little bit what 1043 00:40:50,740 --> 00:40:54,070 sort of the neoclassical economic standard 1044 00:40:54,070 --> 00:40:57,070 economic assumptions are and how sort of enriching 1045 00:40:57,070 --> 00:41:00,250 a model with psychological considerations 1046 00:41:00,250 --> 00:41:02,600 might be more powerful. 1047 00:41:02,600 --> 00:41:05,800 So when you think about laptops in class, 1048 00:41:05,800 --> 00:41:09,070 what are sort of standard economic considerations? 1049 00:41:09,070 --> 00:41:10,120 Should you allow laptops? 1050 00:41:10,120 --> 00:41:11,260 Should there be laptops in class? 1051 00:41:11,260 --> 00:41:12,460 Is that a good thing or a bad thing? 1052 00:41:12,460 --> 00:41:13,793 Or what do you think about that? 1053 00:41:19,490 --> 00:41:22,280 And now, I'm asking for standard, non-psychological 1054 00:41:22,280 --> 00:41:25,280 issues, just saying, if you went to a classical economist 1055 00:41:25,280 --> 00:41:27,510 and say, should we allow laptops in class, 1056 00:41:27,510 --> 00:41:30,680 what would be considerations for that? 1057 00:41:30,680 --> 00:41:31,490 Yeah. 1058 00:41:31,490 --> 00:41:33,448 AUDIENCE: Externalities-- if you're distracting 1059 00:41:33,448 --> 00:41:34,575 people who are around you? 1060 00:41:34,575 --> 00:41:35,450 FRANK SCHILBACH: Yes. 1061 00:41:35,450 --> 00:41:40,070 So you could call that psychological or not. 1062 00:41:40,070 --> 00:41:42,610 But essentially, it could distract people essentially. 1063 00:41:42,610 --> 00:41:46,130 And externalities tend to be sort of, very 1064 00:41:46,130 --> 00:41:49,100 much in classical economics-- essentially, if you smoke, 1065 00:41:49,100 --> 00:41:49,850 it affects others. 1066 00:41:49,850 --> 00:41:52,010 Similarly, if you had a laptop, it might affect others. 1067 00:41:52,010 --> 00:41:53,635 It's a little bit sort of in the middle 1068 00:41:53,635 --> 00:41:56,120 between psychological and economic considerations. 1069 00:41:56,120 --> 00:41:59,840 Because in some sense, if somebody next to you 1070 00:41:59,840 --> 00:42:01,890 would use a laptop, then I would just say, well, 1071 00:42:01,890 --> 00:42:03,598 why don't you just focus on class anyway? 1072 00:42:03,598 --> 00:42:05,478 You should be able to just ignore that. 1073 00:42:05,478 --> 00:42:07,520 But setting that aside that exactly externalities 1074 00:42:07,520 --> 00:42:10,150 are surely one considerations that we have. 1075 00:42:10,150 --> 00:42:11,320 What's the pro-side? 1076 00:42:11,320 --> 00:42:13,090 Or why would we allow laptops? 1077 00:42:13,090 --> 00:42:15,740 What's good about laptops in class? 1078 00:42:15,740 --> 00:42:16,240 Yeah. 1079 00:42:16,240 --> 00:42:18,790 AUDIENCE: It could just be efficient for taking notes. 1080 00:42:18,790 --> 00:42:20,380 Some people might need to use them. 1081 00:42:20,380 --> 00:42:24,320 Some people may are dyslexic, stuff like that. 1082 00:42:24,320 --> 00:42:25,240 So it's just good. 1083 00:42:25,240 --> 00:42:26,282 FRANK SCHILBACH: Exactly. 1084 00:42:26,282 --> 00:42:28,642 It's a useful technology to take notes. 1085 00:42:28,642 --> 00:42:30,142 And you might just be better at that 1086 00:42:30,142 --> 00:42:31,190 or might be more comfortable. 1087 00:42:31,190 --> 00:42:32,320 It's easier for you to do. 1088 00:42:32,320 --> 00:42:35,380 Maybe it also saves some paper or whatever. 1089 00:42:35,380 --> 00:42:37,330 What else are laptops good for? 1090 00:42:37,330 --> 00:42:38,087 Yeah. 1091 00:42:38,087 --> 00:42:39,920 AUDIENCE: More choice is just always better. 1092 00:42:39,920 --> 00:42:41,350 So if each person has their laptop, 1093 00:42:41,350 --> 00:42:43,030 they can choose what's better for them to pay attention-- 1094 00:42:43,030 --> 00:42:44,280 FRANK SCHILBACH: Yes, exactly. 1095 00:42:44,280 --> 00:42:46,390 If you are so inclined and want to watch football 1096 00:42:46,390 --> 00:42:49,700 from yesterday, the replay and so on, if you prefer that 1097 00:42:49,700 --> 00:42:51,310 and that's good for you, you know, 1098 00:42:51,310 --> 00:42:52,900 who should be stopping you, right? 1099 00:42:52,900 --> 00:42:59,110 And so in some sense, laptops are useful for note taking. 1100 00:42:59,110 --> 00:43:02,370 They're also useful for non-class activities. 1101 00:43:02,370 --> 00:43:03,940 And as you say, each student should 1102 00:43:03,940 --> 00:43:06,700 be able to choose for themselves what's good. 1103 00:43:06,700 --> 00:43:12,797 And that may involve paying attention or not. 1104 00:43:12,797 --> 00:43:15,130 Now, what are sort of some psychological considerations? 1105 00:43:15,130 --> 00:43:16,672 We had one of them, which essentially 1106 00:43:16,672 --> 00:43:19,640 distracting others, which essentially is the externality. 1107 00:43:19,640 --> 00:43:20,870 What else? 1108 00:43:20,870 --> 00:43:21,370 Yeah. 1109 00:43:21,370 --> 00:43:24,987 AUDIENCE: Temptation and not valuing your future learning? 1110 00:43:24,987 --> 00:43:26,320 FRANK SCHILBACH: Right, exactly. 1111 00:43:26,320 --> 00:43:28,450 You might sort of have all the best 1112 00:43:28,450 --> 00:43:31,092 intentions of taking notes, what I said previously. 1113 00:43:31,092 --> 00:43:32,800 But then, you know, it gets a little dull 1114 00:43:32,800 --> 00:43:34,922 and boring at minute 40. 1115 00:43:34,922 --> 00:43:36,880 And you might sort of be inclined to sort think 1116 00:43:36,880 --> 00:43:38,860 about other stuff and sort of start 1117 00:43:38,860 --> 00:43:42,010 surfing an internet or the like or start chatting or whatever. 1118 00:43:42,010 --> 00:43:44,170 That's sort of essentially limited self-control one 1119 00:43:44,170 --> 00:43:45,380 way or the other. 1120 00:43:45,380 --> 00:43:45,880 Yeah. 1121 00:43:45,880 --> 00:43:47,755 AUDIENCE: Yeah. 1122 00:43:47,755 --> 00:43:50,005 So I think even if you don't fall into the temptation, 1123 00:43:50,005 --> 00:43:52,780 you kind of have to spend energy resisting the temptation 1124 00:43:52,780 --> 00:43:54,280 and that might kind of distract you. 1125 00:43:54,280 --> 00:43:55,238 FRANK SCHILBACH: Right. 1126 00:43:55,238 --> 00:43:56,710 So there's some cognitive resources 1127 00:43:56,710 --> 00:43:59,258 potentially from that. 1128 00:43:59,258 --> 00:44:00,550 That's a very nice observation. 1129 00:44:00,550 --> 00:44:01,120 Exactly. 1130 00:44:01,120 --> 00:44:02,752 That also might be true. 1131 00:44:02,752 --> 00:44:04,210 There's another part to that, which 1132 00:44:04,210 --> 00:44:07,810 is people tend to overestimate how much they can multitask. 1133 00:44:07,810 --> 00:44:10,060 And there's a large number of experiments 1134 00:44:10,060 --> 00:44:12,220 that say, well, I'm actually paying attention. 1135 00:44:12,220 --> 00:44:14,020 I'm just sort of reading some other stuff. 1136 00:44:14,020 --> 00:44:16,300 I'm chatting with my friend. 1137 00:44:16,300 --> 00:44:19,540 And people actually think they pay attention, and they learn. 1138 00:44:19,540 --> 00:44:20,780 Trust me, they do not. 1139 00:44:20,780 --> 00:44:24,100 So essentially, there's a large, large literature on people 1140 00:44:24,100 --> 00:44:26,710 thinking that they can multitask when, in fact, they cannot. 1141 00:44:26,710 --> 00:44:28,330 And this is a very human thing to do. 1142 00:44:28,330 --> 00:44:31,420 You might think you can study for an exam by watching TV. 1143 00:44:31,420 --> 00:44:34,610 Chances are you're not studying very well. 1144 00:44:34,610 --> 00:44:39,730 So that's essentially some form of overconfidence. 1145 00:44:39,730 --> 00:44:40,390 Right. 1146 00:44:40,390 --> 00:44:41,890 And then there's another part, which 1147 00:44:41,890 --> 00:44:45,590 is a sort of somewhat different psychological consideration, 1148 00:44:45,590 --> 00:44:48,963 which is people tend to not like hard paternalism, as in, 1149 00:44:48,963 --> 00:44:49,630 if you sort of-- 1150 00:44:49,630 --> 00:44:52,360 we talk about this at the very end when we talk about policy. 1151 00:44:52,360 --> 00:44:54,280 People don't like certain hard rules 1152 00:44:54,280 --> 00:44:55,960 about you're allowed to do this, or you 1153 00:44:55,960 --> 00:44:57,160 don't allow to do x or y. 1154 00:44:57,160 --> 00:44:58,870 You have to come to class or whatever. 1155 00:44:58,870 --> 00:45:00,632 People tend to not like that. 1156 00:45:00,632 --> 00:45:02,340 That's more another sort of consideration 1157 00:45:02,340 --> 00:45:07,110 if you think about what is the right policy to do. 1158 00:45:07,110 --> 00:45:09,210 Now, what policy solutions could we do? 1159 00:45:09,210 --> 00:45:10,678 What laptop policies have you seen? 1160 00:45:10,678 --> 00:45:11,970 Or how do you think about them? 1161 00:45:17,500 --> 00:45:19,790 Maybe somebody else? 1162 00:45:19,790 --> 00:45:20,320 Yes. 1163 00:45:20,320 --> 00:45:22,820 AUDIENCE: I haven't seen it, but let the people with laptops 1164 00:45:22,820 --> 00:45:23,745 sit in the back. 1165 00:45:23,745 --> 00:45:24,620 FRANK SCHILBACH: Yes. 1166 00:45:24,620 --> 00:45:26,530 You could do that. 1167 00:45:26,530 --> 00:45:27,970 Yeah. 1168 00:45:27,970 --> 00:45:30,760 And sort of that's kind of minimizing the externality 1169 00:45:30,760 --> 00:45:32,860 potentially. 1170 00:45:32,860 --> 00:45:35,770 It's not really helping with self-control, I guess, right? 1171 00:45:35,770 --> 00:45:37,840 Because particularly, if you sit in the back, 1172 00:45:37,840 --> 00:45:40,000 nobody sees what you're doing. 1173 00:45:40,000 --> 00:45:43,480 And you might be so inclined to do all sorts of things. 1174 00:45:43,480 --> 00:45:44,420 Yes. 1175 00:45:44,420 --> 00:45:47,070 AUDIENCE: I've had professors say a hard no on laptops, 1176 00:45:47,070 --> 00:45:49,390 but if you have special needs or if you 1177 00:45:49,390 --> 00:45:50,890 think you're a special case, you can 1178 00:45:50,890 --> 00:45:52,240 tell the professor [INAUDIBLE]. 1179 00:45:52,240 --> 00:45:53,198 FRANK SCHILBACH: Right. 1180 00:45:53,198 --> 00:45:55,210 So that's sort of like hard paternalism 1181 00:45:55,210 --> 00:45:59,360 with some exceptions potentially. 1182 00:45:59,360 --> 00:46:00,800 Yeah. 1183 00:46:00,800 --> 00:46:03,320 AUDIENCE: I've seen only devices that 1184 00:46:03,320 --> 00:46:06,680 are flat on the table allowed for [INAUDIBLE].. 1185 00:46:06,680 --> 00:46:09,730 FRANK SCHILBACH: For note-taking, yeah. 1186 00:46:09,730 --> 00:46:12,460 AUDIENCE: I've seen TAs sitting in the back 1187 00:46:12,460 --> 00:46:15,320 and keeping track of class participation. 1188 00:46:15,320 --> 00:46:18,490 So if they see you doing something not academic, 1189 00:46:18,490 --> 00:46:20,573 they could technically ding you for it. 1190 00:46:20,573 --> 00:46:21,490 FRANK SCHILBACH: Yeah. 1191 00:46:21,490 --> 00:46:22,490 That's what they're for. 1192 00:46:22,490 --> 00:46:23,890 No. 1193 00:46:23,890 --> 00:46:26,620 No. 1194 00:46:26,620 --> 00:46:28,161 Anything else? 1195 00:46:28,161 --> 00:46:30,036 AUDIENCE: I've seen classes where you're just 1196 00:46:30,036 --> 00:46:31,869 allowed to use your laptop because, I guess, 1197 00:46:31,869 --> 00:46:33,994 you're an adult. And you're responsible for letting 1198 00:46:33,994 --> 00:46:34,510 you learn. 1199 00:46:34,510 --> 00:46:35,200 FRANK SCHILBACH: Right, exactly. 1200 00:46:35,200 --> 00:46:36,340 That's sort of laissez-faire and saying 1201 00:46:36,340 --> 00:46:37,725 you know what's best for you. 1202 00:46:37,725 --> 00:46:39,100 In some sense, since this is sort 1203 00:46:39,100 --> 00:46:42,855 of a psychology and behavioral class, 1204 00:46:42,855 --> 00:46:47,090 I tend to disagree with some of those assumptions. 1205 00:46:47,090 --> 00:46:47,840 But, yes, exactly. 1206 00:46:47,840 --> 00:46:51,330 That's laissez-faire. 1207 00:46:51,330 --> 00:46:52,324 Any other thoughts? 1208 00:46:57,070 --> 00:46:57,610 OK. 1209 00:46:57,610 --> 00:46:58,960 So then here they are. 1210 00:46:58,960 --> 00:47:00,295 So there's laissez-faire. 1211 00:47:00,295 --> 00:47:02,920 There's educational intervention which essentially is providing 1212 00:47:02,920 --> 00:47:06,580 people information about laptops or what 1213 00:47:06,580 --> 00:47:08,160 laptops do for learning. 1214 00:47:08,160 --> 00:47:10,390 There's some experiments I'll show you in a second. 1215 00:47:10,390 --> 00:47:12,140 Educational interventions tend to actually 1216 00:47:12,140 --> 00:47:13,340 not work particularly well. 1217 00:47:13,340 --> 00:47:15,257 So essentially, just giving people information 1218 00:47:15,257 --> 00:47:17,200 tends to not change behavior often 1219 00:47:17,200 --> 00:47:19,030 in the way we'd like to do that. 1220 00:47:19,030 --> 00:47:20,540 You could tax laptop use. 1221 00:47:20,540 --> 00:47:22,060 You say, it's costly to, but that 1222 00:47:22,060 --> 00:47:24,310 would be the typical sort of public economic sort 1223 00:47:24,310 --> 00:47:24,933 of solution. 1224 00:47:24,933 --> 00:47:27,100 I think I'm not allowed to take money from you guys. 1225 00:47:27,100 --> 00:47:29,830 So I might not do that. 1226 00:47:29,830 --> 00:47:32,075 You could ban laptops. 1227 00:47:32,075 --> 00:47:34,450 That was a previous suggestion to say except for students 1228 00:47:34,450 --> 00:47:36,250 with medical need. 1229 00:47:36,250 --> 00:47:38,530 You could make a non-laptop section the default 1230 00:47:38,530 --> 00:47:42,490 and let students opt out or opt in depending on that. 1231 00:47:42,490 --> 00:47:44,890 It's essentially saying, the default choice 1232 00:47:44,890 --> 00:47:46,810 is one, essentially no laptops. 1233 00:47:46,810 --> 00:47:50,110 But you can opt into that by emailing to a TA 1234 00:47:50,110 --> 00:47:51,790 and then use it in a certain section. 1235 00:47:51,790 --> 00:47:54,040 We'll talk about, in the problem set, a little bit why 1236 00:47:54,040 --> 00:47:55,805 that might be a good idea. 1237 00:47:55,805 --> 00:47:58,180 You could also set up an active choice between the laptop 1238 00:47:58,180 --> 00:48:00,910 and the no laptop sections. 1239 00:48:00,910 --> 00:48:03,020 So here's the educational interventions. 1240 00:48:03,020 --> 00:48:05,260 The evidence-- and there's very clear evidence 1241 00:48:05,260 --> 00:48:07,510 from various settings that shows that laptops in class 1242 00:48:07,510 --> 00:48:10,570 are not good for learning on average. 1243 00:48:10,570 --> 00:48:13,250 There's a very nice article by Susan Dynarski in The New York 1244 00:48:13,250 --> 00:48:13,750 Times. 1245 00:48:13,750 --> 00:48:15,898 We'll put this online as well that you could read. 1246 00:48:15,898 --> 00:48:17,440 But essentially, one of these studies 1247 00:48:17,440 --> 00:48:19,180 is a randomized controlled trial that, 1248 00:48:19,180 --> 00:48:21,970 actually, an MIT grad student, a former MIT grad student had 1249 00:48:21,970 --> 00:48:24,190 done, in an intro econ class. 1250 00:48:24,190 --> 00:48:25,485 This is Carter et al. 1251 00:48:25,485 --> 00:48:26,860 And essentially what they find is 1252 00:48:26,860 --> 00:48:29,470 that allowing computers in class reduced test scores 1253 00:48:29,470 --> 00:48:31,360 by 0.18 standard deviations. 1254 00:48:31,360 --> 00:48:33,430 That's quite a large number. 1255 00:48:33,430 --> 00:48:36,610 That's essentially saying they did randomize across classes, 1256 00:48:36,610 --> 00:48:39,580 like at the classroom level, whether they allowed laptops 1257 00:48:39,580 --> 00:48:40,870 in class or not. 1258 00:48:40,870 --> 00:48:42,850 Interestingly, they found negative effects 1259 00:48:42,850 --> 00:48:46,120 both of unconstrained laptops, like on any laptop use, 1260 00:48:46,120 --> 00:48:48,620 but also the flat tablet solution. 1261 00:48:48,620 --> 00:48:51,070 So even the flat tablet solution was worse 1262 00:48:51,070 --> 00:48:54,252 than sort of the note-taking with pen and paper. 1263 00:48:54,252 --> 00:48:55,960 You know, and there's a bunch of evidence 1264 00:48:55,960 --> 00:48:57,290 that sort of shows that. 1265 00:48:57,290 --> 00:48:58,707 So in this class, what we're going 1266 00:48:58,707 --> 00:49:01,120 to have a sort of a version of an opt-in policy, which is 1267 00:49:01,120 --> 00:49:03,040 there's going to be a laptop section starting 1268 00:49:03,040 --> 00:49:04,780 next class, which is going to be in the front on one 1269 00:49:04,780 --> 00:49:05,860 side of the class. 1270 00:49:05,860 --> 00:49:09,430 The reason being we're trying to sort of minimize externalities. 1271 00:49:09,430 --> 00:49:11,620 It's in the front, not in the back. 1272 00:49:11,620 --> 00:49:14,380 The reason being that, if you're in the very back, 1273 00:49:14,380 --> 00:49:16,000 then there's no supervision. 1274 00:49:16,000 --> 00:49:19,660 I'll, I guess, ask some TAs to sit sort of in the back. 1275 00:49:19,660 --> 00:49:22,150 If there's too much other activities going on, 1276 00:49:22,150 --> 00:49:24,907 maybe there won't be a laptop section anymore. 1277 00:49:24,907 --> 00:49:26,740 Anyway, there will be sort of considerations 1278 00:49:26,740 --> 00:49:28,085 about that in the problem set. 1279 00:49:28,085 --> 00:49:29,710 But the point of all of that is to say, 1280 00:49:29,710 --> 00:49:33,100 if you just had the standard economic considerations, 1281 00:49:33,100 --> 00:49:35,710 you would make certain policies on very sort 1282 00:49:35,710 --> 00:49:38,380 of simple considerations by missing important facts. 1283 00:49:38,380 --> 00:49:40,845 And that might be sort of the psychological externality. 1284 00:49:40,845 --> 00:49:44,050 It might be the self-control problems, the overestimation 1285 00:49:44,050 --> 00:49:46,910 of people's ability to multitask and so on and so forth. 1286 00:49:46,910 --> 00:49:48,460 And all of those things might sort of 1287 00:49:48,460 --> 00:49:50,590 interfere with people's optimal choices. 1288 00:49:50,590 --> 00:49:54,040 And that makes certain more paternalistic policies 1289 00:49:54,040 --> 00:49:56,710 potentially more promising. 1290 00:49:56,710 --> 00:49:58,990 But again, in the problem set, you'll think about some 1291 00:49:58,990 --> 00:50:01,390 of these solutions. 1292 00:50:01,390 --> 00:50:02,820 Any questions on the laptops? 1293 00:50:02,820 --> 00:50:03,680 Yes. 1294 00:50:03,680 --> 00:50:05,805 AUDIENCE: I might be thinking of a different study, 1295 00:50:05,805 --> 00:50:08,080 but if this is the same one, I got 1296 00:50:08,080 --> 00:50:11,110 that, in their other classes, they previously were not 1297 00:50:11,110 --> 00:50:12,920 allowed to use laptops. 1298 00:50:12,920 --> 00:50:17,230 So it might be that they had not optimized themselves 1299 00:50:17,230 --> 00:50:20,180 for laptop usage during class. 1300 00:50:20,180 --> 00:50:21,950 FRANK SCHILBACH: That's interesting. 1301 00:50:21,950 --> 00:50:24,890 So I think there are several studies that show that. 1302 00:50:24,890 --> 00:50:27,330 I don't know the details on that. 1303 00:50:27,330 --> 00:50:31,070 But if you can send that to me, I'm happy to look at that 1304 00:50:31,070 --> 00:50:32,100 and reconsider. 1305 00:50:32,100 --> 00:50:34,790 I think my view overall is that sort of the existing evidence 1306 00:50:34,790 --> 00:50:37,070 essentially shows laptops are bad. 1307 00:50:37,070 --> 00:50:40,893 Since I'm interested in your learning, 1308 00:50:40,893 --> 00:50:41,810 that's what I go with. 1309 00:50:41,810 --> 00:50:44,930 But I'm happy to be educated. 1310 00:50:44,930 --> 00:50:48,255 I'll try and put the slides online always the night 1311 00:50:48,255 --> 00:50:50,630 before in case you want to print them out or look at them 1312 00:50:50,630 --> 00:50:53,330 or whatever during class. 1313 00:50:53,330 --> 00:50:56,150 You're welcome to do that. 1314 00:50:56,150 --> 00:50:56,690 OK. 1315 00:50:56,690 --> 00:50:58,550 So then let me tell you very briefly 1316 00:50:58,550 --> 00:51:01,640 about different topics of the class 1317 00:51:01,640 --> 00:51:03,750 and what we're going to cover. 1318 00:51:03,750 --> 00:51:05,840 So first, we're going to have an introduction 1319 00:51:05,840 --> 00:51:07,850 and overview of today's introduction lecture. 1320 00:51:07,850 --> 00:51:10,080 The overview will be on Wednesday, 1321 00:51:10,080 --> 00:51:12,883 which will providing you an overview of what 1322 00:51:12,883 --> 00:51:15,050 the different topics are and how to think about them 1323 00:51:15,050 --> 00:51:17,000 and what evidence do we actually have. 1324 00:51:17,000 --> 00:51:18,600 So I was kind of a little bit handwavy 1325 00:51:18,600 --> 00:51:21,980 in a sense of showing you lots of flashy pictures of people 1326 00:51:21,980 --> 00:51:22,850 misbehaving. 1327 00:51:22,850 --> 00:51:25,010 But in fact, I'm going to show you some more 1328 00:51:25,010 --> 00:51:27,720 sort of rigorous evidence that we have that, in fact, 1329 00:51:27,720 --> 00:51:30,920 some of the assumptions of the classical model 1330 00:51:30,920 --> 00:51:33,750 are violated and sort of how we think about them 1331 00:51:33,750 --> 00:51:35,420 and how we might sort incorporate them 1332 00:51:35,420 --> 00:51:36,350 into economics. 1333 00:51:36,350 --> 00:51:38,838 So that's going to be an overview on Wednesday. 1334 00:51:38,838 --> 00:51:40,880 Then we're going to talk a lot about preferences. 1335 00:51:40,880 --> 00:51:44,750 You can think about essentially, when people make choices, 1336 00:51:44,750 --> 00:51:46,040 there's a utility function. 1337 00:51:46,040 --> 00:51:49,070 And sort of one set of behavioral economics issues 1338 00:51:49,070 --> 00:51:51,820 are changes to the utility function. 1339 00:51:51,820 --> 00:51:55,020 This might be time preferences and people's self-control, 1340 00:51:55,020 --> 00:51:56,510 which we already discussed. 1341 00:51:56,510 --> 00:51:58,970 It might be risk preferences, how people think about risk, 1342 00:51:58,970 --> 00:52:00,560 how they think about gains and losses 1343 00:52:00,560 --> 00:52:03,260 and how their preferences are reference-dependent. 1344 00:52:03,260 --> 00:52:07,130 This is what a student was saying earlier about how 1345 00:52:07,130 --> 00:52:09,590 you evaluate something might depend 1346 00:52:09,590 --> 00:52:12,260 a lot on your expectation of that outcome as opposed 1347 00:52:12,260 --> 00:52:15,230 to just the outcome itself. 1348 00:52:15,230 --> 00:52:17,820 Then we're going to talk quite a bit about social preferences. 1349 00:52:17,820 --> 00:52:21,560 This is sort of like how much we weigh 1350 00:52:21,560 --> 00:52:25,400 others' utility and our utility function 1351 00:52:25,400 --> 00:52:28,040 or their consumption and our utility function. 1352 00:52:28,040 --> 00:52:30,020 We'll do some experiments in class 1353 00:52:30,020 --> 00:52:32,757 on that, which should be a lot of fun, 1354 00:52:32,757 --> 00:52:34,340 and then discuss various issues about, 1355 00:52:34,340 --> 00:52:35,972 A, how much people care about others 1356 00:52:35,972 --> 00:52:37,430 and how they're affected by others' 1357 00:52:37,430 --> 00:52:39,680 behavior and social influences. 1358 00:52:39,680 --> 00:52:42,320 So that's kind of the first half of the class broadly 1359 00:52:42,320 --> 00:52:45,710 about changes in people's preferences. 1360 00:52:45,710 --> 00:52:48,410 Then we're going to talk more about beliefs broadly 1361 00:52:48,410 --> 00:52:53,530 speaking and sort of how people view information in the world. 1362 00:52:53,530 --> 00:52:55,160 We talk a little about emotions, what's 1363 00:52:55,160 --> 00:52:57,020 called projection and attribution bias. 1364 00:52:57,020 --> 00:52:59,220 These are issues about, for example, 1365 00:52:59,220 --> 00:53:00,950 when people are hungry or tired, they 1366 00:53:00,950 --> 00:53:03,780 might make quite different choices. 1367 00:53:03,780 --> 00:53:06,960 And so that might affect their preferences. 1368 00:53:06,960 --> 00:53:09,540 But it might also affect their beliefs in some sense. 1369 00:53:09,540 --> 00:53:10,890 If you think about how you're going to behave, 1370 00:53:10,890 --> 00:53:12,973 if you're hungry right now, it's very hard for you 1371 00:53:12,973 --> 00:53:15,620 to believe or think about that you might not 1372 00:53:15,620 --> 00:53:17,000 be hungry in the future. 1373 00:53:17,000 --> 00:53:19,010 This is sort of the classical example is 1374 00:53:19,010 --> 00:53:21,402 shopping on an empty stomach. 1375 00:53:21,402 --> 00:53:23,360 There's lots of different applications of that. 1376 00:53:23,360 --> 00:53:25,730 There's people buy convertibles when it's sunny 1377 00:53:25,730 --> 00:53:29,080 and then return them when it rains. 1378 00:53:29,080 --> 00:53:30,830 But there's also much more serious issues, 1379 00:53:30,830 --> 00:53:32,330 in particular things like depression. 1380 00:53:32,330 --> 00:53:33,955 For example, when people are depressed, 1381 00:53:33,955 --> 00:53:36,170 it's very hard for them to think about how 1382 00:53:36,170 --> 00:53:38,420 it might feel when they're not depressed anymore 1383 00:53:38,420 --> 00:53:39,050 in the future. 1384 00:53:39,050 --> 00:53:41,300 So we'll talk about sort of projection and attribution 1385 00:53:41,300 --> 00:53:42,758 bias, which is kind of these biases 1386 00:53:42,758 --> 00:53:45,483 and how to think about people states of the world. 1387 00:53:45,483 --> 00:53:47,150 How about limited attention, that people 1388 00:53:47,150 --> 00:53:49,040 sort of don't pay attention to certain things 1389 00:53:49,040 --> 00:53:51,332 in the world and sort of how that might affect people's 1390 00:53:51,332 --> 00:53:53,630 behavior, how we might sort of exploit that potentially 1391 00:53:53,630 --> 00:53:55,490 if we're taxing them or how we might sort of direct 1392 00:53:55,490 --> 00:53:56,865 their attention to certain things 1393 00:53:56,865 --> 00:53:58,880 and might improve their behavior? 1394 00:53:58,880 --> 00:54:01,280 Similarly, then we talk about beliefs and learning. 1395 00:54:01,280 --> 00:54:03,170 This is kind of like what information 1396 00:54:03,170 --> 00:54:04,655 do they have available-- 1397 00:54:04,655 --> 00:54:08,120 A, how they update their beliefs when they get information. 1398 00:54:08,120 --> 00:54:10,400 And are they able to process information well? 1399 00:54:10,400 --> 00:54:12,440 B, is there demand for information? 1400 00:54:12,440 --> 00:54:14,387 Do people get utility from beliefs? 1401 00:54:14,387 --> 00:54:16,220 This is what I was talking about like health 1402 00:54:16,220 --> 00:54:18,650 behaviors or the health information where people might 1403 00:54:18,650 --> 00:54:20,180 have motivated beliefs in the sense 1404 00:54:20,180 --> 00:54:24,890 that they like to believe certain things when, 1405 00:54:24,890 --> 00:54:26,360 in fact, they're not true in part 1406 00:54:26,360 --> 00:54:28,490 because it makes them happy or in part because they 1407 00:54:28,490 --> 00:54:32,292 want to be right about something or their party or whatever. 1408 00:54:32,292 --> 00:54:34,250 We'll talk a little about augmental accounting, 1409 00:54:34,250 --> 00:54:37,893 which is people tend to sort of narrowly bracket their choices. 1410 00:54:37,893 --> 00:54:40,310 They might sort of have certain accounts in their behavior 1411 00:54:40,310 --> 00:54:43,190 and decide sort of separately as opposed to aggregating 1412 00:54:43,190 --> 00:54:45,260 their behavior as a whole. 1413 00:54:45,260 --> 00:54:47,690 And that's sort of an issue that's 1414 00:54:47,690 --> 00:54:50,510 less researched, but quite interesting overall. 1415 00:54:50,510 --> 00:54:52,220 Then we're going to move towards sort 1416 00:54:52,220 --> 00:54:54,230 of more radical deviations, if you want, 1417 00:54:54,230 --> 00:54:57,440 from the standard model, which is things malleability 1418 00:54:57,440 --> 00:54:58,997 and accessibility of preferences, 1419 00:54:58,997 --> 00:55:01,580 which is to say people might not actually know what they want. 1420 00:55:01,580 --> 00:55:03,955 They might not even understand what their preferences are 1421 00:55:03,955 --> 00:55:06,080 or their preferences are easily manipulable. 1422 00:55:06,080 --> 00:55:10,610 I can sort make you choose A or B or A versus B or B versus A, 1423 00:55:10,610 --> 00:55:13,340 depending on what kind of situation I put you in. 1424 00:55:13,340 --> 00:55:17,150 And you might not even notice that I'm doing that. 1425 00:55:17,150 --> 00:55:18,785 And that makes things a lot trickier. 1426 00:55:18,785 --> 00:55:20,160 Because then, in some sense, it's 1427 00:55:20,160 --> 00:55:22,000 much harder to sort of say should 1428 00:55:22,000 --> 00:55:24,000 the government or any sort of other policy maker 1429 00:55:24,000 --> 00:55:25,980 choose A or B if we don't even know 1430 00:55:25,980 --> 00:55:28,397 what people's preferences are, when people don't even know 1431 00:55:28,397 --> 00:55:29,730 what their own preferences are? 1432 00:55:29,730 --> 00:55:31,890 We're going to talk about happiness and mental health. 1433 00:55:31,890 --> 00:55:33,900 Just kind of broadly speaking, what makes people happy? 1434 00:55:33,900 --> 00:55:35,900 And can we think about that, in particular, sort 1435 00:55:35,900 --> 00:55:37,698 of financial choices and others? 1436 00:55:37,698 --> 00:55:39,240 I'll tell you a little bit about some 1437 00:55:39,240 --> 00:55:40,698 of the work on mental health that I 1438 00:55:40,698 --> 00:55:43,920 have been doing, thinking about kind of how mental health might 1439 00:55:43,920 --> 00:55:46,355 affect economic behaviors and choices 1440 00:55:46,355 --> 00:55:48,480 and, in part, how people's demand for mental health 1441 00:55:48,480 --> 00:55:52,900 interventions might be shaped by influences of others. 1442 00:55:52,900 --> 00:55:56,190 We're going to talk about gender and racial discrimination, 1443 00:55:56,190 --> 00:55:59,220 which sometimes there's sort of classical and neoclassical 1444 00:55:59,220 --> 00:56:00,535 models of discrimination. 1445 00:56:00,535 --> 00:56:02,160 But we, in particular, think about sort 1446 00:56:02,160 --> 00:56:04,590 of issues of discrimination that are sort of not 1447 00:56:04,590 --> 00:56:08,760 rational in the sense of think about unfair, in some sense, 1448 00:56:08,760 --> 00:56:10,105 discrimination. 1449 00:56:10,105 --> 00:56:11,730 Finally, we're going to talk about sort 1450 00:56:11,730 --> 00:56:14,130 of policy and paternalism. 1451 00:56:14,130 --> 00:56:16,170 One broad issue is about-- and this goes back 1452 00:56:16,170 --> 00:56:19,740 to malleability and inaccessibility of preferences, 1453 00:56:19,740 --> 00:56:22,680 which is essentially ways in which choices 1454 00:56:22,680 --> 00:56:26,040 can be affected through frames, defaults, and nudges. 1455 00:56:26,040 --> 00:56:28,920 And sometimes I say, I can manipulate your choice 1456 00:56:28,920 --> 00:56:30,630 architecture in certain ways that 1457 00:56:30,630 --> 00:56:32,570 make you choose certain things. 1458 00:56:32,570 --> 00:56:35,070 You know, I might sort of send you letters to do your taxes. 1459 00:56:35,070 --> 00:56:39,000 I can sort of frame or set certain defaults 1460 00:56:39,000 --> 00:56:43,690 for organ donations or for savings choices and the like. 1461 00:56:43,690 --> 00:56:47,175 And that might have profound effects on people's behavior. 1462 00:56:47,175 --> 00:56:49,050 Once we go through that in a sense of showing 1463 00:56:49,050 --> 00:56:52,110 you some evidence that it's possible to do that, we're 1464 00:56:52,110 --> 00:56:54,450 going to talk a little bit about policy and paternalism 1465 00:56:54,450 --> 00:56:57,810 in a sense of say, OK, now, if I know I can change 1466 00:56:57,810 --> 00:57:00,540 your behavior in certain ways, what kinds of policies 1467 00:57:00,540 --> 00:57:01,200 should we do? 1468 00:57:01,200 --> 00:57:02,670 And should we, in fact, do that? 1469 00:57:02,670 --> 00:57:04,770 Or should we tax people? 1470 00:57:04,770 --> 00:57:07,350 Should we set certain frames and defaults? 1471 00:57:07,350 --> 00:57:09,360 And you know, are we potentially making 1472 00:57:09,360 --> 00:57:11,730 some people worse off while making some people better 1473 00:57:11,730 --> 00:57:14,400 off by doing these kinds of policies? 1474 00:57:14,400 --> 00:57:17,715 There's often also some ethical issues associated with that. 1475 00:57:17,715 --> 00:57:19,840 And then, finally, I'm going to talk about poverty. 1476 00:57:19,840 --> 00:57:23,010 This is sort of mostly the research that I do 1477 00:57:23,010 --> 00:57:25,560 and thinking about poverty issues 1478 00:57:25,560 --> 00:57:27,810 through the lens of psychology. 1479 00:57:27,810 --> 00:57:30,697 Again, I'll tell you a little bit about the work I do myself. 1480 00:57:30,697 --> 00:57:32,280 Partially, we get to talk a little bit 1481 00:57:32,280 --> 00:57:35,280 about so how financial constraints or just thinking 1482 00:57:35,280 --> 00:57:38,460 about money affects people's behavior and then, second, how 1483 00:57:38,460 --> 00:57:40,530 others' issues related to poverty 1484 00:57:40,530 --> 00:57:45,990 might shape people's choices, decision making, 1485 00:57:45,990 --> 00:57:48,835 and their labor market outcomes, earnings, and so on 1486 00:57:48,835 --> 00:57:50,460 and where there's potentially something 1487 00:57:50,460 --> 00:57:53,430 that you might want to call a psychological or behavioral 1488 00:57:53,430 --> 00:57:55,160 poverty trap. 1489 00:57:55,160 --> 00:57:57,110 Any questions on these topics? 1490 00:58:03,350 --> 00:58:03,850 OK. 1491 00:58:03,850 --> 00:58:06,430 So then readings for next time, this is Wednesday. 1492 00:58:06,430 --> 00:58:10,360 Please read-- there's a paper on the course website by Matthew 1493 00:58:10,360 --> 00:58:14,560 Rabin, who is one of sort of the all-stars 1494 00:58:14,560 --> 00:58:16,360 in behavioral economics. 1495 00:58:16,360 --> 00:58:19,047 20 years ago, he wrote a very useful perspective 1496 00:58:19,047 --> 00:58:21,130 on psychology and economics that sort of discusses 1497 00:58:21,130 --> 00:58:23,740 a lot of the issues of behavioral economics, 1498 00:58:23,740 --> 00:58:25,840 how to think about that. 1499 00:58:25,840 --> 00:58:28,000 Again, read sections one and two. 1500 00:58:28,000 --> 00:58:29,800 We're not going to test you specifically 1501 00:58:29,800 --> 00:58:31,870 on very specific things, but you should sort of 1502 00:58:31,870 --> 00:58:35,350 be able to sort of remember, at least roughly, what's 1503 00:58:35,350 --> 00:58:37,170 in that paper.