1 00:00:00,000 --> 00:00:01,948 [SQUEAKING] 2 00:00:01,948 --> 00:00:03,409 [RUSTLING] 3 00:00:03,409 --> 00:00:05,357 [CLICKING] 4 00:00:10,720 --> 00:00:12,970 PROFESSOR: OK, so what we're going 5 00:00:12,970 --> 00:00:17,360 to do in talking about social preferences is the following. 6 00:00:17,360 --> 00:00:20,620 We're going to start discussing what our social preferences. 7 00:00:20,620 --> 00:00:22,443 We're going to just talk about what 8 00:00:22,443 --> 00:00:23,860 do we think social preferences are 9 00:00:23,860 --> 00:00:25,780 and how do we think about them, how do we measure them. 10 00:00:25,780 --> 00:00:27,550 This is partially what we did already 11 00:00:27,550 --> 00:00:30,340 in class last week, which gives you 12 00:00:30,340 --> 00:00:32,170 some sense of some ways of measuring them. 13 00:00:32,170 --> 00:00:33,500 There are also other ways of doing that. 14 00:00:33,500 --> 00:00:34,690 We're going to discuss that. 15 00:00:34,690 --> 00:00:35,770 We're going to also discuss a bit 16 00:00:35,770 --> 00:00:37,120 kind of like the measurements that we 17 00:00:37,120 --> 00:00:38,662 did in class that seem sort of pretty 18 00:00:38,662 --> 00:00:40,120 contrived in various ways. 19 00:00:40,120 --> 00:00:42,520 Are they actually predictive in some ways of real world 20 00:00:42,520 --> 00:00:44,488 behavior that we care about? 21 00:00:44,488 --> 00:00:46,030 Then we're going to ask the question, 22 00:00:46,030 --> 00:00:48,520 well, it seems like, from measuring social preferences, 23 00:00:48,520 --> 00:00:51,705 as we did in class, but also in some experiments that we see, 24 00:00:51,705 --> 00:00:53,080 it looks like people are actually 25 00:00:53,080 --> 00:00:54,220 quite nice to each other. 26 00:00:54,220 --> 00:00:55,720 But we're going to ask the question, 27 00:00:55,720 --> 00:00:57,595 are people just genuinely nice to each other? 28 00:00:57,595 --> 00:00:59,230 Do they actually care about others? 29 00:00:59,230 --> 00:01:01,270 Or is it just like, in some ways, 30 00:01:01,270 --> 00:01:03,760 perhaps, social image or other issues about like they 31 00:01:03,760 --> 00:01:05,500 want to look nice or they want to feel 32 00:01:05,500 --> 00:01:09,070 good by being nice to others or look like they're 33 00:01:09,070 --> 00:01:11,410 nice to others, as opposed to generally caring 34 00:01:11,410 --> 00:01:13,820 about the well-being of others. 35 00:01:13,820 --> 00:01:15,730 And we can talk through some experiments 36 00:01:15,730 --> 00:01:17,560 and some evidence that lets us disentangle 37 00:01:17,560 --> 00:01:19,055 between those kinds of hypotheses-- 38 00:01:19,055 --> 00:01:21,430 are people genuinely nice or are they sort of just trying 39 00:01:21,430 --> 00:01:24,213 to look nice in various ways? 40 00:01:24,213 --> 00:01:25,630 And then we're going to talk a bit 41 00:01:25,630 --> 00:01:27,640 about evidence towards the end. 42 00:01:27,640 --> 00:01:29,920 This is the paper by Gautam Rao and others, 43 00:01:29,920 --> 00:01:32,560 which is the question about are social preferences malleable 44 00:01:32,560 --> 00:01:35,440 and can policies change pro-sociality. 45 00:01:35,440 --> 00:01:37,750 That is to say, can we do certain things, 46 00:01:37,750 --> 00:01:40,420 can we mix people in certain ways-- it can be like this will 47 00:01:40,420 --> 00:01:45,833 be how roommates are chosen, and the specific people 48 00:01:45,833 --> 00:01:47,500 we're going to talk about is from India, 49 00:01:47,500 --> 00:01:50,590 where poor students are mixed with rich students. 50 00:01:50,590 --> 00:01:52,180 And does that affect how nice people 51 00:01:52,180 --> 00:01:55,780 are, the rich students, in this case, towards poor students, 52 00:01:55,780 --> 00:01:57,860 but also towards other people in general. 53 00:01:57,860 --> 00:02:00,520 So are there some policies that we can perhaps implement 54 00:02:00,520 --> 00:02:05,140 that help people, society as a whole, being more pro-social 55 00:02:05,140 --> 00:02:07,300 and people being nicer to each other 56 00:02:07,300 --> 00:02:09,910 or doing things that are good for the greater 57 00:02:09,910 --> 00:02:12,437 good, if you want, overall. 58 00:02:12,437 --> 00:02:14,770 As before, we're going to look at two types of evidence. 59 00:02:14,770 --> 00:02:17,440 We're going to look at lab experiments and field evidence. 60 00:02:17,440 --> 00:02:19,420 Lab experiments are the types of experiments 61 00:02:19,420 --> 00:02:22,000 that you saw so far. 62 00:02:22,000 --> 00:02:25,420 And field evidence are sort of the type of experiments 63 00:02:25,420 --> 00:02:28,750 where we look at students being mixed, rich and poor students. 64 00:02:28,750 --> 00:02:30,922 What happens then to people's social preferences 65 00:02:30,922 --> 00:02:31,630 when you do that? 66 00:02:36,030 --> 00:02:38,700 OK, so now let's sort of back up a little 67 00:02:38,700 --> 00:02:44,350 bit into review, very quickly, of what we did last week. 68 00:02:44,350 --> 00:02:46,440 I showed you sort of an action, three games 69 00:02:46,440 --> 00:02:47,890 to measure social preferences. 70 00:02:47,890 --> 00:02:49,690 These are not randomly selected games, 71 00:02:49,690 --> 00:02:53,130 these are the most prevalent, the most commonly-used games 72 00:02:53,130 --> 00:02:55,555 in behavioral and experimental economics, 73 00:02:55,555 --> 00:02:57,930 which is the dictator game, ultimatum game, and the trust 74 00:02:57,930 --> 00:02:58,530 game. 75 00:02:58,530 --> 00:03:01,020 You can read descriptions and discussions of these games 76 00:03:01,020 --> 00:03:05,160 in the background readings, which is Camerer and Fehr 2004. 77 00:03:05,160 --> 00:03:06,640 Now what are these games? 78 00:03:06,640 --> 00:03:08,833 I'm going to briefly review them right now. 79 00:03:08,833 --> 00:03:10,500 And then I'm going to go to the evidence 80 00:03:10,500 --> 00:03:13,368 and go back to evidence that actually uses these games. 81 00:03:13,368 --> 00:03:14,910 So what are the games, just to recap? 82 00:03:14,910 --> 00:03:17,700 There's the dictator game, which is very simple, 83 00:03:17,700 --> 00:03:21,720 where the dictator gets some money allocated to him or her. 84 00:03:21,720 --> 00:03:24,300 And there's another person who is the recipient. 85 00:03:24,300 --> 00:03:25,800 And a dictator just gets to choose 86 00:03:25,800 --> 00:03:27,412 how much money does he or she wants 87 00:03:27,412 --> 00:03:28,620 to give to this other person. 88 00:03:28,620 --> 00:03:29,700 That's it. 89 00:03:29,700 --> 00:03:32,250 You can think of this as some raw concern for others, 90 00:03:32,250 --> 00:03:35,288 essentially to say, when you don't know anything 91 00:03:35,288 --> 00:03:37,080 about this other person-- often these games 92 00:03:37,080 --> 00:03:38,970 are anonymous-- here's an anonymous person whom 93 00:03:38,970 --> 00:03:40,000 you could give some money. 94 00:03:40,000 --> 00:03:41,310 And often it's another student. 95 00:03:41,310 --> 00:03:45,553 It could be like a poor person in Kenya or elsewhere. 96 00:03:45,553 --> 00:03:47,220 Here's a person with whom you don't have 97 00:03:47,220 --> 00:03:48,853 any interactions whatsoever. 98 00:03:48,853 --> 00:03:50,520 How much money do you want for yourself? 99 00:03:50,520 --> 00:03:53,040 How much money do you want for this other person? 100 00:03:53,040 --> 00:03:55,200 The original version is just sort of 1 to 1. 101 00:03:55,200 --> 00:03:57,570 Any dollar that you give is given directly 102 00:03:57,570 --> 00:03:58,723 to this other person. 103 00:03:58,723 --> 00:04:01,140 There's different versions where essentially any dollar is 104 00:04:01,140 --> 00:04:03,630 converted, where essentially the exchange rate is varied, 105 00:04:03,630 --> 00:04:05,310 where it's cheaper or more expensive, 106 00:04:05,310 --> 00:04:08,310 if you give a dollar, the other person gets $2.00 or $0.50 107 00:04:08,310 --> 00:04:09,840 or the like. 108 00:04:09,840 --> 00:04:11,760 OK, so that's the dictator game. 109 00:04:11,760 --> 00:04:12,930 What's the ultimatum game? 110 00:04:12,930 --> 00:04:14,930 It's sort of like a bargaining game if you want. 111 00:04:14,930 --> 00:04:18,870 It's sort of very, very simple game theory if you want. 112 00:04:18,870 --> 00:04:20,940 You can use that, which is essentially used again 113 00:04:20,940 --> 00:04:22,350 by experimental economists. 114 00:04:22,350 --> 00:04:24,480 What they do is there's two players again. 115 00:04:24,480 --> 00:04:28,050 There's a proposer or the sender that gives, again, like, 116 00:04:28,050 --> 00:04:29,570 some divisible pie. 117 00:04:29,570 --> 00:04:31,290 Think of this as $10 or the like. 118 00:04:31,290 --> 00:04:32,340 Often it's money. 119 00:04:32,340 --> 00:04:35,160 And then you say a portion of x is given, then, 120 00:04:35,160 --> 00:04:36,420 to the responder. 121 00:04:36,420 --> 00:04:38,370 The responder then has the chance 122 00:04:38,370 --> 00:04:40,290 to either accept or reject. 123 00:04:40,290 --> 00:04:42,660 If the responder accepts, then the division 124 00:04:42,660 --> 00:04:43,830 is just implemented. 125 00:04:43,830 --> 00:04:47,670 If the responder rejects, then essentially both players 126 00:04:47,670 --> 00:04:50,568 get nothing. 127 00:04:50,568 --> 00:04:51,610 And why is it a game now? 128 00:04:51,610 --> 00:04:55,860 Well, it's because essentially the proposer 129 00:04:55,860 --> 00:04:57,510 needs to sort of take into account what 130 00:04:57,510 --> 00:04:58,920 the other person is going to do and have 131 00:04:58,920 --> 00:05:01,290 some beliefs about that and anticipating that, then sort 132 00:05:01,290 --> 00:05:05,047 make that choice of how much money would you like to give. 133 00:05:05,047 --> 00:05:06,630 Finally, there's the trust game, which 134 00:05:06,630 --> 00:05:08,520 is, in fact, very similar to the ultimatum 135 00:05:08,520 --> 00:05:12,150 game except for that the amount sent by the sender 136 00:05:12,150 --> 00:05:15,930 is tripled before the decider is supposed to say-- 137 00:05:19,650 --> 00:05:21,780 before they decide how much to return, 138 00:05:21,780 --> 00:05:25,600 the recipient decides how much return, if anything. 139 00:05:25,600 --> 00:05:29,250 So now essentially, it's like a high-return opportunity 140 00:05:29,250 --> 00:05:35,508 to give money, essentially in a situation of high trust. 141 00:05:35,508 --> 00:05:37,050 In particular, there's communication. 142 00:05:37,050 --> 00:05:39,342 If I'm playing with somebody, I would tell that person, 143 00:05:39,342 --> 00:05:40,620 yes, give me all of the money. 144 00:05:40,620 --> 00:05:42,840 And if you are nice to me, that money is tripled, 145 00:05:42,840 --> 00:05:44,970 and I give you half of it back, or the like. 146 00:05:44,970 --> 00:05:46,800 So both of us are better off. 147 00:05:46,800 --> 00:05:49,080 If there's lots of trust between us, that's actually 148 00:05:49,080 --> 00:05:51,288 implementable, and a sense of, like, I'm telling you, 149 00:05:51,288 --> 00:05:53,205 give me that money, you'll give me that money, 150 00:05:53,205 --> 00:05:55,110 and then I'm actually returning that money. 151 00:05:55,110 --> 00:05:56,568 But of course, if there's no trust, 152 00:05:56,568 --> 00:05:58,985 then that's what the game name comes from, the trust game. 153 00:05:58,985 --> 00:06:01,110 If there's no trust, I'm going to tell you, give me 154 00:06:01,110 --> 00:06:03,100 a bunch of money, you might actually do that, 155 00:06:03,100 --> 00:06:05,010 and then I'll just keep it all for myself. 156 00:06:05,010 --> 00:06:07,320 Or you might not even believe me in the first place. 157 00:06:07,320 --> 00:06:11,080 You might just say, yeah, I'd love to, but I don't trust you, 158 00:06:11,080 --> 00:06:11,580 Frank. 159 00:06:11,580 --> 00:06:13,050 I give you nothing. 160 00:06:13,050 --> 00:06:15,330 And then essentially, nothing is tripled. 161 00:06:15,330 --> 00:06:19,620 And that's meant to then measure some form of trust in society 162 00:06:19,620 --> 00:06:22,333 or that people have with each other. 163 00:06:22,333 --> 00:06:24,000 We're going to go back to that in trying 164 00:06:24,000 --> 00:06:27,725 to see what does it actually measure in the real world. 165 00:06:27,725 --> 00:06:29,100 So just sort of having said that, 166 00:06:29,100 --> 00:06:30,660 those are those three things you saw. 167 00:06:30,660 --> 00:06:31,890 And then there's different versions 168 00:06:31,890 --> 00:06:33,848 of these games that could be sort of like-- you 169 00:06:33,848 --> 00:06:35,610 can vary the privacy. 170 00:06:35,610 --> 00:06:37,680 Are these games private versus public? 171 00:06:37,680 --> 00:06:38,640 Is it anonymous? 172 00:06:38,640 --> 00:06:40,920 Do you know who the other person is versus not? 173 00:06:40,920 --> 00:06:42,150 Is there communication? 174 00:06:42,150 --> 00:06:44,610 Can you talk to this other person or not? 175 00:06:44,610 --> 00:06:46,950 And what are the stakes? 176 00:06:46,950 --> 00:06:49,650 Are we talking about hypothetical choices? 177 00:06:49,650 --> 00:06:50,910 Are we talking about money? 178 00:06:50,910 --> 00:06:51,990 High stakes with money. 179 00:06:51,990 --> 00:06:56,940 Are we talking about other goods like apples or chocolate? 180 00:06:56,940 --> 00:06:58,350 Any questions so far? 181 00:06:58,350 --> 00:07:00,150 We'll get back to the actual evidence, 182 00:07:00,150 --> 00:07:02,820 but I should make sure that these games are 183 00:07:02,820 --> 00:07:03,640 clear to everyone. 184 00:07:03,640 --> 00:07:05,265 That's sort of what happened last time. 185 00:07:09,210 --> 00:07:10,950 OK. 186 00:07:10,950 --> 00:07:14,260 Now, backing up a little bit now, 187 00:07:14,260 --> 00:07:17,550 then the question is, like, what is, in fact, 188 00:07:17,550 --> 00:07:19,020 social preferences? 189 00:07:19,020 --> 00:07:23,840 And so most economic analysis in some way 190 00:07:23,840 --> 00:07:27,200 assumes away social preferences and sort of assumes 191 00:07:27,200 --> 00:07:29,510 self-interest, very narrowly defined. 192 00:07:29,510 --> 00:07:33,380 And so that means that people essentially only care 193 00:07:33,380 --> 00:07:36,210 about their own interests and their own outcomes, 194 00:07:36,210 --> 00:07:38,730 and then the market is sort of taking care of the rest. 195 00:07:38,730 --> 00:07:42,530 And this is at the heart of what Adam Smith originally 196 00:07:42,530 --> 00:07:44,708 was saying, I think, in The Wealth of Nations, 197 00:07:44,708 --> 00:07:47,250 which is like, "It's not from the benevolence of the butcher, 198 00:07:47,250 --> 00:07:50,400 the brewer, or the baker that we expect our dinner, 199 00:07:50,400 --> 00:07:52,610 but from their regard for their own interest. 200 00:07:52,610 --> 00:07:54,890 We address ourselves not to their humanity, 201 00:07:54,890 --> 00:07:56,660 but to their self-love, and never talk 202 00:07:56,660 --> 00:08:00,020 to them of our necessity, but of their advantage." 203 00:08:00,020 --> 00:08:03,620 That is to say that when everybody is very selfish 204 00:08:03,620 --> 00:08:06,800 and cares about themselves, just comparative advantage 205 00:08:06,800 --> 00:08:08,630 and the ability to trade with each other 206 00:08:08,630 --> 00:08:12,440 will make sure that other people do useful things for us. 207 00:08:12,440 --> 00:08:15,350 So you can do the work you do, make some money, 208 00:08:15,350 --> 00:08:16,610 and then go to the bakery. 209 00:08:16,610 --> 00:08:19,160 And the baker will give you some bread in exchange 210 00:08:19,160 --> 00:08:22,048 for some money, not because the baker likes you or wants 211 00:08:22,048 --> 00:08:24,590 to be friends with you or the like, but because you pay them, 212 00:08:24,590 --> 00:08:26,060 and it's in their interest to give you 213 00:08:26,060 --> 00:08:27,530 good bread because you're going to pay them, 214 00:08:27,530 --> 00:08:28,820 and you're going to come back perhaps and buy 215 00:08:28,820 --> 00:08:30,370 more bread in the future. 216 00:08:30,370 --> 00:08:32,370 So that's sort of a pretty reasonable assumption 217 00:08:32,370 --> 00:08:34,789 in saying people are doing useful things for each other 218 00:08:34,789 --> 00:08:36,450 as part of a market transaction. 219 00:08:36,450 --> 00:08:37,880 And that's what a lot of economics 220 00:08:37,880 --> 00:08:41,299 has assumed and done for a long time. 221 00:08:41,299 --> 00:08:44,030 Now, that is not a bad assumption 222 00:08:44,030 --> 00:08:46,430 at all in the sense of, like, it's actually 223 00:08:46,430 --> 00:08:50,330 pretty realistic in various ways in thinking about humans. 224 00:08:50,330 --> 00:08:54,140 The question is not necessarily whether there 225 00:08:54,140 --> 00:08:56,820 are some instances of whether people care about others. 226 00:08:56,820 --> 00:08:59,240 The question that we're going to ask is, like, well, 227 00:08:59,240 --> 00:09:02,730 what do we miss by ignoring social preferences? 228 00:09:02,730 --> 00:09:06,290 And so it's pretty clear that preferences 229 00:09:06,290 --> 00:09:08,678 depart from pure self-interest in non-trivial ways. 230 00:09:08,678 --> 00:09:10,970 And the question we're going to ask is, like, are there 231 00:09:10,970 --> 00:09:12,553 some concrete settings where we really 232 00:09:12,553 --> 00:09:14,670 care about this and sort of say, well, 233 00:09:14,670 --> 00:09:16,350 we should model things differently? 234 00:09:16,350 --> 00:09:20,120 And if we do that, we can understand certain phenomena 235 00:09:20,120 --> 00:09:20,905 better. 236 00:09:20,905 --> 00:09:22,280 So the goal is here to understand 237 00:09:22,280 --> 00:09:24,155 how common and important these departures are 238 00:09:24,155 --> 00:09:28,280 and what their nature is, like, how can we best model these. 239 00:09:28,280 --> 00:09:34,217 And here's a nice example of perhaps how one 240 00:09:34,217 --> 00:09:36,050 might want to think about social preferences 241 00:09:36,050 --> 00:09:38,930 and how perhaps people think about social preferences. 242 00:09:38,930 --> 00:09:39,920 So this is a picture-- 243 00:09:39,920 --> 00:09:42,380 I don't know how well you can see this-- 244 00:09:42,380 --> 00:09:47,450 taken in Hawaii, which is a box where 245 00:09:47,450 --> 00:09:48,590 banana bread is being sold. 246 00:09:48,590 --> 00:09:50,190 It's freely available to everybody, 247 00:09:50,190 --> 00:09:51,540 so you can just take it. 248 00:09:51,540 --> 00:09:53,720 I think there's some price tag somewhere. 249 00:09:53,720 --> 00:09:58,550 There's also a lockbox next to it on the left of the picture 250 00:09:58,550 --> 00:10:00,128 where you can deposit payments. 251 00:10:00,128 --> 00:10:01,670 So the banana bread costs some money. 252 00:10:01,670 --> 00:10:04,460 You can deposit your payments. 253 00:10:04,460 --> 00:10:07,447 There's nobody that enforces it, but the lockbox 254 00:10:07,447 --> 00:10:09,280 is such that it's actually locked in a sense 255 00:10:09,280 --> 00:10:12,420 that you can't take away the box. 256 00:10:12,420 --> 00:10:14,660 So when you see that picture, what 257 00:10:14,660 --> 00:10:18,260 perception of social preferences or humanity does that reflect? 258 00:10:18,260 --> 00:10:21,430 How do we think about this picture? 259 00:10:21,430 --> 00:10:22,184 Yes? 260 00:10:22,184 --> 00:10:24,851 AUDIENCE: That we're going to be honest and give the money based 261 00:10:24,851 --> 00:10:26,212 on how much [INAUDIBLE]. 262 00:10:26,212 --> 00:10:26,920 PROFESSOR: Right. 263 00:10:26,920 --> 00:10:28,450 So one assumption, to some degree, 264 00:10:28,450 --> 00:10:31,190 is, like, well, people seem pretty nice in general. 265 00:10:31,190 --> 00:10:33,483 It seems in a sense of, like, we allow 266 00:10:33,483 --> 00:10:34,900 them to just take the banana bread 267 00:10:34,900 --> 00:10:37,940 and run away and not put in anything. 268 00:10:37,940 --> 00:10:39,370 We sort of ask them nicely. 269 00:10:39,370 --> 00:10:41,680 It costs, like, $1 or $5 or whatever. 270 00:10:41,680 --> 00:10:42,940 Please put in the money. 271 00:10:42,940 --> 00:10:45,190 It seems like the conception to some degree 272 00:10:45,190 --> 00:10:49,550 is, like, people are going to be nice and just do that. 273 00:10:49,550 --> 00:10:51,240 Is that all, or is there something else? 274 00:10:51,240 --> 00:10:52,320 Yes. 275 00:10:52,320 --> 00:10:54,060 AUDIENCE: Maybe there's an expectation 276 00:10:54,060 --> 00:10:56,750 not that everyone would do the right thing, 277 00:10:56,750 --> 00:11:03,120 but that the loss from some people who would just 278 00:11:03,120 --> 00:11:07,310 take the bread and run will be maybe less than, for example, 279 00:11:07,310 --> 00:11:09,260 the cost of labor or staffing the booth. 280 00:11:09,260 --> 00:11:09,980 PROFESSOR: Right. 281 00:11:09,980 --> 00:11:10,490 Exactly. 282 00:11:10,490 --> 00:11:12,645 So there's some-- the expectation, 283 00:11:12,645 --> 00:11:15,020 to be more precise, is not everybody is going to be nice, 284 00:11:15,020 --> 00:11:18,020 but on average, most people are going to be pretty nice 285 00:11:18,020 --> 00:11:21,950 or adhering to this normal rule that you should pay some money. 286 00:11:21,950 --> 00:11:23,990 And some people might run, but as you say, 287 00:11:23,990 --> 00:11:27,080 it might not be worth standing there in the heat all day 288 00:11:27,080 --> 00:11:31,670 to make up for that. 289 00:11:31,670 --> 00:11:32,720 Exactly. 290 00:11:32,720 --> 00:11:34,440 And then what else is there? 291 00:11:34,440 --> 00:11:34,940 Yes. 292 00:11:34,940 --> 00:11:36,398 AUDIENCE: It gives a perception of, 293 00:11:36,398 --> 00:11:38,594 like, you don't expect them to steal banana bread, 294 00:11:38,594 --> 00:11:41,080 but you expect them to steal the money, so you lock it. 295 00:11:41,080 --> 00:11:42,500 PROFESSOR: Right, there's a lock. 296 00:11:42,500 --> 00:11:44,417 So it's like, in some sense, you sort of think 297 00:11:44,417 --> 00:11:47,180 that, on the one hand, the assumption is most people are 298 00:11:47,180 --> 00:11:48,690 going to be pretty nice. 299 00:11:48,690 --> 00:11:53,150 But if we allow them to run away with all the money, 300 00:11:53,150 --> 00:11:55,940 at least there are some people who, if allowed, 301 00:11:55,940 --> 00:11:58,320 would not be nice at all. 302 00:11:58,320 --> 00:12:00,403 In fact, they would take away the money and not 303 00:12:00,403 --> 00:12:01,820 only not pay for the banana bread, 304 00:12:01,820 --> 00:12:04,920 but also take away the lockbox. 305 00:12:04,920 --> 00:12:09,877 So that's exactly what I've written down here, 306 00:12:09,877 --> 00:12:11,960 which is to say people in general are nice enough. 307 00:12:11,960 --> 00:12:14,750 Most will probably pay for the banana bread, 308 00:12:14,750 --> 00:12:17,870 so it's not worth sort of standing around and ensuring 309 00:12:17,870 --> 00:12:18,590 that. 310 00:12:18,590 --> 00:12:20,840 But people also can be selfish. 311 00:12:20,840 --> 00:12:23,060 At least, some people will be quite selfish. 312 00:12:23,060 --> 00:12:24,800 So if a lot of cash were easy to take, 313 00:12:24,800 --> 00:12:26,360 someone at least would take it. 314 00:12:26,360 --> 00:12:28,878 Notice that's not an assumption about everybody, or even 315 00:12:28,878 --> 00:12:29,420 the majority. 316 00:12:29,420 --> 00:12:31,295 It's just to say there's a fraction of people 317 00:12:31,295 --> 00:12:36,850 who, if given the opportunity, might take away our money 318 00:12:36,850 --> 00:12:38,948 and run from it. 319 00:12:38,948 --> 00:12:40,490 So that's actually a pretty good sort 320 00:12:40,490 --> 00:12:44,150 of perspective for thinking about social preferences, which 321 00:12:44,150 --> 00:12:46,880 is to say that self-interest is probably 322 00:12:46,880 --> 00:12:50,990 a major driver of behavior in many economic contexts. 323 00:12:50,990 --> 00:12:53,240 In some situations, self-interest is not 324 00:12:53,240 --> 00:12:54,570 the main motive, in some sense. 325 00:12:54,570 --> 00:12:56,840 Again, we don't need to necessarily enforce 326 00:12:56,840 --> 00:12:57,720 this for everybody. 327 00:12:57,720 --> 00:12:59,030 People will be quite nice. 328 00:12:59,030 --> 00:13:02,793 But there will be-- some seemingly small departures 329 00:13:02,793 --> 00:13:04,460 from pure self-interest can dramatically 330 00:13:04,460 --> 00:13:06,050 influence economic outcomes. 331 00:13:06,050 --> 00:13:10,550 That is to say, if we couldn't lock the box, 332 00:13:10,550 --> 00:13:14,510 if you were not doing that, somebody would take away 333 00:13:14,510 --> 00:13:15,020 this box. 334 00:13:15,020 --> 00:13:17,103 And now we couldn't offer banana bread to anybody, 335 00:13:17,103 --> 00:13:19,100 and the whole transaction would go down. 336 00:13:19,100 --> 00:13:24,150 Put differently, if we had things like a theft or robbery 337 00:13:24,150 --> 00:13:26,150 and so on, if there's some people who are really 338 00:13:26,150 --> 00:13:28,850 mean and essentially take away all of our money, 339 00:13:28,850 --> 00:13:30,290 now we can't do any transaction. 340 00:13:30,290 --> 00:13:35,240 I can't have a shop trying to sell stuff to people. 341 00:13:35,240 --> 00:13:39,110 This is the examples [INAUDIBLE] positive and negative. 342 00:13:39,110 --> 00:13:42,620 Small things can actually make huge differences for others. 343 00:13:42,620 --> 00:13:44,900 This is like helping a stranger with directions 344 00:13:44,900 --> 00:13:47,180 or helping a stranger who wants to go to the hospital 345 00:13:47,180 --> 00:13:47,450 and so on. 346 00:13:47,450 --> 00:13:49,450 That could make huge differences in their lives, 347 00:13:49,450 --> 00:13:52,070 even if the-- or calling an ambulance for somebody, that 348 00:13:52,070 --> 00:13:53,810 can make huge differences people's lives, 349 00:13:53,810 --> 00:13:55,460 even though the action from your side 350 00:13:55,460 --> 00:13:57,530 is actually relatively small. 351 00:13:57,530 --> 00:13:59,528 Helping a fellow student with computer trouble 352 00:13:59,528 --> 00:14:01,070 or other issues that they might have, 353 00:14:01,070 --> 00:14:04,280 helping somebody who is sad on a given day 354 00:14:04,280 --> 00:14:05,910 feel better about themselves. 355 00:14:05,910 --> 00:14:08,540 In fact, as you might know, it's the Random Acts 356 00:14:08,540 --> 00:14:13,010 of Kindness Week this week, so doing random acts of kindness 357 00:14:13,010 --> 00:14:15,390 might actually make a huge difference in people's lives. 358 00:14:15,390 --> 00:14:18,560 In fact, the problem said, well, we'll 359 00:14:18,560 --> 00:14:21,230 ask you to do some of that, in fact. 360 00:14:21,230 --> 00:14:23,990 Washing your hands is particularly pertinent 361 00:14:23,990 --> 00:14:25,760 right now. 362 00:14:25,760 --> 00:14:28,610 Again, you might think you're doing that to protect yourself, 363 00:14:28,610 --> 00:14:29,720 and surely you are. 364 00:14:29,720 --> 00:14:33,740 But in fact, there's also a huge externality potentially that, 365 00:14:33,740 --> 00:14:37,160 like, by protecting yourself, you protect others 366 00:14:37,160 --> 00:14:38,630 from potentially getting sick. 367 00:14:38,630 --> 00:14:42,510 And that could make a huge difference in people's lives. 368 00:14:42,510 --> 00:14:43,740 Any questions on this? 369 00:14:49,010 --> 00:14:49,530 OK. 370 00:14:49,530 --> 00:14:51,530 So now we're getting back to the ultimatum game. 371 00:14:51,530 --> 00:14:54,520 I'm going to ask the question about how are people actually 372 00:14:54,520 --> 00:14:56,980 behaving in real situations. 373 00:14:56,980 --> 00:14:59,200 How do people behave in general? 374 00:14:59,200 --> 00:15:00,730 And perhaps, what does game theory 375 00:15:00,730 --> 00:15:02,570 say how people should behave? 376 00:15:02,570 --> 00:15:04,700 How are people behaving in reality, 377 00:15:04,700 --> 00:15:09,410 and how does that compare to how you guys behaved here in class? 378 00:15:09,410 --> 00:15:12,650 So again, the ultimatum game, you should be familiar with. 379 00:15:12,650 --> 00:15:14,770 There's a proposer, a responder. 380 00:15:14,770 --> 00:15:18,100 Proposer proposes some split of the pie. 381 00:15:18,100 --> 00:15:20,990 Responder can accept or reject. 382 00:15:20,990 --> 00:15:24,850 Now, what does game theory predict what people should do? 383 00:15:24,850 --> 00:15:27,323 Like, in the absence of social preferences? 384 00:15:38,170 --> 00:15:39,020 Yes? 385 00:15:39,020 --> 00:15:40,400 AUDIENCE: [INAUDIBLE]. 386 00:15:43,282 --> 00:15:43,990 PROFESSOR: Right. 387 00:15:43,990 --> 00:15:45,698 If you're the recipient, you'll care only 388 00:15:45,698 --> 00:15:49,150 about your own output or your own outcomes. 389 00:15:49,150 --> 00:15:51,370 Even if I offer you $0.01 or $0.05 390 00:15:51,370 --> 00:15:53,680 or whatever, whatever is worth the transaction cost, 391 00:15:53,680 --> 00:15:55,220 you're going to accept everything. 392 00:15:55,220 --> 00:15:57,520 So if I'm playing with you, I'm going to anticipate-- 393 00:15:57,520 --> 00:15:59,500 I know that you will accept essentially 394 00:15:59,500 --> 00:16:02,410 any offer that's positive. 395 00:16:02,410 --> 00:16:04,180 Even zero, you might even accept, 396 00:16:04,180 --> 00:16:06,880 because it's sort of like, why do you care? 397 00:16:06,880 --> 00:16:11,170 But you strictly prefer a positive offer over no offer, 398 00:16:11,170 --> 00:16:12,800 so you're going to accept that. 399 00:16:12,800 --> 00:16:16,420 And then if I'm playing with you, anticipating that, 400 00:16:16,420 --> 00:16:19,340 I will just give you essentially the lowest possible amount, 401 00:16:19,340 --> 00:16:21,190 and that will be the outcomes. 402 00:16:21,190 --> 00:16:22,330 Right? 403 00:16:22,330 --> 00:16:23,590 So here it is. 404 00:16:23,590 --> 00:16:26,080 The prediction is very clear. 405 00:16:26,080 --> 00:16:27,970 The responder cares only-- this is, again, 406 00:16:27,970 --> 00:16:30,220 game theory, but game theory in the absence 407 00:16:30,220 --> 00:16:31,240 of social preferences. 408 00:16:31,240 --> 00:16:32,690 I should be more precise. 409 00:16:32,690 --> 00:16:35,920 So the responder cares only about money, 410 00:16:35,920 --> 00:16:38,500 so she will accept any x that's smaller 411 00:16:38,500 --> 00:16:41,230 than 0, any amount that's given, because that's better 412 00:16:41,230 --> 00:16:43,150 than not having any money. 413 00:16:43,150 --> 00:16:46,580 The proposer will understand that, 414 00:16:46,580 --> 00:16:49,670 will only care about money as well, 415 00:16:49,670 --> 00:16:51,700 so offers as little as possible. 416 00:16:51,700 --> 00:16:55,060 The proposer doesn't care about how the recipient is doing. 417 00:16:55,060 --> 00:16:57,130 So therefore, the proposer will just 418 00:16:57,130 --> 00:17:04,213 offer as little as possible to maximize their own payout. 419 00:17:04,213 --> 00:17:06,130 And then an equilibrium, that will essentially 420 00:17:06,130 --> 00:17:09,640 be the zero or smallest possible offer will be made, 421 00:17:09,640 --> 00:17:11,730 and that offer will be accepted. 422 00:17:11,730 --> 00:17:14,230 So there will be no rejections, either because, essentially, 423 00:17:14,230 --> 00:17:16,930 there's no reason to reject, and there will also 424 00:17:16,930 --> 00:17:19,990 be no large offers of the like because there's no reason 425 00:17:19,990 --> 00:17:23,680 to give more than minimal amounts. 426 00:17:23,680 --> 00:17:24,190 OK. 427 00:17:24,190 --> 00:17:26,859 So then what are typical results in the ultimatum game? 428 00:17:26,859 --> 00:17:28,670 What do people actually find? 429 00:17:28,670 --> 00:17:33,310 Well, so most people offer between 40% and 50% of the pie. 430 00:17:33,310 --> 00:17:35,230 Such offers are mostly accepted. 431 00:17:35,230 --> 00:17:37,000 The acceptance rate is increasing 432 00:17:37,000 --> 00:17:39,970 in the offer, which is to say if you offer, like, 10%, 433 00:17:39,970 --> 00:17:42,010 you have a pretty good chance of being rejected. 434 00:17:42,010 --> 00:17:44,315 If you offer, like, 30%, 40%, 50%, 435 00:17:44,315 --> 00:17:45,940 you're much more likely to be accepted. 436 00:17:45,940 --> 00:17:48,340 And you offer, like, 70%, 80%, almost surely 437 00:17:48,340 --> 00:17:49,570 you'll be accepted. 438 00:17:49,570 --> 00:17:52,600 And an offer that's below 20% are mostly rejected. 439 00:17:52,600 --> 00:17:53,230 Why is that? 440 00:17:53,230 --> 00:17:57,290 Or what's the reason for that, presumably? 441 00:17:57,290 --> 00:17:58,638 Yes? 442 00:17:58,638 --> 00:18:00,120 AUDIENCE: [INAUDIBLE]. 443 00:18:05,442 --> 00:18:06,150 PROFESSOR: Right. 444 00:18:06,150 --> 00:18:10,712 And why are you sort of upset if somebody offers you 10%? 445 00:18:10,712 --> 00:18:12,920 AUDIENCE: Because it shows they're disrespecting you. 446 00:18:12,920 --> 00:18:17,610 PROFESSOR: Yeah, something about fairness or disrespect, or it's 447 00:18:17,610 --> 00:18:19,260 just not a nice thing to do. 448 00:18:19,260 --> 00:18:22,080 And now you'd rather not have any money 449 00:18:22,080 --> 00:18:24,780 or not take $1 or whatever they offer to you. 450 00:18:24,780 --> 00:18:26,280 You'd rather reject that, so you're 451 00:18:26,280 --> 00:18:29,820 willing to give up $1 to essentially get back 452 00:18:29,820 --> 00:18:31,410 at the other person. 453 00:18:31,410 --> 00:18:32,190 Yeah? 454 00:18:32,190 --> 00:18:35,170 AUDIENCE: It could also be that for some people, 455 00:18:35,170 --> 00:18:39,648 the thought of settling for such small amounts of money 456 00:18:39,648 --> 00:18:43,660 is embarrassing, or it feels like it's demeaning 457 00:18:43,660 --> 00:18:45,757 or [INAUDIBLE]. 458 00:18:45,757 --> 00:18:47,340 PROFESSOR: Yeah, so it's one of those. 459 00:18:47,340 --> 00:18:48,632 Some of it is-- yeah, exactly-- 460 00:18:48,632 --> 00:18:50,670 I think, demeaning, or just feels like-- 461 00:18:50,670 --> 00:18:54,030 so one way to put this is you have inequality aversion 462 00:18:54,030 --> 00:18:58,230 in some ways in the sense of you just don't want to be 463 00:18:58,230 --> 00:19:00,360 the person who has $1 when the other person has 9, 464 00:19:00,360 --> 00:19:03,190 or you don't want to have $0.10 when the other has, like, 9.90. 465 00:19:03,190 --> 00:19:07,010 So you really don't like inequality in some ways. 466 00:19:07,010 --> 00:19:09,510 There are some versions of this game where the computer sort 467 00:19:09,510 --> 00:19:11,708 of allocates these choices. 468 00:19:11,708 --> 00:19:13,125 And you can look at how people are 469 00:19:13,125 --> 00:19:14,590 rejecting those offers as well. 470 00:19:14,590 --> 00:19:17,220 And they tend to not do that, which sort of suggests 471 00:19:17,220 --> 00:19:21,960 that it's really coming from some sense of people 472 00:19:21,960 --> 00:19:24,060 don't care so much about the outcome, per se. 473 00:19:24,060 --> 00:19:26,970 They care about the fairness or the intention 474 00:19:26,970 --> 00:19:28,252 of the other person. 475 00:19:28,252 --> 00:19:28,960 But you're right. 476 00:19:28,960 --> 00:19:31,680 There's other potential considerations at play. 477 00:19:31,680 --> 00:19:34,600 Now, what do things look like in our class? 478 00:19:34,600 --> 00:19:37,560 In fact, this looks quite similar to what 479 00:19:37,560 --> 00:19:38,940 people do in general. 480 00:19:38,940 --> 00:19:40,590 I know you're very special, but when 481 00:19:40,590 --> 00:19:42,250 it comes to the ultimatum game, you're 482 00:19:42,250 --> 00:19:44,010 actually pretty ordinary. 483 00:19:44,010 --> 00:19:49,150 So what you see is, essentially, most people offer 40%, 50%. 484 00:19:49,150 --> 00:19:51,900 Some people offer somewhat more. 485 00:19:51,900 --> 00:19:54,660 You see in-- you can't see this that well. 486 00:19:54,660 --> 00:19:56,970 You see it in red are the rejections. 487 00:19:56,970 --> 00:19:59,400 So rejections tend to be on the left side, 488 00:19:59,400 --> 00:20:05,670 in particular when people offer between 1% and 10%. 489 00:20:05,670 --> 00:20:08,130 There seem to be around some rejections, so 50%, 490 00:20:08,130 --> 00:20:09,930 or maybe it's just below 50%. 491 00:20:09,930 --> 00:20:13,620 But usually, if you look at, essentially, about 50%, 492 00:20:13,620 --> 00:20:19,410 or 40%, 50%, 60%, people tend to not reject any offers. 493 00:20:19,410 --> 00:20:20,460 Any questions on that? 494 00:20:24,580 --> 00:20:25,434 Yeah? 495 00:20:25,434 --> 00:20:28,030 AUDIENCE: Could it also be that people 496 00:20:28,030 --> 00:20:29,590 are anticipating future games? 497 00:20:29,590 --> 00:20:32,170 Or maybe not [INAUDIBLE],, but sending a signal, 498 00:20:32,170 --> 00:20:34,080 like there's any similar [INAUDIBLE] 499 00:20:34,080 --> 00:20:35,583 be better off [INAUDIBLE]? 500 00:20:35,583 --> 00:20:37,250 PROFESSOR: Yes, that's a great question. 501 00:20:37,250 --> 00:20:38,470 So I kind of-- 502 00:20:38,470 --> 00:20:40,090 all of what I was doing, and I should 503 00:20:40,090 --> 00:20:42,790 emphasize this a bit more-- 504 00:20:42,790 --> 00:20:45,760 the way I implemented some of these games in class, 505 00:20:45,760 --> 00:20:48,130 some experimental economists would get a heart attack 506 00:20:48,130 --> 00:20:51,490 in terms of how sloppy that was in a sense of, like, 507 00:20:51,490 --> 00:20:55,390 a lot of these games, usually when I show you these results, 508 00:20:55,390 --> 00:20:57,790 these are usually very careful games, where, essentially, 509 00:20:57,790 --> 00:20:58,900 usually it's anonymous. 510 00:20:58,900 --> 00:21:00,160 Usually it's a one-shot game. 511 00:21:00,160 --> 00:21:02,590 Usually it's very clear that it's a one-shot game. 512 00:21:02,590 --> 00:21:04,180 You will just never see that person. 513 00:21:04,180 --> 00:21:06,100 You never even interact with that person. 514 00:21:06,100 --> 00:21:09,040 It'll be clear whether it's public versus private, 515 00:21:09,040 --> 00:21:09,950 and so on. 516 00:21:09,950 --> 00:21:13,180 So usually in those games, things are extremely clear. 517 00:21:13,180 --> 00:21:15,373 There is no strategic interactions, and so on. 518 00:21:15,373 --> 00:21:17,290 All of these things are essentially shut down. 519 00:21:17,290 --> 00:21:20,290 And then you can look at, randomly vary, 520 00:21:20,290 --> 00:21:23,313 some of these characteristics and look at deviations of that. 521 00:21:23,313 --> 00:21:23,980 So you're right. 522 00:21:23,980 --> 00:21:26,710 In our game, in fact, you might sort of 523 00:21:26,710 --> 00:21:28,493 think that, let's just be nice now, 524 00:21:28,493 --> 00:21:30,160 in part, because the other person, maybe 525 00:21:30,160 --> 00:21:31,240 I play with that person again. 526 00:21:31,240 --> 00:21:33,430 And I wasn't really very precise about that either. 527 00:21:33,430 --> 00:21:36,280 I think actually they're all like randomly allocated. 528 00:21:36,280 --> 00:21:38,410 So in fact, that was not a consideration, 529 00:21:38,410 --> 00:21:39,910 but it wasn't entirely clear. 530 00:21:39,910 --> 00:21:42,430 Or for example, some people were selected, 531 00:21:42,430 --> 00:21:44,920 and it happened to be that it was 50/50 selection. 532 00:21:44,920 --> 00:21:47,570 That may have affected others in the class, and so on. 533 00:21:47,570 --> 00:21:49,510 So a lot of these things were fairly sloppy 534 00:21:49,510 --> 00:21:51,700 in terms of the actual implementation. 535 00:21:51,700 --> 00:21:54,220 In the real games, that would be all shut down 536 00:21:54,220 --> 00:21:56,832 and done much more carefully. 537 00:21:56,832 --> 00:21:58,540 There were some other questions, I think? 538 00:21:58,540 --> 00:22:00,820 No. 539 00:22:00,820 --> 00:22:01,540 OK. 540 00:22:01,540 --> 00:22:02,050 Yeah? 541 00:22:02,050 --> 00:22:04,090 AUDIENCE: Was this just for the chocolate? 542 00:22:04,090 --> 00:22:06,310 PROFESSOR: I think this is-- 543 00:22:06,310 --> 00:22:07,450 I looked at all of these. 544 00:22:07,450 --> 00:22:09,560 The ultimatum games all looked very similar. 545 00:22:09,560 --> 00:22:12,460 So this is, I think, for-- 546 00:22:12,460 --> 00:22:14,320 this is the money one. 547 00:22:14,320 --> 00:22:16,250 But even for others, it looked fairly similar. 548 00:22:16,250 --> 00:22:19,060 I'll show you some variation in the dictator 549 00:22:19,060 --> 00:22:24,160 game in a second, where we have, like, do the stakes matter? 550 00:22:24,160 --> 00:22:26,470 Does it matter whether it's private versus public? 551 00:22:26,470 --> 00:22:29,230 Does it matter whether it's chocolate or money? 552 00:22:29,230 --> 00:22:30,910 And perhaps, why might it matter? 553 00:22:30,910 --> 00:22:32,830 We'll get back to that in a second. 554 00:22:32,830 --> 00:22:34,420 Turns out, it does matter. 555 00:22:34,420 --> 00:22:37,360 I'll say a little bit about that in a second. 556 00:22:37,360 --> 00:22:38,136 Yeah? 557 00:22:38,136 --> 00:22:41,520 AUDIENCE: So the fraction of rejection 558 00:22:41,520 --> 00:22:46,420 in the green [INAUDIBLE] others, and then [INAUDIBLE] 559 00:22:46,420 --> 00:22:48,730 is much higher. 560 00:22:48,730 --> 00:22:51,617 The number of rejections in, like, $1 to $10 561 00:22:51,617 --> 00:22:53,433 much higher than [INAUDIBLE]? 562 00:22:53,433 --> 00:22:54,100 PROFESSOR: Yeah. 563 00:22:54,100 --> 00:22:56,750 AUDIENCE: [INAUDIBLE] explain the notion of fairness 564 00:22:56,750 --> 00:23:00,178 or the equitable distribution of the pie you just mentioned, 565 00:23:00,178 --> 00:23:02,016 because [INAUDIBLE] accept when they 566 00:23:02,016 --> 00:23:05,305 are being given 90% of the pie, whereas they are just not 567 00:23:05,305 --> 00:23:07,912 ready to accept [INAUDIBLE]. 568 00:23:07,912 --> 00:23:08,620 PROFESSOR: Right. 569 00:23:08,620 --> 00:23:09,578 So that's a good point. 570 00:23:09,578 --> 00:23:11,680 So you were saying earlier, maybe some people 571 00:23:11,680 --> 00:23:16,828 have some issues about inequity or inequality across people. 572 00:23:16,828 --> 00:23:18,370 So if that were the only explanation, 573 00:23:18,370 --> 00:23:21,100 then you would rejections not only at, like, 574 00:23:21,100 --> 00:23:25,240 if I give you $1, and I keep 9, but also if they give you $9, 575 00:23:25,240 --> 00:23:27,370 and I keep 1. 576 00:23:27,370 --> 00:23:29,050 So that means there's something else. 577 00:23:29,050 --> 00:23:33,070 I think the fairness, people don't object to inequality. 578 00:23:33,070 --> 00:23:35,855 If I say, OK, I'll give you a lot, 579 00:23:35,855 --> 00:23:38,230 and you have more than I do, people don't object to that. 580 00:23:38,230 --> 00:23:41,740 People object to, if I keep a lot and give you very little, 581 00:23:41,740 --> 00:23:45,130 presumably because I'm selfish, and I'm not nice to you. 582 00:23:45,130 --> 00:23:47,510 So that's kind of the sense of fairness. 583 00:23:47,510 --> 00:23:50,115 So it's a fairness that goes beyond the outcomes. 584 00:23:50,115 --> 00:23:51,490 So we're going to talk about this 585 00:23:51,490 --> 00:23:54,073 for a bit, which is to say you could just look at the outcomes 586 00:23:54,073 --> 00:23:56,200 and say, how much is the inequality of outcomes? 587 00:23:56,200 --> 00:23:57,928 And I care about that. 588 00:23:57,928 --> 00:23:59,470 But that doesn't seem to be the case. 589 00:23:59,470 --> 00:24:00,928 What seems to be rather the case is 590 00:24:00,928 --> 00:24:02,440 that people care about intentions 591 00:24:02,440 --> 00:24:03,350 one way or the other. 592 00:24:03,350 --> 00:24:07,690 It's like, what was I thinking, or was I a mean person when 593 00:24:07,690 --> 00:24:08,750 I was doing this? 594 00:24:08,750 --> 00:24:11,110 And if you get the sense that I was not very nice, 595 00:24:11,110 --> 00:24:14,230 you might reject almost regardless of what 596 00:24:14,230 --> 00:24:15,970 the outcome will be eventually. 597 00:24:15,970 --> 00:24:19,240 And of course, you will reject less if it's 598 00:24:19,240 --> 00:24:21,530 more costly for you to do so. 599 00:24:21,530 --> 00:24:24,380 But perhaps it's also the case if I do, like, 70-30, 600 00:24:24,380 --> 00:24:27,070 you might say, well, that's maybe not nice, 601 00:24:27,070 --> 00:24:30,280 but it's also not that unfair either. 602 00:24:30,280 --> 00:24:30,780 OK. 603 00:24:33,770 --> 00:24:34,600 Right. 604 00:24:34,600 --> 00:24:37,990 So now, one thing you can ask is how do these games 605 00:24:37,990 --> 00:24:40,570 look like across countries? 606 00:24:40,570 --> 00:24:44,110 And as I said, this game has been extensively played. 607 00:24:44,110 --> 00:24:46,060 It's a game that like lots and lots of people 608 00:24:46,060 --> 00:24:49,030 have played in all sorts of situations. 609 00:24:49,030 --> 00:24:53,505 And the patterns are remarkably stable across countries. 610 00:24:53,505 --> 00:24:54,880 And this is also the case-- like, 611 00:24:54,880 --> 00:24:58,690 every time I play this in class, these patterns 612 00:24:58,690 --> 00:24:59,887 look exactly the same. 613 00:24:59,887 --> 00:25:01,720 So whether people are playing in Pittsburgh, 614 00:25:01,720 --> 00:25:06,670 in Ljubljana, in Yogyakarta, Tokyo, play all essentially 615 00:25:06,670 --> 00:25:09,460 the same way. 616 00:25:09,460 --> 00:25:13,090 And even increasing people's stakes to, like, several months 617 00:25:13,090 --> 00:25:16,818 pay by running essentially the experiment 618 00:25:16,818 --> 00:25:18,610 with poorer populations, where you can pay, 619 00:25:18,610 --> 00:25:23,530 essentially, days' or weeks' or even a month's pay, 620 00:25:23,530 --> 00:25:30,220 people seem to play very similar across places? 621 00:25:30,220 --> 00:25:30,850 Why is that? 622 00:25:30,850 --> 00:25:32,830 It seems to be like somehow rejecting 623 00:25:32,830 --> 00:25:35,350 becomes a lot more costly when I offer you, 624 00:25:35,350 --> 00:25:37,600 like, a day's wage or the like. 625 00:25:37,600 --> 00:25:40,140 But why do you think it looks fairly similar as well? 626 00:25:40,140 --> 00:25:42,730 Why do people still reject? 627 00:25:42,730 --> 00:25:43,571 Yes? 628 00:25:43,571 --> 00:25:45,735 AUDIENCE: If your goal is to get back 629 00:25:45,735 --> 00:25:48,862 at the other person, if the pool is just bigger, 630 00:25:48,862 --> 00:25:51,598 then you're hurting them more by denying them. 631 00:25:51,598 --> 00:25:52,390 PROFESSOR: Exactly. 632 00:25:52,390 --> 00:25:53,470 So it's exactly as you said. 633 00:25:53,470 --> 00:25:54,670 There's two things going on. 634 00:25:54,670 --> 00:25:57,860 On the one hand, rejection becomes really expensive to me. 635 00:25:57,860 --> 00:26:01,930 So if there's a month's wage, and you give me 10% of that 636 00:26:01,930 --> 00:26:07,737 and keep 90% of that, rejecting 10% off a month's wage, 637 00:26:07,737 --> 00:26:09,320 that's three or four days or whatever. 638 00:26:09,320 --> 00:26:12,070 That's really costly for me to do. 639 00:26:12,070 --> 00:26:15,532 However, it's also really mean of you to do that, 640 00:26:15,532 --> 00:26:16,990 and I can really screw you over now 641 00:26:16,990 --> 00:26:20,590 by sort of rejecting that offer. 642 00:26:20,590 --> 00:26:23,590 So those things essentially seem to be more or less [INAUDIBLE].. 643 00:26:23,590 --> 00:26:25,180 They go into opposite directions. 644 00:26:25,180 --> 00:26:29,300 Empirically it seems to be that cancels each other out. 645 00:26:29,300 --> 00:26:31,660 That's exactly right. 646 00:26:31,660 --> 00:26:36,370 So now, there's a very interesting paper then 647 00:26:36,370 --> 00:26:41,120 that does this ultimatum game in hunter-gatherer societies 648 00:26:41,120 --> 00:26:44,560 and asks the question about, like, while it 649 00:26:44,560 --> 00:26:46,510 seems to be that people behave fairly 650 00:26:46,510 --> 00:26:50,410 similar in these games in industrialized societies-- 651 00:26:50,410 --> 00:26:52,480 whether they're relatively rich or not, 652 00:26:52,480 --> 00:26:54,770 people seem to behave very similarly. 653 00:26:54,770 --> 00:26:57,220 But perhaps if you look at hunter-gatherer society that 654 00:26:57,220 --> 00:26:59,350 are quite different in various ways, perhaps 655 00:26:59,350 --> 00:27:02,870 there we can look at interesting differences across places. 656 00:27:02,870 --> 00:27:06,850 And so on the one hand, we see there's 657 00:27:06,850 --> 00:27:08,560 lots of uniformity of play. 658 00:27:08,560 --> 00:27:12,470 In lots of places, people seem to behave very similarly. 659 00:27:12,470 --> 00:27:14,660 But there are some important differences. 660 00:27:14,660 --> 00:27:19,240 And perhaps what this line of research is doing-- 661 00:27:19,240 --> 00:27:22,310 it's often called cultural economics-- 662 00:27:22,310 --> 00:27:24,520 is to try to see what is the role of culture. 663 00:27:24,520 --> 00:27:27,220 And there's Nathan Nunn and other people at Harvard 664 00:27:27,220 --> 00:27:30,700 are doing this in particular, inspired by Joe Henrich, who 665 00:27:30,700 --> 00:27:34,870 is the leader of that, is trying to look at, like, 666 00:27:34,870 --> 00:27:38,410 what is the role of culture in shaping people's preferences? 667 00:27:38,410 --> 00:27:40,690 And so one thing they had done-- and at the time, 668 00:27:40,690 --> 00:27:44,050 they looked at 15 different hunter-gatherer societies. 669 00:27:44,050 --> 00:27:45,460 You can see them here on the map. 670 00:27:45,460 --> 00:27:49,837 Some of them are in South America. 671 00:27:49,837 --> 00:27:50,920 Some are in South America. 672 00:27:50,920 --> 00:27:52,960 Some are in Africa. 673 00:27:52,960 --> 00:27:57,700 And some are in Asia, Oceania. 674 00:27:57,700 --> 00:28:01,180 And so you could look at these different societies. 675 00:28:01,180 --> 00:28:06,430 And by looking at the way they are living and producing, 676 00:28:06,430 --> 00:28:09,490 you can try to think about potential predictions of how 677 00:28:09,490 --> 00:28:13,600 they might behave and social preferences behavior. 678 00:28:13,600 --> 00:28:15,980 And there's two examples here that we have. 679 00:28:15,980 --> 00:28:22,150 One is the society that you can see here, the Machiguenga, 680 00:28:22,150 --> 00:28:24,850 which are, I guess, in-- 681 00:28:24,850 --> 00:28:25,976 which country is this? 682 00:28:25,976 --> 00:28:26,728 AUDIENCE: Peru. 683 00:28:26,728 --> 00:28:27,520 PROFESSOR: In Peru? 684 00:28:27,520 --> 00:28:28,300 AUDIENCE: Peru. 685 00:28:28,300 --> 00:28:31,300 PROFESSOR: Who are in Peru in Latin America. 686 00:28:31,300 --> 00:28:32,080 You can see them. 687 00:28:32,080 --> 00:28:34,090 They essentially are independent families. 688 00:28:34,090 --> 00:28:36,010 They do a lot of cash cropping. 689 00:28:36,010 --> 00:28:39,880 Their way of living appears to be slash-and-burn agriculture, 690 00:28:39,880 --> 00:28:41,795 gathering food, fishing, and hunting. 691 00:28:41,795 --> 00:28:44,170 So it's overall-- and you can correct me if that's wrong. 692 00:28:44,170 --> 00:28:46,870 It seems to be fairly individualistic in the sense 693 00:28:46,870 --> 00:28:48,640 of the way people produce. 694 00:28:48,640 --> 00:28:51,160 Essentially, they live individualistically. 695 00:28:51,160 --> 00:28:53,830 They are not really relying on a lot of cooperation 696 00:28:53,830 --> 00:28:59,590 in the way they produce their food or their living. 697 00:28:59,590 --> 00:29:05,180 In contrast, the Lamaleras in Indonesia, 698 00:29:05,180 --> 00:29:06,310 these are whale hunters. 699 00:29:06,310 --> 00:29:08,650 Essentially, when you try to hunt a whale-- 700 00:29:08,650 --> 00:29:09,850 I don't quite know. 701 00:29:09,850 --> 00:29:11,380 I imagine, at least-- 702 00:29:11,380 --> 00:29:13,960 it requires lots of cooperation in the sense of, 703 00:29:13,960 --> 00:29:16,090 like, people have to work together 704 00:29:16,090 --> 00:29:23,620 by nature of essentially finding food or hunting these whales. 705 00:29:23,620 --> 00:29:28,730 And so they on a daily basis are really relying on cooperation. 706 00:29:28,730 --> 00:29:32,410 And so now this line of research by Henrich and others 707 00:29:32,410 --> 00:29:38,530 is trying to see, by looking at people's way of living 708 00:29:38,530 --> 00:29:40,390 and way of producing on any given life, 709 00:29:40,390 --> 00:29:44,830 is that predictive of people's behavior in ultimatum games? 710 00:29:44,830 --> 00:29:47,350 And what you see is the following. 711 00:29:47,350 --> 00:29:49,790 I'll just show you what results are. 712 00:29:49,790 --> 00:29:54,550 So the society in Peru, the slash-and-burn 713 00:29:54,550 --> 00:29:57,850 horticulturalists, seem to have little concerns 714 00:29:57,850 --> 00:30:01,360 for others outside of the family or for social status. 715 00:30:01,360 --> 00:30:06,400 What you see essentially is relatively low offers. 716 00:30:06,400 --> 00:30:10,240 Most of these offers are being accepted. 717 00:30:10,240 --> 00:30:13,120 There's essentially very low rejection rates. 718 00:30:13,120 --> 00:30:14,965 And that seems to be essentially-- 719 00:30:14,965 --> 00:30:19,370 and the rejections of amounts that are below 20% 720 00:30:19,370 --> 00:30:20,950 is only, like, 1 out of 10. 721 00:30:20,950 --> 00:30:24,460 So it seems to be the case, this society 722 00:30:24,460 --> 00:30:26,860 that looks a lot more individualistic 723 00:30:26,860 --> 00:30:30,460 is, in fact, much closer to perhaps the assumption, 724 00:30:30,460 --> 00:30:33,460 the neoclassical assumptions of no social preferences mattering 725 00:30:33,460 --> 00:30:36,790 in the sense of they seem to be-- 726 00:30:36,790 --> 00:30:40,000 in the way, at least, they play the ultimatum game, 727 00:30:40,000 --> 00:30:42,340 they seem to be happy to offer low amounts, 728 00:30:42,340 --> 00:30:46,540 and they seem to also be not upset or annoyed or getting 729 00:30:46,540 --> 00:30:49,780 back to others who offer them low amounts. 730 00:30:49,780 --> 00:30:52,240 In contrast, if you look at the Lamaleras, 731 00:30:52,240 --> 00:30:54,970 the whale hunting culture based on strong and meticulously 732 00:30:54,970 --> 00:31:01,450 detailed corporation, what you see is the mean offer is 57%, 733 00:31:01,450 --> 00:31:04,120 or the mode is 50%. 734 00:31:04,120 --> 00:31:06,610 There are quite a few rejections, 735 00:31:06,610 --> 00:31:09,700 in particular, rejections of small amounts. 736 00:31:09,700 --> 00:31:12,430 And that's to say, like, there's really an expectation 737 00:31:12,430 --> 00:31:14,470 of people cooperating. 738 00:31:14,470 --> 00:31:16,690 And if you don't cooperate, if you are not 739 00:31:16,690 --> 00:31:18,430 playing by those kinds of rules, people 740 00:31:18,430 --> 00:31:21,007 are being punished accordingly. 741 00:31:21,007 --> 00:31:22,090 And there are some others. 742 00:31:22,090 --> 00:31:29,680 There's sort of another culture of the Au/Gnau, 743 00:31:29,680 --> 00:31:32,710 which is a culture where gift giving is an avenue to status 744 00:31:32,710 --> 00:31:35,837 and gives the right to ask for reciprocity at any time. 745 00:31:35,837 --> 00:31:38,170 So that's the thing that you often see in India as well, 746 00:31:38,170 --> 00:31:42,610 that people tend to reject offers 747 00:31:42,610 --> 00:31:45,760 that are very large in part because they 748 00:31:45,760 --> 00:31:46,990 think that's inappropriate. 749 00:31:46,990 --> 00:31:49,365 They think that's just a weird thing to do in a sense of, 750 00:31:49,365 --> 00:31:51,550 like, I offer you, like, 50% or 60%, 751 00:31:51,550 --> 00:31:53,890 they just think, like, if I accept this offer, now 752 00:31:53,890 --> 00:31:58,360 I owe you something and have to give you stuff back. 753 00:31:58,360 --> 00:32:00,034 You are nodding. 754 00:32:00,034 --> 00:32:00,930 AUDIENCE: Oh, yes. 755 00:32:00,930 --> 00:32:01,555 PROFESSOR: Yes. 756 00:32:01,555 --> 00:32:06,540 So that's in India as well to some degree as far as-- 757 00:32:06,540 --> 00:32:07,580 yeah. 758 00:32:07,580 --> 00:32:08,615 Any comments on this? 759 00:32:08,615 --> 00:32:09,115 Yeah. 760 00:32:09,115 --> 00:32:11,840 AUDIENCE: Is that last experiment anonymous? 761 00:32:11,840 --> 00:32:14,240 Because if it's anonymous, then there's 762 00:32:14,240 --> 00:32:18,102 no sense of this specific person [INAUDIBLE].. 763 00:32:18,102 --> 00:32:19,560 PROFESSOR: That's a great question. 764 00:32:19,560 --> 00:32:21,680 I think it may not have been. 765 00:32:21,680 --> 00:32:23,180 I need to check. 766 00:32:23,180 --> 00:32:27,260 I think-- so there's some question on how closely in some 767 00:32:27,260 --> 00:32:28,010 sense to the-- 768 00:32:30,650 --> 00:32:32,290 these are 15 societies. 769 00:32:32,290 --> 00:32:34,760 So in some sense, these are not strictly 770 00:32:34,760 --> 00:32:37,940 speaking statistical tests of, like, 771 00:32:37,940 --> 00:32:40,460 does culture cause this behavior? 772 00:32:40,460 --> 00:32:42,830 Because you only have essentially 15 observations 773 00:32:42,830 --> 00:32:46,260 of certain ways people behave. 774 00:32:46,260 --> 00:32:49,845 So in some sense, the mapping between the description 775 00:32:49,845 --> 00:32:51,470 of what I was telling you, which surely 776 00:32:51,470 --> 00:32:54,440 was inaccurate in terms of how people really 777 00:32:54,440 --> 00:32:57,110 live in those societies, and their behavior 778 00:32:57,110 --> 00:32:59,220 is more a descriptive mapping. 779 00:32:59,220 --> 00:33:02,480 So in some ways, it could be that even if it were anonymous, 780 00:33:02,480 --> 00:33:04,790 people behave in certain ways that reflects 781 00:33:04,790 --> 00:33:09,155 a culture of cooperation is to say, it's just inappropriate-- 782 00:33:09,155 --> 00:33:10,910 or reflect, essentially, when they say, 783 00:33:10,910 --> 00:33:13,760 if somebody gives you a large amount or a large gift, 784 00:33:13,760 --> 00:33:15,380 you're not supposed to accept it. 785 00:33:15,380 --> 00:33:18,260 You might sort of not accept it even in an anonymous game 786 00:33:18,260 --> 00:33:21,200 because you feel bad about it afterwards 787 00:33:21,200 --> 00:33:26,150 because you can't reciprocate, if that makes sense. 788 00:33:26,150 --> 00:33:29,000 But I have to look it up, what the specific details were. 789 00:33:29,000 --> 00:33:31,700 The paper is also-- and it's a very short paper. 790 00:33:31,700 --> 00:33:34,280 So it's also on the course website if you want to look. 791 00:33:36,800 --> 00:33:39,440 Any other comments? 792 00:33:39,440 --> 00:33:41,710 Are you from Peru? 793 00:33:41,710 --> 00:33:42,960 Do you know about the society? 794 00:33:42,960 --> 00:33:45,200 Is that remotely accurate what I was saying? 795 00:33:45,200 --> 00:33:46,980 AUDIENCE: I'm not really sure. 796 00:33:46,980 --> 00:33:49,290 PROFESSOR: All right, well, you can look into this 797 00:33:49,290 --> 00:33:51,570 and report back. 798 00:33:51,570 --> 00:33:52,140 OK. 799 00:33:52,140 --> 00:33:55,260 But anyway, it's a very interesting application 800 00:33:55,260 --> 00:33:58,410 of the ultimatum game because it essentially tries to say, 801 00:33:58,410 --> 00:34:00,390 we're trying to look at how do people 802 00:34:00,390 --> 00:34:01,590 relate in the real world. 803 00:34:01,590 --> 00:34:03,840 And looking at these different societies 804 00:34:03,840 --> 00:34:05,580 gives you some sense of, like, it 805 00:34:05,580 --> 00:34:08,820 does seem to capture something in the real world of how 806 00:34:08,820 --> 00:34:12,670 people are, in fact, behaving. 807 00:34:12,670 --> 00:34:13,650 OK. 808 00:34:13,650 --> 00:34:17,040 So now we can sort think about what's 809 00:34:17,040 --> 00:34:18,306 going on in ultimatum games. 810 00:34:18,306 --> 00:34:19,889 And we talked about this a little bit, 811 00:34:19,889 --> 00:34:21,681 but I want to be a little bit more precise. 812 00:34:21,681 --> 00:34:24,659 It's like, why do responders reject low offers? 813 00:34:24,659 --> 00:34:26,413 What's going on here? 814 00:34:26,413 --> 00:34:27,330 What are people doing? 815 00:34:32,710 --> 00:34:33,433 Yes? 816 00:34:33,433 --> 00:34:35,620 AUDIENCE: [INAUDIBLE] spite. 817 00:34:35,620 --> 00:34:37,570 PROFESSOR: Yes, and so the spite is-- 818 00:34:37,570 --> 00:34:39,489 so what exactly are you objecting to? 819 00:34:39,489 --> 00:34:43,690 I asked this before, but let me ask again. 820 00:34:43,690 --> 00:34:46,664 AUDIENCE: So I think a lot of things have already been said. 821 00:34:46,664 --> 00:34:51,750 If you will reject, you think it's unfair, so much so 822 00:34:51,750 --> 00:34:53,719 that you would rather that they get nothing, 823 00:34:53,719 --> 00:34:55,605 you both get nothing, than [INAUDIBLE] 824 00:34:55,605 --> 00:34:57,032 take this pity offer. 825 00:34:57,032 --> 00:34:57,740 PROFESSOR: Right. 826 00:34:57,740 --> 00:34:58,910 So there's two things. 827 00:34:58,910 --> 00:35:01,920 Let me put them out here. 828 00:35:01,920 --> 00:35:02,920 There's two things here. 829 00:35:02,920 --> 00:35:04,390 You're talking about the second thing, I think, 830 00:35:04,390 --> 00:35:05,680 which is the fairness part. 831 00:35:05,680 --> 00:35:09,460 So one is about a procedural thing, which is to say, 832 00:35:09,460 --> 00:35:11,440 I object to procedure. 833 00:35:11,440 --> 00:35:15,430 I object to you not treating me nicely or fairly. 834 00:35:15,430 --> 00:35:16,960 I'm just mad at you for doing that. 835 00:35:16,960 --> 00:35:18,980 Therefore, I reject. 836 00:35:18,980 --> 00:35:20,500 Hypothesis number one-- this is what 837 00:35:20,500 --> 00:35:22,690 we were talking about earlier, which is to say, 838 00:35:22,690 --> 00:35:24,280 I just dislike being behind. 839 00:35:24,280 --> 00:35:26,150 You offer me-- you keep 6. 840 00:35:26,150 --> 00:35:26,830 I have 4. 841 00:35:26,830 --> 00:35:30,730 I just don't like having less than somebody else, 842 00:35:30,730 --> 00:35:33,670 and I might reject that as well. 843 00:35:33,670 --> 00:35:36,580 Can we disentangle those explanations? 844 00:35:41,100 --> 00:35:42,870 From dictator game data? 845 00:35:52,020 --> 00:35:52,808 Yes? 846 00:35:52,808 --> 00:35:57,390 AUDIENCE: You can have some [INAUDIBLE].. 847 00:35:57,390 --> 00:36:03,030 I guess, reject [INAUDIBLE]. 848 00:36:03,030 --> 00:36:03,810 PROFESSOR: Right. 849 00:36:03,810 --> 00:36:06,120 So from the data that I just showed, 850 00:36:06,120 --> 00:36:08,760 or from the pure, simple dictator game, 851 00:36:08,760 --> 00:36:10,380 it's very hard to disentangle these 852 00:36:10,380 --> 00:36:12,990 because it looks the same. 853 00:36:12,990 --> 00:36:15,720 But what you're saying now is we can have different versions 854 00:36:15,720 --> 00:36:19,185 of that where you could have either the computer implement 855 00:36:19,185 --> 00:36:23,430 it, or you could choose for other people, and so on. 856 00:36:23,430 --> 00:36:24,930 I think that's what you were saying. 857 00:36:24,930 --> 00:36:27,250 And then from that, you could say, well, 858 00:36:27,250 --> 00:36:30,720 if you're also rejecting that, then perhaps it's 859 00:36:30,720 --> 00:36:32,550 not about the procedure per se. 860 00:36:32,550 --> 00:36:36,270 If the computer chose 6 and 4 and you reject it anyway, 861 00:36:36,270 --> 00:36:38,370 then it's not to do with fairness, 862 00:36:38,370 --> 00:36:40,800 because the computer was perhaps just randomizing. 863 00:36:40,800 --> 00:36:42,750 But it's rather to do with, like, I 864 00:36:42,750 --> 00:36:46,660 don't like to have less than somebody else in the same game. 865 00:36:46,660 --> 00:36:48,397 Right? 866 00:36:48,397 --> 00:36:48,980 AUDIENCE: Yes. 867 00:36:48,980 --> 00:36:50,480 So if you used a computer, you would 868 00:36:50,480 --> 00:36:54,530 be able to [INAUDIBLE] just disliking [INAUDIBLE] less. 869 00:36:54,530 --> 00:37:01,416 If you have an individual respond to [INAUDIBLE] accept 870 00:37:01,416 --> 00:37:04,083 or reject an offer that somebody else made, 871 00:37:04,083 --> 00:37:05,215 it's quite [INAUDIBLE]-- 872 00:37:05,215 --> 00:37:05,360 PROFESSOR: Right. 873 00:37:05,360 --> 00:37:07,813 AUDIENCE: Then you would see whether they were just 874 00:37:07,813 --> 00:37:08,722 trying to punish. 875 00:37:08,722 --> 00:37:09,430 PROFESSOR: I see. 876 00:37:09,430 --> 00:37:10,130 Yes, I see. 877 00:37:10,130 --> 00:37:10,630 I see. 878 00:37:10,630 --> 00:37:12,170 So yes, you can do that. 879 00:37:12,170 --> 00:37:14,780 There's a bit of an issue, a little bit like-- 880 00:37:14,780 --> 00:37:19,250 another version of that would be just I don't like inequality, 881 00:37:19,250 --> 00:37:21,590 but I just might want to-- 882 00:37:21,590 --> 00:37:25,082 I just really like 5-5 or over 50-50 outcomes. 883 00:37:25,082 --> 00:37:27,540 There, I think we can look at what you were saying earlier, 884 00:37:27,540 --> 00:37:31,010 which is to say I could see if there's a third party that 885 00:37:31,010 --> 00:37:34,160 looks at somebody choosing 60 for themselves 886 00:37:34,160 --> 00:37:37,700 and 40 for somebody else, versus somebody choosing 40 887 00:37:37,700 --> 00:37:39,860 for themselves and 60 for somebody else. 888 00:37:39,860 --> 00:37:41,510 If you really dislike inequality, 889 00:37:41,510 --> 00:37:43,970 you would just reject both of those things. 890 00:37:43,970 --> 00:37:48,230 And if you sort of instead think this 891 00:37:48,230 --> 00:37:50,700 is like-- you dislike inequality. 892 00:37:50,700 --> 00:37:55,940 If the person only does that for themselves in some selfish way, 893 00:37:55,940 --> 00:38:00,660 you would only reject the 60-40, but not the 40-60. 894 00:38:00,660 --> 00:38:01,160 Great. 895 00:38:04,990 --> 00:38:05,490 Right. 896 00:38:05,490 --> 00:38:09,553 So the answer is no, we cannot distinguish these motives based 897 00:38:09,553 --> 00:38:10,470 on the ultimatum game. 898 00:38:10,470 --> 00:38:13,290 So we need some additional evidence to do that. 899 00:38:13,290 --> 00:38:15,750 We're going to do that mostly next time. 900 00:38:15,750 --> 00:38:19,180 We're going to try to make some progress towards that. 901 00:38:19,180 --> 00:38:22,380 Now, if you now think about the proposer's motive-- 902 00:38:22,380 --> 00:38:25,230 if a proposer gives a large or a small amount, what can 903 00:38:25,230 --> 00:38:26,100 we learn from that? 904 00:38:26,100 --> 00:38:28,767 Or why is it difficult to figure out what the proposal is doing? 905 00:38:32,460 --> 00:38:34,800 If you see somebody gives a large amount of money, 906 00:38:34,800 --> 00:38:36,200 what have you learned from that? 907 00:38:36,200 --> 00:38:36,934 Yeah? 908 00:38:36,934 --> 00:38:39,825 AUDIENCE: It could be that they're just being optimistic, 909 00:38:39,825 --> 00:38:44,040 or it could be that they have beliefs about what 910 00:38:44,040 --> 00:38:48,347 the responder would reject versus what the responder would 911 00:38:48,347 --> 00:38:49,048 not reject. 912 00:38:49,048 --> 00:38:51,138 And they could reject [INAUDIBLE].. 913 00:38:51,138 --> 00:38:51,930 PROFESSOR: Exactly. 914 00:38:51,930 --> 00:38:54,087 So it could be that I'm just trying to be nice. 915 00:38:54,087 --> 00:38:55,170 I'm trying to be friendly. 916 00:38:55,170 --> 00:38:55,980 I care about you. 917 00:38:55,980 --> 00:39:00,420 I offer you a 50-50 because I want you to have half. 918 00:39:00,420 --> 00:39:02,010 Or it could be that I just actually 919 00:39:02,010 --> 00:39:03,420 don't care about you at all. 920 00:39:03,420 --> 00:39:05,753 I just think if I offer you only 30 or 20 921 00:39:05,753 --> 00:39:07,170 or whatever, there's a good chance 922 00:39:07,170 --> 00:39:08,610 that you might reject it. 923 00:39:08,610 --> 00:39:12,510 Therefore, I just optimize over my-- 924 00:39:12,510 --> 00:39:14,490 so it depends on my beliefs about what 925 00:39:14,490 --> 00:39:16,110 the rejection rates are. 926 00:39:16,110 --> 00:39:20,790 And if the beliefs are that below 30% or 40% people's 927 00:39:20,790 --> 00:39:23,520 rejection rates are positive and sort of higher, 928 00:39:23,520 --> 00:39:25,770 I might essentially just maximize depending on my risk 929 00:39:25,770 --> 00:39:26,858 preferences or whatever. 930 00:39:26,858 --> 00:39:27,900 Suppose I'm risk-neutral. 931 00:39:27,900 --> 00:39:30,930 I might just maximize expected payout. 932 00:39:30,930 --> 00:39:33,240 It might be that essentially 40% or 50% 933 00:39:33,240 --> 00:39:37,320 is just the optimal solution to that maximization problem, 934 00:39:37,320 --> 00:39:40,240 and I just don't care about you whatsoever. 935 00:39:40,240 --> 00:39:42,210 So if I'm looking at this behavior essentially, 936 00:39:42,210 --> 00:39:44,670 if I'm looking at an ultimatum game 937 00:39:44,670 --> 00:39:48,120 and looking at a proposer offering large amounts 938 00:39:48,120 --> 00:39:50,730 or positive amounts, you can learn nothing about their-- 939 00:39:50,730 --> 00:39:52,200 or it's very hard to learn anything 940 00:39:52,200 --> 00:39:55,110 about their social preferences, because essentially there's 941 00:39:55,110 --> 00:39:58,860 strategic motives potentially involved as well. 942 00:39:58,860 --> 00:39:59,790 Is that clear? 943 00:39:59,790 --> 00:40:01,154 Or any questions? 944 00:40:05,050 --> 00:40:06,300 So what do we need to do then? 945 00:40:06,300 --> 00:40:11,260 Well, now we need to sort of have some other evidence. 946 00:40:11,260 --> 00:40:13,110 So we talked about this already. 947 00:40:13,110 --> 00:40:15,030 When you think about social preferences, 948 00:40:15,030 --> 00:40:18,120 you can think about three broad categories of preferences. 949 00:40:18,120 --> 00:40:20,410 And we're going to talk about this a lot more. 950 00:40:20,410 --> 00:40:22,530 So one is distributional preferences. 951 00:40:22,530 --> 00:40:26,100 This is to say that you care about outcomes. 952 00:40:26,100 --> 00:40:28,245 You care about how much each person gets. 953 00:40:28,245 --> 00:40:30,390 So that's to say we can sort of represent this to, 954 00:40:30,390 --> 00:40:33,450 like, I put some weight on my utility, on my outcomes. 955 00:40:33,450 --> 00:40:35,310 I put some weight on your outcomes. 956 00:40:35,310 --> 00:40:37,230 And then depending on what that weight is, 957 00:40:37,230 --> 00:40:40,360 I'm going to decide about how to distribute outcomes. 958 00:40:40,360 --> 00:40:42,443 These are sort of distributional preferences. 959 00:40:42,443 --> 00:40:43,860 Then there are things like what we 960 00:40:43,860 --> 00:40:46,470 want to call face-saving concerns, which 961 00:40:46,470 --> 00:40:51,120 is people don't want to look bad in front of others. 962 00:40:51,120 --> 00:40:53,370 You can also think about this like social image, which 963 00:40:53,370 --> 00:40:56,250 is to say, I want to look like I'm a nice guy. 964 00:40:56,250 --> 00:40:59,070 So in particular, if I'm in public making certain choices, 965 00:40:59,070 --> 00:41:01,650 if I'm being observed, I'm going to behave 966 00:41:01,650 --> 00:41:03,630 in a very friendly way. 967 00:41:03,630 --> 00:41:07,840 If I could secretly be mean, I would do that. 968 00:41:07,840 --> 00:41:10,170 But as long as my social image is at stake, 969 00:41:10,170 --> 00:41:11,810 I might look like a nice person. 970 00:41:11,810 --> 00:41:14,310 We're going to talk about some experimental evidence looking 971 00:41:14,310 --> 00:41:15,580 at that. 972 00:41:15,580 --> 00:41:17,820 And then there's some which we want call 973 00:41:17,820 --> 00:41:20,260 intentions-based preferences. 974 00:41:20,260 --> 00:41:21,930 These are things like reciprocity, 975 00:41:21,930 --> 00:41:27,055 procedural justice, and other facets of social preferences. 976 00:41:27,055 --> 00:41:28,680 Broadly speaking, you can think of this 977 00:41:28,680 --> 00:41:32,580 as fairness, which is to say people don't care 978 00:41:32,580 --> 00:41:34,440 about the outcomes, per se. 979 00:41:34,440 --> 00:41:38,310 They care about essentially the process in which 980 00:41:38,310 --> 00:41:40,500 the outcomes were generated. 981 00:41:40,500 --> 00:41:43,080 So they might be perfectly happy with one person getting 982 00:41:43,080 --> 00:41:45,660 a lot and another person getting nothing as 983 00:41:45,660 --> 00:41:47,790 long as the computer randomized and [INAUDIBLE],, 984 00:41:47,790 --> 00:41:51,040 the probability of getting money was the same for everybody. 985 00:41:51,040 --> 00:41:54,270 So that would be somebody who said-- 986 00:41:54,270 --> 00:41:57,570 these would be intention-based preferences, where people say, 987 00:41:57,570 --> 00:41:58,800 this is procedural justice. 988 00:41:58,800 --> 00:42:00,060 This is a fair thing to do. 989 00:42:03,240 --> 00:42:07,620 And people are motivated by that, 990 00:42:07,620 --> 00:42:11,100 or essentially, they might reciprocate. 991 00:42:11,100 --> 00:42:13,680 In cases when they think things are unfair, 992 00:42:13,680 --> 00:42:15,210 they might not do so in case they 993 00:42:15,210 --> 00:42:17,850 think things are fairly generated, even 994 00:42:17,850 --> 00:42:21,420 if things are unequal ex-post. 995 00:42:21,420 --> 00:42:22,350 OK. 996 00:42:22,350 --> 00:42:25,300 So now we're going to talk about all three of those. 997 00:42:25,300 --> 00:42:27,420 So let's start with distributional preferences. 998 00:42:27,420 --> 00:42:30,180 This is, again, the simplest kinds of social preferences. 999 00:42:30,180 --> 00:42:32,700 This is very much a natural extension 1000 00:42:32,700 --> 00:42:35,350 of how economists think about the world. 1001 00:42:35,350 --> 00:42:36,840 This is going back to Gary Becker, 1002 00:42:36,840 --> 00:42:39,750 I think, in the '60s, where essentially, instead of saying, 1003 00:42:39,750 --> 00:42:43,170 does a utility function that just has my own outcomes, 1004 00:42:43,170 --> 00:42:45,330 as my own consumption as an argument, 1005 00:42:45,330 --> 00:42:48,120 instead, have a utility function that has my own utility, 1006 00:42:48,120 --> 00:42:50,892 how much I get, a function of that, 1007 00:42:50,892 --> 00:42:52,350 plus how much does the other person 1008 00:42:52,350 --> 00:42:54,940 get, and a weight on that other person. 1009 00:42:54,940 --> 00:42:58,830 And so again, we sort of look at, 1010 00:42:58,830 --> 00:43:01,320 like, the person cares not about only how much they get, 1011 00:43:01,320 --> 00:43:02,192 but also the other. 1012 00:43:02,192 --> 00:43:03,900 There's two versions of that, and this is 1013 00:43:03,900 --> 00:43:05,108 what you were saying earlier. 1014 00:43:05,108 --> 00:43:07,170 There are interested distributional preferences 1015 00:43:07,170 --> 00:43:09,690 and disinterested distributional preferences. 1016 00:43:09,690 --> 00:43:12,000 The interested one is essentially-- 1017 00:43:12,000 --> 00:43:14,040 the version is, like, where I'm involved, 1018 00:43:14,040 --> 00:43:16,440 and I get a certain outcome, and there's 1019 00:43:16,440 --> 00:43:18,540 another person involved, and I put some weight 1020 00:43:18,540 --> 00:43:19,800 on that other person. 1021 00:43:19,800 --> 00:43:21,600 So it's about me versus another person, 1022 00:43:21,600 --> 00:43:25,440 and how these resources are being divided. 1023 00:43:25,440 --> 00:43:28,080 Could be also multiple people, of course. 1024 00:43:28,080 --> 00:43:29,700 And then there's a disinterested part, 1025 00:43:29,700 --> 00:43:32,700 which is to say, how do I want resources 1026 00:43:32,700 --> 00:43:35,160 to be divided across others? 1027 00:43:35,160 --> 00:43:38,220 This is questions like, how do we want society to look? 1028 00:43:38,220 --> 00:43:42,510 How do people feel about inequality and the like? 1029 00:43:42,510 --> 00:43:45,180 Or, like what public goods-- how much should the government 1030 00:43:45,180 --> 00:43:47,460 engage in redistribution and so on? 1031 00:43:47,460 --> 00:43:49,550 These would be often-- 1032 00:43:49,550 --> 00:43:51,300 to some degree, of course, it's interested 1033 00:43:51,300 --> 00:43:56,340 because people have to pay taxes or receive benefits 1034 00:43:56,340 --> 00:43:57,390 from the government. 1035 00:43:57,390 --> 00:43:59,100 But often, it's you can ask questions 1036 00:43:59,100 --> 00:44:00,600 people have opinions about. 1037 00:44:00,600 --> 00:44:01,920 Suppose you live in the US. 1038 00:44:01,920 --> 00:44:04,470 You could have opinions about how should Europe's tax system 1039 00:44:04,470 --> 00:44:05,220 look like. 1040 00:44:05,220 --> 00:44:06,780 What is fair and what is unfair? 1041 00:44:06,780 --> 00:44:09,600 Those would be, essentially, disinterested distributional 1042 00:44:09,600 --> 00:44:10,705 preferences? 1043 00:44:13,820 --> 00:44:16,315 Any questions on that? 1044 00:44:16,315 --> 00:44:17,690 So we're going to start with this 1045 00:44:17,690 --> 00:44:21,020 and see how far can you get distributional preferences. 1046 00:44:21,020 --> 00:44:22,730 What kinds of things can be explained? 1047 00:44:22,730 --> 00:44:24,950 And then we're going to move towards saving 1048 00:44:24,950 --> 00:44:29,310 face or social image type of concerns. 1049 00:44:29,310 --> 00:44:31,970 So the dictator game, I already explained to you 1050 00:44:31,970 --> 00:44:33,810 what it looks like. 1051 00:44:33,810 --> 00:44:39,920 So here's what the dictator game looked like in class 1052 00:44:39,920 --> 00:44:41,190 with no stakes. 1053 00:44:41,190 --> 00:44:42,860 This is very typical to how people 1054 00:44:42,860 --> 00:44:46,380 behave in the real world. 1055 00:44:46,380 --> 00:44:47,440 What do we see here? 1056 00:44:47,440 --> 00:44:52,160 What are patterns of behaviors? 1057 00:44:52,160 --> 00:44:54,800 This is no stakes, and it's just hypothetical questions here. 1058 00:45:06,670 --> 00:45:07,240 Yeah? 1059 00:45:07,240 --> 00:45:08,782 AUDIENCE: There are two large groups. 1060 00:45:08,782 --> 00:45:11,080 One group that doesn't give anything, 1061 00:45:11,080 --> 00:45:14,468 and a group that gives half. 1062 00:45:14,468 --> 00:45:15,260 PROFESSOR: Exactly. 1063 00:45:15,260 --> 00:45:19,690 And this is very typical, again, in many situations. 1064 00:45:19,690 --> 00:45:21,220 There's a group of people, often, 1065 00:45:21,220 --> 00:45:24,670 like, 30%, 40% of people who just essentially keep 1066 00:45:24,670 --> 00:45:26,350 the money. 1067 00:45:26,350 --> 00:45:28,300 There's a group of people that thinks 1068 00:45:28,300 --> 00:45:30,620 50-50 is the right thing to do. 1069 00:45:30,620 --> 00:45:33,250 That's often also something, like, 30% of people. 1070 00:45:33,250 --> 00:45:35,320 And then there's people often in between, 1071 00:45:35,320 --> 00:45:36,970 often between 0 and 50. 1072 00:45:36,970 --> 00:45:41,350 And there are some people who tend to give a lot. 1073 00:45:41,350 --> 00:45:44,110 That's often a minority of people. 1074 00:45:44,110 --> 00:45:47,290 Now, what can we infer from those choices? 1075 00:45:47,290 --> 00:45:49,300 What can we infer from people who choose 0? 1076 00:45:52,250 --> 00:45:53,110 Yes? 1077 00:45:53,110 --> 00:45:56,400 AUDIENCE: They give themselves [INAUDIBLE].. 1078 00:45:56,400 --> 00:46:00,590 PROFESSOR: And do we have any objections to that? 1079 00:46:00,590 --> 00:46:02,570 Or what other interpretation could we have? 1080 00:46:08,820 --> 00:46:09,640 Yes? 1081 00:46:09,640 --> 00:46:12,370 AUDIENCE: In the case of money, we can't necessarily 1082 00:46:12,370 --> 00:46:14,530 assume they're self-interested, I guess, 1083 00:46:14,530 --> 00:46:17,998 because they could be giving it to charity. 1084 00:46:17,998 --> 00:46:19,040 PROFESSOR: Yeah, exactly. 1085 00:46:19,040 --> 00:46:22,010 So in the case of money, you could keep the money. 1086 00:46:22,010 --> 00:46:26,420 You could say, I'm choosing the $10. 1087 00:46:26,420 --> 00:46:28,250 And instead, I'm giving it to somebody 1088 00:46:28,250 --> 00:46:30,230 who is in higher need, who has higher 1089 00:46:30,230 --> 00:46:33,260 marginal value of this money. 1090 00:46:33,260 --> 00:46:35,960 That could be you could give it to charity 1091 00:46:35,960 --> 00:46:38,930 or give it to somebody on the street or any other person 1092 00:46:38,930 --> 00:46:40,400 you think is in need. 1093 00:46:40,400 --> 00:46:43,340 So that person who looks like they're fairly selfish 1094 00:46:43,340 --> 00:46:45,680 might, in fact, be quite generous in the sense 1095 00:46:45,680 --> 00:46:48,530 of giving the money to others. 1096 00:46:48,530 --> 00:46:50,960 That's a common issue with a lot of these games, 1097 00:46:50,960 --> 00:46:53,960 is essentially it's often an outside option in a sense of, 1098 00:46:53,960 --> 00:46:57,087 like, we can look at these games as if this is the only game. 1099 00:46:57,087 --> 00:46:58,670 That's the only thing that's happening 1100 00:46:58,670 --> 00:47:01,220 in the world, and nothing else, and look at-- 1101 00:47:01,220 --> 00:47:03,200 try to explain behavior through that. 1102 00:47:03,200 --> 00:47:05,660 But often the issue is there's other things that 1103 00:47:05,660 --> 00:47:07,410 happen in the world. 1104 00:47:07,410 --> 00:47:09,500 So when I think about, am I behind, 1105 00:47:09,500 --> 00:47:13,700 or do I get more money versus somebody else, when 1106 00:47:13,700 --> 00:47:16,340 you play with somebody who is a lot richer or a lot poorer 1107 00:47:16,340 --> 00:47:21,665 than you are, in fact, giving them $10 versus not or $5 1108 00:47:21,665 --> 00:47:23,840 or not or will not change that fact that you're 1109 00:47:23,840 --> 00:47:25,190 a lot richer than they are. 1110 00:47:25,190 --> 00:47:27,530 So perhaps you should give a lot more, a lot less money, 1111 00:47:27,530 --> 00:47:28,560 and so on. 1112 00:47:28,560 --> 00:47:30,320 So in anything that we're assuming, 1113 00:47:30,320 --> 00:47:32,255 we look at this through the lens of, 1114 00:47:32,255 --> 00:47:35,415 like, there's only this game that happens, and nothing else. 1115 00:47:35,415 --> 00:47:37,790 It turns out that's actually a pretty good approximation. 1116 00:47:37,790 --> 00:47:39,830 People sort of, what we call, are narrowly framed. 1117 00:47:39,830 --> 00:47:41,247 They look at essentially this game 1118 00:47:41,247 --> 00:47:43,370 as if this is completely separate from anything 1119 00:47:43,370 --> 00:47:45,540 else in the world. 1120 00:47:45,540 --> 00:47:47,660 So people tend to not integrate these games 1121 00:47:47,660 --> 00:47:49,910 with the rest of life or the rest of utility. 1122 00:47:49,910 --> 00:47:51,762 So expected utilities would say, you 1123 00:47:51,762 --> 00:47:53,720 should integrate this game with everything else 1124 00:47:53,720 --> 00:47:55,230 that happens in life. 1125 00:47:55,230 --> 00:47:57,530 So you should not look at, like, 6 versus 4. 1126 00:47:57,530 --> 00:47:59,300 I'm earning more than somebody else. 1127 00:47:59,300 --> 00:48:02,090 You should look at, like, what is my income overall, 1128 00:48:02,090 --> 00:48:02,930 a lifetime income? 1129 00:48:02,930 --> 00:48:05,600 What is your lifetime income or the expected lifetime 1130 00:48:05,600 --> 00:48:07,460 income of somebody else in the experiment? 1131 00:48:07,460 --> 00:48:10,010 And depending on that, you should give money or not. 1132 00:48:10,010 --> 00:48:12,540 Or differently, you might say, well, 1133 00:48:12,540 --> 00:48:14,690 should I give money to somebody else in class? 1134 00:48:14,690 --> 00:48:16,940 Well, perhaps not because most people in class 1135 00:48:16,940 --> 00:48:21,470 will eventually make a lot of money once they graduate. 1136 00:48:21,470 --> 00:48:24,140 And instead, let's just use that money 1137 00:48:24,140 --> 00:48:29,390 and give it to somebody else who has higher need of that. 1138 00:48:29,390 --> 00:48:29,930 OK. 1139 00:48:29,930 --> 00:48:31,450 So here's no stakes. 1140 00:48:31,450 --> 00:48:34,410 It turns out when you look at stakes, not much happens. 1141 00:48:34,410 --> 00:48:37,100 So this is answers-- if you look at-- this is no stakes. 1142 00:48:37,100 --> 00:48:37,910 Here's stakes. 1143 00:48:37,910 --> 00:48:41,220 Answers look, in fact, very similar. 1144 00:48:41,220 --> 00:48:43,280 There's not much of a difference. 1145 00:48:43,280 --> 00:48:46,068 Even when you have higher stakes-- 1146 00:48:46,068 --> 00:48:48,110 I think this is, like, $20, and the like-- people 1147 00:48:48,110 --> 00:48:49,600 seem to behave very similarly. 1148 00:48:49,600 --> 00:48:52,592 So it seems to be, in fact, asking hypothetical questions-- 1149 00:48:52,592 --> 00:48:54,050 and this is how some of these games 1150 00:48:54,050 --> 00:48:56,870 started, just asking hypothetically, 1151 00:48:56,870 --> 00:48:59,412 what would you do if somebody gave you $10 and so on? 1152 00:48:59,412 --> 00:49:01,370 Turns out, you actually get pretty good answers 1153 00:49:01,370 --> 00:49:04,078 for how people behave in reality. 1154 00:49:04,078 --> 00:49:05,870 That's true for this class, but that's also 1155 00:49:05,870 --> 00:49:07,040 true in actual experiments. 1156 00:49:07,040 --> 00:49:08,957 In actual experiments, if you just ask people, 1157 00:49:08,957 --> 00:49:10,020 like, what would you do? 1158 00:49:10,020 --> 00:49:11,510 And then when you actually implement it 1159 00:49:11,510 --> 00:49:13,260 with the same people or with other people, 1160 00:49:13,260 --> 00:49:17,640 you get actually pretty similar patterns across people. 1161 00:49:17,640 --> 00:49:19,740 Now, we do find that-- 1162 00:49:19,740 --> 00:49:21,740 so we had a version of that that was essentially 1163 00:49:21,740 --> 00:49:25,250 in private, which is to say you did not 1164 00:49:25,250 --> 00:49:27,750 have to reveal if selected to the entire class, 1165 00:49:27,750 --> 00:49:30,555 essentially, what you chose, but only to the other person. 1166 00:49:30,555 --> 00:49:32,930 Notice that that's not really private in the sense of you 1167 00:49:32,930 --> 00:49:35,450 still have to tell the other person, ha, I chose 10, 1168 00:49:35,450 --> 00:49:37,950 and you get 0, so that's not very nice. 1169 00:49:37,950 --> 00:49:39,950 There's other versions that would be in private, 1170 00:49:39,950 --> 00:49:43,190 would just be the other person might never find out, 1171 00:49:43,190 --> 00:49:45,650 or the other person might at least never find out 1172 00:49:45,650 --> 00:49:49,280 that it was you who chose 10, and they got 0. 1173 00:49:49,280 --> 00:49:52,490 And so but what we do see, if you look at-- 1174 00:49:52,490 --> 00:49:54,350 I think this is with monetary stakes. 1175 00:49:54,350 --> 00:49:55,790 This is with $10. 1176 00:49:55,790 --> 00:49:57,830 This is with $10 in private. 1177 00:49:57,830 --> 00:50:00,920 And you do see, for example, that when 1178 00:50:00,920 --> 00:50:03,980 you look at the fraction who gives essentially 100%, 1179 00:50:03,980 --> 00:50:07,010 that fraction tends to go down. 1180 00:50:07,010 --> 00:50:11,420 So it looks like people are somewhat less nice in private. 1181 00:50:11,420 --> 00:50:13,097 We're going to get back to that again 1182 00:50:13,097 --> 00:50:14,930 in the sense of trying to see-- when we look 1183 00:50:14,930 --> 00:50:16,820 at the question of, are people really nice 1184 00:50:16,820 --> 00:50:19,460 when they give money, and so on. 1185 00:50:19,460 --> 00:50:22,010 We're going to look at cleaner versions of that, where people 1186 00:50:22,010 --> 00:50:26,150 have the ability to essentially hide in some ways 1187 00:50:26,150 --> 00:50:27,470 that they're not nice. 1188 00:50:27,470 --> 00:50:28,490 How might one do that? 1189 00:50:28,490 --> 00:50:30,530 How could you-- what kind of game would you 1190 00:50:30,530 --> 00:50:33,680 design to hide the fact that the person is not nice? 1191 00:50:36,400 --> 00:50:38,080 Yes? 1192 00:50:38,080 --> 00:50:40,890 AUDIENCE: For example, if the recipient 1193 00:50:40,890 --> 00:50:44,620 is only given the money and not told 1194 00:50:44,620 --> 00:50:47,452 what percentage of the original money [INAUDIBLE].. 1195 00:50:47,452 --> 00:50:48,160 PROFESSOR: Right. 1196 00:50:48,160 --> 00:50:50,260 So one version of that would be the recipient just 1197 00:50:50,260 --> 00:50:52,090 doesn't even know what the game is. 1198 00:50:52,090 --> 00:50:53,260 They just get some money. 1199 00:50:53,260 --> 00:50:56,050 Here's some money for whatever reason. 1200 00:50:56,050 --> 00:50:58,610 You were just given it because you're in the experiment. 1201 00:50:58,610 --> 00:51:00,370 And so you could just essentially hide 1202 00:51:00,370 --> 00:51:02,865 the entire game from the recipient. 1203 00:51:02,865 --> 00:51:04,240 What's another thing you could do 1204 00:51:04,240 --> 00:51:06,560 where you don't hide the entire game but sort of say, 1205 00:51:06,560 --> 00:51:07,477 still, there's a game? 1206 00:51:07,477 --> 00:51:09,030 But what other ways could you-- 1207 00:51:09,030 --> 00:51:10,780 in what other ways could you hide the fact 1208 00:51:10,780 --> 00:51:13,870 that people are not nice? 1209 00:51:13,870 --> 00:51:18,380 What mechanisms could you implement? 1210 00:51:18,380 --> 00:51:19,692 Yeah? 1211 00:51:19,692 --> 00:51:22,948 AUDIENCE: If you mix robots and [INAUDIBLE] people. 1212 00:51:22,948 --> 00:51:23,740 PROFESSOR: Exactly. 1213 00:51:23,740 --> 00:51:24,430 And that's exactly-- 1214 00:51:24,430 --> 00:51:26,430 I'm going to show you that, I think, on Wednesday. 1215 00:51:26,430 --> 00:51:28,763 That's exactly what people have done, where are you say, 1216 00:51:28,763 --> 00:51:32,258 with 50% chance there's a robot who is not very nice or very 1217 00:51:32,258 --> 00:51:34,050 nice or whatever, and some chance the robot 1218 00:51:34,050 --> 00:51:37,110 is really mean, with some chance the robot is really nice. 1219 00:51:37,110 --> 00:51:38,580 And now we can look at-- 1220 00:51:38,580 --> 00:51:40,770 so with 50%, the robot decides. 1221 00:51:40,770 --> 00:51:43,107 With 50% chance, you decide. 1222 00:51:43,107 --> 00:51:44,940 And the recipient will never know whether it 1223 00:51:44,940 --> 00:51:48,210 was a robot versus you. 1224 00:51:48,210 --> 00:51:50,490 And so now you can essentially hide behind the robot. 1225 00:51:50,490 --> 00:51:53,340 You can say, well, I really wanted to give you a lot. 1226 00:51:53,340 --> 00:51:55,800 But sorry, the robot was really mean. 1227 00:51:55,800 --> 00:51:57,270 Too bad. 1228 00:51:57,270 --> 00:51:59,970 And the other person will never find out. 1229 00:51:59,970 --> 00:52:02,610 And it turns out when you do that kind of game, people 1230 00:52:02,610 --> 00:52:08,251 tend to be a lot less nice than when there's no robot. 1231 00:52:08,251 --> 00:52:09,280 OK. 1232 00:52:09,280 --> 00:52:13,820 So then we have chocolate. 1233 00:52:13,820 --> 00:52:15,570 Looks like people are giving somewhat more 1234 00:52:15,570 --> 00:52:17,460 when it comes to chocolate. 1235 00:52:17,460 --> 00:52:21,570 They tend to give a lot more when it comes to apples. 1236 00:52:24,630 --> 00:52:26,955 So now, why is that? 1237 00:52:26,955 --> 00:52:27,830 Why does that matter? 1238 00:52:27,830 --> 00:52:28,860 Yes? 1239 00:52:28,860 --> 00:52:33,570 AUDIENCE: If you model algorithms, your utility 1240 00:52:33,570 --> 00:52:37,220 plus the other person's utility [INAUDIBLE],, 1241 00:52:37,220 --> 00:52:42,400 it could be that people with marginal utility, 1242 00:52:42,400 --> 00:52:45,875 in chocolate apples decline more than [INAUDIBLE].. 1243 00:52:45,875 --> 00:52:46,500 PROFESSOR: Yes. 1244 00:52:46,500 --> 00:52:49,530 It turns out getting the fifth or sixth or seventh apple 1245 00:52:49,530 --> 00:52:52,620 of the day, maybe not as enjoyable 1246 00:52:52,620 --> 00:52:55,720 as the first or second. 1247 00:52:55,720 --> 00:52:56,400 So exactly. 1248 00:52:56,400 --> 00:52:58,500 So well, it's really reasonable to think 1249 00:52:58,500 --> 00:53:00,150 that the marginal utility of money 1250 00:53:00,150 --> 00:53:03,330 is linear for small amounts. 1251 00:53:03,330 --> 00:53:06,370 If I give you $1, 2, or 3, or 4, or 5, in some sense, 1252 00:53:06,370 --> 00:53:08,370 that shouldn't be very concave. 1253 00:53:08,370 --> 00:53:09,900 We discussed that at length. 1254 00:53:09,900 --> 00:53:12,330 For apples, they're perishable. 1255 00:53:12,330 --> 00:53:14,730 There's only so many apples you can eat on a given day. 1256 00:53:14,730 --> 00:53:16,772 So really what we should look at is, like, what's 1257 00:53:16,772 --> 00:53:17,950 the utility of apples? 1258 00:53:17,950 --> 00:53:21,180 And if the marginal utility of apples is decreasing, 1259 00:53:21,180 --> 00:53:24,910 that might matter quite a bit. 1260 00:53:24,910 --> 00:53:26,190 OK. 1261 00:53:26,190 --> 00:53:30,210 So maybe let me summarize a little bit. 1262 00:53:30,210 --> 00:53:32,220 What did we learn in those kinds of games? 1263 00:53:32,220 --> 00:53:38,070 So first is people look fairly generous in dictator games, 1264 00:53:38,070 --> 00:53:41,190 even when the game is largely private. 1265 00:53:41,190 --> 00:53:43,800 We're going to challenge that assumption a little bit more, 1266 00:53:43,800 --> 00:53:46,770 exactly as you suggested, with other ways in which 1267 00:53:46,770 --> 00:53:49,920 either people can be more private, they can just 1268 00:53:49,920 --> 00:53:52,650 hide the game entirely, or when they perhaps 1269 00:53:52,650 --> 00:53:56,340 can hide behind a machine or a computer or in other ways, 1270 00:53:56,340 --> 00:54:01,110 or when they perhaps can opt out of the game in certain ways. 1271 00:54:01,110 --> 00:54:03,000 People generally seem to give something 1272 00:54:03,000 --> 00:54:04,890 like 30% of the total. 1273 00:54:04,890 --> 00:54:05,880 That's quite common. 1274 00:54:05,880 --> 00:54:08,795 20% to 30% is quite common overall. 1275 00:54:08,795 --> 00:54:10,920 It doesn't seem to matter that much whether there's 1276 00:54:10,920 --> 00:54:13,650 hypothetical versus actual choices. 1277 00:54:13,650 --> 00:54:17,820 The size of the stakes also seem not to matter that much. 1278 00:54:17,820 --> 00:54:20,100 We seem to see somewhat different behaviors 1279 00:54:20,100 --> 00:54:21,208 of chocolates and apples. 1280 00:54:21,208 --> 00:54:23,250 There's also some people who told me that they're 1281 00:54:23,250 --> 00:54:24,790 lactose intolerant and so on. 1282 00:54:24,790 --> 00:54:27,330 So essentially saying, like, their marginal utility 1283 00:54:27,330 --> 00:54:28,780 of chocolate is quite low. 1284 00:54:28,780 --> 00:54:30,960 So it's much easier-- it's much less costly to give 1285 00:54:30,960 --> 00:54:31,938 to somebody else. 1286 00:54:31,938 --> 00:54:34,230 Or put differently, it's actually efficient potentially 1287 00:54:34,230 --> 00:54:35,938 because the marginal utility of chocolate 1288 00:54:35,938 --> 00:54:38,743 is much higher for that other person than it is for you. 1289 00:54:38,743 --> 00:54:40,410 Of course, there's a problem now, again, 1290 00:54:40,410 --> 00:54:42,480 that you could just choose the chocolates for yourself 1291 00:54:42,480 --> 00:54:44,220 and then just give it to your friends. 1292 00:54:44,220 --> 00:54:48,450 So you look quite selfish, but in fact, you might not be. 1293 00:54:48,450 --> 00:54:51,100 So anyway, but sort of what really matters here-- 1294 00:54:51,100 --> 00:54:52,410 and this is perhaps-- 1295 00:54:52,410 --> 00:54:55,000 in some sense, when we talk about apples and chocolates, 1296 00:54:55,000 --> 00:54:56,053 it's a little bit silly. 1297 00:54:56,053 --> 00:54:57,720 But when you think about a dictator game 1298 00:54:57,720 --> 00:55:01,810 where you play with somebody in Kenya whose income or lifetime 1299 00:55:01,810 --> 00:55:04,117 income is, like, orders of magnitude lower than yours, 1300 00:55:04,117 --> 00:55:05,700 then it matters, actually, quite a bit 1301 00:55:05,700 --> 00:55:08,452 what's the marginal utility of the $10 that you 1302 00:55:08,452 --> 00:55:10,410 could keep for yourself versus for that person. 1303 00:55:10,410 --> 00:55:12,060 For that person, that's a lot of money. 1304 00:55:12,060 --> 00:55:14,850 For you, a lot less. 1305 00:55:14,850 --> 00:55:17,220 And so you might be more generous 1306 00:55:17,220 --> 00:55:20,130 when playing with a person whose marginal utility of money 1307 00:55:20,130 --> 00:55:23,700 or consumption is a lot higher. 1308 00:55:23,700 --> 00:55:25,620 This is what I was mentioning as well. 1309 00:55:25,620 --> 00:55:28,350 You want to think about, what is the outside option? 1310 00:55:28,350 --> 00:55:29,910 So some people might give low amount 1311 00:55:29,910 --> 00:55:32,310 and decide to give the money or the apples 1312 00:55:32,310 --> 00:55:34,740 or whatever to somebody else. 1313 00:55:34,740 --> 00:55:37,140 Those people will look selfish in dictator games, 1314 00:55:37,140 --> 00:55:39,360 while, in fact, they're quite generous. 1315 00:55:39,360 --> 00:55:40,740 That's a problem with these games 1316 00:55:40,740 --> 00:55:43,900 that are hard to deal with. 1317 00:55:43,900 --> 00:55:47,760 We're not going to be able to deal with that overall. 1318 00:55:47,760 --> 00:55:49,960 You can think about it a little bit and sort of see. 1319 00:55:49,960 --> 00:55:53,580 Can we give people options to opt out and so on? 1320 00:55:53,580 --> 00:55:55,110 But it will always be an issue. 1321 00:55:55,110 --> 00:55:57,360 It's less of an issue than you perhaps think about it. 1322 00:55:57,360 --> 00:55:59,350 Conceptually, that's a huge problem. 1323 00:55:59,350 --> 00:56:01,470 But when we look at people's actual behavior, 1324 00:56:01,470 --> 00:56:03,540 it looks like people really behave as if this was 1325 00:56:03,540 --> 00:56:04,950 the only thing in the world. 1326 00:56:04,950 --> 00:56:06,930 And then, in some ways, it's quite predictive 1327 00:56:06,930 --> 00:56:09,780 of actual behavior in the world, even perhaps, in some ways, 1328 00:56:09,780 --> 00:56:11,670 it shouldn't be. 1329 00:56:11,670 --> 00:56:14,365 Any other comments or things we learned perhaps from the game 1330 00:56:14,365 --> 00:56:15,240 or that you observed? 1331 00:56:18,180 --> 00:56:18,956 Yeah? 1332 00:56:18,956 --> 00:56:22,050 AUDIENCE: Is there a possibility that the people that give 0, 1333 00:56:22,050 --> 00:56:27,140 it's just that they're interested distribution 1334 00:56:27,140 --> 00:56:29,370 [INAUDIBLE] is the same as your disinterested one, 1335 00:56:29,370 --> 00:56:31,912 and they actually think, like, oh, the best thing for society 1336 00:56:31,912 --> 00:56:37,090 is for me to have those $10 and not the other person? 1337 00:56:37,090 --> 00:56:41,123 PROFESSOR: So because you put a lot of weight on yourself or? 1338 00:56:41,123 --> 00:56:42,540 AUDIENCE: Sort of like, regardless 1339 00:56:42,540 --> 00:56:45,390 if it's correct or not, but you think 1340 00:56:45,390 --> 00:56:47,576 that the way you'll use these $10 1341 00:56:47,576 --> 00:56:51,000 is better for society than the other person? 1342 00:56:51,000 --> 00:56:53,610 PROFESSOR: Yeah, I think you have to justify that somehow 1343 00:56:53,610 --> 00:56:55,320 in the sense of, like, if you are very 1344 00:56:55,320 --> 00:56:57,630 similar to the other students, then you might sort of 1345 00:56:57,630 --> 00:57:00,850 think that dividing it-- 1346 00:57:00,850 --> 00:57:04,005 so depends what do you think is the marginal utility of money. 1347 00:57:04,005 --> 00:57:05,880 So if you think the marginal utility of money 1348 00:57:05,880 --> 00:57:08,820 is essentially the same for everybody, in some sense then, 1349 00:57:08,820 --> 00:57:10,800 it doesn't matter very much whether you keep it 1350 00:57:10,800 --> 00:57:12,330 to yourself versus somebody else, 1351 00:57:12,330 --> 00:57:14,740 in some sense because, in the grand scheme of things, 1352 00:57:14,740 --> 00:57:16,107 it doesn't look very different. 1353 00:57:16,107 --> 00:57:18,690 It could be that, for example, you think, for whatever reason, 1354 00:57:18,690 --> 00:57:21,060 you just really had a bad day, and you can use the $10 1355 00:57:21,060 --> 00:57:22,647 to buy something nice. 1356 00:57:22,647 --> 00:57:24,480 It could be that you think you're relatively 1357 00:57:24,480 --> 00:57:26,268 poor to others in class. 1358 00:57:26,268 --> 00:57:28,560 For whatever reason, you just don't have a lot of money 1359 00:57:28,560 --> 00:57:29,430 right now. 1360 00:57:29,430 --> 00:57:32,220 The marginal utility of money for you might be really high. 1361 00:57:32,220 --> 00:57:35,580 So again, and in some sense, that's socially efficient. 1362 00:57:35,580 --> 00:57:38,070 You would actually-- even if it weren't you, 1363 00:57:38,070 --> 00:57:40,450 you would give the money to yourself. 1364 00:57:40,450 --> 00:57:42,510 Now, I think it is the case that when 1365 00:57:42,510 --> 00:57:45,570 you look at these distributions, it 1366 00:57:45,570 --> 00:57:49,000 tends to be that 30% of people keep all of that money. 1367 00:57:49,000 --> 00:57:55,160 So it tends to be that it's hard to believe that that's really-- 1368 00:57:55,160 --> 00:57:57,660 I mean, you'd have to look at, like, how does that correlate 1369 00:57:57,660 --> 00:57:59,160 with other characteristics? 1370 00:57:59,160 --> 00:58:01,170 It probably does, and we'll get back 1371 00:58:01,170 --> 00:58:03,360 to that in the sense of when we see now, 1372 00:58:03,360 --> 00:58:05,790 when people have the chance to hide money and so on, when 1373 00:58:05,790 --> 00:58:08,670 you see now the fraction of people who choose 1374 00:58:08,670 --> 00:58:10,530 $10 for themselves and 0 for others 1375 00:58:10,530 --> 00:58:13,740 goes up, that, in some ways, compared to when you can't 1376 00:58:13,740 --> 00:58:17,010 hide, that perhaps is an identifying selfishness and so 1377 00:58:17,010 --> 00:58:17,640 on. 1378 00:58:17,640 --> 00:58:19,710 Here, when you see people keeping 10, 1379 00:58:19,710 --> 00:58:22,442 you can come up with lots of explanations, including 1380 00:58:22,442 --> 00:58:24,900 the one that you mentioned, but also including the ones you 1381 00:58:24,900 --> 00:58:26,700 keep it for yourself and then give it to your friends 1382 00:58:26,700 --> 00:58:28,230 and share it with them and so on. 1383 00:58:28,230 --> 00:58:30,600 And we can't really infer very much. 1384 00:58:30,600 --> 00:58:32,400 But in these cleaner experiments, 1385 00:58:32,400 --> 00:58:35,665 where you can really say, OK, when 1386 00:58:35,665 --> 00:58:37,290 you don't have the outside option, when 1387 00:58:37,290 --> 00:58:40,890 you don't have the chance to hide, you look quite nice. 1388 00:58:40,890 --> 00:58:43,260 That goes away when you have the chance to hide. 1389 00:58:43,260 --> 00:58:46,860 Presumably, that is because you just are selfish and not 1390 00:58:46,860 --> 00:58:48,770 for other reasons. 1391 00:58:48,770 --> 00:58:50,210 OK. 1392 00:58:50,210 --> 00:58:51,212 Yes? 1393 00:58:51,212 --> 00:58:52,920 AUDIENCE: Have people looked at if things 1394 00:58:52,920 --> 00:58:54,762 change if you actually get the money 1395 00:58:54,762 --> 00:58:55,970 or get the apples beforehand? 1396 00:58:55,970 --> 00:58:57,416 Like, [INAUDIBLE]? 1397 00:59:03,942 --> 00:59:07,930 PROFESSOR: Surely they-- surely they have. 1398 00:59:07,930 --> 00:59:09,820 What I don't know-- so surely [INAUDIBLE].. 1399 00:59:09,820 --> 00:59:12,620 So [INAUDIBLE] is very robust in various situations. 1400 00:59:12,620 --> 00:59:15,790 So my guess is they were less likely to give. 1401 00:59:15,790 --> 00:59:16,750 I don't have-- 1402 00:59:16,750 --> 00:59:20,350 I can look it up, but I don't have like a great experiment 1403 00:59:20,350 --> 00:59:22,480 that I can tell you about. 1404 00:59:22,480 --> 00:59:24,430 But my sense is that-- 1405 00:59:24,430 --> 00:59:27,970 so the question is, if you give people the $10 into their hand, 1406 00:59:27,970 --> 00:59:29,830 are they going to choose differently? 1407 00:59:29,830 --> 00:59:34,210 My guess is they will look a lot less selfish. 1408 00:59:34,210 --> 00:59:36,310 In the experiment that you played, in some sense 1409 00:59:36,310 --> 00:59:37,810 you are sort of endowed with a game. 1410 00:59:37,810 --> 00:59:38,962 So in some ways-- 1411 00:59:38,962 --> 00:59:40,420 but you could do deviation of that. 1412 00:59:40,420 --> 00:59:42,030 You could look at, like, here's $10. 1413 00:59:42,030 --> 00:59:43,780 It's neither yours nor the other person's. 1414 00:59:43,780 --> 00:59:45,250 How would you like to divide it? 1415 00:59:45,250 --> 00:59:47,333 My guess is that person will be more generous when 1416 00:59:47,333 --> 00:59:49,990 they do that compared to saying, like, here's $10. 1417 00:59:49,990 --> 00:59:53,110 They give it to you in $1 bills, and now you 1418 00:59:53,110 --> 00:59:54,670 have to give me back some. 1419 00:59:54,670 --> 00:59:56,947 People will be more likely to keep the money when 1420 00:59:56,947 --> 00:59:59,530 they actually have the money in hand, when they feel like it's 1421 00:59:59,530 --> 01:00:02,950 yours or theirs, as opposed to have 1422 01:00:02,950 --> 01:00:07,840 to divide money that doesn't really belong to any person. 1423 01:00:07,840 --> 01:00:09,940 So my guess is that's true, but I don't-- 1424 01:00:09,940 --> 01:00:12,820 there's literally thousands of experiments of people doing 1425 01:00:12,820 --> 01:00:14,510 these kinds of variations. 1426 01:00:14,510 --> 01:00:16,510 So I'm sure there is somebody who has done this. 1427 01:00:16,510 --> 01:00:21,190 I can look up and see whether I find that. 1428 01:00:21,190 --> 01:00:21,700 OK. 1429 01:00:21,700 --> 01:00:23,440 Any other thoughts of what we learned? 1430 01:00:27,980 --> 01:00:28,820 OK. 1431 01:00:28,820 --> 01:00:31,070 So when you now think about how should you 1432 01:00:31,070 --> 01:00:33,110 model these distributional preferences-- 1433 01:00:33,110 --> 01:00:36,560 and there's a classic paper by Charness and Rabin. 1434 01:00:36,560 --> 01:00:38,870 Rabin is the guy that you know from the calibration 1435 01:00:38,870 --> 01:00:41,870 theorem about risk preferences. 1436 01:00:41,870 --> 01:00:44,930 And the way their model is very simple. 1437 01:00:44,930 --> 01:00:47,450 They essentially have the preferences 1438 01:00:47,450 --> 01:00:49,550 over outcomes x1 and x2. 1439 01:00:49,550 --> 01:00:50,710 Think of this as money. 1440 01:00:50,710 --> 01:00:52,760 It could be also apples or something else. 1441 01:00:52,760 --> 01:00:54,560 Money is perhaps a better approximation 1442 01:00:54,560 --> 01:00:56,520 because the marginal utility of money-- 1443 01:00:56,520 --> 01:00:59,250 think of this as rather constant. 1444 01:00:59,250 --> 01:01:03,080 And now player 2 is the dictator. 1445 01:01:03,080 --> 01:01:05,240 So x2 is how much player 2 gets. 1446 01:01:05,240 --> 01:01:08,360 So player 2 has a utility function u2. 1447 01:01:08,360 --> 01:01:11,592 The utility of player 2 has a function of x1. 1448 01:01:11,592 --> 01:01:13,550 This is how much the other person gets, and x2, 1449 01:01:13,550 --> 01:01:16,490 how much the player 2 gets. 1450 01:01:16,490 --> 01:01:19,160 And the utility function looks like rho times 1451 01:01:19,160 --> 01:01:21,650 x1 plus 1 minus rho times x2. 1452 01:01:21,650 --> 01:01:23,990 If x2 is larger or equal than x1-- 1453 01:01:23,990 --> 01:01:26,430 that is to say that the person is ahead, 1454 01:01:26,430 --> 01:01:28,370 if x2 gets more than x1-- 1455 01:01:28,370 --> 01:01:32,990 and it's sigma times x1 plus 1 minus sigma x2 1456 01:01:32,990 --> 01:01:37,580 if x2 is smaller or equal than x1. 1457 01:01:37,580 --> 01:01:39,980 So what do these parameters measure? 1458 01:01:39,980 --> 01:01:41,540 What a rho and sigma measure here? 1459 01:01:52,243 --> 01:01:53,410 Why might they be different? 1460 01:01:53,410 --> 01:01:54,430 Yeah? 1461 01:01:54,430 --> 01:01:55,630 AUDIENCE: How nice they are? 1462 01:01:55,630 --> 01:01:57,513 PROFESSOR: Yes, exactly. 1463 01:01:57,513 --> 01:01:58,180 What does it do? 1464 01:01:58,180 --> 01:02:01,390 How much weight do you put on yourself or the other person? 1465 01:02:01,390 --> 01:02:03,615 AUDIENCE: So if it's a lower value, 1466 01:02:03,615 --> 01:02:06,972 then you don't really care about the other person, 1467 01:02:06,972 --> 01:02:08,430 so you put more weight on yourself. 1468 01:02:08,430 --> 01:02:08,482 PROFESSOR: Right. 1469 01:02:08,482 --> 01:02:09,482 So in the extreme case-- 1470 01:02:09,482 --> 01:02:12,250 I guess, in one case, I guess, if you choose-- 1471 01:02:12,250 --> 01:02:14,830 if this is 0, then essentially, you're 1472 01:02:14,830 --> 01:02:19,690 back in the neoclassical case of not caring about others at all. 1473 01:02:19,690 --> 01:02:22,630 So if you choose, like, rho sigma equals 0, then it's just, 1474 01:02:22,630 --> 01:02:23,830 like, how much do you get? 1475 01:02:23,830 --> 01:02:27,440 You should get 0, and you should accept any offer, right? 1476 01:02:27,440 --> 01:02:33,550 And now as rho and sigma goes up, 0.5 or whatever, 1477 01:02:33,550 --> 01:02:35,860 you give more weight on the other person. 1478 01:02:35,860 --> 01:02:38,170 So 0.5, for example, is the case where 1479 01:02:38,170 --> 01:02:43,120 you care equally about the other person and yourself, right? 1480 01:02:43,120 --> 01:02:45,190 Because the weights are then the same. 1481 01:02:45,190 --> 01:02:47,460 Now, why might rho and sigma be different? 1482 01:02:53,570 --> 01:02:54,874 Yes? 1483 01:02:54,874 --> 01:02:59,850 AUDIENCE: So probably rho is larger than sigma. 1484 01:02:59,850 --> 01:03:02,020 PROFESSOR: And why is that? 1485 01:03:02,020 --> 01:03:05,340 AUDIENCE: So if you're person 2 and you already 1486 01:03:05,340 --> 01:03:07,680 know you're getting more than the other person, 1487 01:03:07,680 --> 01:03:11,160 then you don't feel such a need to do so much-- 1488 01:03:11,160 --> 01:03:14,008 you don't feel a need to have so much more than them. 1489 01:03:14,008 --> 01:03:15,050 PROFESSOR: Yeah, exactly. 1490 01:03:15,050 --> 01:03:15,960 It's just easier. 1491 01:03:15,960 --> 01:03:19,430 That seems to be a very intuitive feature of the world. 1492 01:03:19,430 --> 01:03:21,230 It seems to be much easier to be generous 1493 01:03:21,230 --> 01:03:22,857 when you're ahead anyway. 1494 01:03:22,857 --> 01:03:24,440 If you're very rich, you might as well 1495 01:03:24,440 --> 01:03:26,780 give some money to the poorer person, and so on. 1496 01:03:26,780 --> 01:03:30,740 If you are poorer than the other person, 1497 01:03:30,740 --> 01:03:33,290 it's much harder to be generous in some ways, 1498 01:03:33,290 --> 01:03:35,150 perhaps because you feel like you deserve 1499 01:03:35,150 --> 01:03:37,520 to get as much as the other person, 1500 01:03:37,520 --> 01:03:40,970 so why would you give the other person more? 1501 01:03:40,970 --> 01:03:43,790 You put essentially less weight on that other person 1502 01:03:43,790 --> 01:03:45,950 if you're behind. 1503 01:03:45,950 --> 01:03:48,800 What's the case of sigma smaller than 0? 1504 01:03:57,610 --> 01:03:59,140 Yeah? 1505 01:03:59,140 --> 01:04:02,790 AUDIENCE: Would it be then the person 1506 01:04:02,790 --> 01:04:06,404 who want to hurt the other person, 1507 01:04:06,404 --> 01:04:10,708 so then they put more weight on decreasing their value? 1508 01:04:10,708 --> 01:04:11,750 PROFESSOR: Yeah, exactly. 1509 01:04:11,750 --> 01:04:14,250 So if you're behind-- so it's the case where you are behind, 1510 01:04:14,250 --> 01:04:16,110 you're the x2, the player 2. 1511 01:04:16,110 --> 01:04:19,480 If player 2 is behind, has less than player 1. 1512 01:04:19,480 --> 01:04:23,950 Now, you might get positive utility 1513 01:04:23,950 --> 01:04:28,010 from reducing player 1's outcome. 1514 01:04:28,010 --> 01:04:29,590 So if you're behind, you'd rather 1515 01:04:29,590 --> 01:04:31,840 have the other person get less, and you'll feel better 1516 01:04:31,840 --> 01:04:36,520 about that, presumably because you're then less behind, even 1517 01:04:36,520 --> 01:04:40,340 holding constant your own outcomes. 1518 01:04:40,340 --> 01:04:42,520 So if you keep your own outcome, your own payment, 1519 01:04:42,520 --> 01:04:43,870 the same-- suppose you get 5. 1520 01:04:43,870 --> 01:04:45,490 The other person gets 20-- 1521 01:04:45,490 --> 01:04:47,020 you feel better if the other person 1522 01:04:47,020 --> 01:04:49,510 gets 15 or 10 because you're less behind now 1523 01:04:49,510 --> 01:04:51,510 than the other person. 1524 01:04:51,510 --> 01:04:53,470 OK, so you're willing to hurt the other person. 1525 01:04:53,470 --> 01:04:56,620 You might be even willing to pay to hurt the other person, which 1526 01:04:56,620 --> 01:04:59,290 is kind of what happens in the dictator game 1527 01:04:59,290 --> 01:05:03,380 when the person rejects the dictator game. 1528 01:05:03,380 --> 01:05:04,390 Any questions on that? 1529 01:05:08,470 --> 01:05:12,370 So I have an example of sigma smaller than 1. 1530 01:05:12,370 --> 01:05:16,000 Suppose you have sigma being minus 1/3. 1531 01:05:16,000 --> 01:05:18,032 You have to choose between 0, 0 and 9, 1. 1532 01:05:18,032 --> 01:05:19,240 What are you going to choose? 1533 01:05:24,710 --> 01:05:26,810 So this is what this looks like, sigma times 1534 01:05:26,810 --> 01:05:28,460 x1 plus 1 minus sigma x2. 1535 01:05:39,600 --> 01:05:40,100 Yes? 1536 01:05:40,100 --> 01:05:41,435 AUDIENCE: You choose 0, 0. 1537 01:05:41,435 --> 01:05:42,560 PROFESSOR: And why is that? 1538 01:05:42,560 --> 01:05:44,733 AUDIENCE: Because it multiplies the other numbers. 1539 01:05:44,733 --> 01:05:45,400 PROFESSOR: Yeah. 1540 01:05:45,400 --> 01:05:46,635 AUDIENCE: [INAUDIBLE]. 1541 01:05:46,635 --> 01:05:48,510 PROFESSOR: Right, so essentially-- the reason 1542 01:05:48,510 --> 01:05:51,360 is that essentially, if you look at this situation, 1543 01:05:51,360 --> 01:05:54,000 and you put minus 1/3 weight, essentially, 1544 01:05:54,000 --> 01:05:55,320 on the other person. 1545 01:05:55,320 --> 01:05:57,960 So you really don't like the other person getting 9. 1546 01:05:57,960 --> 01:06:03,300 So you're really unhappy about that. 1547 01:06:03,300 --> 01:06:07,423 Despite the fact that you value getting some money yourself-- 1548 01:06:07,423 --> 01:06:09,090 you have positive value on that as well. 1549 01:06:09,090 --> 01:06:12,480 It's, I guess, 4/3 times 1. 1550 01:06:12,480 --> 01:06:14,430 But sort of putting together, that's negative. 1551 01:06:14,430 --> 01:06:16,080 So essentially, you're willing to pay-- 1552 01:06:16,080 --> 01:06:19,320 and this is, I guess, one explanation potentially 1553 01:06:19,320 --> 01:06:20,550 why people might reject. 1554 01:06:20,550 --> 01:06:22,550 And this is, I guess, what I was saying earlier. 1555 01:06:22,550 --> 01:06:25,710 You might reject the ultimatum game simply 1556 01:06:25,710 --> 01:06:26,940 for distributional reasons. 1557 01:06:26,940 --> 01:06:29,490 You might just say, I don't like being behind in this game. 1558 01:06:29,490 --> 01:06:32,760 If I'm behind, I put negative weight on this other person, 1559 01:06:32,760 --> 01:06:35,250 and then I'm now rejecting it because I 1560 01:06:35,250 --> 01:06:40,470 feel happier if the person who is in front of me gets less, 1561 01:06:40,470 --> 01:06:45,360 even if that comes at the cost of me having to pay some money. 1562 01:06:45,360 --> 01:06:48,080 Now, notice, you are not going to reject all unequal offers. 1563 01:06:48,080 --> 01:06:51,390 So I think 6-4, you would probably accept. 1564 01:06:51,390 --> 01:06:55,920 But you will reject really uneven offers. 1565 01:06:55,920 --> 01:06:58,650 So in some sense, if I ask you questions using the strategy 1566 01:06:58,650 --> 01:06:59,910 method, as we did-- 1567 01:06:59,910 --> 01:07:02,250 if I asked you questions about at what threshold 1568 01:07:02,250 --> 01:07:05,460 would you reject an offer, I can essentially-- 1569 01:07:05,460 --> 01:07:07,860 if I assume that your preferences look like this, 1570 01:07:07,860 --> 01:07:12,750 I can back out, essentially, your sigma. 1571 01:07:12,750 --> 01:07:14,430 Questions on that? 1572 01:07:14,430 --> 01:07:15,213 Yeah? 1573 01:07:15,213 --> 01:07:18,540 AUDIENCE: What's the utility [INAUDIBLE] 1574 01:07:18,540 --> 01:07:21,915 for the dictator, not the person who gets to reject the offer? 1575 01:07:21,915 --> 01:07:23,540 PROFESSOR: Sorry, this is just-- sorry, 1576 01:07:23,540 --> 01:07:24,665 I should have been clearer. 1577 01:07:24,665 --> 01:07:28,700 So this could be for-- 1578 01:07:28,700 --> 01:07:32,520 these are preferences that are generic for any types of games 1579 01:07:32,520 --> 01:07:34,180 that are being played. 1580 01:07:34,180 --> 01:07:37,270 So you could apply this to any game. 1581 01:07:37,270 --> 01:07:40,740 So Charness and Rabin, essentially what they do 1582 01:07:40,740 --> 01:07:43,080 is-- in their paper, and it's a long paper 1583 01:07:43,080 --> 01:07:44,580 that covers a lot of ground. 1584 01:07:44,580 --> 01:07:46,247 But essentially, what they're saying is, 1585 01:07:46,247 --> 01:07:48,940 like, here's what distributional preferences might look like. 1586 01:07:48,940 --> 01:07:51,060 And you can then choose different parameters. 1587 01:07:51,060 --> 01:07:53,490 And then the question is, like, what kinds of behavior now 1588 01:07:53,490 --> 01:07:54,523 can we explain? 1589 01:07:54,523 --> 01:07:56,190 Being the dictator game, potentially you 1590 01:07:56,190 --> 01:07:58,360 can try to explain sort of people's behavior. 1591 01:07:58,360 --> 01:08:01,377 But not I'm trying to apply it to the ultimatum game. 1592 01:08:01,377 --> 01:08:03,210 Or you could also apply it to the trust game 1593 01:08:03,210 --> 01:08:05,430 and say, for what kinds of behavior, 1594 01:08:05,430 --> 01:08:10,260 what kinds of preferences might people do certain behavior? 1595 01:08:10,260 --> 01:08:16,090 So what kinds of behaviors can be explained with that? 1596 01:08:16,090 --> 01:08:19,660 So now what's the experimental evidence on rho and sigma? 1597 01:08:19,660 --> 01:08:22,310 Just to remind you, rho is the parameter, 1598 01:08:22,310 --> 01:08:24,810 the weight that you put on the other person if you're ahead. 1599 01:08:24,810 --> 01:08:28,380 Sigma is the parameter on the other person when you're behind 1600 01:08:28,380 --> 01:08:30,250 in a situation. 1601 01:08:30,250 --> 01:08:32,640 So we think that when people are ahead, 1602 01:08:32,640 --> 01:08:35,229 most people seem to have a positive rho. 1603 01:08:35,229 --> 01:08:37,319 So people usually tend to-- 1604 01:08:37,319 --> 01:08:39,810 when ahead, they're willing to sacrifice some money 1605 01:08:39,810 --> 01:08:42,038 to increase the other person's payout. 1606 01:08:42,038 --> 01:08:44,580 So this is essentially-- when you think about dictator games, 1607 01:08:44,580 --> 01:08:46,229 when they have 10, essentially you 1608 01:08:46,229 --> 01:08:48,779 think of this as being, like, 10-0, you're ahead. 1609 01:08:48,779 --> 01:08:51,720 So you're willing to sacrifice at least some money, like 7-3 1610 01:08:51,720 --> 01:08:53,640 or 6-4, even 5-5. 1611 01:08:53,640 --> 01:08:55,080 You like that better. 1612 01:08:55,080 --> 01:08:58,290 You're happy to give the other person some amount. 1613 01:08:58,290 --> 01:09:00,330 A minority are even willing to sacrifice money 1614 01:09:00,330 --> 01:09:02,372 to give the other person an equal amount of money 1615 01:09:02,372 --> 01:09:04,140 or even less-- 1616 01:09:04,140 --> 01:09:07,439 sorry, even more. 1617 01:09:07,439 --> 01:09:10,319 When deciding to split $10, subjects 1618 01:09:10,319 --> 01:09:13,170 tend to give about 20% to 25% on average. 1619 01:09:13,170 --> 01:09:14,910 In class, that was actually pretty close. 1620 01:09:14,910 --> 01:09:18,600 The 28%, I think, are about the $10 amount. 1621 01:09:18,600 --> 01:09:20,640 You guys gave it something like 28%. 1622 01:09:20,640 --> 01:09:22,682 That's actually pretty close to what people would 1623 01:09:22,682 --> 01:09:24,120 do in normal dictator games. 1624 01:09:24,120 --> 01:09:26,939 The estimated rho tends to be about 0.4. 1625 01:09:26,939 --> 01:09:28,050 Yeah? 1626 01:09:28,050 --> 01:09:30,960 AUDIENCE: How can rho be estimated from the dictator 1627 01:09:30,960 --> 01:09:31,460 game? 1628 01:09:31,460 --> 01:09:33,961 The linearity of the preferences here, 1629 01:09:33,961 --> 01:09:37,730 shouldn't it predict that player 2 will always 1630 01:09:37,730 --> 01:09:43,134 either give 0 or give 50%? 1631 01:09:43,134 --> 01:09:44,547 PROFESSOR: No, it's just to say-- 1632 01:09:44,547 --> 01:09:47,130 I think this is just to say you have the same marginal utility 1633 01:09:47,130 --> 01:09:47,970 than I have. 1634 01:09:47,970 --> 01:09:51,240 So depending on when I ask you, like, how much do you 1635 01:09:51,240 --> 01:09:58,320 give, that pins down your rho. 1636 01:09:58,320 --> 01:10:03,710 AUDIENCE: But if x1 is 10 minus x2, 1637 01:10:03,710 --> 01:10:09,860 and we want to maximize rho x1 plus 1 minus rho times 1638 01:10:09,860 --> 01:10:14,982 x2, and you're maximizing that over x2. 1639 01:10:14,982 --> 01:10:17,710 [INAUDIBLE] equation, shouldn't it be [INAUDIBLE]?? 1640 01:10:17,710 --> 01:10:20,592 PROFESSOR: No, no, but the unknown here is the rho. 1641 01:10:20,592 --> 01:10:22,300 It's essentially to say, for example, you 1642 01:10:22,300 --> 01:10:25,570 choose 6-4, that means you prefer 6-4 over 7-3, 1643 01:10:25,570 --> 01:10:28,120 and you prefer 6-4 over 5-5. 1644 01:10:28,120 --> 01:10:29,740 And those two inequalities essentially 1645 01:10:29,740 --> 01:10:32,950 give you a bound on rho. 1646 01:10:32,950 --> 01:10:36,190 We can talk afterwards, but I'm pretty sure that's true. 1647 01:10:36,190 --> 01:10:37,010 Yeah. 1648 01:10:37,010 --> 01:10:37,510 Yeah. 1649 01:10:41,392 --> 01:10:43,850 But yeah, that's essentially just saying, like, essentially 1650 01:10:43,850 --> 01:10:45,440 what you do is you have-- 1651 01:10:45,440 --> 01:10:47,510 since you chose a certain option, 1652 01:10:47,510 --> 01:10:50,750 now essentially you'd prefer not to choose a higher option. 1653 01:10:50,750 --> 01:10:52,520 You prefer not to choose a lower option. 1654 01:10:52,520 --> 01:10:54,437 When you write that down, that should give you 1655 01:10:54,437 --> 01:10:58,760 inequalities as a function of rho in this case. 1656 01:10:58,760 --> 01:11:01,970 But we'll talk about it in two minutes. 1657 01:11:01,970 --> 01:11:05,180 What's the experimental evidence of sigma? 1658 01:11:05,180 --> 01:11:08,300 Only about 10% to 20% of players have sigma strong enough 1659 01:11:08,300 --> 01:11:10,580 to pay a non-trivial amount to hurt the other player. 1660 01:11:10,580 --> 01:11:16,850 These are people essentially who reject offers. 1661 01:11:16,850 --> 01:11:19,010 About 30% of people will sacrifice 1662 01:11:19,010 --> 01:11:20,490 to help a player ahead of them. 1663 01:11:20,490 --> 01:11:22,782 So even if they're behind and have the option of, like, 1664 01:11:22,782 --> 01:11:26,240 can you sort of help the person in front of you, 30% of people 1665 01:11:26,240 --> 01:11:27,483 will do that. 1666 01:11:27,483 --> 01:11:29,150 If a person is behind you, often neither 1667 01:11:29,150 --> 01:11:31,010 wants to help or hurt the other person by much. 1668 01:11:31,010 --> 01:11:32,780 Essentially, sort of like, whatever, I don't 1669 01:11:32,780 --> 01:11:33,988 care about this other person. 1670 01:11:33,988 --> 01:11:35,600 I don't feel inclined to do very much. 1671 01:11:35,600 --> 01:11:40,310 But also, I'm not necessarily hurting them. 1672 01:11:40,310 --> 01:11:42,830 And now I want to just preview what 1673 01:11:42,830 --> 01:11:45,230 we're going to talk about next time, which 1674 01:11:45,230 --> 01:11:47,930 is to say it looks like people are pretty 1675 01:11:47,930 --> 01:11:49,055 nice in various situations. 1676 01:11:49,055 --> 01:11:51,222 It looks like in the dictator game, they give money. 1677 01:11:51,222 --> 01:11:52,760 It looks like in the ultimatum game, 1678 01:11:52,760 --> 01:11:54,260 they give money, and so on. 1679 01:11:54,260 --> 01:11:57,110 Looks like people give money to charity. 1680 01:11:57,110 --> 01:12:00,920 This is what we discussed earlier about 2% of GDP. 1681 01:12:00,920 --> 01:12:04,820 People also do a bunch of volunteering or other 1682 01:12:04,820 --> 01:12:06,090 contributions. 1683 01:12:06,090 --> 01:12:10,310 So they give some time, about 15 hours per month on average 1684 01:12:10,310 --> 01:12:12,050 when you do surveys. 1685 01:12:12,050 --> 01:12:16,550 So it looks like people seem fairly generous overall. 1686 01:12:16,550 --> 01:12:21,080 Now, when you think about social recognition or social image, 1687 01:12:21,080 --> 01:12:24,350 perhaps that picture gets less rosy overall. 1688 01:12:24,350 --> 01:12:27,050 It seems to be that essentially, people 1689 01:12:27,050 --> 01:12:30,620 care a lot about what others think of them, and not just 1690 01:12:30,620 --> 01:12:32,480 about the outcomes, per se. 1691 01:12:32,480 --> 01:12:34,760 So one example would be gifts to organizations. 1692 01:12:34,760 --> 01:12:36,380 If you look at thresholds of-- like, 1693 01:12:36,380 --> 01:12:38,338 when you look at the Boston Symphony Orchestra, 1694 01:12:38,338 --> 01:12:40,520 and you look at the distribution of people giving, 1695 01:12:40,520 --> 01:12:42,590 it tends to be there's these thresholds where 1696 01:12:42,590 --> 01:12:44,780 you can give a certain amount, and then you recognize them. 1697 01:12:44,780 --> 01:12:47,197 Like, when you go to the Boston Symphony, in this booklet, 1698 01:12:47,197 --> 01:12:50,210 you can who's like a certain donor of which category. 1699 01:12:50,210 --> 01:12:52,340 People tend to bunch just above the thresholds 1700 01:12:52,340 --> 01:12:53,240 for these categories. 1701 01:12:53,240 --> 01:12:57,050 And presumably, that's because they want to look good. 1702 01:12:57,050 --> 01:12:58,590 That's just one example. 1703 01:12:58,590 --> 01:13:01,400 So I think the next question we're going to ask on Wednesday 1704 01:13:01,400 --> 01:13:06,080 is to say, is social recognition a major motivation for giving? 1705 01:13:06,080 --> 01:13:08,720 So people seem to not only care about what others get, 1706 01:13:08,720 --> 01:13:12,900 but what others think of them when they give or not. 1707 01:13:12,900 --> 01:13:15,770 So in some ways, that's a philosophical discussion 1708 01:13:15,770 --> 01:13:18,815 in saying, like, are people really nice versus not? 1709 01:13:18,815 --> 01:13:20,690 And you might say, it's sort of disappointing 1710 01:13:20,690 --> 01:13:24,530 if they're not nice in certain situations compared to others. 1711 01:13:24,530 --> 01:13:26,570 The flip side of that is, well, if you 1712 01:13:26,570 --> 01:13:29,390 know what kinds of situations generate 1713 01:13:29,390 --> 01:13:31,730 altruistic or prosocial behavior, well, 1714 01:13:31,730 --> 01:13:33,980 if you know that, then you can essentially also create 1715 01:13:33,980 --> 01:13:37,370 situations to foster prosociality. 1716 01:13:37,370 --> 01:13:39,860 So once we kind of know under which structure 1717 01:13:39,860 --> 01:13:42,890 or in which kind of circumstances 1718 01:13:42,890 --> 01:13:45,140 people are nice, well, that gives us 1719 01:13:45,140 --> 01:13:48,050 some lever or some ways for policy and saying, like, 1720 01:13:48,050 --> 01:13:49,820 well, you have to design your institutions 1721 01:13:49,820 --> 01:13:52,580 or your organizations or your company in certain ways 1722 01:13:52,580 --> 01:13:55,280 to get people to behave nice to each other. 1723 01:13:55,280 --> 01:13:56,930 In contrast, if you thought people 1724 01:13:56,930 --> 01:13:59,930 are either generally nice or not, well, then in some sense, 1725 01:13:59,930 --> 01:14:01,080 that's good to know. 1726 01:14:01,080 --> 01:14:03,920 But then in some sense, we can't do very much about that. 1727 01:14:03,920 --> 01:14:07,370 And that's what we're going to talk about next time. 1728 01:14:07,370 --> 01:14:09,130 Thank you.