1 00:00:01,440 --> 00:00:03,840 [SQUEEKING] [RUSTLING] 2 00:00:03,840 --> 00:00:11,050 [CLICKING] 3 00:00:11,050 --> 00:00:15,280 PROFESSOR: Welcome to lecture number 12. 4 00:00:15,280 --> 00:00:18,530 I'm very sorry that this has to be remote. 5 00:00:18,530 --> 00:00:20,830 I hope all of you are doing OK. 6 00:00:20,830 --> 00:00:24,100 I know this is lots of trouble and stress 7 00:00:24,100 --> 00:00:29,200 for many of you, having to move, not knowing necessarily 8 00:00:29,200 --> 00:00:33,590 where to move to, financial issues and so on and so forth, 9 00:00:33,590 --> 00:00:38,380 lots of worries about health, of your own health and others. 10 00:00:38,380 --> 00:00:41,330 I hope you're doing OK. 11 00:00:41,330 --> 00:00:43,420 Try to take good care of yourselves and others 12 00:00:43,420 --> 00:00:46,330 try to look out for others, be there for each other. 13 00:00:46,330 --> 00:00:49,780 Try also to make use of mental health resources as you can. 14 00:00:49,780 --> 00:00:53,360 I'll send an email about that as well. 15 00:00:53,360 --> 00:00:54,730 This is lecture number 12. 16 00:00:54,730 --> 00:00:57,260 We talk about social preferences. 17 00:00:57,260 --> 00:01:00,490 So we talked about sort of broadly 18 00:01:00,490 --> 00:01:03,010 given introduction about social preferences. 19 00:01:03,010 --> 00:01:07,722 Let me just move towards where we were towards the end. 20 00:01:07,722 --> 00:01:09,430 I showed you, essentially, different ways 21 00:01:09,430 --> 00:01:12,310 in which you can experimentally elicit 22 00:01:12,310 --> 00:01:16,570 people's social preferences using dictator games, ultimatum 23 00:01:16,570 --> 00:01:18,100 games, trust games. 24 00:01:18,100 --> 00:01:21,160 We looked at how generous do people 25 00:01:21,160 --> 00:01:22,640 behave in those kinds of games? 26 00:01:22,640 --> 00:01:24,970 Does it look like people care about others, 27 00:01:24,970 --> 00:01:27,670 they're nice towards others? 28 00:01:27,670 --> 00:01:31,083 We then, to some degree, sort of found quite a bit of generosity 29 00:01:31,083 --> 00:01:32,500 in dictator games and other games. 30 00:01:32,500 --> 00:01:34,180 People seemed quite nice. 31 00:01:34,180 --> 00:01:36,790 In some pieces of evidence, it wasn't quite clear 32 00:01:36,790 --> 00:01:39,068 whether it was is it that people are generally 33 00:01:39,068 --> 00:01:41,110 nice to each other, or is it like they don't want 34 00:01:41,110 --> 00:01:45,400 to look like a mean person, either towards others 35 00:01:45,400 --> 00:01:47,270 or perhaps towards themselves. 36 00:01:47,270 --> 00:01:49,718 They want to think of themselves as being a nice person, 37 00:01:49,718 --> 00:01:51,010 and they don't want to be mean. 38 00:01:51,010 --> 00:01:56,770 And that might be a motivation of giving towards others. 39 00:01:56,770 --> 00:01:59,020 In the games that have shown you so far, 40 00:01:59,020 --> 00:02:03,160 it wasn't quite clear how to distinguish that. 41 00:02:03,160 --> 00:02:04,910 So what we're going to do in this lecture, 42 00:02:04,910 --> 00:02:07,090 we're going to look at some pieces of evidence 43 00:02:07,090 --> 00:02:11,950 where there's versions of dictator and other games 44 00:02:11,950 --> 00:02:15,460 that lets us look at more the underlying motivation of why 45 00:02:15,460 --> 00:02:18,730 people give, and perhaps detect, in some cases, 46 00:02:18,730 --> 00:02:22,480 that when people are giving in some situations which 47 00:02:22,480 --> 00:02:25,600 look like people are quite nice, in fact, 48 00:02:25,600 --> 00:02:27,400 that might not be because they generally 49 00:02:27,400 --> 00:02:28,970 care about the other person. 50 00:02:28,970 --> 00:02:31,480 It might be rather either because of social image 51 00:02:31,480 --> 00:02:34,090 concerns-- is this about what other people think about them 52 00:02:34,090 --> 00:02:35,740 if they give versus not-- 53 00:02:35,740 --> 00:02:38,030 or because of self-image concerns, 54 00:02:38,030 --> 00:02:40,930 these are, like, what do they think of themselves, as in you 55 00:02:40,930 --> 00:02:44,560 give to somebody else not necessarily because you care 56 00:02:44,560 --> 00:02:47,320 about the other person, or it might not even 57 00:02:47,320 --> 00:02:49,990 be that you care about what the other person thinks of you. 58 00:02:49,990 --> 00:02:51,880 It's just about you yourself want 59 00:02:51,880 --> 00:02:55,330 to feel good about yourself, and therefore you 60 00:02:55,330 --> 00:02:57,550 might be generous because you want to sort of keep 61 00:02:57,550 --> 00:03:01,000 or maintain the image of being a nice person to yourself 62 00:03:01,000 --> 00:03:02,327 as well. 63 00:03:02,327 --> 00:03:04,660 Some of the evidence that I'm going to show you is-- you 64 00:03:04,660 --> 00:03:05,740 wouldn't say depressing, but it's 65 00:03:05,740 --> 00:03:08,407 a little bit sort of negative in the sense that essentially what 66 00:03:08,407 --> 00:03:12,790 I'm going to show you is that people appear, or in fact are, 67 00:03:12,790 --> 00:03:16,990 if you believe that evidence, less nice than it might appear. 68 00:03:16,990 --> 00:03:18,850 So it says that last time, at first sight, 69 00:03:18,850 --> 00:03:21,520 it looks like that's a little depressing or little negative 70 00:03:21,520 --> 00:03:27,137 in the sense of we might be disappointed that we thought 71 00:03:27,137 --> 00:03:28,720 everybody was really nice and friendly 72 00:03:28,720 --> 00:03:30,012 and cared so much about others. 73 00:03:30,012 --> 00:03:32,150 Maybe that's not quite true. 74 00:03:32,150 --> 00:03:34,720 On the other hand, however, if we sort of understand 75 00:03:34,720 --> 00:03:37,840 the conditions under which people are nice and friendly 76 00:03:37,840 --> 00:03:39,243 to each other-- 77 00:03:39,243 --> 00:03:40,910 and in some ways, you might sort of say, 78 00:03:40,910 --> 00:03:43,180 well, it's not obvious that we really 79 00:03:43,180 --> 00:03:46,040 care that much about whether people are generally nice 80 00:03:46,040 --> 00:03:46,540 versus not. 81 00:03:46,540 --> 00:03:47,998 What we really care about, perhaps, 82 00:03:47,998 --> 00:03:49,530 is how people, in fact, behave. 83 00:03:49,530 --> 00:03:52,510 Are they friendly and prosocial and support each other? 84 00:03:52,510 --> 00:03:55,300 And perhaps if we understand the conditions under which people 85 00:03:55,300 --> 00:03:57,580 are nice and friendly to each other, 86 00:03:57,580 --> 00:04:01,360 perhaps that allows us, then, to think about policies, 87 00:04:01,360 --> 00:04:02,350 how do we set up-- 88 00:04:02,350 --> 00:04:06,250 in firms or in organizations or in society as a whole-- 89 00:04:06,250 --> 00:04:08,050 how do we set up policies that might make 90 00:04:08,050 --> 00:04:10,090 people more prosocial overall. 91 00:04:10,090 --> 00:04:12,340 So this lecture, we're going to talk about more trying 92 00:04:12,340 --> 00:04:15,760 to detect different motivations for people's prosociality. 93 00:04:15,760 --> 00:04:17,950 And then the next lecture, we're going 94 00:04:17,950 --> 00:04:22,510 to talk about now what are some field evidence So all I'm 95 00:04:22,510 --> 00:04:24,667 going to show you for now is essentially mostly lab 96 00:04:24,667 --> 00:04:26,500 evidence, but we're going to show some field 97 00:04:26,500 --> 00:04:28,300 evidence in real world situations 98 00:04:28,300 --> 00:04:31,210 in companies and so on, or in field experiments, 99 00:04:31,210 --> 00:04:32,210 how people are behaving. 100 00:04:32,210 --> 00:04:33,835 And in particular, we're going to think 101 00:04:33,835 --> 00:04:35,890 about some policies that might get people to be 102 00:04:35,890 --> 00:04:37,870 more prosocial versus not. 103 00:04:40,930 --> 00:04:44,442 So we already talked about this as a whole. 104 00:04:44,442 --> 00:04:46,150 So the first thing we want to think about 105 00:04:46,150 --> 00:04:48,340 is essentially social recognition. 106 00:04:48,340 --> 00:04:50,350 This is essentially social image. 107 00:04:50,350 --> 00:04:54,790 How do others think about you when you give versus not? 108 00:04:54,790 --> 00:04:58,780 So then I'm going to show you now several papers, 109 00:04:58,780 --> 00:05:01,060 or evidence from several papers, that essentially does 110 00:05:01,060 --> 00:05:04,660 different modifications, mostly of the Dictator Game, that 111 00:05:04,660 --> 00:05:09,435 allows us to disentangle these different motivations. 112 00:05:09,435 --> 00:05:11,060 So the first one is the very nice paper 113 00:05:11,060 --> 00:05:14,630 by Lazear et al that conduct an experiment to study motives 114 00:05:14,630 --> 00:05:17,630 for giving, as of many dictator games do. 115 00:05:17,630 --> 00:05:24,220 And they have essentially a very simple design, two treatments. 116 00:05:24,220 --> 00:05:25,970 One of them is the standard dictator game. 117 00:05:25,970 --> 00:05:27,827 This is what you have seen before. 118 00:05:27,827 --> 00:05:29,660 The only difference is this one is in euros. 119 00:05:29,660 --> 00:05:32,750 It's sort of, I think, done in Europe. 120 00:05:32,750 --> 00:05:36,660 Think of euros as just being the same as dollars. 121 00:05:36,660 --> 00:05:42,230 So people can decide to split 10 euros between themselves 122 00:05:42,230 --> 00:05:44,360 and another subject. 123 00:05:44,360 --> 00:05:47,180 In another treatment, subjects decide 124 00:05:47,180 --> 00:05:49,760 whether to even participate in the Dictator Game. 125 00:05:49,760 --> 00:05:53,510 So essentially, what's being allowed here is an exit option. 126 00:05:53,510 --> 00:05:56,242 You could essentially say either you're 127 00:05:56,242 --> 00:05:57,950 going to participate in the dictator game 128 00:05:57,950 --> 00:06:00,860 with another person, or you could just decide 129 00:06:00,860 --> 00:06:03,530 you don't even play this game. 130 00:06:03,530 --> 00:06:06,260 And you get just get some fixed payout. 131 00:06:06,260 --> 00:06:11,930 Now then, if the dictator chooses to participate, 132 00:06:11,930 --> 00:06:14,030 the recipient is informed of the game 133 00:06:14,030 --> 00:06:16,250 and the choice of the dictator. 134 00:06:16,250 --> 00:06:18,500 So the recipient will be informed 135 00:06:18,500 --> 00:06:23,130 and know about there was a dictator game that was played. 136 00:06:23,130 --> 00:06:25,370 Here's the amount that the dictator chose. 137 00:06:25,370 --> 00:06:28,100 Here's the amount that you got. 138 00:06:28,100 --> 00:06:31,900 And then the standard dictator game commences, and then 139 00:06:31,900 --> 00:06:33,770 the recipient just gets the money, 140 00:06:33,770 --> 00:06:36,900 and the recipient is essentially informed of the action. 141 00:06:36,900 --> 00:06:40,520 So if you choose, for example, 9 euros for yourself and one 142 00:06:40,520 --> 00:06:43,760 euros for the other person, the other person will know that 143 00:06:43,760 --> 00:06:48,830 the one euro that they got came from a dictator game, 144 00:06:48,830 --> 00:06:54,020 and there was somebody who chose $9.00 for themselves and gave 145 00:06:54,020 --> 00:06:55,340 $1.00 for them. 146 00:06:55,340 --> 00:06:59,030 So then one option of this, or one version of this game, 147 00:06:59,030 --> 00:07:01,940 was essentially just like a costless exit option. 148 00:07:01,940 --> 00:07:03,440 That is essentially the option where 149 00:07:03,440 --> 00:07:06,950 you say you receive 10 euros without the option 150 00:07:06,950 --> 00:07:08,660 of distributing the money. 151 00:07:08,660 --> 00:07:10,580 Notice that that's exactly the same 152 00:07:10,580 --> 00:07:13,970 as choosing to opt into the game and choosing 10-0 153 00:07:13,970 --> 00:07:15,382 in terms of payouts. 154 00:07:15,382 --> 00:07:17,090 If I just say I want 10 euros for myself, 155 00:07:17,090 --> 00:07:18,740 I give nothing to the other person, 156 00:07:18,740 --> 00:07:22,130 that's exactly the same as saying you opt out 157 00:07:22,130 --> 00:07:23,930 of the dictator game. 158 00:07:23,930 --> 00:07:27,030 The difference, of course, is that if you opt out, 159 00:07:27,030 --> 00:07:31,040 the participant will never know that you opted out. 160 00:07:31,040 --> 00:07:32,720 So the other person will never find out 161 00:07:32,720 --> 00:07:36,540 that somebody was kind of mean to them. 162 00:07:36,540 --> 00:07:40,280 And so you might sort of choose the opt out option, perhaps 163 00:07:40,280 --> 00:07:42,830 because you think the other person will feel bad. 164 00:07:42,830 --> 00:07:46,610 And if the other person feels bad, you try and avoid that 165 00:07:46,610 --> 00:07:48,080 and you might opt out. 166 00:07:48,080 --> 00:07:50,030 This is kind of the exit option. 167 00:07:50,030 --> 00:07:54,350 So the first option here, the subject line of the slides, 168 00:07:54,350 --> 00:07:56,270 is costly exit. 169 00:07:56,270 --> 00:07:59,600 In this version of the game, the exit is actually not costly. 170 00:07:59,600 --> 00:08:00,860 It's costless to do so. 171 00:08:00,860 --> 00:08:03,560 You can just essentially take the $1.00 and run, 172 00:08:03,560 --> 00:08:07,430 and you don't have to actually play the dictator game. 173 00:08:07,430 --> 00:08:10,970 Now when you think about distributional preferences, 174 00:08:10,970 --> 00:08:13,130 what do distributional preferences predict? 175 00:08:13,130 --> 00:08:17,360 Well, dictators who want to give some money strictly 176 00:08:17,360 --> 00:08:19,430 prefer to play the dictator game. 177 00:08:19,430 --> 00:08:22,723 If you think you prefer 9-1 over 10-0 well, 178 00:08:22,723 --> 00:08:24,140 then you're going to play the game 179 00:08:24,140 --> 00:08:27,170 and say I'm going to choose 9-1 because otherwise, there's 180 00:08:27,170 --> 00:08:29,930 no way to actually give this person money. 181 00:08:29,930 --> 00:08:31,670 Notice distributional preferences, 182 00:08:31,670 --> 00:08:33,380 just to remind you, are preferences 183 00:08:33,380 --> 00:08:35,122 were only the outcomes matter. 184 00:08:35,122 --> 00:08:37,039 It doesn't matter how the outcomes come about. 185 00:08:37,039 --> 00:08:39,390 Only the actual outcome matters. 186 00:08:39,390 --> 00:08:43,250 Now there are some people who want to just keep everything 187 00:08:43,250 --> 00:08:44,059 to themselves. 188 00:08:44,059 --> 00:08:46,370 Well, those people who should just 189 00:08:46,370 --> 00:08:47,810 be indifferent-- they will just be 190 00:08:47,810 --> 00:08:50,930 like, it doesn't really matter whether I just opt out. 191 00:08:50,930 --> 00:08:52,390 Maybe that's easier to opt out. 192 00:08:52,390 --> 00:08:53,432 It doesn't really matter. 193 00:08:53,432 --> 00:08:57,020 I would give 10-0 anyway, either opt out or just 194 00:08:57,020 --> 00:09:00,200 stay in the game and choose 10-0 anyway. 195 00:09:00,200 --> 00:09:02,810 But in either case, the option to exit the game 196 00:09:02,810 --> 00:09:06,447 should have no effect on how much the dictators share. 197 00:09:06,447 --> 00:09:07,280 It shouldn't matter. 198 00:09:07,280 --> 00:09:09,260 If you want to give some positive amount, 199 00:09:09,260 --> 00:09:10,640 if you opt that option, it doesn't really 200 00:09:10,640 --> 00:09:12,432 matter because you're not going to opt out. 201 00:09:12,432 --> 00:09:13,610 You're going to just opt in. 202 00:09:13,610 --> 00:09:16,440 You stay in the game and just give the money to that person. 203 00:09:16,440 --> 00:09:17,840 And if you want to give 0 anyway, 204 00:09:17,840 --> 00:09:19,460 it doesn't matter whether you opt out or not. 205 00:09:19,460 --> 00:09:21,470 You just give 0 anyway, and if you opt out, 206 00:09:21,470 --> 00:09:24,600 it's essentially effectively giving 0 to the other person. 207 00:09:24,600 --> 00:09:28,320 So either way, the option to exit 208 00:09:28,320 --> 00:09:30,890 should have no effect on how much dictators share. 209 00:09:30,890 --> 00:09:33,350 In particular, it should not reduce 210 00:09:33,350 --> 00:09:35,415 how much dictators share. 211 00:09:35,415 --> 00:09:36,290 Does that make sense? 212 00:09:39,210 --> 00:09:40,050 OK. 213 00:09:40,050 --> 00:09:43,410 The limited audience seems to understand. 214 00:09:43,410 --> 00:09:45,840 Hope this is clear to everybody else as well. 215 00:09:45,840 --> 00:09:47,850 By the way, I should have said the lectures now 216 00:09:47,850 --> 00:09:53,260 will be recorded currently with a very limited audience, 217 00:09:53,260 --> 00:09:56,700 but in general, with no audience. 218 00:09:56,700 --> 00:09:58,500 What I'm going to do is try to keep 219 00:09:58,500 --> 00:10:01,890 sort of what I very much miss in this format is 220 00:10:01,890 --> 00:10:04,380 sort of the interactive form of the lectures. 221 00:10:04,380 --> 00:10:06,630 So what are we're going to do is trying to figure out, 222 00:10:06,630 --> 00:10:08,400 in particular, after spring break, 223 00:10:08,400 --> 00:10:10,440 trying to record the lectures and then finding 224 00:10:10,440 --> 00:10:14,460 some other ways of allowing students to interact, perhaps 225 00:10:14,460 --> 00:10:15,840 by having a version of, well, you 226 00:10:15,840 --> 00:10:17,730 can watch the lectures online, and then we 227 00:10:17,730 --> 00:10:20,910 have essentially either online office hours or discussions 228 00:10:20,910 --> 00:10:21,630 of the slides. 229 00:10:21,630 --> 00:10:23,850 Or if there are particular issues that you may, 230 00:10:23,850 --> 00:10:25,710 may answer some questions over Zoom 231 00:10:25,710 --> 00:10:30,030 or other formats we can then discuss some issues that 232 00:10:30,030 --> 00:10:31,050 perhaps aren't clear. 233 00:10:31,050 --> 00:10:33,360 Or you can ask these questions, and then I 234 00:10:33,360 --> 00:10:35,130 try to respond to that. 235 00:10:35,130 --> 00:10:36,630 Anyway, getting back to this game, 236 00:10:36,630 --> 00:10:38,310 essentially we have now the comparison 237 00:10:38,310 --> 00:10:42,180 between standard dictator games and the option 238 00:10:42,180 --> 00:10:46,170 of, in this case, a free exit. 239 00:10:46,170 --> 00:10:48,630 So in this case, the exit option lowers giving, 240 00:10:48,630 --> 00:10:52,440 and the standard dictator game, dictators shared about one euro 241 00:10:52,440 --> 00:10:54,990 $0.87 on average. 242 00:10:54,990 --> 00:10:58,170 That's, again, sort of in line with the 20%, 25% 243 00:10:58,170 --> 00:11:01,230 that people tend to give in dictator games in the very 244 00:11:01,230 --> 00:11:02,490 simplest form. 245 00:11:02,490 --> 00:11:06,630 But when you allow people to exit, the share given 246 00:11:06,630 --> 00:11:08,760 goes down to $0.58. 247 00:11:08,760 --> 00:11:09,450 OK? 248 00:11:09,450 --> 00:11:15,390 And so that's essentially to say there are many people who, 249 00:11:15,390 --> 00:11:18,750 when they're getting into the dictator game, 250 00:11:18,750 --> 00:11:23,280 they feel compelled to give at least some money because either 251 00:11:23,280 --> 00:11:24,870 they feel bad themselves somehow, 252 00:11:24,870 --> 00:11:27,510 or they feel that the other person will feel bad. 253 00:11:27,510 --> 00:11:29,520 So you kind of want to choose 10-0. 254 00:11:29,520 --> 00:11:32,760 But you feel bad about the other person feeling bad 255 00:11:32,760 --> 00:11:36,600 if you choose 10-0, so you might be inclined to choose 8-2, 7-3, 256 00:11:36,600 --> 00:11:38,400 or the like. 257 00:11:38,400 --> 00:11:41,760 Now if you allow the costless exit option, 258 00:11:41,760 --> 00:11:43,800 the person who in the standard dictator game 259 00:11:43,800 --> 00:11:48,060 would choose like 7-3, 6-4, or 8-2 or the like, 260 00:11:48,060 --> 00:11:52,170 that person, in fact, wants to choose the 10-0 261 00:11:52,170 --> 00:11:57,780 but feels bad in the standard dictator game, so chooses 7-3. 262 00:11:57,780 --> 00:12:00,390 Once you allow the costless exit, 263 00:12:00,390 --> 00:12:04,780 the person just chooses the 10-0 and essentially, then, 264 00:12:04,780 --> 00:12:09,690 by opting out, effectively is choosing 10-0. 265 00:12:09,690 --> 00:12:12,600 Now a different version of this-- so what I showed you 266 00:12:12,600 --> 00:12:14,370 so far was costless exit. 267 00:12:14,370 --> 00:12:16,110 It essentially was free to exit. 268 00:12:16,110 --> 00:12:19,380 You could just choose the 10-0 and just run. 269 00:12:19,380 --> 00:12:22,500 There was no cost of doing so. 270 00:12:22,500 --> 00:12:23,880 The pie stayed the same. 271 00:12:23,880 --> 00:12:25,740 There was no subtraction of that. 272 00:12:25,740 --> 00:12:28,500 Now different versions of that games 273 00:12:28,500 --> 00:12:31,290 were the exit is actually costly. 274 00:12:31,290 --> 00:12:34,080 That is to say, instead of choosing a dictator game where 275 00:12:34,080 --> 00:12:38,500 you can share 10 euros between the dictator and the recipient, 276 00:12:38,500 --> 00:12:40,140 now you can choose the exit option. 277 00:12:40,140 --> 00:12:42,750 You have to pay, essentially, one euro for the exit option 278 00:12:42,750 --> 00:12:46,290 so you get 9 euros if you exit, and the other person still 279 00:12:46,290 --> 00:12:48,150 gets 0. 280 00:12:48,150 --> 00:12:54,370 So that's now to say people then choose the costly exit, which 281 00:12:54,370 --> 00:12:57,400 is to say they say, I'd rather do the 9 euros 282 00:12:57,400 --> 00:13:00,010 and not play the dictator game with this other person. 283 00:13:00,010 --> 00:13:02,560 Rather than playing the dictator game 284 00:13:02,560 --> 00:13:04,990 where you get 10 euros potentially, 285 00:13:04,990 --> 00:13:08,740 you could choose 10 euros and 0 for the other person. 286 00:13:08,740 --> 00:13:14,140 But there seem to be quite a few people who choose, essentially, 287 00:13:14,140 --> 00:13:17,950 the costly exit, the 9 euros, without playing the game. 288 00:13:17,950 --> 00:13:20,030 And so they're here. 289 00:13:20,030 --> 00:13:23,290 Say average subjects are willing to take 82% of the pie 290 00:13:23,290 --> 00:13:26,230 rather than split the full pie in the dictator game. 291 00:13:26,230 --> 00:13:27,640 That is to say people essentially 292 00:13:27,640 --> 00:13:29,890 are willing to forgo quite a bit of money 293 00:13:29,890 --> 00:13:31,480 to face the situation where you have 294 00:13:31,480 --> 00:13:33,103 to deal with this other person. 295 00:13:33,103 --> 00:13:35,020 Notice that you could choose exactly the same. 296 00:13:35,020 --> 00:13:37,840 You could choose 10-0 or 9-0 or the like. 297 00:13:37,840 --> 00:13:39,480 People don't want to do that. 298 00:13:39,480 --> 00:13:43,600 Instead, they choose the costly exit option 299 00:13:43,600 --> 00:13:45,280 so the other person never finds out. 300 00:13:45,280 --> 00:13:46,660 So in some ways, you don't have to deal 301 00:13:46,660 --> 00:13:48,368 with yourself or the other person feeling 302 00:13:48,368 --> 00:13:51,430 bad about your not being nice. 303 00:13:51,430 --> 00:13:52,900 And so that reveals, in some sense, 304 00:13:52,900 --> 00:13:56,140 saying if you see people being nice in dictator games, 305 00:13:56,140 --> 00:13:58,300 it seems to be, at least to some degree, 306 00:13:58,300 --> 00:14:00,400 that's not coming from people generally 307 00:14:00,400 --> 00:14:02,290 wanting to be nice to the other person 308 00:14:02,290 --> 00:14:05,860 and generally wanting to improve people's payout for the sake 309 00:14:05,860 --> 00:14:07,390 of making them richer. 310 00:14:07,390 --> 00:14:08,890 But it's rather they want to avoid 311 00:14:08,890 --> 00:14:10,810 the other person feels bad about them, 312 00:14:10,810 --> 00:14:13,120 or getting mad or the like. 313 00:14:13,120 --> 00:14:14,830 Now do let me just repeat that. 314 00:14:14,830 --> 00:14:17,590 Why do people subject want to exit? 315 00:14:17,590 --> 00:14:21,023 Well, simply put, they want to take the money for themselves, 316 00:14:21,023 --> 00:14:23,440 but they don't want to indicate to the potential recipient 317 00:14:23,440 --> 00:14:25,270 that she's been treated unfairly. 318 00:14:25,270 --> 00:14:28,030 So exiting allows them to satisfy their greed 319 00:14:28,030 --> 00:14:31,225 and not to worry about the participant's reaction. 320 00:14:34,210 --> 00:14:35,589 Do you have any questions? 321 00:14:42,300 --> 00:14:43,010 OK. 322 00:14:43,010 --> 00:14:45,440 So that's the first piece of evidence. 323 00:14:45,440 --> 00:14:51,170 That's Lazear et al on costly exit in a dictator game. 324 00:14:51,170 --> 00:14:53,600 Now we discussed already very last time 325 00:14:53,600 --> 00:14:55,820 another version of that, which essentially 326 00:14:55,820 --> 00:15:00,380 is a different version, which is providing excuses 327 00:15:00,380 --> 00:15:03,080 or covers to be selfish. 328 00:15:03,080 --> 00:15:05,210 This is a very nice paper by Andreoni and Bernheim, 329 00:15:05,210 --> 00:15:07,130 where essentially the short summary of this 330 00:15:07,130 --> 00:15:10,400 is what they do is they allow for a computer option. 331 00:15:10,400 --> 00:15:12,980 They allow for an option where essentially, the computer 332 00:15:12,980 --> 00:15:17,660 decides in some cases, which gives 333 00:15:17,660 --> 00:15:20,470 people cover to be not nice. 334 00:15:20,470 --> 00:15:22,850 So if there's a chance that you will decide yourself 335 00:15:22,850 --> 00:15:25,640 and a chance that the computer decides, and the recipient 336 00:15:25,640 --> 00:15:29,150 doesn't know whether I chose or the computer decided-- 337 00:15:29,150 --> 00:15:31,160 say it's 50% or the like-- 338 00:15:31,160 --> 00:15:35,660 I might choose a very mean choice 339 00:15:35,660 --> 00:15:39,327 because I can always say, well, you know, I didn't do this. 340 00:15:39,327 --> 00:15:40,160 It was the computer. 341 00:15:40,160 --> 00:15:41,850 I wanted to be really nice to you. 342 00:15:41,850 --> 00:15:43,970 The computer happened to be just mean. 343 00:15:43,970 --> 00:15:45,850 Sorry about that. 344 00:15:45,850 --> 00:15:47,790 I really tried to be nice to you. 345 00:15:47,790 --> 00:15:50,480 And so people essentially use the computer as cover. 346 00:15:50,480 --> 00:15:53,010 Let me show you what this exact evidence looks like. 347 00:15:53,010 --> 00:15:56,160 So this is a different version of a non-anonymous dictator 348 00:15:56,160 --> 00:15:56,660 game. 349 00:15:56,660 --> 00:15:59,103 So this game is sort of set up in a way 350 00:15:59,103 --> 00:16:00,770 that you actually know the other person, 351 00:16:00,770 --> 00:16:03,500 so you have to sort of face the other person, at least 352 00:16:03,500 --> 00:16:04,590 in some way. 353 00:16:04,590 --> 00:16:09,080 And so the way this is set up is that the dictator's choice was 354 00:16:09,080 --> 00:16:11,390 forced with some probability. 355 00:16:11,390 --> 00:16:17,360 So this is a dictator game with $20. 356 00:16:17,360 --> 00:16:23,000 Now in the simplest version, the computer chooses 20-0 or 0-20 357 00:16:23,000 --> 00:16:25,100 with equal probability. 358 00:16:25,100 --> 00:16:26,870 So if the computer chooses, the computer 359 00:16:26,870 --> 00:16:30,470 chooses either 20-0 or 0-20 with 50% chance each 360 00:16:30,470 --> 00:16:32,810 if it's the computer's turn. 361 00:16:32,810 --> 00:16:34,850 The dictator observes the allocation 362 00:16:34,850 --> 00:16:36,080 chosen by the computer. 363 00:16:36,080 --> 00:16:38,450 The recipient does not. 364 00:16:38,450 --> 00:16:42,575 If I'm the dictator, I can see what the computer does, 365 00:16:42,575 --> 00:16:44,450 and then I'm sort of asked to make my choice. 366 00:16:44,450 --> 00:16:49,580 The recipient does not know what the computer actually chose. 367 00:16:49,580 --> 00:16:52,160 The dictator then makes a dictator game allocation 368 00:16:52,160 --> 00:16:54,380 with a pie of $20 such as before. 369 00:16:54,380 --> 00:16:57,050 Just happens to be $20 rather than $10. 370 00:16:57,050 --> 00:17:00,470 And the computer's forced allocation 371 00:17:00,470 --> 00:17:04,369 implemented with probability p is known both to the dictator 372 00:17:04,369 --> 00:17:05,569 and to recipient. 373 00:17:05,569 --> 00:17:09,230 So the recipient knows what the chance of p 374 00:17:09,230 --> 00:17:12,920 is that I'm choosing versus the computer choosing. 375 00:17:12,920 --> 00:17:17,420 So the computer chooses with probability p. 376 00:17:17,420 --> 00:17:21,230 The dictator chooses with probability 1 minus p. 377 00:17:21,230 --> 00:17:28,030 That is known to both the recipient and to the dictator. 378 00:17:28,030 --> 00:17:35,980 And now the game will vary the probability p. 379 00:17:35,980 --> 00:17:38,800 So if the probability p is like a 0, 380 00:17:38,800 --> 00:17:41,380 if the computer never chooses, and only the dictator chooses, 381 00:17:41,380 --> 00:17:43,990 then we're back to the typical dictator game. 382 00:17:43,990 --> 00:17:46,270 Then I cannot hide behind the computer. 383 00:17:46,270 --> 00:17:48,280 It's obvious that I'm choosing. 384 00:17:48,280 --> 00:17:52,000 If I'm choosing 20-0, you know that, 385 00:17:52,000 --> 00:17:55,840 so I have no way of blaming the computer. 386 00:17:55,840 --> 00:18:01,300 If, in contrast, the probability is larger than 0-- 387 00:18:01,300 --> 00:18:04,220 suppose the probability is 50%, 60%, 70%-- 388 00:18:04,220 --> 00:18:06,790 it's very likely that the computer chose. 389 00:18:06,790 --> 00:18:10,990 So now I can also choose 20-0 and just say, well, sorry, 390 00:18:10,990 --> 00:18:13,210 it happens to be that the computer implemented 391 00:18:13,210 --> 00:18:14,440 this choice. 392 00:18:14,440 --> 00:18:15,880 Sorry that you got 0. 393 00:18:15,880 --> 00:18:18,430 So I can essentially hide behind the computer. 394 00:18:18,430 --> 00:18:20,230 And the larger the probability p is, 395 00:18:20,230 --> 00:18:23,350 the more credible is this hiding behind the computer 396 00:18:23,350 --> 00:18:26,740 because you know if the chance is only 5% or 10%, 397 00:18:26,740 --> 00:18:29,440 you might not believe me that the computer actually chose. 398 00:18:29,440 --> 00:18:31,270 You're like, yeah, yeah, Frank, you're 399 00:18:31,270 --> 00:18:32,110 telling me it was the computer. 400 00:18:32,110 --> 00:18:32,860 Really it was you. 401 00:18:32,860 --> 00:18:35,500 But if the chance is 90% or the like, 402 00:18:35,500 --> 00:18:38,290 then it's actually very likely, very plausible 403 00:18:38,290 --> 00:18:41,170 that the computer chose the main outcome. 404 00:18:41,170 --> 00:18:43,325 And then I might also just choose the mean outcome 405 00:18:43,325 --> 00:18:44,950 because you're never going to find out. 406 00:18:44,950 --> 00:18:47,350 You're not going to suspect that this was me as opposed 407 00:18:47,350 --> 00:18:49,640 to the computer. 408 00:18:49,640 --> 00:18:53,050 And then at the end, the recipient 409 00:18:53,050 --> 00:18:57,120 only learns the allocation, not the dictator's choice. 410 00:18:57,120 --> 00:18:59,590 The recipient only learns what they get. 411 00:18:59,590 --> 00:19:02,680 They do not learn what I chose or whether it was 412 00:19:02,680 --> 00:19:05,050 the computer or me who chose. 413 00:19:05,050 --> 00:19:09,490 They only know the probability p with which the computer or I 414 00:19:09,490 --> 00:19:10,690 made that choice. 415 00:19:10,690 --> 00:19:12,390 Yeah. 416 00:19:12,390 --> 00:19:13,904 AUDIENCE: Does the dictator know p 417 00:19:13,904 --> 00:19:15,800 before he makes his allocation? 418 00:19:19,120 --> 00:19:19,820 PROFESSOR: Yes. 419 00:19:19,820 --> 00:19:20,320 Yes. 420 00:19:20,320 --> 00:19:22,270 So exactly. 421 00:19:22,270 --> 00:19:23,740 The dictator knows p. 422 00:19:23,740 --> 00:19:26,890 The dictator even knows, I think, the actual choice 423 00:19:26,890 --> 00:19:29,590 that the computer made, yeah. 424 00:19:29,590 --> 00:19:31,160 But that's key here. 425 00:19:31,160 --> 00:19:34,840 So both the dictator and the recipient know p. 426 00:19:34,840 --> 00:19:38,590 That's really important because only if I know what p is, 427 00:19:38,590 --> 00:19:40,720 I can actually react to it and hide. 428 00:19:40,720 --> 00:19:43,165 So the prediction here will be that if p 429 00:19:43,165 --> 00:19:46,210 is very high, if the computer chooses with high probability, 430 00:19:46,210 --> 00:19:51,280 people will be more likely to be mean in the 10-0 case 431 00:19:51,280 --> 00:19:53,530 because I could just essentially emulate the computer 432 00:19:53,530 --> 00:19:54,490 and hide behind it. 433 00:19:54,490 --> 00:19:58,030 But for that I need to know exactly what p is. 434 00:19:58,030 --> 00:20:00,520 So now what do distributional preferences here predict? 435 00:20:00,520 --> 00:20:02,050 Again, distributional preferences 436 00:20:02,050 --> 00:20:05,150 are preferences such that you only care about the outcome. 437 00:20:05,150 --> 00:20:08,570 You don't care about how this came about. 438 00:20:08,570 --> 00:20:11,110 So the dictator here should only think about the case 439 00:20:11,110 --> 00:20:13,060 in which her choice counts. 440 00:20:13,060 --> 00:20:15,185 That is to say, the computer is entirely irrelevant 441 00:20:15,185 --> 00:20:17,810 because again, I can't influence what the computer does anyway. 442 00:20:17,810 --> 00:20:20,530 It doesn't matter whether the computer chooses with 5% or 10% 443 00:20:20,530 --> 00:20:24,160 or a 20% chance, or 50%, even 70% chance. 444 00:20:24,160 --> 00:20:25,960 The only thing I should care about 445 00:20:25,960 --> 00:20:30,040 is well, for the chance of 1 minus p when I choose myself, 446 00:20:30,040 --> 00:20:31,030 what am I going to do? 447 00:20:31,030 --> 00:20:32,822 And what are my distributional preferences? 448 00:20:32,822 --> 00:20:36,160 How much do I want to give you versus the other person? 449 00:20:36,160 --> 00:20:37,930 And that's essentially the only case 450 00:20:37,930 --> 00:20:41,380 in which she can, in fact, affect the distribution. 451 00:20:41,380 --> 00:20:47,437 And so p should have essentially no effect in people's choices. 452 00:20:47,437 --> 00:20:49,270 So that's a very straightforward prediction. 453 00:20:49,270 --> 00:20:51,310 Essentially, you only care about the case 454 00:20:51,310 --> 00:20:52,870 when you can make a difference. 455 00:20:52,870 --> 00:20:54,790 And in that case, you just choose 456 00:20:54,790 --> 00:20:58,600 whatever you want to choose between 20-0 and 0-20, 457 00:20:58,600 --> 00:21:01,367 whatever you want to give the other person. 458 00:21:01,367 --> 00:21:03,700 The other cases are just irrelevant because they are out 459 00:21:03,700 --> 00:21:05,440 of the person's control. 460 00:21:05,440 --> 00:21:07,390 Now why might p matter anyway? 461 00:21:07,390 --> 00:21:08,380 I already said that. 462 00:21:08,380 --> 00:21:10,690 Essentially p might matter because you 463 00:21:10,690 --> 00:21:14,940 can hide behind the computer if p is particularly high. 464 00:21:14,940 --> 00:21:15,720 Yes. 465 00:21:15,720 --> 00:21:18,510 AUDIENCE: So the recipient knows that the computer can only 466 00:21:18,510 --> 00:21:21,353 choose 20 or 0? 467 00:21:21,353 --> 00:21:22,020 PROFESSOR: Yeah. 468 00:21:22,020 --> 00:21:24,780 There's different versions of that, exactly. 469 00:21:24,780 --> 00:21:25,380 Exactly. 470 00:21:25,380 --> 00:21:29,550 In some case, the computer could choose 20-0, 471 00:21:29,550 --> 00:21:31,750 0-20 with equal probability. 472 00:21:31,750 --> 00:21:33,360 So the recipient knows that. 473 00:21:33,360 --> 00:21:35,160 That's known. 474 00:21:35,160 --> 00:21:37,612 AUDIENCE: If it's like a 80-20 and the recipient 475 00:21:37,612 --> 00:21:39,388 knows automatically. 476 00:21:39,388 --> 00:21:40,180 PROFESSOR: Exactly. 477 00:21:40,180 --> 00:21:41,055 That's exactly right. 478 00:21:41,055 --> 00:21:43,330 I'm going to show you exactly that evidence. 479 00:21:43,330 --> 00:21:45,890 In fact, what the game will do is-- 480 00:21:45,890 --> 00:21:46,390 sorry. 481 00:21:46,390 --> 00:21:47,860 The question was does the recipient 482 00:21:47,860 --> 00:21:52,440 know whether it's 20-0 versus 18-two or something else? 483 00:21:52,440 --> 00:21:54,130 And that's exactly right. 484 00:21:54,130 --> 00:21:57,280 The only plausible way in which I can hide behind the computer, 485 00:21:57,280 --> 00:22:00,550 in this case, is if I want to be selfish, is 20-0. 486 00:22:00,550 --> 00:22:03,048 If I choose 19-one, you know it was me. 487 00:22:03,048 --> 00:22:05,590 It can't end up in the computer because the computer can only 488 00:22:05,590 --> 00:22:07,750 do 20-0-0-20. 489 00:22:07,750 --> 00:22:08,825 And that's key here. 490 00:22:08,825 --> 00:22:10,450 And there's going to be some variation. 491 00:22:10,450 --> 00:22:13,270 And that variation will really what the computer does. 492 00:22:13,270 --> 00:22:14,860 And essentially that will exactly 493 00:22:14,860 --> 00:22:17,710 reveal people hiding behind the computer 494 00:22:17,710 --> 00:22:20,740 because then some variation will be the computer does 19-1, 495 00:22:20,740 --> 00:22:23,530 and suddenly people choose a lot of 19-1. 496 00:22:23,530 --> 00:22:25,390 And the only reason why you might do that 497 00:22:25,390 --> 00:22:26,770 is exactly because you might want 498 00:22:26,770 --> 00:22:28,960 to hide behind the computer. 499 00:22:28,960 --> 00:22:31,180 So let me show you that, exactly that, in fact. 500 00:22:31,180 --> 00:22:35,680 So here's now what we find. 501 00:22:35,680 --> 00:22:37,310 This is a bit of a complicated graph 502 00:22:37,310 --> 00:22:40,550 so let me try to walk you through that in detail. 503 00:22:40,550 --> 00:22:42,910 So the graph shows you the following. 504 00:22:42,910 --> 00:22:46,720 It shows you on the y-axis the fraction that people gave, 505 00:22:46,720 --> 00:22:49,070 on, average for, the different scenarios. 506 00:22:49,070 --> 00:22:52,373 So how much, what percentage of the $20 507 00:22:52,373 --> 00:22:54,040 did the person give to the other person, 508 00:22:54,040 --> 00:22:55,768 decide the dictator to do? 509 00:22:55,768 --> 00:22:56,810 This is not the computer. 510 00:22:56,810 --> 00:22:59,470 So this is only the dictator's choices. 511 00:22:59,470 --> 00:23:01,660 On the x-axis, we see the probability 512 00:23:01,660 --> 00:23:07,250 of the forced choice, of x 0 being 0. 513 00:23:10,140 --> 00:23:12,970 This is essentially to say what is the forced choice? 514 00:23:12,970 --> 00:23:17,270 What is the chance that the other person gets 0? 515 00:23:17,270 --> 00:23:20,970 The computer chooses with some probability 20-0, 516 00:23:20,970 --> 00:23:23,730 so 20 for the dictator and 0 for the other person, 517 00:23:23,730 --> 00:23:26,220 and that's happening for different probability. 518 00:23:26,220 --> 00:23:31,590 So the probability is either 0, 25%, 50%, or 75%. 519 00:23:31,590 --> 00:23:33,880 That's the p that I mentioned earlier. 520 00:23:33,880 --> 00:23:37,770 Let's look at p equals 0% to start with. 521 00:23:37,770 --> 00:23:40,125 That's essentially the scenario in which 522 00:23:40,125 --> 00:23:41,250 the computer is irrelevant. 523 00:23:41,250 --> 00:23:43,500 The computer never makes any choices. 524 00:23:43,500 --> 00:23:47,040 What you see, essentially, is the typical behavior 525 00:23:47,040 --> 00:23:48,450 in dictator games. 526 00:23:48,450 --> 00:23:50,780 Sorry, I should have also mentioned that here's much, 527 00:23:50,780 --> 00:23:53,190 then the different lines are, like how much 528 00:23:53,190 --> 00:23:55,570 is given to the other person. 529 00:23:55,570 --> 00:23:58,170 So the blue line that you see there is like 10. 530 00:23:58,170 --> 00:23:59,640 That's the 10-10 allocation. 531 00:23:59,640 --> 00:24:02,610 How much is the dictator giving to the other person? 532 00:24:02,610 --> 00:24:04,530 The red line is a 0. 533 00:24:04,530 --> 00:24:06,870 This is essentially the 20-0 allocation. 534 00:24:06,870 --> 00:24:11,430 And the other two lines down there are 1, 2 to 9, 535 00:24:11,430 --> 00:24:14,618 and there's also larger than 10. 536 00:24:14,618 --> 00:24:16,410 What we're going to focus on is essentially 537 00:24:16,410 --> 00:24:18,420 how often does the person give 0, 538 00:24:18,420 --> 00:24:21,940 and how often does the person give, like, 50%? 539 00:24:21,940 --> 00:24:24,400 So when you look at the 0% line, essentially, that's 540 00:24:24,400 --> 00:24:25,650 the typical dictator choice. 541 00:24:25,650 --> 00:24:26,610 Again, p is 0. 542 00:24:26,610 --> 00:24:28,110 The computer is entirely irrelevant. 543 00:24:28,110 --> 00:24:30,000 The computer can't do anything. 544 00:24:30,000 --> 00:24:32,090 Now I cannot hide behind the computer. 545 00:24:32,090 --> 00:24:34,680 You get sort of the typical dictator game outcomes. 546 00:24:34,680 --> 00:24:38,340 But just, like, 55% of people, actually, more than usual, 547 00:24:38,340 --> 00:24:39,810 give half. 548 00:24:39,810 --> 00:24:42,660 They choose the 10-10 allocation. 549 00:24:42,660 --> 00:24:44,400 About 30% of people choose 0. 550 00:24:44,400 --> 00:24:45,540 That's very common. 551 00:24:45,540 --> 00:24:49,170 And then there are some people below that give either 2 to 9, 552 00:24:49,170 --> 00:24:52,380 or 1 or even larger than 10. 553 00:24:52,380 --> 00:24:56,250 So that's typically what we see in a dictator game. 554 00:24:56,250 --> 00:24:59,640 Now moving to the right is now positive p's. 555 00:24:59,640 --> 00:25:06,060 This is p equals 0.25%, 50%, 75%. 556 00:25:06,060 --> 00:25:08,525 And now what we're seeing, essentially, 557 00:25:08,525 --> 00:25:09,900 is that as you move to the right, 558 00:25:09,900 --> 00:25:14,700 in particular going from 0 to 25 and from 25 to 50, 559 00:25:14,700 --> 00:25:18,060 the probability of, or the fraction of people 560 00:25:18,060 --> 00:25:23,040 who choose 0 for the other person goes up quite a bit. 561 00:25:23,040 --> 00:25:26,980 So now, for example, look at the 50% chance equals 50%. 562 00:25:26,980 --> 00:25:30,810 Now there's a 50% chance that the other person gets 0 anyway 563 00:25:30,810 --> 00:25:33,030 because the computer chose so. 564 00:25:33,030 --> 00:25:36,900 And now, if I'm the dictator, essentially I 565 00:25:36,900 --> 00:25:39,420 can also choose 20-0. 566 00:25:39,420 --> 00:25:42,360 And we know once you, as a recipient, see the outcome, 567 00:25:42,360 --> 00:25:45,720 you cannot tell if it was a 50% chance that it was me, 568 00:25:45,720 --> 00:25:50,580 it was a 50% chance it was the computer choosing that. 569 00:25:50,580 --> 00:25:55,230 And so in that case, you see now that 70% of people 570 00:25:55,230 --> 00:25:57,180 actually choose 20-0. 571 00:25:57,180 --> 00:26:02,400 So that fraction goes up from 30% for p equals 0 to about 50% 572 00:26:02,400 --> 00:26:07,680 for p equals 25%, and to 70% for p equals 50%. 573 00:26:07,680 --> 00:26:11,190 So essentially, the larger p is, at least for the range of 0 574 00:26:11,190 --> 00:26:13,710 to 50%, the larger is the fraction 575 00:26:13,710 --> 00:26:16,740 of people who actually choose the 20-0 allocation, 576 00:26:16,740 --> 00:26:19,230 presumably because it's now more plausible if I'm 577 00:26:19,230 --> 00:26:20,550 hiding behind the computer. 578 00:26:23,220 --> 00:26:26,080 Now you see for people, 75%, now there 579 00:26:26,080 --> 00:26:29,820 seems to be sort of no increase anymore. 580 00:26:29,820 --> 00:26:32,550 It's a little unclear how to interpret that. 581 00:26:32,550 --> 00:26:36,388 To some degree, this could be because now, 50% is already 582 00:26:36,388 --> 00:26:36,930 large enough. 583 00:26:36,930 --> 00:26:39,840 That's already very plausible that the computer chose anyway, 584 00:26:39,840 --> 00:26:42,480 so going from 50 to 75, there's no additional benefits 585 00:26:42,480 --> 00:26:44,190 of hiding behind the computer. 586 00:26:44,190 --> 00:26:46,245 It could also be that people in some way sort 587 00:26:46,245 --> 00:26:48,120 feel bad for the other person because there's 588 00:26:48,120 --> 00:26:51,140 a high chance that they get 0 anyway from the computer. 589 00:26:51,140 --> 00:26:54,330 So you might as well give something to that other person, 590 00:26:54,330 --> 00:26:54,960 perhaps. 591 00:26:54,960 --> 00:26:57,120 I'm not quite sure, but that's, in some sense, not the point 592 00:26:57,120 --> 00:26:57,640 here. 593 00:26:57,640 --> 00:27:00,060 The point here is that the fraction who are mean 594 00:27:00,060 --> 00:27:02,800 goes up by a lot when you can hide behind the computer. 595 00:27:02,800 --> 00:27:04,680 The fraction who looks or is quite 596 00:27:04,680 --> 00:27:07,080 nice who chooses the 10-10 allocation 597 00:27:07,080 --> 00:27:09,325 goes down by quite a bit. 598 00:27:09,325 --> 00:27:10,200 Does that make sense? 599 00:27:10,200 --> 00:27:13,500 And the following slides have added a fairly detailed 600 00:27:13,500 --> 00:27:14,340 description of this. 601 00:27:14,340 --> 00:27:17,820 But let's pause for a second to see whether that makes sense. 602 00:27:22,060 --> 00:27:24,370 So I think I have already said all of this. 603 00:27:24,370 --> 00:27:27,375 This is to say, the higher p star or p is, 604 00:27:27,375 --> 00:27:28,750 the easier it is for the dictator 605 00:27:28,750 --> 00:27:30,700 to hide behind the computer. 606 00:27:30,700 --> 00:27:33,010 I said all of that already. 607 00:27:33,010 --> 00:27:36,700 So when you sort of take together this evidence, 608 00:27:36,700 --> 00:27:38,890 it seems to be evidence that people are not 609 00:27:38,890 --> 00:27:44,500 as nice as results from simple dictator games 610 00:27:44,500 --> 00:27:46,870 might suggest because, in some sense, 611 00:27:46,870 --> 00:27:48,610 in the simple dictator games, they 612 00:27:48,610 --> 00:27:52,090 would like to do something else, but they just can't really 613 00:27:52,090 --> 00:27:54,460 do so. 614 00:27:54,460 --> 00:27:56,380 And this is kind of like your question here. 615 00:27:56,380 --> 00:27:59,620 This is a very nice variation that they have. 616 00:27:59,620 --> 00:28:02,500 This is a different variation where the computer now chooses 617 00:28:02,500 --> 00:28:05,200 19-1 with probability p. 618 00:28:05,200 --> 00:28:07,660 So the computer essentially chooses 19 for the dictator 619 00:28:07,660 --> 00:28:10,510 and 1 for the other person with probability, again, 620 00:28:10,510 --> 00:28:15,010 p equals 0, 25%, 50%, 75%. 621 00:28:15,010 --> 00:28:17,620 And exactly as you sort of predicted and thought about, 622 00:28:17,620 --> 00:28:20,260 is you see the same thing happening for the people 623 00:28:20,260 --> 00:28:20,860 choosing 10. 624 00:28:20,860 --> 00:28:22,570 That fraction goes down. 625 00:28:22,570 --> 00:28:25,792 But instead, people are not now choosing more of 20-0. 626 00:28:25,792 --> 00:28:27,500 Instead, what they're choosing-- in fact, 627 00:28:27,500 --> 00:28:29,417 there's a fraction of people that choose 20-0. 628 00:28:29,417 --> 00:28:31,180 The red line goes down a bit. 629 00:28:31,180 --> 00:28:34,870 Instead, what goes up is the 19-1 allocation. 630 00:28:34,870 --> 00:28:37,630 So lots of people now choose 19 for themselves 631 00:28:37,630 --> 00:28:39,290 and 1 for the other person. 632 00:28:39,290 --> 00:28:40,370 Why is that? 633 00:28:40,370 --> 00:28:42,910 It's because now you can hide between the 19-1 choice 634 00:28:42,910 --> 00:28:44,290 of the computer. 635 00:28:44,290 --> 00:28:46,750 And that's the selfish choice. 636 00:28:46,750 --> 00:28:51,790 The 20-0 choice, in fact, goes down a little bit, 637 00:28:51,790 --> 00:28:55,420 perhaps because it becomes quite obvious that these are people 638 00:28:55,420 --> 00:28:58,120 who are quite selfish anyway. 639 00:28:58,120 --> 00:28:59,950 But now they choose the 19-1 option. 640 00:28:59,950 --> 00:29:03,940 They switch to 19-1 because now, you're kind of mean anyway, 641 00:29:03,940 --> 00:29:06,490 but now you can essentially hide behind the computer. 642 00:29:06,490 --> 00:29:08,600 OK? 643 00:29:08,600 --> 00:29:09,670 Any questions on this? 644 00:29:15,160 --> 00:29:16,790 So now what's going on here? 645 00:29:16,790 --> 00:29:22,930 So one potential explanation is face-saving concerns, 646 00:29:22,930 --> 00:29:24,770 essentially social image. 647 00:29:24,770 --> 00:29:27,550 So these are the motivation to avoid unfavorable judgment 648 00:29:27,550 --> 00:29:28,060 by others. 649 00:29:28,060 --> 00:29:29,710 It's essentially if you care about what 650 00:29:29,710 --> 00:29:31,720 others think about you, you might 651 00:29:31,720 --> 00:29:33,950 engage in certain behaviors. 652 00:29:33,950 --> 00:29:38,260 So the experiment provides a way to provide 653 00:29:38,260 --> 00:29:40,040 people to avoid such judgments. 654 00:29:40,040 --> 00:29:42,400 So remember, this is not an anonymous game. 655 00:29:42,400 --> 00:29:44,020 You have to face this other person 656 00:29:44,020 --> 00:29:46,880 and deal with them afterwards, at least in some way. 657 00:29:46,880 --> 00:29:49,330 So now, essentially, the experiment, the computer 658 00:29:49,330 --> 00:29:53,050 provides, essentially, people an opportunity to save face, 659 00:29:53,050 --> 00:29:56,902 and essentially to keep up their social image 660 00:29:56,902 --> 00:29:59,110 because essentially, what you're trying to avoid here 661 00:29:59,110 --> 00:30:02,560 is if you choose the 20-0 option the other person will 662 00:30:02,560 --> 00:30:03,820 think you're not very nice. 663 00:30:03,820 --> 00:30:07,150 And either you're trying to avoid to have 664 00:30:07,150 --> 00:30:08,770 to see that they're unhappy. 665 00:30:08,770 --> 00:30:11,350 You might just feel bad about them if they're unhappy. 666 00:30:11,350 --> 00:30:15,270 Or perhaps more plausible, it just feels bad for you 667 00:30:15,270 --> 00:30:17,020 if other people think you're kind of mean. 668 00:30:17,020 --> 00:30:18,825 So you want to be mean, but you don't 669 00:30:18,825 --> 00:30:20,200 want others to think you're mean, 670 00:30:20,200 --> 00:30:23,932 and therefore you hide behind the computer. 671 00:30:23,932 --> 00:30:25,390 I think I said all of this already. 672 00:30:25,390 --> 00:30:28,420 So both of these types of evidence, 673 00:30:28,420 --> 00:30:31,660 I think, the exit in the dictator game, 674 00:30:31,660 --> 00:30:35,138 but also the evidence that I showed you 675 00:30:35,138 --> 00:30:36,680 just now, hiding behind the computer, 676 00:30:36,680 --> 00:30:44,260 seems to say when there's an option to avoid people feeling 677 00:30:44,260 --> 00:30:47,440 bad or people feeling that you're mean to them, 678 00:30:47,440 --> 00:30:50,710 people tend to take that option, and then essentially, that 679 00:30:50,710 --> 00:30:53,440 reduces their giving behavior. 680 00:30:53,440 --> 00:30:56,650 Now in some sense, when you think about many other-- 681 00:30:56,650 --> 00:30:59,320 and there's some empirical evidence on this by Gautam Rao 682 00:30:59,320 --> 00:31:05,200 and Stefano DellaVigna and others, 683 00:31:05,200 --> 00:31:10,780 in a situation when it comes to voting or other gift-giving 684 00:31:10,780 --> 00:31:17,920 choices overall, people are willing to pay, essentially, 685 00:31:17,920 --> 00:31:21,127 to avoid situations where they're asked to give. 686 00:31:21,127 --> 00:31:23,210 This is like somebody comes to your door and says, 687 00:31:23,210 --> 00:31:25,030 would you like to donate for this cause, 688 00:31:25,030 --> 00:31:28,240 or would you like to vote for this candidate, 689 00:31:28,240 --> 00:31:30,020 and so on and so forth. 690 00:31:30,020 --> 00:31:33,790 People might, once they ask them, I'd say, yeah, sure, 691 00:31:33,790 --> 00:31:36,610 I'll do it because they feel bad, 692 00:31:36,610 --> 00:31:38,740 or it feels like it's not a very-- so [INAUDIBLE] 693 00:31:38,740 --> 00:31:42,250 shows you pictures of poor children 694 00:31:42,250 --> 00:31:46,420 and some terrible situations. 695 00:31:46,420 --> 00:31:49,450 If you are now saying, no, I don't want to give any money, 696 00:31:49,450 --> 00:31:51,670 you look like a really mean person. 697 00:31:51,670 --> 00:31:53,470 So what people then tend to do is 698 00:31:53,470 --> 00:31:56,290 they tend to essentially just avoid the situation altogether 699 00:31:56,290 --> 00:31:58,390 by either pretending that they're not home 700 00:31:58,390 --> 00:32:00,520 or indicating never opening the door. 701 00:32:00,520 --> 00:32:02,710 Or if you see somebody on the street who 702 00:32:02,710 --> 00:32:06,880 wants to collect donations, you might just 703 00:32:06,880 --> 00:32:09,730 essentially avoid that person altogether, go out of your way. 704 00:32:09,730 --> 00:32:11,800 If you see a beggar on the street, 705 00:32:11,800 --> 00:32:15,040 you essentially might go around that person 706 00:32:15,040 --> 00:32:18,520 because you try to avoid, essentially, judgment, 707 00:32:18,520 --> 00:32:21,220 either feeling bad for them about you being mean to them, 708 00:32:21,220 --> 00:32:25,033 or feeling bad about them just judging you in certain ways 709 00:32:25,033 --> 00:32:26,200 for not being a nice person. 710 00:32:28,820 --> 00:32:31,490 How do we think about this in terms of utility function? 711 00:32:31,490 --> 00:32:33,580 We're not going to talk about this very much. 712 00:32:33,580 --> 00:32:35,470 We're going to get into this a little bit 713 00:32:35,470 --> 00:32:38,500 back about this when you talk about beliefs 714 00:32:38,500 --> 00:32:40,552 and belief-based utility. 715 00:32:40,552 --> 00:32:42,010 But just to give you a sense of how 716 00:32:42,010 --> 00:32:45,190 to think about this in terms of modeling it, so what you need, 717 00:32:45,190 --> 00:32:46,810 essentially, is like another term. 718 00:32:46,810 --> 00:32:49,930 And utility function-- in addition to your distributional 719 00:32:49,930 --> 00:32:52,480 preferences about you put some weight on how much you get 720 00:32:52,480 --> 00:32:54,097 and how much the other person gets-- 721 00:32:54,097 --> 00:32:55,180 you need some other term-- 722 00:32:55,180 --> 00:32:56,350 I call it v-- 723 00:32:56,350 --> 00:32:58,810 which is player 1's beliefs about rho. 724 00:32:58,810 --> 00:33:02,257 Like what does the other person think that your rho is? 725 00:33:02,257 --> 00:33:04,090 So if you really don't want to give anything 726 00:33:04,090 --> 00:33:06,730 to that other person, you want to avoid 727 00:33:06,730 --> 00:33:10,240 that the other person thinks that you put very little weight 728 00:33:10,240 --> 00:33:11,140 on the other person. 729 00:33:11,140 --> 00:33:12,932 So rho, again, to remind you, is the weight 730 00:33:12,932 --> 00:33:14,380 that's put on the other person. 731 00:33:14,380 --> 00:33:16,420 This is the perspective of player 2. 732 00:33:16,420 --> 00:33:19,180 So rho is the weight that you put on the other person. 733 00:33:19,180 --> 00:33:22,750 And now essentially, player 2 might 734 00:33:22,750 --> 00:33:25,317 put some weight on player 1 in terms of the output 735 00:33:25,317 --> 00:33:25,900 that they get. 736 00:33:25,900 --> 00:33:27,317 They want to give them some money, 737 00:33:27,317 --> 00:33:30,820 and if they have more money, they feel happier. 738 00:33:30,820 --> 00:33:32,590 But they also put some weight on what 739 00:33:32,590 --> 00:33:34,930 player 1 thinks that my rho is. 740 00:33:34,930 --> 00:33:39,040 So if I have the choice, do I want to give 10? 741 00:33:39,040 --> 00:33:41,530 If I have $10, and I can choose that in a dictator game 742 00:33:41,530 --> 00:33:47,800 to give it to the other person, I might want to do 10-0, 743 00:33:47,800 --> 00:33:50,110 and that's essentially just coming from my rho. 744 00:33:50,110 --> 00:33:52,240 But I also don't want the other person 745 00:33:52,240 --> 00:33:54,160 to think that my rho is 0. 746 00:33:54,160 --> 00:33:56,380 So I might derive positive utility 747 00:33:56,380 --> 00:34:00,700 from the other player thinking that my rho is whatever, 0.3, 748 00:34:00,700 --> 00:34:02,560 0.5, or whatever. 749 00:34:02,560 --> 00:34:04,680 I like that the other person thinks that I'm 750 00:34:04,680 --> 00:34:06,790 of a positive weight on them. 751 00:34:06,790 --> 00:34:08,657 And therefore, I might give them more. 752 00:34:08,657 --> 00:34:11,199 More generally, I guess you can just sort of have the utility 753 00:34:11,199 --> 00:34:14,080 function that depends on the other person's outputs 754 00:34:14,080 --> 00:34:19,860 or on each person's outcome, how much they get, 755 00:34:19,860 --> 00:34:24,530 and on the other's beliefs about what my utility function looks 756 00:34:24,530 --> 00:34:25,030 like. 757 00:34:25,030 --> 00:34:26,820 So essentially, people care about 758 00:34:26,820 --> 00:34:29,250 whether others think that they are nice 759 00:34:29,250 --> 00:34:34,310 or have concerns for others in the utility function. 760 00:34:34,310 --> 00:34:36,719 Again, we're not going to talk about this in more 761 00:34:36,719 --> 00:34:38,100 detail, at least for now. 762 00:34:38,100 --> 00:34:40,590 We're going to get back to this when we think about beliefs 763 00:34:40,590 --> 00:34:42,780 and belief-based utility, which essentially 764 00:34:42,780 --> 00:34:46,949 is the idea that beliefs are potentially not 765 00:34:46,949 --> 00:34:50,790 just instrumental, but about like others usually 766 00:34:50,790 --> 00:34:52,260 help you make good decisions. 767 00:34:52,260 --> 00:34:54,659 But it might be that you also just derive utility 768 00:34:54,659 --> 00:34:55,469 from beliefs. 769 00:34:55,469 --> 00:34:57,930 You might drive utility from thinking that you're 770 00:34:57,930 --> 00:35:02,460 smart, good-looking, and so on. 771 00:35:02,460 --> 00:35:04,260 Any questions on this? 772 00:35:06,910 --> 00:35:07,680 OK. 773 00:35:07,680 --> 00:35:11,870 So now one question you might have, 774 00:35:11,870 --> 00:35:14,035 and I already alluded to that, is 775 00:35:14,035 --> 00:35:16,300 are people giving because they enjoy giving, 776 00:35:16,300 --> 00:35:18,620 or because it's uncomfortable refusing to give? 777 00:35:18,620 --> 00:35:19,120 is? 778 00:35:19,120 --> 00:35:21,510 It like, I really want to give and I feel really good 779 00:35:21,510 --> 00:35:22,010 about it. 780 00:35:22,010 --> 00:35:24,190 I get some warm glow in some ways. 781 00:35:24,190 --> 00:35:27,640 Or is it in some ways, I'm giving not because I actually 782 00:35:27,640 --> 00:35:32,290 enjoy it and I'm happy about it, but rather, because otherwise 783 00:35:32,290 --> 00:35:38,750 I just feel bad and uncomfortable. 784 00:35:38,750 --> 00:35:40,510 So the evidence we talked about suggests 785 00:35:40,510 --> 00:35:43,420 that the latter motivation is more important for many. 786 00:35:43,420 --> 00:35:46,870 So many people kind of just feel bad by not giving. 787 00:35:49,390 --> 00:35:51,700 This is essentially in particular, the evidence 788 00:35:51,700 --> 00:35:56,050 that people are willing to pay to avoid dictator games that I 789 00:35:56,050 --> 00:35:57,100 showed you. 790 00:35:57,100 --> 00:36:02,470 There's a very nice paper about what's 791 00:36:02,470 --> 00:36:07,840 called moral wiggle room, which gets at the idea 792 00:36:07,840 --> 00:36:13,030 that in a way, if people opt out of the dictator game, 793 00:36:13,030 --> 00:36:16,400 they have to justify to themselves that they're mean. 794 00:36:16,400 --> 00:36:18,460 So if you have self-image, if you also 795 00:36:18,460 --> 00:36:21,310 care about how do you think about yourself, 796 00:36:21,310 --> 00:36:24,460 are you a good person or not, how 797 00:36:24,460 --> 00:36:26,320 do you justify to yourself, essentially, 798 00:36:26,320 --> 00:36:28,090 that you chose a 20-0 option. 799 00:36:28,090 --> 00:36:30,580 And the Dana et al paper is a very nice paper 800 00:36:30,580 --> 00:36:33,200 that sort of illustrates that very nicely. 801 00:36:33,200 --> 00:36:38,380 So this is, again, a very simple game. 802 00:36:38,380 --> 00:36:40,907 Subjects are asked to choose between self, other. 803 00:36:40,907 --> 00:36:42,740 So how much do you get yourself and how much 804 00:36:42,740 --> 00:36:44,770 does the other person get? 805 00:36:44,770 --> 00:36:46,180 There's two options here. 806 00:36:46,180 --> 00:36:49,090 Option A is you get $6.00 for yourself and x dollars 807 00:36:49,090 --> 00:36:50,590 for the other person. 808 00:36:50,590 --> 00:36:53,800 Option B is you get $5.00 for yourself and y dollars 809 00:36:53,800 --> 00:36:55,570 for the other person. 810 00:36:55,570 --> 00:36:58,490 And x and y is varied across subjects 811 00:36:58,490 --> 00:37:00,220 subject and randomized. 812 00:37:00,220 --> 00:37:06,220 So one option here is x equals 1 and y equals 5. 813 00:37:06,220 --> 00:37:08,110 Now what does that mean for Options A and B? 814 00:37:08,110 --> 00:37:10,780 That means option A is, if you had to choose, 815 00:37:10,780 --> 00:37:13,750 would be 6 for yourself and 1 for the other person. 816 00:37:13,750 --> 00:37:18,850 Option B is like 5 for each player. 817 00:37:18,850 --> 00:37:21,970 So now choosing between Option B and Option A. 818 00:37:21,970 --> 00:37:23,950 If you choose Option A, essentially, 819 00:37:23,950 --> 00:37:27,580 you get $1.00 more than you would get if you chose Option B 820 00:37:27,580 --> 00:37:31,600 But the other person gets $4.00 less than they would get if you 821 00:37:31,600 --> 00:37:35,110 chose Option B. So choosing Option A is not a particularly 822 00:37:35,110 --> 00:37:36,010 nice option. 823 00:37:36,010 --> 00:37:40,300 Essentially you decide that one additional dollar for you is 824 00:37:40,300 --> 00:37:46,960 worth more than $4.00 that the other person gets less. 825 00:37:46,960 --> 00:37:50,200 And so again, that's not a very nice move. 826 00:37:50,200 --> 00:37:55,210 So most people, in fact, tend to choose Option B when 827 00:37:55,210 --> 00:37:57,190 faced with that direct choice. 828 00:37:57,190 --> 00:37:59,800 That is to say, 26% of people choose 829 00:37:59,800 --> 00:38:02,890 Option A. They choose the selfish option, if you want. 830 00:38:02,890 --> 00:38:07,160 And 74% choose Option B, which is, in some ways, 831 00:38:07,160 --> 00:38:09,293 the nice thing to do. 832 00:38:09,293 --> 00:38:10,960 This is the very standard version of it. 833 00:38:10,960 --> 00:38:13,520 There's no other things going on. 834 00:38:13,520 --> 00:38:15,520 There's no exit option and so on. 835 00:38:15,520 --> 00:38:22,890 And so again, I said all this. 836 00:38:22,890 --> 00:38:25,590 So now when people choose Option B, 837 00:38:25,590 --> 00:38:27,670 there's essentially two explanations here. 838 00:38:27,670 --> 00:38:30,840 One is the person is quite nice. 839 00:38:30,840 --> 00:38:34,890 The other explanation is in some ways, they're not that nice, 840 00:38:34,890 --> 00:38:37,690 but they're just trying to avoid feeling bad. 841 00:38:37,690 --> 00:38:39,570 They kind of want to choose Option A, 842 00:38:39,570 --> 00:38:42,360 but if you choose Option A, you feel really bad about it, 843 00:38:42,360 --> 00:38:44,550 and that's why you don't do it. 844 00:38:44,550 --> 00:38:47,790 Now the game also has a twist here, which 845 00:38:47,790 --> 00:38:49,770 is for some other subjects-- 846 00:38:49,770 --> 00:38:53,580 these are some subjects just had this choice only. 847 00:38:53,580 --> 00:38:55,920 They choose, as I said, 74% chose 848 00:38:55,920 --> 00:39:00,120 Option B. Some other subjects, x and y 849 00:39:00,120 --> 00:39:03,000 were initially unknown with either x 850 00:39:03,000 --> 00:39:08,070 equals 1 and y equals 5, or x equals 5 and y equals 1. 851 00:39:08,070 --> 00:39:11,520 And those are implemented with 50% chance. 852 00:39:11,520 --> 00:39:12,790 So initially you don't know. 853 00:39:12,790 --> 00:39:14,310 It could be either-- 854 00:39:14,310 --> 00:39:15,490 let me just write this down. 855 00:39:15,490 --> 00:39:17,130 Could be one of two scenarios. 856 00:39:17,130 --> 00:39:19,800 Both the chance 50%. 857 00:39:19,800 --> 00:39:22,980 So one of them is if you choose Option A, that's the choice 858 00:39:22,980 --> 00:39:24,300 that we just had before. 859 00:39:24,300 --> 00:39:26,940 The first one is Option A versus B. That's what we just had, 860 00:39:26,940 --> 00:39:29,460 which essentially is 6-1 versus 5-5. 861 00:39:29,460 --> 00:39:32,820 So their choosing A is a pretty selfish move. 862 00:39:32,820 --> 00:39:35,190 That's the option I just showed you. 863 00:39:35,190 --> 00:39:40,680 The second scenario is it's 6-5 versus 5-1. 864 00:39:40,680 --> 00:39:43,680 Now it's unambiguously clear that you should choose Option 865 00:39:43,680 --> 00:39:47,280 A because you're better off, the other person is better off, 866 00:39:47,280 --> 00:39:48,510 everybody is happy. 867 00:39:48,510 --> 00:39:50,910 So surely you should choose Option A. Essentially 868 00:39:50,910 --> 00:39:55,110 Option A dominates Option B in the second scenario. 869 00:39:55,110 --> 00:39:57,360 OK? 870 00:39:57,360 --> 00:39:59,760 Now here's the twist. 871 00:39:59,760 --> 00:40:05,010 Subjects could costlessly find out which one is the case. 872 00:40:05,010 --> 00:40:07,740 And the recipient, in this case, will not 873 00:40:07,740 --> 00:40:09,150 learn what the other person did. 874 00:40:09,150 --> 00:40:12,180 The recipient just essentially receives the money. 875 00:40:12,180 --> 00:40:14,790 The recipient does not know what the person does. 876 00:40:14,790 --> 00:40:17,940 So now, essentially, I will tell you there's 877 00:40:17,940 --> 00:40:20,970 two scenarios going on, either one or two. 878 00:40:20,970 --> 00:40:24,300 I'm going to ask you to choose A versus B. 879 00:40:24,300 --> 00:40:26,760 And you have two options before that choice. 880 00:40:26,760 --> 00:40:30,090 The option is either you can just choose without not knowing 881 00:40:30,090 --> 00:40:34,020 which scenario you're in, or you can choose costlessly 882 00:40:34,020 --> 00:40:40,890 to find out whether you're in Choice 1 versus Choice 2. 883 00:40:40,890 --> 00:40:42,660 OK. 884 00:40:42,660 --> 00:40:47,410 So is that easy to set up, clear? 885 00:40:47,410 --> 00:40:49,850 Happy to repeat. 886 00:40:49,850 --> 00:40:50,640 Yeah. 887 00:40:50,640 --> 00:40:51,140 Yeah. 888 00:40:55,070 --> 00:40:58,440 AUDIENCE: Who is choosing between 1 and 2? 889 00:40:58,440 --> 00:41:00,110 PROFESSOR: So if you're in the game, 890 00:41:00,110 --> 00:41:03,860 there's a 50% chance that you're in scenario 1 and a 50% chance 891 00:41:03,860 --> 00:41:05,390 in scenario 2. 892 00:41:05,390 --> 00:41:07,790 That is fixed. 893 00:41:07,790 --> 00:41:12,110 You have two choices to make in your whole decision. 894 00:41:12,110 --> 00:41:14,960 The first one is, do you want to find out whether you're 895 00:41:14,960 --> 00:41:17,510 in scenario 1 or in scenario 2? 896 00:41:17,510 --> 00:41:20,120 So either you don't find out-- it's just a 50% chance 897 00:41:20,120 --> 00:41:23,690 of scenario 1 and 50% chance of scenario 2-- 898 00:41:23,690 --> 00:41:26,390 or you find out costlessly I'm going to tell you 899 00:41:26,390 --> 00:41:28,760 you're in scenario 1 or scenario 2. 900 00:41:28,760 --> 00:41:30,680 It's already decided which scenario you're in. 901 00:41:30,680 --> 00:41:32,462 You just don't know it. 902 00:41:32,462 --> 00:41:33,920 So your first choice is do you want 903 00:41:33,920 --> 00:41:36,620 to know which scenario you're in? 904 00:41:36,620 --> 00:41:40,740 The second choice, then, is between A and B. 905 00:41:40,740 --> 00:41:42,300 And you either choose without knowing 906 00:41:42,300 --> 00:41:43,300 what scenario you're in. 907 00:41:43,300 --> 00:41:44,375 You're either in 1 or 2. 908 00:41:44,375 --> 00:41:45,750 You choose A and B, and then it's 909 00:41:45,750 --> 00:41:48,930 just going to be implemented afterwards, 910 00:41:48,930 --> 00:41:50,760 the A or B for each of the scenarios, 911 00:41:50,760 --> 00:41:54,060 or the one that's actually implemented. 912 00:41:54,060 --> 00:41:57,300 Or you know already you're in scenario 1 or scenario 2 913 00:41:57,300 --> 00:41:59,070 because you costlessly found out, 914 00:41:59,070 --> 00:42:02,340 and then you make that choice about A and B. 915 00:42:02,340 --> 00:42:03,340 Does it make sense? 916 00:42:03,340 --> 00:42:03,840 OK. 917 00:42:03,840 --> 00:42:04,980 Perfect. 918 00:42:04,980 --> 00:42:08,250 So now first, we can think about distributional preferences. 919 00:42:08,250 --> 00:42:10,770 What happens if you just have purely distributional 920 00:42:10,770 --> 00:42:11,670 preferences? 921 00:42:11,670 --> 00:42:13,680 Could this additional twist in any way 922 00:42:13,680 --> 00:42:16,150 lower people's giving behavior? 923 00:42:16,150 --> 00:42:20,860 And so when you are distributional preferences, 924 00:42:20,860 --> 00:42:23,180 notice that what I added-- 925 00:42:23,180 --> 00:42:26,890 so what we're comparing here is only having the first scenario 926 00:42:26,890 --> 00:42:27,470 to choose. 927 00:42:27,470 --> 00:42:29,240 We're comparing that to the twist. 928 00:42:29,240 --> 00:42:31,240 So we're comparing, essentially, this scenario-- 929 00:42:34,870 --> 00:42:37,180 this is the people who chose without having 930 00:42:37,180 --> 00:42:38,950 this additional twist going on. 931 00:42:38,950 --> 00:42:41,020 We're going to compare those kinds of people 932 00:42:41,020 --> 00:42:44,080 to people who are randomized to have this additional twist, 933 00:42:44,080 --> 00:42:47,020 where they first have the choice or the option 934 00:42:47,020 --> 00:42:49,095 to costlessly find out where they are. 935 00:42:49,095 --> 00:42:50,470 And the question I'm going to ask 936 00:42:50,470 --> 00:42:54,490 is, now, how does adding this additional option affect 937 00:42:54,490 --> 00:42:55,870 people's giving behavior? 938 00:42:55,870 --> 00:42:57,310 And in particular, is it possible 939 00:42:57,310 --> 00:43:02,390 that people give less because of adding this additional choice? 940 00:43:02,390 --> 00:43:05,260 So now there's two options overall. 941 00:43:05,260 --> 00:43:07,630 Either it's the case that your optimal choice 942 00:43:07,630 --> 00:43:09,070 depends on x and y-- 943 00:43:09,070 --> 00:43:12,430 so either you're going to tell me, well, 944 00:43:12,430 --> 00:43:16,180 whether I choose A or B depends on x and y. 945 00:43:16,180 --> 00:43:17,980 And if that's the case, well, of course 946 00:43:17,980 --> 00:43:20,740 then you want to find out what x and y is. 947 00:43:20,740 --> 00:43:21,880 Remember, what is x and y? 948 00:43:21,880 --> 00:43:23,950 X and y is the thing that I have at the top. 949 00:43:23,950 --> 00:43:27,250 X and y is either x equals 1 and y equals 5, 950 00:43:27,250 --> 00:43:30,370 or it is x equals 5 and y equals 1. 951 00:43:30,370 --> 00:43:32,772 So it either is the case that it matters to you 952 00:43:32,772 --> 00:43:34,480 what x and y are-- either you really want 953 00:43:34,480 --> 00:43:36,670 to find out what x and y is. 954 00:43:36,670 --> 00:43:38,680 Well, in that case, surely if I ask you, 955 00:43:38,680 --> 00:43:40,157 do you want to costlessly find out, 956 00:43:40,157 --> 00:43:41,740 you don't want to find out because you 957 00:43:41,740 --> 00:43:43,120 want to make the right choice. 958 00:43:43,120 --> 00:43:44,710 You want to know, am I in scenario 1 959 00:43:44,710 --> 00:43:47,320 or am I in scenario 2? 960 00:43:47,320 --> 00:43:50,410 So then if I offer you the costless option 961 00:43:50,410 --> 00:43:53,500 to find out which scenario you're in, 962 00:43:53,500 --> 00:43:55,690 but you want to find out. 963 00:43:55,690 --> 00:43:58,510 And then if you find out that you're in the x equals 1 964 00:43:58,510 --> 00:44:01,720 and y equals 5 situation, you should make the same choice 965 00:44:01,720 --> 00:44:04,180 that you chose previously. 966 00:44:04,180 --> 00:44:07,930 If you chose previously Option A or Option 967 00:44:07,930 --> 00:44:10,188 B, once you find out what scenario you're in, 968 00:44:10,188 --> 00:44:12,730 surely you're going to choose the same thing because you only 969 00:44:12,730 --> 00:44:15,160 care about distributiona; preferences. 970 00:44:15,160 --> 00:44:18,670 You're going to just choose the same thing. 971 00:44:18,670 --> 00:44:21,170 Now if your choice, on the other hand, 972 00:44:21,170 --> 00:44:23,560 does not depend on x and y, you should 973 00:44:23,560 --> 00:44:25,930 be indifferent between finding out or not. 974 00:44:25,930 --> 00:44:29,875 And it shouldn't really matter whether they give you 975 00:44:29,875 --> 00:44:31,000 that option, in some sense. 976 00:44:31,000 --> 00:44:33,580 You should choose the same thing as you chose before. 977 00:44:33,580 --> 00:44:36,850 In either case, when x equals 1 and y equals 5, 978 00:44:36,850 --> 00:44:39,370 leaving x and y initially unknown, 979 00:44:39,370 --> 00:44:41,020 so adding this additional option, 980 00:44:41,020 --> 00:44:43,870 should not decrease the amount of giving. 981 00:44:43,870 --> 00:44:50,560 If anything, it should not be affected in any case. 982 00:44:50,560 --> 00:44:54,770 But in particular, it should not decrease it. 983 00:44:54,770 --> 00:44:58,360 So let me have you look at this for a second 984 00:44:58,360 --> 00:44:59,900 just to be clear on that. 985 00:44:59,900 --> 00:45:01,390 So essentially what I'm saying is 986 00:45:01,390 --> 00:45:05,440 adding this additional thing, the scenario 2, 987 00:45:05,440 --> 00:45:08,800 either the scenarios matter, and if the scenarios 988 00:45:08,800 --> 00:45:10,957 matter whether you're in scenario 1 versus scenario 989 00:45:10,957 --> 00:45:12,790 2, if you make different choices, well, then 990 00:45:12,790 --> 00:45:15,640 surely you should find out which scenario you're in. 991 00:45:15,640 --> 00:45:18,010 And you're going to say, OK, I really want to know. 992 00:45:18,010 --> 00:45:20,260 Once you know whether you're in scenario 1-- 993 00:45:20,260 --> 00:45:22,300 suppose you're actually in scenario 1-- 994 00:45:22,300 --> 00:45:26,540 you should make the same choice as you did before. 995 00:45:26,540 --> 00:45:31,100 And if you say, well, I actually don't 996 00:45:31,100 --> 00:45:32,392 care which scenario I'm in. 997 00:45:32,392 --> 00:45:34,100 I'm going to make the same choice anyway, 998 00:45:34,100 --> 00:45:35,800 then adding this additional scenario 999 00:45:35,800 --> 00:45:37,550 shouldn't really matter anyway because you 1000 00:45:37,550 --> 00:45:39,350 choose the same thing anyway. 1001 00:45:39,350 --> 00:45:41,900 But regardless, adding essentially the scenario 2 1002 00:45:41,900 --> 00:45:45,050 should not decrease how much you give. 1003 00:45:45,050 --> 00:45:49,400 Now what they find instead is 44% 1004 00:45:49,400 --> 00:45:51,980 chose not to find out x and y. 1005 00:45:51,980 --> 00:45:57,110 And of these subjects, 95% chose Option A. 1006 00:45:57,110 --> 00:45:59,330 So these are people who essentially they 1007 00:45:59,330 --> 00:46:04,110 say, I'd rather not know what's going on here, 1008 00:46:04,110 --> 00:46:07,380 then they chose Option A, which is either the fairly 1009 00:46:07,380 --> 00:46:09,480 selfish option, or it's the option that's 1010 00:46:09,480 --> 00:46:10,560 better for both people. 1011 00:46:10,560 --> 00:46:13,050 Let me go back to this so you see this. 1012 00:46:13,050 --> 00:46:16,180 So when given these two choices, they essentially say, 1013 00:46:16,180 --> 00:46:18,780 I'd rather not know, even if it's costless to do. 1014 00:46:18,780 --> 00:46:21,450 I don't want to know whether I'm in scenario 1 or 2. 1015 00:46:21,450 --> 00:46:24,510 I'm going to choose Option A. Why do they do that, 1016 00:46:24,510 --> 00:46:26,370 or what's going on in their minds? 1017 00:46:30,610 --> 00:46:31,906 Yes. 1018 00:46:31,906 --> 00:46:33,739 AUDIENCE: They don't want to know if they're 1019 00:46:33,739 --> 00:46:35,530 being mean to the other person. 1020 00:46:35,530 --> 00:46:36,490 PROFESSOR: Exactly. 1021 00:46:36,490 --> 00:46:43,390 So if you find out whether it's scenario 1 or scenario 2, 1022 00:46:43,390 --> 00:46:44,810 things can happen. 1023 00:46:44,810 --> 00:46:47,300 One will be you'll be in scenario number 2. 1024 00:46:47,300 --> 00:46:49,300 You're going to choose Option A because you know 1025 00:46:49,300 --> 00:46:51,190 that's better for everybody. 1026 00:46:51,190 --> 00:46:53,350 Or you're going to be in scenario 1, 1027 00:46:53,350 --> 00:46:57,610 well, then, choosing A is kind of a selfish and mean move. 1028 00:46:57,610 --> 00:47:00,550 And then, once you do that, then you have to sort of deal with, 1029 00:47:00,550 --> 00:47:03,100 well, am I a mean person or not? 1030 00:47:03,100 --> 00:47:05,980 Instead, what people do or seem to do, is they just say, 1031 00:47:05,980 --> 00:47:07,510 I actually don't want to find out. 1032 00:47:07,510 --> 00:47:10,030 It could be either way. 1033 00:47:10,030 --> 00:47:11,170 Who knows what's going on? 1034 00:47:11,170 --> 00:47:13,720 Could be scenario 1, could be scenario 2. 1035 00:47:13,720 --> 00:47:15,620 Who knows? 1036 00:47:15,620 --> 00:47:17,120 But you know, there's a good chance, 1037 00:47:17,120 --> 00:47:21,595 50% chance that it's scenario 2, in which case choosing Option A 1038 00:47:21,595 --> 00:47:23,240 is actually good for everybody. 1039 00:47:23,240 --> 00:47:26,590 So let me just not find out and then choose Option A, 1040 00:47:26,590 --> 00:47:29,050 and I'm going to essentially make myself think that, 1041 00:47:29,050 --> 00:47:33,160 oh, probably it was scenario number 2. 1042 00:47:33,160 --> 00:47:35,110 So essentially what people are doing here 1043 00:47:35,110 --> 00:47:37,450 is they sort of delude themselves in some way 1044 00:47:37,450 --> 00:47:40,660 into thinking that they're doing the right thing, 1045 00:47:40,660 --> 00:47:44,620 or they're doing not a mean thing by avoiding information 1046 00:47:44,620 --> 00:47:47,950 that would essentially be free. 1047 00:47:47,950 --> 00:47:51,040 So if you really wanted to find out whether you're mean or not, 1048 00:47:51,040 --> 00:47:52,850 you could just get the information. 1049 00:47:52,850 --> 00:47:55,302 And since it's like free there and entirely available. 1050 00:47:55,302 --> 00:47:57,010 Instead, what people do is they just say, 1051 00:47:57,010 --> 00:47:58,810 oh, I'd rather not know, and then they 1052 00:47:58,810 --> 00:48:02,620 choose the thing that's at least potentially mean. 1053 00:48:02,620 --> 00:48:07,510 And so then not only is it the case that 44% of people 1054 00:48:07,510 --> 00:48:11,920 choose not to find out, but also the fraction of selfish people 1055 00:48:11,920 --> 00:48:13,330 goes up by a lot. 1056 00:48:15,890 --> 00:48:17,400 So I showed you-- 1057 00:48:17,400 --> 00:48:19,930 this is a little bit tricky-- but essentially 1058 00:48:19,930 --> 00:48:23,630 what I showed you here is that there's a 50% chance 1059 00:48:23,630 --> 00:48:27,920 to be in scenario 1 versus in scenario 2. 1060 00:48:27,920 --> 00:48:32,090 And the 63% that I'm going to show you here, 1061 00:48:32,090 --> 00:48:33,590 this is the fraction who essentially 1062 00:48:33,590 --> 00:48:35,120 chose the selfish choice. 1063 00:48:35,120 --> 00:48:40,070 This is only looking at the 50% chance for which x 1064 00:48:40,070 --> 00:48:42,080 equals 1 and y equals 5. 1065 00:48:42,080 --> 00:48:44,180 So essentially this is taking the half 1066 00:48:44,180 --> 00:48:51,410 of the cases in which, actually, x 1067 00:48:51,410 --> 00:48:53,900 equals 1 and y equals 5 was implemented. 1068 00:48:53,900 --> 00:48:57,422 These are cases where either the person found out or not. 1069 00:48:57,422 --> 00:48:59,130 But when you take those things together-- 1070 00:48:59,130 --> 00:49:02,900 so these are like essentially two scenarios or the sum of two 1071 00:49:02,900 --> 00:49:04,430 things that happened-- 1072 00:49:04,430 --> 00:49:08,120 one is the person didn't find out and chose Option A or B, 1073 00:49:08,120 --> 00:49:10,040 or the person actually found out. 1074 00:49:10,040 --> 00:49:15,380 But taken together, essentially 63% of people in those cases 1075 00:49:15,380 --> 00:49:17,600 chose the selfish option, Option A, 1076 00:49:17,600 --> 00:49:20,150 compared to the 26% in the baseline case 1077 00:49:20,150 --> 00:49:21,510 that I showed you earlier. 1078 00:49:21,510 --> 00:49:23,420 So putting these things together essentially 1079 00:49:23,420 --> 00:49:26,390 means that there's quite a few people who 1080 00:49:26,390 --> 00:49:28,790 conveniently don't want to find out whether they're mean 1081 00:49:28,790 --> 00:49:29,480 or not. 1082 00:49:29,480 --> 00:49:32,000 They choose, then, the mean option, 1083 00:49:32,000 --> 00:49:35,690 and much more so than when they don't have this whole-- 1084 00:49:35,690 --> 00:49:39,920 what's called moral wiggle room, which essentially is the option 1085 00:49:39,920 --> 00:49:42,410 to delude themselves, and they're actually not 1086 00:49:42,410 --> 00:49:45,650 that mean because probably, it was better 1087 00:49:45,650 --> 00:49:48,800 for everybody to choose Option A. Does that make sense? 1088 00:49:55,780 --> 00:49:58,300 So now you might ask two things. 1089 00:49:58,300 --> 00:50:00,540 Well, it could be that there's this concern 1090 00:50:00,540 --> 00:50:01,535 about others' beliefs. 1091 00:50:01,535 --> 00:50:02,910 This could be about social image, 1092 00:50:02,910 --> 00:50:04,830 or could be about self-image. 1093 00:50:04,830 --> 00:50:08,190 Now the experiment is precisely set up in a way 1094 00:50:08,190 --> 00:50:10,260 that, in fact, it's not about social image. 1095 00:50:10,260 --> 00:50:14,190 The other person, in fact, doesn't find out 1096 00:50:14,190 --> 00:50:16,470 whether you learned her payoff. 1097 00:50:16,470 --> 00:50:21,180 And so not learning doesn't really help improve her. 1098 00:50:21,180 --> 00:50:23,140 The other person only sees the result. 1099 00:50:23,140 --> 00:50:29,010 So whether I actually learn about which scenario we're in 1100 00:50:29,010 --> 00:50:32,670 doesn't really actually help improving the other person's 1101 00:50:32,670 --> 00:50:33,330 opinion. 1102 00:50:33,330 --> 00:50:35,910 The only person who actually knows whether you found out 1103 00:50:35,910 --> 00:50:37,980 what scenario you're in is actually you 1104 00:50:37,980 --> 00:50:40,020 yourself when you make that choice, which 1105 00:50:40,020 --> 00:50:43,020 really suggests that this is about saving face, about 1106 00:50:43,020 --> 00:50:44,340 self-image. 1107 00:50:44,340 --> 00:50:47,370 People essentially want to delude themselves or make 1108 00:50:47,370 --> 00:50:49,710 themselves think that they're nicer than they actually 1109 00:50:49,710 --> 00:50:54,090 are by avoiding information that could actually help them 1110 00:50:54,090 --> 00:50:56,220 make the right choice or be nicer if they really 1111 00:50:56,220 --> 00:50:57,900 wanted to be. 1112 00:50:57,900 --> 00:51:00,750 So if you really wanted to be nice, what you would do 1113 00:51:00,750 --> 00:51:03,480 is you would really find out which scenario you're in. 1114 00:51:03,480 --> 00:51:05,370 You would choose Option A and Scenario 2 1115 00:51:05,370 --> 00:51:07,950 when it's better for everybody, and you choose option B 1116 00:51:07,950 --> 00:51:10,410 if it's better for the other person 1117 00:51:10,410 --> 00:51:14,350 or really costly for the other person to choose Option A. 1118 00:51:14,350 --> 00:51:16,390 So let me sort of summarize here. 1119 00:51:16,390 --> 00:51:18,810 So the most likely explanation is 1120 00:51:18,810 --> 00:51:22,200 that it's really based on how dictators 1121 00:51:22,200 --> 00:51:23,740 feel about themselves. 1122 00:51:23,740 --> 00:51:25,830 So the uncertainty in whether the selfish action 1123 00:51:25,830 --> 00:51:29,580 helps or hurts the other person gives an excuse to be selfish. 1124 00:51:29,580 --> 00:51:32,430 So now I can say, oh, 50% chance that this is 1125 00:51:32,430 --> 00:51:34,330 the right choice for everybody. 1126 00:51:34,330 --> 00:51:35,580 So that's an excuse. 1127 00:51:35,580 --> 00:51:39,227 So you can keep telling yourself that you didn't mean any harm. 1128 00:51:39,227 --> 00:51:40,560 I really didn't want to be mean. 1129 00:51:40,560 --> 00:51:42,750 I thought with a good chance the other person will 1130 00:51:42,750 --> 00:51:43,820 be better off as well. 1131 00:51:49,000 --> 00:51:52,090 And this term is sort of called moral wiggle room, which 1132 00:51:52,090 --> 00:51:54,520 essentially is sort of helping you 1133 00:51:54,520 --> 00:51:58,090 be in an ambiguous situation because once you haven't found 1134 00:51:58,090 --> 00:52:01,300 out, it's actually unclear whether the other person is 1135 00:52:01,300 --> 00:52:03,430 better off or worse off by choosing Option A and B. 1136 00:52:03,430 --> 00:52:05,050 The experiment is set up in a way 1137 00:52:05,050 --> 00:52:08,440 that if you do not find out which scenario you're in, 1138 00:52:08,440 --> 00:52:10,690 it's not clear what's better for the other person what 1139 00:52:10,690 --> 00:52:12,430 to choose. 1140 00:52:12,430 --> 00:52:15,070 But of course, that's amazing in some sense 1141 00:52:15,070 --> 00:52:17,787 because you could always find out costlessly. 1142 00:52:17,787 --> 00:52:19,870 And sometimes you can delude yourself and say, ah, 1143 00:52:19,870 --> 00:52:20,710 it's not clear. 1144 00:52:20,710 --> 00:52:21,730 What should we do? 1145 00:52:21,730 --> 00:52:23,018 I don't know. 1146 00:52:23,018 --> 00:52:25,060 Could be good for the person, bad for the person, 1147 00:52:25,060 --> 00:52:27,010 whatever, but I'm choosing A. 1148 00:52:27,010 --> 00:52:29,780 But of course that's silly because in some sense 1149 00:52:29,780 --> 00:52:32,290 that's not an explanation that holds water because you could 1150 00:52:32,290 --> 00:52:34,207 always find out, if you really wanted to know, 1151 00:52:34,207 --> 00:52:36,820 whether you wanted to be nice to the other person. 1152 00:52:36,820 --> 00:52:39,530 It's very easy to find out what's going on. 1153 00:52:39,530 --> 00:52:41,080 So that's essentially to say people 1154 00:52:41,080 --> 00:52:43,420 want to save face in front of themselves 1155 00:52:43,420 --> 00:52:46,990 as opposed to in front of others. 1156 00:52:46,990 --> 00:52:48,100 Any questions on this? 1157 00:52:52,710 --> 00:52:53,510 OK. 1158 00:52:53,510 --> 00:52:56,360 So now a different version of that is giving 1159 00:52:56,360 --> 00:52:59,360 and communication, which, is, again in some ways, 1160 00:52:59,360 --> 00:53:02,720 about social image concerns, about having 1161 00:53:02,720 --> 00:53:04,880 to deal with others' reactions to you. 1162 00:53:04,880 --> 00:53:07,160 This could be about self-image or social image. 1163 00:53:07,160 --> 00:53:09,890 But either way, it's about some form of communication, 1164 00:53:09,890 --> 00:53:12,170 how others or you yourself feel. 1165 00:53:12,170 --> 00:53:16,430 So Ellingsen and Johannesson modified the dictator game 1166 00:53:16,430 --> 00:53:20,660 by allowing for a very limited form of communication. 1167 00:53:20,660 --> 00:53:23,030 One group of subjects played the usual dictator game, 1168 00:53:23,030 --> 00:53:24,950 which is just as usual. 1169 00:53:24,950 --> 00:53:26,600 And the other group, the recipients 1170 00:53:26,600 --> 00:53:28,910 only were allowed to send anonymous messages 1171 00:53:28,910 --> 00:53:30,790 to the dictator after receiving their share. 1172 00:53:30,790 --> 00:53:33,290 Is that the version you played, or was it different for you? 1173 00:53:33,290 --> 00:53:36,220 Could you already talk to them earlier, 1174 00:53:36,220 --> 00:53:37,791 when there was communication? 1175 00:53:42,392 --> 00:53:44,600 It might have been the exact version that you played, 1176 00:53:44,600 --> 00:53:46,183 or it might have been in your version, 1177 00:53:46,183 --> 00:53:48,800 you could actually talk to the other person earlier. 1178 00:53:48,800 --> 00:53:52,972 AUDIENCE: You could only message before you decided. 1179 00:53:52,972 --> 00:53:53,680 PROFESSOR: I see. 1180 00:53:53,680 --> 00:53:56,830 So that's a slightly different version from what you played. 1181 00:53:56,830 --> 00:54:00,700 This one is a version where the recipient could only respond. 1182 00:54:00,700 --> 00:54:04,750 So the dictator gets the $10 or whatever is played. 1183 00:54:04,750 --> 00:54:07,480 The dictator makes a choice, and then the recipient 1184 00:54:07,480 --> 00:54:09,790 can essentially send a message back, 1185 00:54:09,790 --> 00:54:13,080 can in some way reciprocate if you want. 1186 00:54:13,080 --> 00:54:17,110 Could be a nice message if you give quite a bit of money, 1187 00:54:17,110 --> 00:54:20,920 or it could be a mean message if you didn't give a lot of money. 1188 00:54:20,920 --> 00:54:25,030 And anticipating that might affect the dictator's choices. 1189 00:54:25,030 --> 00:54:29,930 And so this is kind of what they found in the paper. 1190 00:54:29,930 --> 00:54:33,520 So essentially, the anticipation of a feedback 1191 00:54:33,520 --> 00:54:38,260 increased sharing from 25% to 34% in this specific game. 1192 00:54:38,260 --> 00:54:39,770 That's quite a bit. 1193 00:54:39,770 --> 00:54:45,610 So most recipients, in fact, do choose to send a message. 1194 00:54:45,610 --> 00:54:47,360 The messages vary quite a bit. 1195 00:54:47,360 --> 00:54:50,080 So it's about very mean messages saying, 1196 00:54:50,080 --> 00:54:53,200 so you chose to take all the money yourself, 1197 00:54:53,200 --> 00:54:54,520 you greedy bastard. 1198 00:54:54,520 --> 00:54:57,165 I was just wondering if there was anyone who would do that. 1199 00:54:57,165 --> 00:54:58,540 And the answer is apparently yes. 1200 00:54:58,540 --> 00:55:00,190 Apparently people like you exist. 1201 00:55:00,190 --> 00:55:01,270 Have a nice evening. 1202 00:55:01,270 --> 00:55:02,990 So that's not a very nice message. 1203 00:55:02,990 --> 00:55:06,530 I also censored a few messages that were in there earlier. 1204 00:55:06,530 --> 00:55:08,470 Another one is, thank you, greedy bastard. 1205 00:55:08,470 --> 00:55:11,080 You will like it as investment banker. 1206 00:55:11,080 --> 00:55:13,728 I hope you will buy something nice. 1207 00:55:13,728 --> 00:55:15,520 And then there's sort of positive messages, 1208 00:55:15,520 --> 00:55:17,770 presumably, when people are actually given quite a bit 1209 00:55:17,770 --> 00:55:20,373 of money, which is like, there's hope for a better world. 1210 00:55:20,373 --> 00:55:22,540 With the help of your generosity and your big heart, 1211 00:55:22,540 --> 00:55:26,530 I can tonight break the black pudding curse. 1212 00:55:26,530 --> 00:55:28,000 No more pea soup. 1213 00:55:28,000 --> 00:55:30,460 You have a standing invitation to dinner at my place. 1214 00:55:30,460 --> 00:55:32,435 The door is always open with love. 1215 00:55:32,435 --> 00:55:34,060 So you know that varies by quite a bit. 1216 00:55:34,060 --> 00:55:36,370 And anticipating those kinds of messages, 1217 00:55:36,370 --> 00:55:38,547 you can see how dictators then might, 1218 00:55:38,547 --> 00:55:40,630 particularly if they play this repeatedly and have 1219 00:55:40,630 --> 00:55:47,210 some experience with it, might opt into giving more. 1220 00:55:47,210 --> 00:55:49,330 So this indicates that sort of beyond caring 1221 00:55:49,330 --> 00:55:53,620 about the recipient's reaction, dictators 1222 00:55:53,620 --> 00:55:55,760 care how shielded they are from it. 1223 00:55:55,760 --> 00:55:57,865 So it's not just about, you might 1224 00:55:57,865 --> 00:56:00,490 feel bad-- so what I showed you previously in the costless exit 1225 00:56:00,490 --> 00:56:03,952 thing, it was more about I'm worried about you feeling bad, 1226 00:56:03,952 --> 00:56:06,160 and I don't want you to feel bad even if I don't have 1227 00:56:06,160 --> 00:56:08,410 to face you in any situations. 1228 00:56:08,410 --> 00:56:11,350 This one is about, I might worry about you feeling bad, 1229 00:56:11,350 --> 00:56:17,590 but I also might worry about receiving 1230 00:56:17,590 --> 00:56:18,507 that message from you. 1231 00:56:18,507 --> 00:56:20,298 So there are some people who might actually 1232 00:56:20,298 --> 00:56:22,630 be fine about the other person feeling bad as long as I 1233 00:56:22,630 --> 00:56:24,490 don't have to face it. 1234 00:56:24,490 --> 00:56:26,740 But now, if I actually have to face the message 1235 00:56:26,740 --> 00:56:31,713 from the other person, I might behave nicer in some ways 1236 00:56:31,713 --> 00:56:33,880 because I didn't have to actually deal with the fact 1237 00:56:33,880 --> 00:56:38,260 or face the fact that perhaps I'm not a nice person. 1238 00:56:38,260 --> 00:56:40,960 In our version-- I looked this up-- 1239 00:56:40,960 --> 00:56:44,650 it seems like there's not much evidence of communication 1240 00:56:44,650 --> 00:56:46,267 affecting behavior. 1241 00:56:46,267 --> 00:56:48,850 This could be, perhaps, because the version with communication 1242 00:56:48,850 --> 00:56:52,270 was different from what was played here. 1243 00:56:52,270 --> 00:56:54,730 So it doesn't seem like once the person gave, 1244 00:56:54,730 --> 00:56:57,310 I think the game was just over, as opposed 1245 00:56:57,310 --> 00:57:01,390 to once the person gave, you could send a message back 1246 00:57:01,390 --> 00:57:04,430 and reciprocate in some way. 1247 00:57:04,430 --> 00:57:05,890 So here you can see that. 1248 00:57:05,890 --> 00:57:08,582 In a dictator game without communication, 1249 00:57:08,582 --> 00:57:10,540 notice that the axes are changing a little bit. 1250 00:57:10,540 --> 00:57:13,360 This is like the monetary stakes, no communication. 1251 00:57:13,360 --> 00:57:14,600 This is with communication. 1252 00:57:14,600 --> 00:57:15,960 Notice that the axis changes. 1253 00:57:15,960 --> 00:57:17,710 I can't really fix this from the software, 1254 00:57:17,710 --> 00:57:21,460 but if you look at the average offer, with no communication, 1255 00:57:21,460 --> 00:57:23,800 it's 38%. 1256 00:57:23,800 --> 00:57:26,200 With communication, it's 37%, so if anything, 1257 00:57:26,200 --> 00:57:27,880 it's a little bit lower. 1258 00:57:27,880 --> 00:57:30,910 The comparison is a little bit unclean because over time, 1259 00:57:30,910 --> 00:57:32,740 maybe later in the game, you play different 1260 00:57:32,740 --> 00:57:34,573 than earlier in the game, and it's not quite 1261 00:57:34,573 --> 00:57:35,740 cleanly randomized. 1262 00:57:35,740 --> 00:57:37,420 But it doesn't seem, in your case, 1263 00:57:37,420 --> 00:57:39,520 like the communication mattered very much. 1264 00:57:39,520 --> 00:57:41,920 Again, perhaps that's because the nature of the game 1265 00:57:41,920 --> 00:57:43,045 is somewhat different. 1266 00:57:46,630 --> 00:57:47,130 Sorry. 1267 00:57:47,130 --> 00:57:48,672 This is actually not a dictator game. 1268 00:57:48,672 --> 00:57:51,410 This is the ultimatum game. 1269 00:57:51,410 --> 00:57:52,710 This is a typo at the top. 1270 00:57:52,710 --> 00:57:57,222 For the ultimatum game, it seems like communication matters 1271 00:57:57,222 --> 00:57:58,680 a little bit when you look at this. 1272 00:57:58,680 --> 00:58:01,800 When you look at the very left, people giving 0, 1273 00:58:01,800 --> 00:58:04,050 for whatever reason, I don't know how you communicated 1274 00:58:04,050 --> 00:58:06,600 and what you said, but it seems like your communication made 1275 00:58:06,600 --> 00:58:08,670 people more likely to give 0, and then 1276 00:58:08,670 --> 00:58:11,930 these 0 options being rejected. 1277 00:58:11,930 --> 00:58:14,430 So this is what you see on the very left is in the ultimatum 1278 00:58:14,430 --> 00:58:17,100 game, the red is essentially, these 1279 00:58:17,100 --> 00:58:19,200 are offers that are rejected. 1280 00:58:19,200 --> 00:58:22,290 Without communication, nobody was giving 0. 1281 00:58:22,290 --> 00:58:25,080 With communication, somehow people are giving 0, 1282 00:58:25,080 --> 00:58:27,280 and then these offers were rejected. 1283 00:58:27,280 --> 00:58:30,900 So perhaps some of you gave very negative messages or threats 1284 00:58:30,900 --> 00:58:32,820 or whatever, and that really pissed off 1285 00:58:32,820 --> 00:58:35,910 the dictator, the other person, and then essentially 1286 00:58:35,910 --> 00:58:37,470 that really backfired. 1287 00:58:37,470 --> 00:58:40,260 So I don't know who is teaching you how to communicate, 1288 00:58:40,260 --> 00:58:43,170 but maybe you want to think about this in future games. 1289 00:58:43,170 --> 00:58:45,720 Somehow communication, if anything, 1290 00:58:45,720 --> 00:58:51,270 made things worse compared to the no communication case. 1291 00:58:51,270 --> 00:58:55,860 But even there, if you look at the average offer 1292 00:58:55,860 --> 00:58:59,380 is 47% here on the top. 1293 00:58:59,380 --> 00:59:03,940 And again, 46% with communication. 1294 00:59:03,940 --> 00:59:07,020 So it doesn't make a huge difference either way. 1295 00:59:07,020 --> 00:59:08,550 OK? 1296 00:59:08,550 --> 00:59:09,680 Any questions on that? 1297 00:59:16,310 --> 00:59:17,990 So then we're going to talk a little bit 1298 00:59:17,990 --> 00:59:21,710 about intentions-based social preferences. 1299 00:59:21,710 --> 00:59:26,660 Let me tell you a brief tale about this 1300 00:59:26,660 --> 00:59:28,910 that sort of illustrates this point quite nicely. 1301 00:59:28,910 --> 00:59:32,930 So a boy finds two ripe apples as he walks home from school. 1302 00:59:32,930 --> 00:59:37,760 He keeps the larger one, gives a smaller one to his friend. 1303 00:59:37,760 --> 00:59:43,230 The friend says, it wasn't nice to keep the larger one. 1304 00:59:43,230 --> 00:59:45,900 The boy says, well, what would you have done? 1305 00:59:45,900 --> 00:59:49,350 And the friend says, I'd have given you the larger one 1306 00:59:49,350 --> 00:59:51,300 and kept the smaller one. 1307 00:59:51,300 --> 00:59:54,400 And the boy says, well, we each got what you wanted. 1308 00:59:54,400 --> 00:59:56,680 So what are you complaining about? 1309 00:59:56,680 --> 00:59:58,230 And so what does that illustrate? 1310 00:59:58,230 --> 01:00:01,890 It illustrates that it's not always 1311 01:00:01,890 --> 01:00:04,710 about outcomes or the distribution, even, 1312 01:00:04,710 --> 01:00:06,210 of outcomes or the like. 1313 01:00:06,210 --> 01:00:10,290 What seems to matter is some form 1314 01:00:10,290 --> 01:00:15,210 of like justice or fairness in how the allocation came about. 1315 01:00:15,210 --> 01:00:17,910 In most of the economics, essentially, we 1316 01:00:17,910 --> 01:00:19,920 assume that our satisfaction of outcomes, 1317 01:00:19,920 --> 01:00:22,590 like how much we like certain outcomes and so on, when 1318 01:00:22,590 --> 01:00:25,710 you look at people's utility function, often, essentially, 1319 01:00:25,710 --> 01:00:26,790 it's outcome-based. 1320 01:00:26,790 --> 01:00:29,680 What matters is essentially how much you consume, and so on. 1321 01:00:29,680 --> 01:00:31,590 So the satisfaction with outcomes 1322 01:00:31,590 --> 01:00:34,440 doesn't depend on how the outcome came about. 1323 01:00:34,440 --> 01:00:37,410 And here, when you look at these apples, 1324 01:00:37,410 --> 01:00:41,092 how the apples are distributed, the friend actually 1325 01:00:41,092 --> 01:00:43,050 doesn't seem to care that much about whether he 1326 01:00:43,050 --> 01:00:44,910 gets the larger or the smaller apple. 1327 01:00:44,910 --> 01:00:47,280 What the friend cares about is, is your other friend 1328 01:00:47,280 --> 01:00:50,160 a nice person, or is the allocation of the outcome fair, 1329 01:00:50,160 --> 01:00:52,860 and a sense of didn't he just keep the larger 1330 01:00:52,860 --> 01:00:57,070 apple because he could, and then was mean to his friend. 1331 01:00:57,070 --> 01:01:00,583 So really what seems to matter quite a bit is-- 1332 01:01:00,583 --> 01:01:03,000 and there's quite a bit of work in organizational behavior 1333 01:01:03,000 --> 01:01:04,650 and economics and social psychology-- 1334 01:01:04,650 --> 01:01:07,170 is the idea of procedural justice. 1335 01:01:07,170 --> 01:01:09,990 Fairness in the process used to allocate resources 1336 01:01:09,990 --> 01:01:12,340 seems to matter quite a bit. 1337 01:01:12,340 --> 01:01:14,790 So maybe another way to put this, 1338 01:01:14,790 --> 01:01:17,160 it's kind of OK, actually, for most people. 1339 01:01:17,160 --> 01:01:20,430 If some people are richer than others that's fine, 1340 01:01:20,430 --> 01:01:25,020 but it's much less if they obtained money 1341 01:01:25,020 --> 01:01:27,990 as coming in some corrupt process, 1342 01:01:27,990 --> 01:01:32,130 or if it's just kind of unfair because some person was born 1343 01:01:32,130 --> 01:01:35,670 really rich, and the other person was born really poor, 1344 01:01:35,670 --> 01:01:40,650 and there's no mobility or opportunity in society. 1345 01:01:40,650 --> 01:01:43,560 That's sort of relating to the idea that in a way, 1346 01:01:43,560 --> 01:01:45,960 the American dream is everybody can become rich, 1347 01:01:45,960 --> 01:01:47,460 and that's essentially the key here. 1348 01:01:47,460 --> 01:01:49,660 And then if some people are richer than others, 1349 01:01:49,660 --> 01:01:51,810 people often think that's OK. 1350 01:01:51,810 --> 01:01:54,180 What's not OK, however, is if some people 1351 01:01:54,180 --> 01:01:56,280 have vastly different opportunities or chances 1352 01:01:56,280 --> 01:01:57,090 than others. 1353 01:01:57,090 --> 01:01:59,580 Or if there's lots of corruption, where essentially 1354 01:01:59,580 --> 01:02:04,860 some people get certain jobs not because they're qualified, 1355 01:02:04,860 --> 01:02:07,470 but rather because their parents or their family 1356 01:02:07,470 --> 01:02:10,050 or their friends or whatever are very powerful 1357 01:02:10,050 --> 01:02:12,150 or they have corrupted others in other ways. 1358 01:02:17,370 --> 01:02:20,100 Now there's a question of fairness, 1359 01:02:20,100 --> 01:02:23,350 and then there's a question of reciprocity, 1360 01:02:23,350 --> 01:02:26,880 which is essentially the idea that so fairness, people have 1361 01:02:26,880 --> 01:02:27,960 preferences for fairness. 1362 01:02:27,960 --> 01:02:31,690 They like fair outcomes more than others, unfair outcomes. 1363 01:02:31,690 --> 01:02:33,750 Another form of that is reciprocity, 1364 01:02:33,750 --> 01:02:38,160 is people like to treat others as others have treated 1365 01:02:38,160 --> 01:02:39,240 or treating them. 1366 01:02:42,330 --> 01:02:45,060 More generally, the way we treat others, essentially, depends 1367 01:02:45,060 --> 01:02:46,478 on the way they treat us. 1368 01:02:46,478 --> 01:02:48,020 So if somebody is mean to you, you're 1369 01:02:48,020 --> 01:02:48,990 going to be mean to them. 1370 01:02:48,990 --> 01:02:50,407 If somebody is nice to you, you're 1371 01:02:50,407 --> 01:02:51,550 going to be nice to them. 1372 01:02:51,550 --> 01:02:53,390 And that's really important motivation 1373 01:02:53,390 --> 01:02:55,500 in a lot of behavior. 1374 01:02:55,500 --> 01:02:59,430 Now this gets us back to the ultimatum game-- 1375 01:02:59,430 --> 01:03:01,810 I'm going to talk about this only very briefly-- 1376 01:03:01,810 --> 01:03:06,433 which is the question of why do responders reject low offers 1377 01:03:06,433 --> 01:03:07,350 in the ultimatum game? 1378 01:03:07,350 --> 01:03:08,370 We talked about this a little bit. 1379 01:03:08,370 --> 01:03:10,530 You can think about distributional preferences. 1380 01:03:10,530 --> 01:03:14,520 We talked about [INAUDIBLE] and Rabin and about the preferences 1381 01:03:14,520 --> 01:03:16,182 about being behind versus not. 1382 01:03:16,182 --> 01:03:17,640 There's an explanation that's based 1383 01:03:17,640 --> 01:03:19,170 on distributional preference where you just 1384 01:03:19,170 --> 01:03:21,240 don't want to be really behind, and therefore you 1385 01:03:21,240 --> 01:03:23,580 reject the unfair offers. 1386 01:03:23,580 --> 01:03:26,100 But another one is you think you've really treated unfairly, 1387 01:03:26,100 --> 01:03:29,730 and you're willing to punish the proposer for such unfairness. 1388 01:03:29,730 --> 01:03:33,360 And in a simple ultimatum game, you 1389 01:03:33,360 --> 01:03:36,630 can't really tell whether it's hypothesis one or two. 1390 01:03:36,630 --> 01:03:40,200 But-- and we talked about this already-- 1391 01:03:40,200 --> 01:03:43,290 you can compare responders' choices in the ultimatum game 1392 01:03:43,290 --> 01:03:45,210 to choices when the offer isn't generated 1393 01:03:45,210 --> 01:03:51,000 by the other person, the recipient or the other person. 1394 01:03:51,000 --> 01:03:54,060 For example, you can look at choices 1395 01:03:54,060 --> 01:03:58,110 that were generated by the computer, by a third person 1396 01:03:58,110 --> 01:03:59,000 and so on. 1397 01:03:59,000 --> 01:04:02,670 So if it's generated by the third person, you might say, 1398 01:04:02,670 --> 01:04:05,790 I shouldn't punish the other person in the game 1399 01:04:05,790 --> 01:04:08,240 because it's not that person who chose it. 1400 01:04:08,240 --> 01:04:10,240 It's some other person, so it's not their fault. 1401 01:04:10,240 --> 01:04:11,640 So why should I punish them? 1402 01:04:11,640 --> 01:04:14,250 Similarly, if the computer chose, again, 1403 01:04:14,250 --> 01:04:16,748 if I have less than the other person, that might be OK. 1404 01:04:16,748 --> 01:04:18,540 I might not be happy about it, but it's not 1405 01:04:18,540 --> 01:04:21,930 the other person's fault. It's the computer who chose. 1406 01:04:21,930 --> 01:04:24,180 In the next lecture, we're going to talk about a field 1407 01:04:24,180 --> 01:04:25,080 application of that. 1408 01:04:25,080 --> 01:04:28,730 It's a very nice paper by Emily Breza and co-authors, 1409 01:04:28,730 --> 01:04:32,430 which is about the morale effects of pay inequality. 1410 01:04:32,430 --> 01:04:36,690 This is about how workers behave in a work environment 1411 01:04:36,690 --> 01:04:39,180 when people are paid unequally. 1412 01:04:39,180 --> 01:04:43,728 And the short summary of this paper is that when workers-- 1413 01:04:43,728 --> 01:04:45,270 this is a field experiment in India-- 1414 01:04:45,270 --> 01:04:52,680 when workers are paid unequally, they might be really unhappy 1415 01:04:52,680 --> 01:04:56,937 and be less productive at their work because essentially, 1416 01:04:56,937 --> 01:04:58,770 and particularly if they have to collaborate 1417 01:04:58,770 --> 01:05:01,110 with other workers who are paid more than them. 1418 01:05:01,110 --> 01:05:03,120 But in particular, what they seem to care about 1419 01:05:03,120 --> 01:05:06,850 is, is it fair that the other person earns more? 1420 01:05:06,850 --> 01:05:09,000 So if you earn more than I do, it 1421 01:05:09,000 --> 01:05:11,010 might actually be OK if I can understand 1422 01:05:11,010 --> 01:05:12,560 that you are much smarter, faster, 1423 01:05:12,560 --> 01:05:13,840 or better at what you do. 1424 01:05:13,840 --> 01:05:15,690 So if in some observable ways, I can 1425 01:05:15,690 --> 01:05:18,720 tell that you're better than I am, then, in some sense, 1426 01:05:18,720 --> 01:05:20,700 I might be perfectly fine with you 1427 01:05:20,700 --> 01:05:22,830 earning like 10%, 20%, 50% or whatever percent 1428 01:05:22,830 --> 01:05:25,710 more money because I know you're a better worker. 1429 01:05:25,710 --> 01:05:27,600 So OK, fine, you're paid more. 1430 01:05:27,600 --> 01:05:30,600 That seems fine and OK. 1431 01:05:30,600 --> 01:05:32,935 However, in some other situation where 1432 01:05:32,935 --> 01:05:35,310 it's kind of not that obvious, where it looks like you're 1433 01:05:35,310 --> 01:05:36,768 doing the same thing that I'm doing 1434 01:05:36,768 --> 01:05:38,790 and not really faster or better and so on, 1435 01:05:38,790 --> 01:05:42,060 yet you're paid more, I might be really annoyed and unhappy 1436 01:05:42,060 --> 01:05:43,092 about it. 1437 01:05:43,092 --> 01:05:44,550 And you might also be uncomfortable 1438 01:05:44,550 --> 01:05:48,000 because you feel bad about being paid more than I do for doing 1439 01:05:48,000 --> 01:05:49,540 exactly the same thing. 1440 01:05:49,540 --> 01:05:52,190 So the experiment by Emily Breza and coauthors 1441 01:05:52,190 --> 01:05:55,410 is doing exactly that, looking at situations where workers are 1442 01:05:55,410 --> 01:06:00,060 paid unequally versus the same, and situations where this 1443 01:06:00,060 --> 01:06:02,400 is justified versus not, and really 1444 01:06:02,400 --> 01:06:05,130 seems to be that fairness matters, or perceived fairness 1445 01:06:05,130 --> 01:06:06,930 matters quite a bit for these workers. 1446 01:06:06,930 --> 01:06:11,550 When workers can justify being paid less in some way, or more 1447 01:06:11,550 --> 01:06:13,470 than others, that's OK. 1448 01:06:13,470 --> 01:06:15,030 When it's unjustified, they really 1449 01:06:15,030 --> 01:06:17,400 get annoyed and produce a lot less 1450 01:06:17,400 --> 01:06:18,570 in that particular setting. 1451 01:06:21,740 --> 01:06:23,060 OK. 1452 01:06:23,060 --> 01:06:25,557 So let me now start with some field evidence. 1453 01:06:25,557 --> 01:06:28,140 We're going to talk a lot more about field evidence next time. 1454 01:06:28,140 --> 01:06:31,020 Let me briefly talk about this paper here, and the next time, 1455 01:06:31,020 --> 01:06:33,540 I'm going to tell you a lot more about that. 1456 01:06:33,540 --> 01:06:36,080 So this is a very nice paper by Bandiera et al which 1457 01:06:36,080 --> 01:06:40,370 looks at the impact of relative pay versus piece rates 1458 01:06:40,370 --> 01:06:41,600 on productivity. 1459 01:06:41,600 --> 01:06:43,640 Piece rates is just like whatever you produce, 1460 01:06:43,640 --> 01:06:46,490 you're going to be paid by unit of your output. 1461 01:06:46,490 --> 01:06:50,630 You just get some fixed payment per each unit of output 1462 01:06:50,630 --> 01:06:52,220 that you produce. 1463 01:06:52,220 --> 01:06:55,580 Relative pay is essentially your workers 1464 01:06:55,580 --> 01:06:57,630 are paid relative to others. 1465 01:06:57,630 --> 01:07:01,370 So if I'm working 50% more than you do, 1466 01:07:01,370 --> 01:07:05,720 I get paid a lot more than you do. 1467 01:07:05,720 --> 01:07:09,830 Now why does this matter for people, worker behavior? 1468 01:07:09,830 --> 01:07:12,350 Well, there's a negative externality now 1469 01:07:12,350 --> 01:07:13,623 on me working really hard. 1470 01:07:13,623 --> 01:07:14,540 What's an externality? 1471 01:07:14,540 --> 01:07:21,110 Essentially an effect on you that you're not 1472 01:07:21,110 --> 01:07:22,580 really compensated for. 1473 01:07:22,580 --> 01:07:24,320 That is to say, if I worked really hard 1474 01:07:24,320 --> 01:07:26,690 and I'm really good at my work, not only am 1475 01:07:26,690 --> 01:07:29,180 I getting paid more, but also you're 1476 01:07:29,180 --> 01:07:33,980 going to be paid less because essentially relatively, 1477 01:07:33,980 --> 01:07:35,630 you're going to look worse than me 1478 01:07:35,630 --> 01:07:37,170 if you work in the same team. 1479 01:07:37,170 --> 01:07:39,110 So relative pay essentially has this issue 1480 01:07:39,110 --> 01:07:41,750 about where it matters for social preferences 1481 01:07:41,750 --> 01:07:44,720 because essentially, any increase in my pay 1482 01:07:44,720 --> 01:07:47,078 comes at a reduction of your pay. 1483 01:07:47,078 --> 01:07:48,620 To be clear, the relative pay is such 1484 01:07:48,620 --> 01:07:49,995 that there's an average pay which 1485 01:07:49,995 --> 01:07:52,430 depends on the average productivity of everybody. 1486 01:07:52,430 --> 01:07:54,380 And then the relative pay is on top of that, 1487 01:07:54,380 --> 01:07:56,960 is like the workers who work more, 1488 01:07:56,960 --> 01:07:58,610 are more productive relative to others, 1489 01:07:58,610 --> 01:08:00,305 are paid more than others. 1490 01:08:03,380 --> 01:08:05,280 And so how do you think about this? 1491 01:08:05,280 --> 01:08:07,520 You can think about a model of pure altruism. 1492 01:08:07,520 --> 01:08:09,260 We talked about this already before. 1493 01:08:09,260 --> 01:08:12,740 There's self has a payoff of x. 1494 01:08:12,740 --> 01:08:16,640 The other person has a payoff of x0 for other. 1495 01:08:16,640 --> 01:08:19,939 The self assigns weight alpha to the other person's utility. 1496 01:08:19,939 --> 01:08:21,979 This is sort of the typical model that's 1497 01:08:21,979 --> 01:08:23,432 been used by Becker and others. 1498 01:08:23,432 --> 01:08:24,890 We don't really think that's really 1499 01:08:24,890 --> 01:08:26,569 the right model for many situations, 1500 01:08:26,569 --> 01:08:29,100 but it's a really useful benchmark model. 1501 01:08:29,100 --> 01:08:33,979 And so now, if I have positive weight on this other person, 1502 01:08:33,979 --> 01:08:37,319 what I'm going to do is I'm going to, 1503 01:08:37,319 --> 01:08:40,189 in the relative pay scheme, I might not 1504 01:08:40,189 --> 01:08:42,350 work particularly hard because I know 1505 01:08:42,350 --> 01:08:47,810 if I'm working really hard, I'm going to make you worse off. 1506 01:08:47,810 --> 01:08:49,410 Does this make sense? 1507 01:08:49,410 --> 01:08:49,910 OK. 1508 01:08:49,910 --> 01:08:54,420 So I'm going to show you, essentially, 1509 01:08:54,420 --> 01:08:56,348 briefly the experiment, the impact, 1510 01:08:56,348 --> 01:08:58,390 the impact of social preference in the workplace. 1511 01:08:58,390 --> 01:09:03,750 So what they do is they have really nice data from the UK-- 1512 01:09:03,750 --> 01:09:07,620 sorry, the microphone doesn't really work-- 1513 01:09:07,620 --> 01:09:14,100 from the UK on fruit pickers under different compensation 1514 01:09:14,100 --> 01:09:15,810 schemes. 1515 01:09:15,810 --> 01:09:17,500 They have a quasi-field experiment. 1516 01:09:17,500 --> 01:09:20,760 They have eight weeks in a 2002 picking season 1517 01:09:20,760 --> 01:09:22,979 where fruit pickers are compensated 1518 01:09:22,979 --> 01:09:24,660 on a relative performance scheme. 1519 01:09:24,660 --> 01:09:29,910 That is to say, the per fruit piece rate 1520 01:09:29,910 --> 01:09:32,430 is decreasing in the average productivity. 1521 01:09:32,430 --> 01:09:34,439 So the incentive is to keep the productivity low 1522 01:09:34,439 --> 01:09:35,500 if you care about others. 1523 01:09:35,500 --> 01:09:38,250 Essentially, the more you produce, 1524 01:09:38,250 --> 01:09:40,950 the more you get yourself, but the other person, 1525 01:09:40,950 --> 01:09:42,210 everybody else, gets less. 1526 01:09:42,210 --> 01:09:44,069 So essentially, now you have an incentive 1527 01:09:44,069 --> 01:09:46,410 to not work too hard because if you work too hard, 1528 01:09:46,410 --> 01:09:49,170 other people around you are going to be paid less. 1529 01:09:49,170 --> 01:09:52,109 In the next eight weeks, however, the compensation 1530 01:09:52,109 --> 01:09:55,080 switched to a flat piece rate per fruit. 1531 01:09:55,080 --> 01:09:56,900 And so now the externality is shut down. 1532 01:09:56,900 --> 01:09:58,650 However, it doesn't really matter for you. 1533 01:09:58,650 --> 01:10:00,450 It doesn't matter how much I produce. 1534 01:10:00,450 --> 01:10:06,340 You essentially get however much you produce, 1535 01:10:06,340 --> 01:10:09,240 or the payment for that. 1536 01:10:09,240 --> 01:10:13,360 The switch was announced on the day when the change took place. 1537 01:10:13,360 --> 01:10:15,340 So it's kind of a surprise to workers. 1538 01:10:15,340 --> 01:10:17,580 So workers were working for eight weeks, 1539 01:10:17,580 --> 01:10:19,788 and then for another eight weeks, there was a switch, 1540 01:10:19,788 --> 01:10:22,205 and they didn't know that this change was going to happen. 1541 01:10:22,205 --> 01:10:24,120 This was a surprise change so we shouldn't 1542 01:10:24,120 --> 01:10:26,760 see any anticipatory effects of that. 1543 01:10:26,760 --> 01:10:29,520 Now the relative effects, now, of this policy 1544 01:10:29,520 --> 01:10:32,040 depends now on this alpha term. 1545 01:10:32,040 --> 01:10:34,500 If I don't care about others at all, it shouldn't matter. 1546 01:10:34,500 --> 01:10:36,750 I should just work really hard, as hard as I can, 1547 01:10:36,750 --> 01:10:39,150 because I want to increase my relative pay. 1548 01:10:39,150 --> 01:10:42,270 However, if my alpha is high, I might 1549 01:10:42,270 --> 01:10:44,820 work not so hard because I'm worried about, essentially, 1550 01:10:44,820 --> 01:10:49,047 my friends or co-workers being paid less. 1551 01:10:49,047 --> 01:10:50,130 And here's what they find. 1552 01:10:50,130 --> 01:10:53,258 Essentially find a dramatic increase in productivity. 1553 01:10:53,258 --> 01:10:55,800 So on the left-hand side, you see essentially the first eight 1554 01:10:55,800 --> 01:10:58,750 weeks, which essentially is their productivity. 1555 01:10:58,750 --> 01:11:03,690 This is kilograms per hour of fruit that are being collected. 1556 01:11:03,690 --> 01:11:06,510 That goes up dramatically in the second half. 1557 01:11:06,510 --> 01:11:09,410 You see on the left side, you essentially see no pre-trend. 1558 01:11:09,410 --> 01:11:11,910 It doesn't look like people are getting just more productive 1559 01:11:11,910 --> 01:11:14,280 over time, maybe because they become more experienced 1560 01:11:14,280 --> 01:11:17,790 or they understand better how to work fast. 1561 01:11:17,790 --> 01:11:20,010 It seems really flat on the left-hand side. 1562 01:11:20,010 --> 01:11:22,770 On the right-hand side it goes up by quite a bit 1563 01:11:22,770 --> 01:11:27,420 pretty much very quickly after this policy is announced. 1564 01:11:27,420 --> 01:11:29,640 It doesn't seem there's any other significant change. 1565 01:11:29,640 --> 01:11:33,120 It really seems like worker productivity per hour goes up. 1566 01:11:33,120 --> 01:11:35,940 For example, the number of workers in the same field 1567 01:11:35,940 --> 01:11:37,050 stays the same and so on. 1568 01:11:37,050 --> 01:11:39,248 It doesn't seem to be lots of other effects. 1569 01:11:39,248 --> 01:11:41,040 Really, the main thing that seems to happen 1570 01:11:41,040 --> 01:11:43,620 is that workers' productivity goes up. 1571 01:11:43,620 --> 01:11:47,280 Is this a response to changes in the piece rate? 1572 01:11:47,280 --> 01:11:49,590 No, in fact, the piece rate went down. 1573 01:11:49,590 --> 01:11:53,730 How much you were rewarded for the marginal kilogram 1574 01:11:53,730 --> 01:11:56,930 that you collected actually went down. 1575 01:11:56,930 --> 01:11:59,460 In fact, there were incentives to work less if you only 1576 01:11:59,460 --> 01:12:01,410 cared about your own output. 1577 01:12:01,410 --> 01:12:03,510 Also this is robust to controlling. 1578 01:12:03,510 --> 01:12:10,380 So now what's going on here, it seems to me 1579 01:12:10,380 --> 01:12:12,930 that workers care about others. 1580 01:12:12,930 --> 01:12:15,540 But not only do they care about others, 1581 01:12:15,540 --> 01:12:18,000 it also seems to be that the effects are 1582 01:12:18,000 --> 01:12:22,060 larger the more friends somebody had on the field. 1583 01:12:22,060 --> 01:12:23,730 So some workers have lots of friends 1584 01:12:23,730 --> 01:12:25,230 on the field in the same field that 1585 01:12:25,230 --> 01:12:27,897 are sort of their comparison group, and others did not. 1586 01:12:27,897 --> 01:12:30,480 Now why does it matter how many friends you have in your field 1587 01:12:30,480 --> 01:12:33,040 if care presumably you care more about your friends 1588 01:12:33,040 --> 01:12:34,300 than other people? 1589 01:12:34,300 --> 01:12:38,280 So now, you might have social preferences. 1590 01:12:38,280 --> 01:12:40,727 You work less to help others because if you work not very 1591 01:12:40,727 --> 01:12:42,185 much, they're going to be paid more 1592 01:12:42,185 --> 01:12:44,260 and you might care about them. 1593 01:12:44,260 --> 01:12:46,710 So you work even less when the friends 1594 01:12:46,710 --> 01:12:50,530 benefit because you care more about your friends than others. 1595 01:12:50,530 --> 01:12:52,350 So then you should see stronger effects 1596 01:12:52,350 --> 01:12:55,920 if you have more friends on the field of the piece 1597 01:12:55,920 --> 01:12:58,210 rates versus the other effects. 1598 01:12:58,210 --> 01:13:00,390 The second part is it's also like a repeated game, 1599 01:13:00,390 --> 01:13:02,920 and that's kind of a trickier to deal with, 1600 01:13:02,920 --> 01:13:05,880 which is your friends might sort of punish you in various ways. 1601 01:13:05,880 --> 01:13:09,420 If I work really hard, they can see, I'm working really hard. 1602 01:13:09,420 --> 01:13:12,510 They might sort of punish you socially at night, 1603 01:13:12,510 --> 01:13:13,867 or they might sort of be mean. 1604 01:13:13,867 --> 01:13:16,450 They might just not want to be your friends anymore and so on. 1605 01:13:16,450 --> 01:13:18,630 It's really a mean thing to do to your friends. 1606 01:13:18,630 --> 01:13:23,700 So if you do that, you might actually not 1607 01:13:23,700 --> 01:13:25,568 want to work hard, not because you 1608 01:13:25,568 --> 01:13:27,360 care that much about the others, but you're 1609 01:13:27,360 --> 01:13:29,027 worried about, essentially, retribution. 1610 01:13:29,027 --> 01:13:31,050 You're worried about them punishing you 1611 01:13:31,050 --> 01:13:34,482 if you are working too fast. 1612 01:13:34,482 --> 01:13:35,940 Now what kind of variation could we 1613 01:13:35,940 --> 01:13:39,237 look at to disentangle those two types of effects? 1614 01:13:39,237 --> 01:13:41,820 So they have different types of fruit, I can tell you already. 1615 01:13:41,820 --> 01:13:43,380 But what kind of variation would we 1616 01:13:43,380 --> 01:13:45,664 need to disentangle those two explanations? 1617 01:13:57,300 --> 01:13:58,450 Or let me ask differently. 1618 01:13:58,450 --> 01:14:03,060 What's crucial in part number two for that explanation? 1619 01:14:03,060 --> 01:14:04,310 So I'll give you another hint. 1620 01:14:04,310 --> 01:14:06,800 There's raspberries and strawberries. 1621 01:14:06,800 --> 01:14:08,840 For strawberries, you can see very well 1622 01:14:08,840 --> 01:14:09,980 what the other person does. 1623 01:14:09,980 --> 01:14:12,088 For raspberries, you cannot. 1624 01:14:12,088 --> 01:14:13,130 And why does that matter? 1625 01:14:18,152 --> 01:14:20,110 AUDIENCE: Because then your friends won't know. 1626 01:14:20,110 --> 01:14:22,830 [INAUDIBLE] 1627 01:14:22,830 --> 01:14:23,790 PROFESSOR: Exactly. 1628 01:14:23,790 --> 01:14:27,900 If your friends don't know-- 1629 01:14:27,900 --> 01:14:29,700 well, if it's explanation number one, 1630 01:14:29,700 --> 01:14:31,180 you really care about your friends, 1631 01:14:31,180 --> 01:14:32,972 it doesn't matter whether your friends know 1632 01:14:32,972 --> 01:14:35,490 or not how much you produce. 1633 01:14:35,490 --> 01:14:36,990 If you care about them a lot, you're 1634 01:14:36,990 --> 01:14:39,780 not going to work particularly hard because working really 1635 01:14:39,780 --> 01:14:42,330 hard is bad for your friends. 1636 01:14:42,330 --> 01:14:44,902 If instead, you're worried about retribution, 1637 01:14:44,902 --> 01:14:46,860 well, then it matters a lot whether your friend 1638 01:14:46,860 --> 01:14:48,090 can see what you do. 1639 01:14:48,090 --> 01:14:50,670 If you do the strawberries, which [INAUDIBLE] on the field, 1640 01:14:50,670 --> 01:14:54,120 you can see really easily what you do. 1641 01:14:54,120 --> 01:14:57,210 If that's the case, you're going to depress your effort 1642 01:14:57,210 --> 01:14:58,440 in the relative pay scheme. 1643 01:15:02,460 --> 01:15:05,760 And then there's going to be large effects once you 1644 01:15:05,760 --> 01:15:08,040 have your individual pay. 1645 01:15:08,040 --> 01:15:12,930 However, if you can hide what you're doing, then 1646 01:15:12,930 --> 01:15:15,330 in some sense you might work secretly really hard 1647 01:15:15,330 --> 01:15:16,805 in some of the bushes. 1648 01:15:16,805 --> 01:15:18,930 Your friends will never be able to tell whether you 1649 01:15:18,930 --> 01:15:20,402 will be working hard. 1650 01:15:20,402 --> 01:15:22,110 And then you should see less of an effect 1651 01:15:22,110 --> 01:15:24,480 when the switch happens. 1652 01:15:24,480 --> 01:15:25,920 And this is exactly that. 1653 01:15:25,920 --> 01:15:28,140 Productivity is observed for the strawberries. 1654 01:15:28,140 --> 01:15:29,910 It's unobserved for raspberries. 1655 01:15:29,910 --> 01:15:32,310 And so what we see, essentially, is no impact 1656 01:15:32,310 --> 01:15:35,850 of the piece rate for food type number two, which essentially 1657 01:15:35,850 --> 01:15:39,030 is to say if you can hide from your friends, 1658 01:15:39,030 --> 01:15:42,580 you will, in fact, work really hard, 1659 01:15:42,580 --> 01:15:45,420 which is essentially suggestive that you don't care 1660 01:15:45,420 --> 01:15:46,800 that much about your friends. 1661 01:15:46,800 --> 01:15:49,050 Instead, what you care about is social sanctions 1662 01:15:49,050 --> 01:15:51,840 or some form of punishment, which only happens 1663 01:15:51,840 --> 01:15:55,080 for the strawberries because that's when people can actually 1664 01:15:55,080 --> 01:15:58,410 see what you produce. 1665 01:15:58,410 --> 01:16:01,620 I can maybe briefly recap this next time. 1666 01:16:01,620 --> 01:16:04,620 Let me just stop here. 1667 01:16:04,620 --> 01:16:09,900 Next lecture, I'm going to show you a lot more field 1668 01:16:09,900 --> 01:16:12,640 evidence from various settings. 1669 01:16:12,640 --> 01:16:15,480 Essentially, the paper by Breza et al 1670 01:16:15,480 --> 01:16:18,150 that have morale effects and so on, essentially looking at 1671 01:16:18,150 --> 01:16:21,180 in field situations, I'm going to show you 1672 01:16:21,180 --> 01:16:23,742 that social preferences seem to matter quite a bit. 1673 01:16:23,742 --> 01:16:25,200 And then in particular, we're going 1674 01:16:25,200 --> 01:16:28,470 to look at the malleability of prosociality, which 1675 01:16:28,470 --> 01:16:31,900 is the question of, well, should we take social preferences 1676 01:16:31,900 --> 01:16:32,670 as given? 1677 01:16:32,670 --> 01:16:37,770 Or are there some interventions or some policies that perhaps 1678 01:16:37,770 --> 01:16:42,750 could make people become nicer or more prosocial 1679 01:16:42,750 --> 01:16:43,710 in some situations? 1680 01:16:43,710 --> 01:16:46,770 Is it something that the government or companies 1681 01:16:46,770 --> 01:16:52,888 could do to get workers or people to be more prosocial? 1682 01:16:52,888 --> 01:16:53,680 That's all for now. 1683 01:16:53,680 --> 01:16:55,400 Thanks so much.