1 00:00:00,000 --> 00:00:01,984 [SQUEAKING] 2 00:00:01,984 --> 00:00:04,464 [RUSTLING] 3 00:00:04,464 --> 00:00:10,920 [CLICKING] 4 00:00:10,920 --> 00:00:13,590 FRANK SCHILBACH: Welcome to Lecture 18 of 14.13. 5 00:00:13,590 --> 00:00:17,160 This lecture's about gender discrimination and identity. 6 00:00:17,160 --> 00:00:17,760 OK. 7 00:00:17,760 --> 00:00:19,050 So what's the agenda today? 8 00:00:19,050 --> 00:00:21,300 Generally, we're going to talk about gender, identity, 9 00:00:21,300 --> 00:00:23,275 and particularly, discrimination. 10 00:00:23,275 --> 00:00:25,650 First, I'm going to give you a broad and a quick overview 11 00:00:25,650 --> 00:00:29,580 of why this is an important area to study. 12 00:00:29,580 --> 00:00:33,000 Then I'm going to give you an equally brief overview 13 00:00:33,000 --> 00:00:37,470 of the gender gap in wages, the fact that women earn less 14 00:00:37,470 --> 00:00:43,170 money than men for equal work, how that has evolved over time, 15 00:00:43,170 --> 00:00:45,722 and how much still is left of this gender gap. 16 00:00:45,722 --> 00:00:47,430 Then we're going to talk for a little bit 17 00:00:47,430 --> 00:00:50,070 about potential technological solutions. 18 00:00:50,070 --> 00:00:54,690 Can we use technology to improve or to reduce discrimination 19 00:00:54,690 --> 00:00:55,872 potentially? 20 00:00:55,872 --> 00:00:57,705 And the answer would be yes, to some extent. 21 00:00:57,705 --> 00:01:00,355 But that technology will not solve everything. 22 00:01:00,355 --> 00:01:02,730 Then we're going to talk about two particularly important 23 00:01:02,730 --> 00:01:03,580 issues. 24 00:01:03,580 --> 00:01:05,880 One is beliefs and people's updating 25 00:01:05,880 --> 00:01:10,957 about beliefs when it comes to men and women and their skill. 26 00:01:10,957 --> 00:01:13,290 And then we're going to talk about gender identity norms 27 00:01:13,290 --> 00:01:17,370 in particular, which is the idea that people's 28 00:01:17,370 --> 00:01:21,540 deeply held beliefs about what should be done in society 29 00:01:21,540 --> 00:01:24,460 or not might affect their behavior. 30 00:01:24,460 --> 00:01:26,880 And finally, we're going to talk a little bit about demand 31 00:01:26,880 --> 00:01:29,305 and supply for different tasks. 32 00:01:29,305 --> 00:01:30,930 So in particular, I'm going to show you 33 00:01:30,930 --> 00:01:34,620 that women are more likely to say yes 34 00:01:34,620 --> 00:01:37,290 to what's called nonpromotable tasks, tasks that 35 00:01:37,290 --> 00:01:38,730 are good for society, but perhaps 36 00:01:38,730 --> 00:01:41,310 not good for their career, or tasks that are good 37 00:01:41,310 --> 00:01:43,570 for, like, a company or a university as a whole, 38 00:01:43,570 --> 00:01:46,140 but are not good, are not promotable in the sense 39 00:01:46,140 --> 00:01:49,420 that they help women, in fact, get promoted. 40 00:01:49,420 --> 00:01:52,080 Women are more likely to say yes to those things. 41 00:01:52,080 --> 00:01:55,440 In addition, on top of that, precisely because they 42 00:01:55,440 --> 00:01:57,230 are more likely to say yes, they're 43 00:01:57,230 --> 00:01:59,910 also more likely to be asked in the first place, which sort 44 00:01:59,910 --> 00:02:01,540 of amplifies that difference. 45 00:02:01,540 --> 00:02:03,540 I'm going to tell you about this in more detail. 46 00:02:03,540 --> 00:02:04,998 The first question you might ask is 47 00:02:04,998 --> 00:02:06,630 why study gender differences? 48 00:02:06,630 --> 00:02:09,350 I have four answers for you. 49 00:02:09,350 --> 00:02:11,260 The first one is simple equality, fairness, 50 00:02:11,260 --> 00:02:12,820 and justice issues. 51 00:02:12,820 --> 00:02:17,060 People simply should be rewarded equally for the same output. 52 00:02:17,060 --> 00:02:20,620 So if a man and a woman works equally many hours 53 00:02:20,620 --> 00:02:23,380 and is equally productive, they should be paid the same. 54 00:02:23,380 --> 00:02:27,580 And if that's not the case, that's unequal. 55 00:02:27,580 --> 00:02:30,610 That's unfair and not-- 56 00:02:30,610 --> 00:02:32,710 injust, or not justified. 57 00:02:32,710 --> 00:02:37,450 Now there might be additional equality and fairness issues 58 00:02:37,450 --> 00:02:39,110 that men and women might not even 59 00:02:39,110 --> 00:02:41,210 be allowed to work as much, women 60 00:02:41,210 --> 00:02:44,110 might not be allowed to work as much as men. 61 00:02:44,110 --> 00:02:47,930 That's an additional sort of fairness concern. 62 00:02:47,930 --> 00:02:49,960 Second, there's an efficiency argument, 63 00:02:49,960 --> 00:02:52,450 which is overall productivity and welfare 64 00:02:52,450 --> 00:02:55,270 falls if women and other groups are held back by discrimination 65 00:02:55,270 --> 00:02:56,860 and other distortions. 66 00:02:56,860 --> 00:03:00,340 And indeed, a substantial share of recent US growth 67 00:03:00,340 --> 00:03:02,800 can be explained by improved allocation 68 00:03:02,800 --> 00:03:05,510 of talent in the economy. 69 00:03:05,510 --> 00:03:08,020 So this is just a simple idea that, well, 70 00:03:08,020 --> 00:03:11,680 if we exclude certain groups, including women 71 00:03:11,680 --> 00:03:16,690 from certain jobs, the most talented people in those groups 72 00:03:16,690 --> 00:03:18,613 will not be allowed to-- 73 00:03:18,613 --> 00:03:21,310 or some of the most talented people in those groups 74 00:03:21,310 --> 00:03:23,300 might not be allowed to do those jobs. 75 00:03:23,300 --> 00:03:25,720 And that's bad for society overall. 76 00:03:25,720 --> 00:03:27,250 Think about, for example, doctors. 77 00:03:27,250 --> 00:03:29,830 If women cannot be doctors, well then, 78 00:03:29,830 --> 00:03:34,060 we will essentially just exclude a large fraction of the most 79 00:03:34,060 --> 00:03:37,850 talented people in terms of being doctors, the smartest, 80 00:03:37,850 --> 00:03:42,910 brightest and hardest working women. 81 00:03:42,910 --> 00:03:46,510 And that will be worse not just for those women, 82 00:03:46,510 --> 00:03:50,050 but it will also be bad for medical care, innovation, 83 00:03:50,050 --> 00:03:52,540 and so on in the medical profession 84 00:03:52,540 --> 00:03:54,940 because you just essentially exclude some of the best 85 00:03:54,940 --> 00:03:58,572 people from that type of work. 86 00:03:58,572 --> 00:04:00,280 And if you want to learn more about this, 87 00:04:00,280 --> 00:04:01,720 the paper by Hsieh et al. 88 00:04:01,720 --> 00:04:06,110 is an excellent discussion of this issue. 89 00:04:06,110 --> 00:04:10,010 Third, we might learn about the formation 90 00:04:10,010 --> 00:04:13,370 of preferences in personality by studying gender. 91 00:04:13,370 --> 00:04:17,209 And here, the idea is that there are, in many cases, 92 00:04:17,209 --> 00:04:19,775 differences and preferences-- 93 00:04:19,775 --> 00:04:22,070 [INAUDIBLE] preferences, social preferences-- 94 00:04:22,070 --> 00:04:25,280 and taste for competition or competitiveness 95 00:04:25,280 --> 00:04:27,920 but also attitudes towards negotiation, 96 00:04:27,920 --> 00:04:30,860 or other things such as identity aspirations, or over- 97 00:04:30,860 --> 00:04:33,490 or underconfidence more generally. 98 00:04:33,490 --> 00:04:37,840 Well, so if there are these differences at later ages, 99 00:04:37,840 --> 00:04:40,330 we can learn about the formation of those preferences 100 00:04:40,330 --> 00:04:45,310 in personality by looking, for instance, at people 101 00:04:45,310 --> 00:04:49,720 at different ages and trying to understand at which ages 102 00:04:49,720 --> 00:04:51,730 those differences emerge. 103 00:04:51,730 --> 00:04:53,830 For instance, you might look at 5-year-olds 104 00:04:53,830 --> 00:04:55,720 or 10-year-olds or 15-year-olds and look 105 00:04:55,720 --> 00:04:57,520 at their risks and social preferences, 106 00:04:57,520 --> 00:05:01,420 and then try to understand when those gender gaps emerge 107 00:05:01,420 --> 00:05:02,770 over time. 108 00:05:02,770 --> 00:05:06,310 And that will allow us, then, to understand questions of nature 109 00:05:06,310 --> 00:05:08,440 versus nurture, whether there are 110 00:05:08,440 --> 00:05:11,080 inherent differences in preferences and personality 111 00:05:11,080 --> 00:05:14,000 overall, just men and women are just different inherently 112 00:05:14,000 --> 00:05:15,370 for some reason. 113 00:05:15,370 --> 00:05:21,340 Or it could be that it's really about socialization 114 00:05:21,340 --> 00:05:23,440 and education and the like. 115 00:05:23,440 --> 00:05:25,930 Girls and boys are just treated very differently 116 00:05:25,930 --> 00:05:28,060 by their parents, by their social environments, 117 00:05:28,060 --> 00:05:29,890 their teachers, and so on, and that 118 00:05:29,890 --> 00:05:32,290 might create such differences and preferences 119 00:05:32,290 --> 00:05:34,000 and other behaviors. 120 00:05:34,000 --> 00:05:36,130 So that's another reason to study those. 121 00:05:36,130 --> 00:05:39,910 You're going to mostly focus on number one, just simply 122 00:05:39,910 --> 00:05:42,970 the idea, the objective that people should be rewarded 123 00:05:42,970 --> 00:05:48,710 equally for the same help. 124 00:05:48,710 --> 00:05:50,030 Now what is the gender gap? 125 00:05:50,030 --> 00:05:52,370 What do I mean when I talk about the gender gap? 126 00:05:52,370 --> 00:05:56,390 So Claudia Goldin, who is a eminent researcher at Harvard, 127 00:05:56,390 --> 00:05:59,270 who has done a lot of seminal work on gender 128 00:05:59,270 --> 00:06:02,150 and the economics of gender, has written a very nice overview 129 00:06:02,150 --> 00:06:06,350 paper in 2014 and giving an overview of what do we know 130 00:06:06,350 --> 00:06:10,160 and how far have we come in trying to reduce and close 131 00:06:10,160 --> 00:06:12,440 the gender gap. 132 00:06:12,440 --> 00:06:15,500 The graph that I'm showing here summarizes this discussion 133 00:06:15,500 --> 00:06:17,443 fairly well. 134 00:06:17,443 --> 00:06:18,610 It's a bit of a messy graph. 135 00:06:18,610 --> 00:06:19,985 So let me show you what it shows, 136 00:06:19,985 --> 00:06:21,420 or tell you what it shows. 137 00:06:21,420 --> 00:06:25,140 On the x-axis, you see people's age from 20 to 70. 138 00:06:25,140 --> 00:06:28,860 On the y-axis, you see the log of female to male earnings, 139 00:06:28,860 --> 00:06:31,450 or average female to male earnings. 140 00:06:31,450 --> 00:06:33,880 Which is essentially a measure of the gender gap, 141 00:06:33,880 --> 00:06:36,720 and so the difference in log is essentially 142 00:06:36,720 --> 00:06:40,660 the percentage difference in female versus male earnings. 143 00:06:40,660 --> 00:06:44,310 So minus 0.3, and we think of that as, like, 30% difference 144 00:06:44,310 --> 00:06:48,720 in female versus male earnings. 145 00:06:48,720 --> 00:06:49,560 What do we see here? 146 00:06:49,560 --> 00:06:52,560 Well, A, we see a pretty large gender gap 147 00:06:52,560 --> 00:06:55,290 anywhere in this graph. 148 00:06:55,290 --> 00:06:57,510 I should have also said different lines are now 149 00:06:57,510 --> 00:06:59,280 different cohorts. 150 00:06:59,280 --> 00:07:04,260 So these are starting in 1923, so people born in 1923, '28, 151 00:07:04,260 --> 00:07:08,640 '33, '38, '43, and so on, at five year intervals, 152 00:07:08,640 --> 00:07:12,630 from the oldest to like more recent cohorts. 153 00:07:12,630 --> 00:07:13,510 Now what do we see? 154 00:07:13,510 --> 00:07:16,980 We see a significant gender gap everywhere in this graph, 155 00:07:16,980 --> 00:07:18,970 or in all lines of the graph. 156 00:07:18,970 --> 00:07:23,460 We see that the gender gap, even for the most recent cohort 157 00:07:23,460 --> 00:07:26,040 seems to be about 20% to 30%. 158 00:07:26,040 --> 00:07:28,830 We see that the gender gap has fallen. 159 00:07:28,830 --> 00:07:30,540 So when you look at like the most-- 160 00:07:30,540 --> 00:07:32,160 the oldest cohorts, the gender gap 161 00:07:32,160 --> 00:07:35,550 was even larger than the more recent cohorts. 162 00:07:35,550 --> 00:07:39,540 You also see that there's sort of this U-shape, where 163 00:07:39,540 --> 00:07:44,430 we seem to find that the gender gap starts emerging already 164 00:07:44,430 --> 00:07:47,610 at age 27, 28, 30. 165 00:07:47,610 --> 00:07:50,550 There's already a gender gap of something like 10% to 15%, 166 00:07:50,550 --> 00:07:52,500 even for the most recent cohorts. 167 00:07:52,500 --> 00:07:56,560 But then that gender gap gets magnified towards, like, 168 00:07:56,560 --> 00:08:02,880 [INAUDIBLE] 20% 30%, 40%, until about age 40 to 45, 169 00:08:02,880 --> 00:08:04,110 when it's at a plateau. 170 00:08:04,110 --> 00:08:08,610 So maybe it falls a bit overall. 171 00:08:08,610 --> 00:08:10,630 So what do we learn here? 172 00:08:10,630 --> 00:08:13,280 Well, there are substantial female labor market 173 00:08:13,280 --> 00:08:14,690 gains in the last half century. 174 00:08:14,690 --> 00:08:17,670 But large gender gaps remain. 175 00:08:17,670 --> 00:08:22,410 In 2010, for instance, the ratio of mean annual earnings 176 00:08:22,410 --> 00:08:24,840 between male and female workers, these 177 00:08:24,840 --> 00:08:29,070 are people who worked full-time, full year, 25 to 69 years, 178 00:08:29,070 --> 00:08:33,270 was 0.72, and that of the median was 0.77. 179 00:08:33,270 --> 00:08:36,720 Notice that the figure here controls 180 00:08:36,720 --> 00:08:38,742 for work time and education. 181 00:08:38,742 --> 00:08:40,659 So while of course, it might also be the case, 182 00:08:40,659 --> 00:08:44,100 it is indeed the case that women work fewer hours in a day 183 00:08:44,100 --> 00:08:48,180 than men do, even conditional on work time education, 184 00:08:48,180 --> 00:08:49,980 there's a significant gap, gender gap 185 00:08:49,980 --> 00:08:53,580 remaining of something like 20 to 30% 186 00:08:53,580 --> 00:08:58,810 that women are paid less for equal work. 187 00:08:58,810 --> 00:09:02,440 Now first, what are some of the reasons why 188 00:09:02,440 --> 00:09:05,230 there have been substantial female labor market gains 189 00:09:05,230 --> 00:09:07,150 over the last half century? 190 00:09:07,150 --> 00:09:10,540 Well, there's quite a few technological 191 00:09:10,540 --> 00:09:12,820 and other explanatory factors. 192 00:09:12,820 --> 00:09:16,540 One big factor was a reduction in the gender gap in education. 193 00:09:16,540 --> 00:09:23,350 Women are just more able and can afford to receive education. 194 00:09:23,350 --> 00:09:26,150 That's true for both primary, secondary education, but also, 195 00:09:26,150 --> 00:09:29,920 in particular, for university or higher education, 196 00:09:29,920 --> 00:09:31,210 more generally. 197 00:09:31,210 --> 00:09:33,190 Second, there's been technological innovations, 198 00:09:33,190 --> 00:09:35,260 such as the pill, dishwasher, and so on, 199 00:09:35,260 --> 00:09:40,990 which allowed women to do more work outside of the household. 200 00:09:40,990 --> 00:09:43,030 There have also been labor demand shifts. 201 00:09:43,030 --> 00:09:45,160 In particular, the US economy has 202 00:09:45,160 --> 00:09:49,270 been moving from manufacturing and agriculture 203 00:09:49,270 --> 00:09:53,790 towards services. 204 00:09:53,790 --> 00:09:56,280 Women tend to have comparative advantages 205 00:09:56,280 --> 00:10:01,110 in services compared to like, in particular, hard manual. 206 00:10:01,110 --> 00:10:05,520 And so these labor demand shifts then have disproportionately 207 00:10:05,520 --> 00:10:08,340 benefited women. 208 00:10:08,340 --> 00:10:10,650 Finally, there's been also lower discrimination, 209 00:10:10,650 --> 00:10:14,280 in particular, stronger regulatory controls 210 00:10:14,280 --> 00:10:17,160 and increased market competitiveness. 211 00:10:17,160 --> 00:10:20,220 For the latter, it's if there's more competition, 212 00:10:20,220 --> 00:10:24,840 firms cannot afford to discriminate as much. 213 00:10:24,840 --> 00:10:26,340 Because it's actually a costly thing 214 00:10:26,340 --> 00:10:29,790 to do, to hire incompetent men compared to competent women. 215 00:10:32,810 --> 00:10:36,290 Now we're going to discuss a paper by Goldin and Rouse, 216 00:10:36,290 --> 00:10:39,890 which is an example of a simple technological solution, 217 00:10:39,890 --> 00:10:42,860 in particular, this question whether blind auditions 218 00:10:42,860 --> 00:10:47,060 in orchestras increased gender ratios in those orchestras. 219 00:10:47,060 --> 00:10:49,700 So is the fact that people-- 220 00:10:49,700 --> 00:10:51,560 the blind auditions were introduced, 221 00:10:51,560 --> 00:10:54,290 did that contribute to the increase in the gender 222 00:10:54,290 --> 00:10:56,240 ratio in orchestras? 223 00:10:56,240 --> 00:10:58,580 What do I mean by the increase in gender ratios 224 00:10:58,580 --> 00:11:00,380 in the orchestras? 225 00:11:00,380 --> 00:11:03,890 So even until the 1960s, '70s, and '80s, 226 00:11:03,890 --> 00:11:07,670 there were shockingly low female ratios in top orchestras 227 00:11:07,670 --> 00:11:10,380 and orchestras more generally. 228 00:11:10,380 --> 00:11:17,070 Now these ratios were below 10% in many orchestras. 229 00:11:17,070 --> 00:11:19,320 And only recently, in the '80s and '90s, 230 00:11:19,320 --> 00:11:21,570 in particular, there were stark increases 231 00:11:21,570 --> 00:11:27,720 in the number of women who play in those orchestras. 232 00:11:27,720 --> 00:11:33,470 This increase was the case in the top five orchestras, 233 00:11:33,470 --> 00:11:36,230 but [INAUDIBLE] also has been, have been similar improvements 234 00:11:36,230 --> 00:11:37,850 in gender ratios in other orchestras, 235 00:11:37,850 --> 00:11:43,610 which you can read in the report that I linked in the slides. 236 00:11:43,610 --> 00:11:47,480 Now one really important reason here 237 00:11:47,480 --> 00:11:51,500 seems to have been blatant sexism, even by very renowned, 238 00:11:51,500 --> 00:11:54,560 or in particular, by renowned conductors, which you can-- 239 00:11:54,560 --> 00:11:56,690 I'm not going to repeat here, but you 240 00:11:56,690 --> 00:12:00,120 can read it in section one of the paper by Goldin and Rouse. 241 00:12:00,120 --> 00:12:02,610 And so one question is like, well, 242 00:12:02,610 --> 00:12:05,270 is there's this sexism and discrimination, 243 00:12:05,270 --> 00:12:13,370 has that contributed to the low gender ratio in orchestras? 244 00:12:13,370 --> 00:12:16,520 Now while there have been all these improvements 245 00:12:16,520 --> 00:12:21,200 in the gender ratio [INAUDIBLE] musicians in general? 246 00:12:21,200 --> 00:12:23,840 There still are vast differences in conductors 247 00:12:23,840 --> 00:12:26,150 and the number of conductors and music directors. 248 00:12:26,150 --> 00:12:29,360 They're essentially predominantly male still. 249 00:12:29,360 --> 00:12:32,090 Moreover, while there have been a lot of improvements 250 00:12:32,090 --> 00:12:36,650 in terms of gender, most musicians are still white. 251 00:12:36,650 --> 00:12:40,010 So in particular, African-Americans and Latinos 252 00:12:40,010 --> 00:12:45,690 are very much underrepresented in orchestras. 253 00:12:45,690 --> 00:12:51,720 Now let me tell you a short aside about women in economics. 254 00:12:51,720 --> 00:12:54,090 This is not just the case that in orchestras 255 00:12:54,090 --> 00:12:59,160 the fraction of women is low, that's also true in economics. 256 00:12:59,160 --> 00:13:01,140 And here's an overview of that. 257 00:13:01,140 --> 00:13:05,730 There's a [INAUDIBLE] paper by Lundberg and Stearns in 2019, 258 00:13:05,730 --> 00:13:08,280 which essentially says that some progress has been made, 259 00:13:08,280 --> 00:13:09,690 but lots more needs to be done. 260 00:13:09,690 --> 00:13:11,430 And more recently, in particular, 261 00:13:11,430 --> 00:13:14,520 some of the progress seems to have stalled. 262 00:13:14,520 --> 00:13:16,190 What do we mean by that? 263 00:13:16,190 --> 00:13:18,010 Well, here this graph shows the fraction 264 00:13:18,010 --> 00:13:22,510 of women in different stages of the profession ranging 265 00:13:22,510 --> 00:13:27,910 from senior mangers here at the top to first-year PhD students 266 00:13:27,910 --> 00:13:32,140 to assistant professors, new assistant professors, 267 00:13:32,140 --> 00:13:34,833 associate professors, and full professors. 268 00:13:34,833 --> 00:13:36,250 And what you see here, there seems 269 00:13:36,250 --> 00:13:38,390 to be some trend that's upwards, in particular 270 00:13:38,390 --> 00:13:40,747 for full professors and associate professors, 271 00:13:40,747 --> 00:13:41,830 which is, of course, good. 272 00:13:45,100 --> 00:13:48,700 there's less of a positive trend to be seen in particular 273 00:13:48,700 --> 00:13:50,515 when it comes to assistant professors. 274 00:13:50,515 --> 00:13:53,470 Here you essentially see since 2000-- 275 00:13:53,470 --> 00:13:57,820 about 2007 or 2006 has been no increase, if anything, 276 00:13:57,820 --> 00:14:01,870 like a decrease in the fraction of female assistant professors. 277 00:14:01,870 --> 00:14:08,160 But even the fraction of PhD students is clearly below 50%. 278 00:14:08,160 --> 00:14:10,350 And so the profession is starting 279 00:14:10,350 --> 00:14:12,630 to understand and address the issues, 280 00:14:12,630 --> 00:14:15,880 in particular, to try to reduce sexism, 281 00:14:15,880 --> 00:14:19,650 and other issues holding women back. 282 00:14:19,650 --> 00:14:22,470 For example, some particularly nice examples 283 00:14:22,470 --> 00:14:24,990 of these mentoring programs, which 284 00:14:24,990 --> 00:14:30,990 are trying to support female and minority assistant professors 285 00:14:30,990 --> 00:14:33,030 and in terms of mentoring and trying to sort of 286 00:14:33,030 --> 00:14:38,220 foster their careers, which has been shown to impact 287 00:14:38,220 --> 00:14:41,850 and significantly improve the profession. 288 00:14:41,850 --> 00:14:45,960 For what it's worth in '14, '13, the majority of students 289 00:14:45,960 --> 00:14:47,880 tends to be female. 290 00:14:47,880 --> 00:14:51,200 And moreover, the best students tend to be female as well. 291 00:14:51,200 --> 00:14:52,950 So very much hope you are still interested 292 00:14:52,950 --> 00:14:55,200 and all the female students are very much 293 00:14:55,200 --> 00:14:57,960 still interested in economics and would pursue a career 294 00:14:57,960 --> 00:15:02,750 in economics, perhaps academic. 295 00:15:02,750 --> 00:15:05,900 Now back to orchestra, so many orchestras 296 00:15:05,900 --> 00:15:09,320 introduced blind auditions in the 1970s and 1980s. 297 00:15:09,320 --> 00:15:12,590 And if you look back at the graph that I showed you-- 298 00:15:12,590 --> 00:15:16,730 this graph here-- you see in the 1970s, '80s, and '90s, that's 299 00:15:16,730 --> 00:15:20,420 also the time when there was a sharp increase in the share 300 00:15:20,420 --> 00:15:22,970 of women in those orchestras. 301 00:15:22,970 --> 00:15:26,150 So you can see here in the graphs, 302 00:15:26,150 --> 00:15:29,030 orchestras, if you want to get a job in orchestras, 303 00:15:29,030 --> 00:15:31,010 you have to do auditions. 304 00:15:31,010 --> 00:15:32,930 These auditions are very, very competitive 305 00:15:32,930 --> 00:15:34,730 and often have like several stages. 306 00:15:34,730 --> 00:15:37,220 There are preliminary, semifinals, and final symphony 307 00:15:37,220 --> 00:15:38,300 orchestras. 308 00:15:38,300 --> 00:15:42,180 What you see here is 11 anonymized orchestras, 309 00:15:42,180 --> 00:15:45,370 including the top orchestras in the US, 310 00:15:45,370 --> 00:15:49,480 and the introduction of blind auditions in those orchestras. 311 00:15:49,480 --> 00:15:51,730 What you see is that most orchestras 312 00:15:51,730 --> 00:15:54,460 did introduce blind auditions at least in some 313 00:15:54,460 --> 00:15:58,120 of their stages of the auditions. 314 00:15:58,120 --> 00:16:01,970 However, many of them also only introduce it in some stages, 315 00:16:01,970 --> 00:16:03,370 but not in others. 316 00:16:03,370 --> 00:16:07,030 Now you can think already that that's not sufficient. 317 00:16:07,030 --> 00:16:09,760 In some cases, if you have only like blind auditions 318 00:16:09,760 --> 00:16:12,815 in the preliminaries, but not in the semifinals and finals, 319 00:16:12,815 --> 00:16:14,440 you might sort of reduce discretization 320 00:16:14,440 --> 00:16:15,315 in the preliminaries. 321 00:16:15,315 --> 00:16:17,890 But then if women are discriminated in the semifinals 322 00:16:17,890 --> 00:16:20,725 or finals, that's not going to help them get hired, which is, 323 00:16:20,725 --> 00:16:25,770 at the end of the day, we, of course, care about. 324 00:16:25,770 --> 00:16:29,040 Now Goldin and Rouse look at technological solutions 325 00:16:29,040 --> 00:16:33,390 and look at the question of whether such technological 326 00:16:33,390 --> 00:16:35,520 solutions, in fact, can help. 327 00:16:35,520 --> 00:16:39,960 Now the question to ask now is looking 328 00:16:39,960 --> 00:16:44,010 at data from actual auditions, often using individual fixed 329 00:16:44,010 --> 00:16:46,485 effects, do these blind auditions, the introduction 330 00:16:46,485 --> 00:16:50,310 of blind auditions help candidates reach the next round 331 00:16:50,310 --> 00:16:52,940 or, in particular, get hired. 332 00:16:52,940 --> 00:16:56,780 Now the data is fairly rich and often includes 333 00:16:56,780 --> 00:17:00,320 or allows for the inclusion of individual fixed effects. 334 00:17:00,320 --> 00:17:02,360 That is the case if you look at the musicians, 335 00:17:02,360 --> 00:17:05,030 Goldin and Rouse are going to ask the question, 336 00:17:05,030 --> 00:17:09,530 is a particular musician more likely 337 00:17:09,530 --> 00:17:14,300 to make it to the next round when auditions are blinded? 338 00:17:14,300 --> 00:17:17,119 And is that more likely the case for female 339 00:17:17,119 --> 00:17:20,838 than for male musicians? 340 00:17:20,838 --> 00:17:22,130 Now what did they find overall? 341 00:17:22,130 --> 00:17:26,869 They find evidence that blind audition procedures 342 00:17:26,869 --> 00:17:29,720 fostered impartiality in hiring and increased 343 00:17:29,720 --> 00:17:33,110 the proportion of women in symphony orchestras. 344 00:17:33,110 --> 00:17:35,330 Now there's some caveats here that, you know, 345 00:17:35,330 --> 00:17:37,310 some of these estimates are fairly imprecise, 346 00:17:37,310 --> 00:17:38,810 in particular, in the later stages 347 00:17:38,810 --> 00:17:44,420 because a number of observations was down quite a bit. 348 00:17:44,420 --> 00:17:46,580 Some of these results are a little bit mixed. 349 00:17:46,580 --> 00:17:48,230 But overall, the evidence suggests 350 00:17:48,230 --> 00:17:50,660 that these blind auditions did indeed 351 00:17:50,660 --> 00:17:55,880 foster impartiality and help women succeed in getting jobs, 352 00:17:55,880 --> 00:17:59,860 in particular in these top orchestras in the West. 353 00:17:59,860 --> 00:18:02,720 Now why are these results perhaps somewhat 354 00:18:02,720 --> 00:18:03,478 less clear cut? 355 00:18:03,478 --> 00:18:05,020 I already told you like one reason is 356 00:18:05,020 --> 00:18:07,510 that in the later rounds, the number of observations 357 00:18:07,510 --> 00:18:10,150 gets kind of small because no longer have so many people make 358 00:18:10,150 --> 00:18:11,620 it to the later rounds. 359 00:18:11,620 --> 00:18:13,000 But there's another reason, which 360 00:18:13,000 --> 00:18:16,690 is the auditions have several rounds, not all of which 361 00:18:16,690 --> 00:18:18,100 were blinded. 362 00:18:18,100 --> 00:18:19,505 Now if there's one-- 363 00:18:19,505 --> 00:18:21,130 if there's discrimination in one round, 364 00:18:21,130 --> 00:18:24,940 but not in the next one, this issue that in the next round, 365 00:18:24,940 --> 00:18:28,270 there will be like selection, and in particular, 366 00:18:28,270 --> 00:18:30,730 the discriminated people who make it to the next round, 367 00:18:30,730 --> 00:18:32,580 you expect them to do better, right? 368 00:18:32,580 --> 00:18:34,420 If there's discrimination against women 369 00:18:34,420 --> 00:18:38,080 in the first round, but not in a second, 370 00:18:38,080 --> 00:18:40,180 you think that women in the second round 371 00:18:40,180 --> 00:18:42,730 will do better because only the best women will make it 372 00:18:42,730 --> 00:18:46,030 to the second round because otherwise they 373 00:18:46,030 --> 00:18:50,260 will be screened out in the first round 374 00:18:50,260 --> 00:18:52,782 due to discrimination. 375 00:18:52,782 --> 00:18:55,700 But if that's the case, then making comparisons 376 00:18:55,700 --> 00:18:59,173 in later rounds is potentially biased or problematic 377 00:18:59,173 --> 00:19:01,340 because you're comparing essentially different types 378 00:19:01,340 --> 00:19:04,130 of people in different rounds. 379 00:19:04,130 --> 00:19:08,990 And that leads to the results being less clear-cut overall. 380 00:19:08,990 --> 00:19:10,630 Let me give you another example. 381 00:19:10,630 --> 00:19:15,488 Suppose male and female workers get graded on their task , 382 00:19:15,488 --> 00:19:18,030 a coding task, online where you essentially see their output, 383 00:19:18,030 --> 00:19:19,850 but you also see their gender. 384 00:19:19,850 --> 00:19:22,550 Now suppose women get graded systematically 385 00:19:22,550 --> 00:19:24,520 worse due to sexism and no other reasons. 386 00:19:24,520 --> 00:19:26,270 Suppose women and men are exactly the same 387 00:19:26,270 --> 00:19:28,580 in the work they do, but some men are sexist 388 00:19:28,580 --> 00:19:30,950 and they're going to rate women worse. 389 00:19:30,950 --> 00:19:34,880 Now for any given score, you should expect the women 390 00:19:34,880 --> 00:19:37,850 that you hire compared to any other random man that you hire 391 00:19:37,850 --> 00:19:40,740 to do better than that man. 392 00:19:40,740 --> 00:19:43,250 And the reason being that they have been systematically 393 00:19:43,250 --> 00:19:47,640 disadvantaged due to the sexism in the rating to start with. 394 00:19:47,640 --> 00:19:49,110 Now you should think about this. 395 00:19:49,110 --> 00:19:51,240 Because, you know, there's also some evidence 396 00:19:51,240 --> 00:19:55,270 of teaching evaluations being biased against women. 397 00:19:55,270 --> 00:19:58,440 Now if you see a man and a woman are at the exact same teaching 398 00:19:58,440 --> 00:20:01,260 rating, well, you would probably want to take the woman's class 399 00:20:01,260 --> 00:20:02,820 because the woman has been-- 400 00:20:02,820 --> 00:20:06,600 the female professor has been discriminated to start with. 401 00:20:06,600 --> 00:20:09,480 Having that high score or the similarly high score 402 00:20:09,480 --> 00:20:13,770 as the male professor is more impressive and likely 403 00:20:13,770 --> 00:20:16,407 to indicate that the class will be better. 404 00:20:16,407 --> 00:20:18,240 There's an excellent paper by Bohren et. al. 405 00:20:18,240 --> 00:20:21,835 that considers the issue of-- the dynamics of discrimination 406 00:20:21,835 --> 00:20:24,210 in much more detail if you're interested in learning more 407 00:20:24,210 --> 00:20:26,870 about that. 408 00:20:26,870 --> 00:20:30,700 Now despite the technological solution 409 00:20:30,700 --> 00:20:35,890 that advances that image, [INAUDIBLE] substantial gender 410 00:20:35,890 --> 00:20:38,170 gaps remain. 411 00:20:38,170 --> 00:20:41,080 So while, as I told you, the gender gap 412 00:20:41,080 --> 00:20:44,230 has been reduced both in terms of labor market or labor force 413 00:20:44,230 --> 00:20:48,100 participation, how many days and hours, how many women overall, 414 00:20:48,100 --> 00:20:50,320 how many days and hours those women work 415 00:20:50,320 --> 00:20:53,710 and in their wages and earnings, how much they are being paid 416 00:20:53,710 --> 00:20:59,910 conditional on working, there are still 417 00:20:59,910 --> 00:21:02,010 substantial gender gaps. 418 00:21:02,010 --> 00:21:05,280 In particular, women's labor market participation 419 00:21:05,280 --> 00:21:08,850 has plateaued since the early 1990s. 420 00:21:08,850 --> 00:21:11,160 And even now, among entering cohorts, 421 00:21:11,160 --> 00:21:13,980 women still earn significantly less than men. 422 00:21:13,980 --> 00:21:16,290 And that's true even for conditional 423 00:21:16,290 --> 00:21:17,700 on work time and education. 424 00:21:17,700 --> 00:21:20,580 That's the graph that I was showing you earlier. 425 00:21:20,580 --> 00:21:24,060 Now given that there are these persistent remaining gender 426 00:21:24,060 --> 00:21:28,470 gaps even though a lot of the technological and other 427 00:21:28,470 --> 00:21:31,080 barriers have been reduced-- now women are educated 428 00:21:31,080 --> 00:21:33,930 or as educated as men are and there's less technological 429 00:21:33,930 --> 00:21:38,610 barriers that preclude women from working, 430 00:21:38,610 --> 00:21:41,070 researchers have now started to consider some of less 431 00:21:41,070 --> 00:21:44,170 traditional within economic factors, 432 00:21:44,170 --> 00:21:47,550 including things such as risk attitudes, negotiation skills, 433 00:21:47,550 --> 00:21:50,460 taste for computation, but also-- 434 00:21:50,460 --> 00:21:52,410 and this is what we're going to focus on-- 435 00:21:52,410 --> 00:21:56,268 beliefs and social norms and identity. 436 00:21:56,268 --> 00:21:57,810 I'm going to first talk about beliefs 437 00:21:57,810 --> 00:21:59,768 and then we're going to talk about social norms 438 00:21:59,768 --> 00:22:02,700 and identity. 439 00:22:02,700 --> 00:22:05,430 Before we get to beliefs about gender, 440 00:22:05,430 --> 00:22:07,170 people's-- women's and men's skills, 441 00:22:07,170 --> 00:22:09,570 I'm going to tell you very briefly about a seminal paper 442 00:22:09,570 --> 00:22:13,950 by Bertrand and Mullainathan from 2004 that randomizes-- 443 00:22:13,950 --> 00:22:17,230 randomized names in job applications. 444 00:22:17,230 --> 00:22:19,650 So what they're trying to do-- 445 00:22:19,650 --> 00:22:23,100 they essentially sent out job applications 446 00:22:23,100 --> 00:22:27,180 that were otherwise identical, male and female workers. 447 00:22:27,180 --> 00:22:29,430 These are fictitious job applications. 448 00:22:29,430 --> 00:22:31,950 And they sent them to employers and then looked 449 00:22:31,950 --> 00:22:35,430 at callback rates by those employers. 450 00:22:35,430 --> 00:22:39,930 And the only thing that was like varied in those applications 451 00:22:39,930 --> 00:22:42,660 was the name on the application. 452 00:22:42,660 --> 00:22:43,920 And what did they do? 453 00:22:43,920 --> 00:22:48,090 Well, they used white-sounding names such as Emily and Greg, 454 00:22:48,090 --> 00:22:50,430 as well as African-American sounding names 455 00:22:50,430 --> 00:22:53,010 such as Lakisha and Jamal. 456 00:22:53,010 --> 00:22:57,060 And then they looked at, well, were these applications, 457 00:22:57,060 --> 00:22:58,950 when they had white-sounding names that 458 00:22:58,950 --> 00:23:02,850 were otherwise identical, more successful than applications 459 00:23:02,850 --> 00:23:10,860 by names with African-American-sounding names 460 00:23:10,860 --> 00:23:14,580 and, again, that were otherwise identical? 461 00:23:14,580 --> 00:23:18,510 And Bertrand and Mullainathan find a striking result 462 00:23:18,510 --> 00:23:21,150 that callback rates for white-sounding names 463 00:23:21,150 --> 00:23:24,240 were 50% higher than for African-American-sounding 464 00:23:24,240 --> 00:23:25,090 names. 465 00:23:25,090 --> 00:23:27,240 That's a huge difference. 466 00:23:27,240 --> 00:23:29,070 And that's important not just because it's 467 00:23:29,070 --> 00:23:31,740 a hassle because you have to send out more applications, 468 00:23:31,740 --> 00:23:33,960 but if you get called back more, you're 469 00:23:33,960 --> 00:23:37,630 more likely to be interviewed or asked for an interview, 470 00:23:37,630 --> 00:23:39,640 you're more likely to actually get the job, 471 00:23:39,640 --> 00:23:42,380 you're more likely to-- 472 00:23:42,380 --> 00:23:45,980 less likely to be unemployed, and you might also 473 00:23:45,980 --> 00:23:50,300 be less likely to stop searching whatsoever because you might 474 00:23:50,300 --> 00:23:53,220 be disappointed or discouraged. 475 00:23:53,220 --> 00:23:57,170 So that's a huge difference and, essentially, discrimination 476 00:23:57,170 --> 00:24:03,140 against African-Americans was blatantly obvious 477 00:24:03,140 --> 00:24:04,580 from this research. 478 00:24:04,580 --> 00:24:08,360 Numerous other studies show gender and racial biases 479 00:24:08,360 --> 00:24:10,400 in various ways. 480 00:24:10,400 --> 00:24:14,750 These are ranging from very similar studies, including 481 00:24:14,750 --> 00:24:19,350 these audit studies where people are sent resumes and the like, 482 00:24:19,350 --> 00:24:22,040 but also ranging to studies where like people on eBay 483 00:24:22,040 --> 00:24:24,410 tried to sell things, where they try 484 00:24:24,410 --> 00:24:27,710 to sell like an iPad or iPhone, and they 485 00:24:27,710 --> 00:24:31,040 hold that iPhone with hands that's either 486 00:24:31,040 --> 00:24:33,260 like white or non-white. 487 00:24:33,260 --> 00:24:35,660 And then they find essentially that white hands 488 00:24:35,660 --> 00:24:41,510 are more likely to sell their iPhones than non-white hands. 489 00:24:41,510 --> 00:24:44,450 Now this is a very important and depressing result. 490 00:24:44,450 --> 00:24:47,420 One thing to flag here that it's hard to distinguish statistical 491 00:24:47,420 --> 00:24:49,460 from case-based discrimination. 492 00:24:49,460 --> 00:24:51,500 You're going to talk more about this recitation. 493 00:24:51,500 --> 00:24:52,980 What do I mean by that? 494 00:24:52,980 --> 00:24:55,850 Well, it might be that there is-- 495 00:24:55,850 --> 00:24:59,090 people have a preference for white applicants. 496 00:24:59,090 --> 00:25:01,550 That is to say, controlling for performance, 497 00:25:01,550 --> 00:25:04,370 it might just be that people are racist 498 00:25:04,370 --> 00:25:06,290 and they'd rather have white applicants 499 00:25:06,290 --> 00:25:10,140 than African-American applicants. 500 00:25:10,140 --> 00:25:15,440 This is what economists would call case-based discrimination, 501 00:25:15,440 --> 00:25:17,780 which essentially is conditional on the performance. 502 00:25:17,780 --> 00:25:21,140 People just like certain types of people better than others. 503 00:25:21,140 --> 00:25:22,700 And, therefore, they call them or are 504 00:25:22,700 --> 00:25:24,670 more likely to call them back. 505 00:25:24,670 --> 00:25:28,660 A different explanation is belief-based explanation. 506 00:25:28,660 --> 00:25:30,940 It might be that employers think that 507 00:25:30,940 --> 00:25:34,030 African-American applicants will perform worse, 508 00:25:34,030 --> 00:25:37,000 even controlling for all other aspects of the resume. 509 00:25:37,000 --> 00:25:40,570 So it could be essentially you see, people see this name 510 00:25:40,570 --> 00:25:43,180 and they think that even if the resume is good, 511 00:25:43,180 --> 00:25:46,540 that person will not perform work as well as somebody 512 00:25:46,540 --> 00:25:48,240 with a white-sounding name. 513 00:25:48,240 --> 00:25:51,720 Notice that these beliefs could be correct or not, 514 00:25:51,720 --> 00:25:54,755 but in any case, we will-- and that's-- we'll talk about this 515 00:25:54,755 --> 00:25:56,945 in recitation as well. 516 00:25:56,945 --> 00:25:58,820 Economists again, would call this statistical 517 00:25:58,820 --> 00:25:59,690 discrimination. 518 00:25:59,690 --> 00:26:01,940 This could be, again, correct statistical estimation 519 00:26:01,940 --> 00:26:08,150 or statistical discrimination that's based on biased beliefs. 520 00:26:08,150 --> 00:26:11,240 But either way, these things jointly 521 00:26:11,240 --> 00:26:16,940 led to clear discrimination of African-American applications, 522 00:26:16,940 --> 00:26:20,330 which, in turn, leads to striking differences 523 00:26:20,330 --> 00:26:26,740 in unemployment or job finding rates across races. 524 00:26:26,740 --> 00:26:30,100 We're going to now focus on beliefs and, in particular, 525 00:26:30,100 --> 00:26:31,690 about beliefs about skills and how 526 00:26:31,690 --> 00:26:34,390 people update on those beliefs depending 527 00:26:34,390 --> 00:26:36,780 on the gender of the person they're updating on. 528 00:26:39,565 --> 00:26:41,190 The people we are going to consider now 529 00:26:41,190 --> 00:26:44,250 is the paper by Sarsons, 2019, which 530 00:26:44,250 --> 00:26:48,230 considers how people interpret signals in the labor market. 531 00:26:48,230 --> 00:26:50,120 The question that Sarsons asks here 532 00:26:50,120 --> 00:26:52,700 is whether somebody's gender influences 533 00:26:52,700 --> 00:26:56,420 the way we interpret information about this person 534 00:26:56,420 --> 00:26:58,940 and about his or her peers. 535 00:26:58,940 --> 00:27:00,240 What do I mean by that? 536 00:27:00,240 --> 00:27:04,400 Well, in many situations when people do certain tasks 537 00:27:04,400 --> 00:27:08,790 and perform at work, good things and bad things happen. 538 00:27:08,790 --> 00:27:11,870 Now you can infer from these things that occur, 539 00:27:11,870 --> 00:27:14,060 from these events, you can infer something 540 00:27:14,060 --> 00:27:17,250 about their competence and their performance. 541 00:27:17,250 --> 00:27:19,940 While many of these situations are ambiguous in the sense 542 00:27:19,940 --> 00:27:22,972 it could be that the person was just really lucky or it could 543 00:27:22,972 --> 00:27:24,680 be that the person is really, really good 544 00:27:24,680 --> 00:27:26,140 at what they're doing. 545 00:27:26,140 --> 00:27:28,890 Now what Sarsons is asking when we see those events 546 00:27:28,890 --> 00:27:30,670 and when people see those events, 547 00:27:30,670 --> 00:27:34,480 are people interpreting those events differently 548 00:27:34,480 --> 00:27:37,440 for men versus women? 549 00:27:37,440 --> 00:27:41,510 So, for example, if a man-- if something bad happens at work, 550 00:27:41,510 --> 00:27:45,260 a boss might react to that person 551 00:27:45,260 --> 00:27:49,340 differently or update their beliefs 552 00:27:49,340 --> 00:27:51,920 about the person's skills differently depending 553 00:27:51,920 --> 00:27:54,080 whether it's man or woman. 554 00:27:54,080 --> 00:27:59,150 In addition, the boss might also update differently 555 00:27:59,150 --> 00:28:00,900 about the peers. 556 00:28:00,900 --> 00:28:03,110 So if a woman makes a mistake, the boss 557 00:28:03,110 --> 00:28:05,930 might not only update his beliefs 558 00:28:05,930 --> 00:28:10,490 about this woman, but also about all other women that 559 00:28:10,490 --> 00:28:11,980 work for him. 560 00:28:11,980 --> 00:28:15,666 Similarly, for a man, of course, the boss might do the same. 561 00:28:15,666 --> 00:28:19,000 And what Heather Sarsons now asks is the question, 562 00:28:19,000 --> 00:28:22,030 is this updating or the interpretation of information 563 00:28:22,030 --> 00:28:26,440 differently for male versus female workers? 564 00:28:26,440 --> 00:28:27,755 And why do we care? 565 00:28:27,755 --> 00:28:30,130 Well, we care a lot about this because hiring, promotion, 566 00:28:30,130 --> 00:28:32,140 and wage decisions hinge on information 567 00:28:32,140 --> 00:28:33,850 about workers' ability. 568 00:28:33,850 --> 00:28:36,160 And in particular, if there's systematic difference 569 00:28:36,160 --> 00:28:39,130 in how information about men and women is interpreted, 570 00:28:39,130 --> 00:28:42,070 that could lead to differences in hiring, promotion, and wage 571 00:28:42,070 --> 00:28:45,700 decisions and, in turn, might contribute to the gender gap. 572 00:28:49,110 --> 00:28:51,810 What does Sarsons do to answer this question? 573 00:28:51,810 --> 00:28:54,630 She looks at how do physicians change referrals 574 00:28:54,630 --> 00:29:00,510 to surgeons and their peers after a patient, 575 00:29:00,510 --> 00:29:03,330 after certain patient outcomes? 576 00:29:03,330 --> 00:29:06,620 Now what I mean by that, physicians often 577 00:29:06,620 --> 00:29:09,800 refer patients needing surgery to a local surgeon. 578 00:29:09,800 --> 00:29:12,260 So somebody breaks their arm or their leg, 579 00:29:12,260 --> 00:29:14,320 well, many physicians can't do this on their own, 580 00:29:14,320 --> 00:29:18,940 so they will refer this patient to a local surgeon. 581 00:29:18,940 --> 00:29:22,540 Now the referral choice reflects the physician's beliefs 582 00:29:22,540 --> 00:29:25,320 about the surgeon's ability, right? 583 00:29:25,320 --> 00:29:29,650 You want to do the best for your patient. 584 00:29:29,650 --> 00:29:32,120 And you want to send that patient 585 00:29:32,120 --> 00:29:34,030 to the best possible surgeon. 586 00:29:34,030 --> 00:29:35,860 So if you have a suspicion, if you 587 00:29:35,860 --> 00:29:38,170 think a certain surgeon is not competent, 588 00:29:38,170 --> 00:29:40,900 you better not send your patient to that surgeon, 589 00:29:40,900 --> 00:29:43,750 in part, because you might care a lot about the patient, 590 00:29:43,750 --> 00:29:46,780 in part, because, you know, if you 591 00:29:46,780 --> 00:29:51,600 make bad referrals, that might also reflect on you eventually. 592 00:29:51,600 --> 00:29:54,960 Now what does Sarsons do specifically? 593 00:29:54,960 --> 00:29:58,380 To document whether the reaction depends 594 00:29:58,380 --> 00:29:59,800 on the surgeon's gender. 595 00:29:59,800 --> 00:30:01,230 What does Sarsons do? 596 00:30:01,230 --> 00:30:04,260 She matches on surgeon and patient characteristics 597 00:30:04,260 --> 00:30:05,280 on procedures. 598 00:30:05,280 --> 00:30:10,200 That is to say, she takes male and female surgeons and patient 599 00:30:10,200 --> 00:30:14,490 characteristics and procedures-- which is like, you know, 600 00:30:14,490 --> 00:30:17,580 broken arms versus more complicated procedures-- 601 00:30:17,580 --> 00:30:20,670 so she matches all of those characteristics and such 602 00:30:20,670 --> 00:30:23,610 that in observations set, she has surgeons only 603 00:30:23,610 --> 00:30:24,990 differ by gender. 604 00:30:24,990 --> 00:30:28,260 And then she considers what we call an event study, which 605 00:30:28,260 --> 00:30:30,390 is comparing how physicians reacts 606 00:30:30,390 --> 00:30:36,240 to male and female surgeons when good and bad things happen. 607 00:30:36,240 --> 00:30:39,470 Let me now show you more precisely what we mean. 608 00:30:39,470 --> 00:30:42,740 So we have here on the x-axis, quarters. 609 00:30:42,740 --> 00:30:45,500 On the y-axis, we have the referrals from the physicians 610 00:30:45,500 --> 00:30:47,240 to the performing surgeon. 611 00:30:47,240 --> 00:30:50,540 So these are essentially dyads or pairs of physicians 612 00:30:50,540 --> 00:30:52,250 and performing surgeons. 613 00:30:52,250 --> 00:30:56,990 And you see here minus 1 is normalized to be 0. 614 00:30:56,990 --> 00:31:00,800 And then, essentially, you see the evolution of the referrals 615 00:31:00,800 --> 00:31:04,350 from a specific physician to a performing surgery. 616 00:31:04,350 --> 00:31:05,523 These are men and women. 617 00:31:05,523 --> 00:31:07,190 And these are essentially cases in which 618 00:31:07,190 --> 00:31:10,010 there is no adverse event or the adverse events 619 00:31:10,010 --> 00:31:12,117 are like a patient's death, for example. 620 00:31:12,117 --> 00:31:13,700 What you see are essentially these are 621 00:31:13,700 --> 00:31:14,992 men and women that are matched. 622 00:31:14,992 --> 00:31:18,090 They're supposed to look exactly the same. 623 00:31:18,090 --> 00:31:23,320 And what we see here is that referrals increase over time. 624 00:31:23,320 --> 00:31:26,310 They tend to sort of increase fairly steeply over time 625 00:31:26,310 --> 00:31:29,490 and then sort of plateau off over time. 626 00:31:29,490 --> 00:31:32,100 The exact shape and pattern of this relationship 627 00:31:32,100 --> 00:31:33,488 doesn't matter so much. 628 00:31:33,488 --> 00:31:35,280 However, what matters is that men and women 629 00:31:35,280 --> 00:31:39,383 look exactly the same, right? 630 00:31:39,383 --> 00:31:40,800 So the performing surgeon is here. 631 00:31:40,800 --> 00:31:43,590 The gender here that varies is the gender 632 00:31:43,590 --> 00:31:46,130 of the performing surgery. 633 00:31:46,130 --> 00:31:48,170 Now then she looks at-- 634 00:31:48,170 --> 00:31:51,410 Sarsons looks at cases in which there is an adverse event. 635 00:31:51,410 --> 00:31:56,160 Again, the adverse event happens around here. 636 00:31:56,160 --> 00:31:59,630 So what you see here is then when there is an adverse event, 637 00:31:59,630 --> 00:32:03,380 there's a clear punishment in the sense 638 00:32:03,380 --> 00:32:05,270 of like the male surgeon now gets 639 00:32:05,270 --> 00:32:11,690 fewer referrals from the physician right 640 00:32:11,690 --> 00:32:13,700 after the adverse event happens. 641 00:32:13,700 --> 00:32:16,100 So this is manifested by the blue bars 642 00:32:16,100 --> 00:32:19,880 being lower than the gray bars. 643 00:32:19,880 --> 00:32:21,830 Notice that the number doesn't go down. 644 00:32:21,830 --> 00:32:25,110 It's still above-- everything here is above the red line. 645 00:32:25,110 --> 00:32:29,960 So it's just the physician is not 646 00:32:29,960 --> 00:32:32,975 increasing the referrals over time anymore 647 00:32:32,975 --> 00:32:36,762 or that increase gets dampened by the adverse event. 648 00:32:36,762 --> 00:32:37,970 This is for the male surgeon. 649 00:32:37,970 --> 00:32:40,520 Let's look at now what happens to the female surgeon. 650 00:32:40,520 --> 00:32:43,910 What we see here is a clearer difference that opens up 651 00:32:43,910 --> 00:32:46,310 after the adverse event. 652 00:32:46,310 --> 00:32:48,230 Notice that there's no pre-trends here. 653 00:32:48,230 --> 00:32:49,940 Before, it looks-- men and women, 654 00:32:49,940 --> 00:32:52,430 or female and male surgeons, including 655 00:32:52,430 --> 00:32:55,380 the red and the blue bars look exactly the same. 656 00:32:55,380 --> 00:32:57,800 But the bar, the difference only opens up 657 00:32:57,800 --> 00:32:59,320 after the adverse event. 658 00:32:59,320 --> 00:33:01,430 And there's a pretty large gap, you can see, 659 00:33:01,430 --> 00:33:03,770 that shows up in the referrals. 660 00:33:03,770 --> 00:33:07,130 And this gap is way larger if you compare 661 00:33:07,130 --> 00:33:09,410 the red versus the gray lines-- 662 00:33:09,410 --> 00:33:11,150 it's way larger compared to the gap 663 00:33:11,150 --> 00:33:14,450 with the divergence between the blue and the gray lines. 664 00:33:14,450 --> 00:33:18,070 So what we see here is that both men and women 665 00:33:18,070 --> 00:33:21,400 are punished in the sense of receiving fewer referrals 666 00:33:21,400 --> 00:33:22,900 after an adverse event. 667 00:33:22,900 --> 00:33:25,690 That punishment is way more severe for female 668 00:33:25,690 --> 00:33:28,620 than for male surgeons. 669 00:33:28,620 --> 00:33:32,183 Now let's do the same for the absence of a good event 670 00:33:32,183 --> 00:33:33,850 aren't actually good events that happen. 671 00:33:33,850 --> 00:33:34,683 What's a good event? 672 00:33:34,683 --> 00:33:36,992 Well, it's essentially a surprisingly good thing 673 00:33:36,992 --> 00:33:38,700 that happened where there's a complicated 674 00:33:38,700 --> 00:33:41,810 surgery and the person does really well. 675 00:33:41,810 --> 00:33:45,750 What we see here is, again, sort of the men and women, 676 00:33:45,750 --> 00:33:48,000 they look pretty much exactly the same. 677 00:33:48,000 --> 00:33:52,710 You can see here maybe men are doing slightly better overall. 678 00:33:52,710 --> 00:33:55,880 But overall, these gray lines look very similar over time. 679 00:33:55,880 --> 00:33:57,630 Now, if there's a good event that happens, 680 00:33:57,630 --> 00:34:00,630 you see, essentially, men get very much rewarded. 681 00:34:00,630 --> 00:34:06,280 If the male surgeon does well, he 682 00:34:06,280 --> 00:34:11,139 gets rewarded in the sense of receiving a lot more referrals 683 00:34:11,139 --> 00:34:13,929 from the PCP, from the doctor. 684 00:34:13,929 --> 00:34:16,870 Now if you do the same for female surgeon. 685 00:34:16,870 --> 00:34:20,420 Again, female surgeons benefit from that. 686 00:34:20,420 --> 00:34:26,409 But the increase in referrals is a lot lower than for men. 687 00:34:26,409 --> 00:34:28,600 So, again, both men and women, there 688 00:34:28,600 --> 00:34:30,550 is a reaction to that information, 689 00:34:30,550 --> 00:34:32,560 to a good or a bad event. 690 00:34:32,560 --> 00:34:35,710 But the reaction tends to be much stronger, much more 691 00:34:35,710 --> 00:34:37,659 favorable for men. 692 00:34:37,659 --> 00:34:40,690 In terms of the good events, men get rewarded more 693 00:34:40,690 --> 00:34:41,860 for the good events. 694 00:34:41,860 --> 00:34:46,239 And women get punished more for bad events, [INAUDIBLE] 695 00:34:46,239 --> 00:34:48,330 bad events that happen. 696 00:34:48,330 --> 00:34:50,219 Let me summarize the main results. 697 00:34:50,219 --> 00:34:53,550 After a bad outcome, after, for example like a patient death, 698 00:34:53,550 --> 00:34:57,100 there is a 34 decrease in referrals to female surgeon-- 699 00:34:57,100 --> 00:34:57,600 surgeons. 700 00:34:57,600 --> 00:35:00,000 And there's a stagnation in referrals to male surgeons. 701 00:35:00,000 --> 00:35:02,850 So, essentially, women tend to be punished quite a lot. 702 00:35:02,850 --> 00:35:05,200 Men, a little bit in the sense that referrals 703 00:35:05,200 --> 00:35:08,190 don't go up anymore, but the punishment is much less. 704 00:35:08,190 --> 00:35:10,530 Now, strikingly, physicians are also 705 00:35:10,530 --> 00:35:14,880 less likely to refer to other female surgeons. 706 00:35:14,880 --> 00:35:16,560 That is to say, if a female surgeon, 707 00:35:16,560 --> 00:35:19,290 if a patient dies from a female surgeon, 708 00:35:19,290 --> 00:35:22,290 the physician is also less likely to refer 709 00:35:22,290 --> 00:35:25,230 patients, future patients to other female surgeons. 710 00:35:25,230 --> 00:35:28,420 These other female surgeons, nobody has died there. 711 00:35:28,420 --> 00:35:30,000 So, essentially, what seems to be 712 00:35:30,000 --> 00:35:35,620 the case is that the physicians also are updating about women 713 00:35:35,620 --> 00:35:40,360 or female surgeons in general if there is a negative event that 714 00:35:40,360 --> 00:35:43,520 happens, which is sort of very striking. 715 00:35:43,520 --> 00:35:48,020 After a good outcome, such as unanticipated survival, 716 00:35:48,020 --> 00:35:50,450 there's a doubling of referrals to male surgeons 717 00:35:50,450 --> 00:35:53,900 but only a 70% increase in referrals to female surgeons. 718 00:35:53,900 --> 00:35:55,980 And there are here-- there's no spillovers 719 00:35:55,980 --> 00:35:59,560 to other female surgeons. 720 00:35:59,560 --> 00:36:01,720 So, again, what do we find here? 721 00:36:01,720 --> 00:36:04,930 Well, we find that women are punished more 722 00:36:04,930 --> 00:36:07,300 for unexpected bad events. 723 00:36:07,300 --> 00:36:13,127 And they're rewarded less for unexpected good events. 724 00:36:13,127 --> 00:36:14,710 That is to say, it's not just the case 725 00:36:14,710 --> 00:36:17,860 that there's more updating going on, in the sense 726 00:36:17,860 --> 00:36:19,840 maybe you learn more from good and bad events 727 00:36:19,840 --> 00:36:22,840 because maybe you have like less information about men and women 728 00:36:22,840 --> 00:36:23,890 to start with. 729 00:36:23,890 --> 00:36:25,390 That's not the case here. 730 00:36:25,390 --> 00:36:29,030 What is here the case is that there's more 731 00:36:29,030 --> 00:36:34,010 updating or more punishment for mistakes for women and less 732 00:36:34,010 --> 00:36:36,620 reward for good things to happen, 733 00:36:36,620 --> 00:36:41,010 for good events for women as well. 734 00:36:41,010 --> 00:36:45,070 Now, why do these asymmetries matter? 735 00:36:45,070 --> 00:36:49,540 Well, one important issue here is that women have more chances 736 00:36:49,540 --> 00:36:51,320 to make mistakes. 737 00:36:51,320 --> 00:36:54,430 And, in particular, if there is like dropping out, 738 00:36:54,430 --> 00:36:57,960 if women don't get promoted or if they even drop out, 739 00:36:57,960 --> 00:37:00,170 that leads to lower skill accumulation 740 00:37:00,170 --> 00:37:02,480 and, you know, women will be just less qualified 741 00:37:02,480 --> 00:37:04,970 overall in the end because they have less chances 742 00:37:04,970 --> 00:37:09,653 to get promoted and get more responsibility 743 00:37:09,653 --> 00:37:12,070 and so on and so forth and then they even drop out overall 744 00:37:12,070 --> 00:37:15,280 and we just have fewer women in the profession. 745 00:37:15,280 --> 00:37:19,150 Now, moreover, usually wage gaps are 746 00:37:19,150 --> 00:37:22,330 measured conditional skills, industry, and position. 747 00:37:22,330 --> 00:37:25,320 But if there is biased evaluations 748 00:37:25,320 --> 00:37:27,880 that then lead to differences in skills and positions 749 00:37:27,880 --> 00:37:29,950 in the workplace, not only is it the case 750 00:37:29,950 --> 00:37:35,210 that women are receiving wage gaps conditional on the skills, 751 00:37:35,210 --> 00:37:37,570 but they also have different skills and positions 752 00:37:37,570 --> 00:37:39,910 in the workplace because of biased evaluations. 753 00:37:39,910 --> 00:37:42,700 So that's an additional scope for discrimination 754 00:37:42,700 --> 00:37:45,930 and differences in earnings and so on. 755 00:37:45,930 --> 00:37:47,640 And then, importantly, this is also 756 00:37:47,640 --> 00:37:50,340 not a mistake that can get corrected over time. 757 00:37:50,340 --> 00:37:53,640 Because if women drop out in particular, then-- 758 00:37:53,640 --> 00:37:56,100 or if there's no referrals that happen anymore, 759 00:37:56,100 --> 00:37:58,410 then the doctor or any other person 760 00:37:58,410 --> 00:38:01,770 might not receive any signals anymore from work. 761 00:38:01,770 --> 00:38:04,350 If you sort of update on a particular person 762 00:38:04,350 --> 00:38:07,970 that that patient of mine died because I sent them 763 00:38:07,970 --> 00:38:09,600 to that person, if you then think, 764 00:38:09,600 --> 00:38:11,460 you know, this person is terrible 765 00:38:11,460 --> 00:38:14,430 and don't send any further patients anymore, 766 00:38:14,430 --> 00:38:16,800 you will never learn that this woman was, in fact, very 767 00:38:16,800 --> 00:38:19,020 much highly qualified and that was just 768 00:38:19,020 --> 00:38:21,840 an accident that could have happened to anyone. 769 00:38:21,840 --> 00:38:25,010 So that's a really important issue that-- 770 00:38:25,010 --> 00:38:26,660 so when women are underrepresented 771 00:38:26,660 --> 00:38:29,730 employers see fewer outcomes overall, 772 00:38:29,730 --> 00:38:33,116 and if there is updates more of the bad outcomes, 773 00:38:33,116 --> 00:38:37,130 or, in particular, if there's also updates about all women, 774 00:38:37,130 --> 00:38:42,560 then there will be fewer chances to learn and then women are let 775 00:38:42,560 --> 00:38:45,890 go and so on or less likely to be hired, 776 00:38:45,890 --> 00:38:48,620 then there will be no chance of actually correcting 777 00:38:48,620 --> 00:38:50,360 those [INAUDIBLE]. 778 00:38:50,360 --> 00:38:54,140 Strikingly, as I told you, what's really important 779 00:38:54,140 --> 00:38:57,170 as a result is that seems to be like updating, 780 00:38:57,170 --> 00:39:00,030 not just about one specific woman 781 00:39:00,030 --> 00:39:02,120 overall after the negative events, 782 00:39:02,120 --> 00:39:08,330 but also about all other women that get punished 783 00:39:08,330 --> 00:39:11,210 for the bad event that happens. 784 00:39:11,210 --> 00:39:13,130 And if that's the case, if there's 785 00:39:13,130 --> 00:39:15,710 several women in the profession, then all women 786 00:39:15,710 --> 00:39:17,990 always have to pay for any bad events that 787 00:39:17,990 --> 00:39:18,950 happen to any woman. 788 00:39:18,950 --> 00:39:21,140 And sometimes, you know, mistakes, in fact, happen. 789 00:39:21,140 --> 00:39:25,160 That's, of course, terrible because that sort of amplifies 790 00:39:25,160 --> 00:39:26,210 a lot of the updating. 791 00:39:26,210 --> 00:39:30,470 Because even if somebody makes no mistakes whatsoever, 792 00:39:30,470 --> 00:39:33,240 negative updates happen on their performance, 793 00:39:33,240 --> 00:39:37,130 other women perform negatively in some ways. 794 00:39:42,190 --> 00:39:46,470 Now one question you might ask is, well, 795 00:39:46,470 --> 00:39:49,820 couldn't we just use algorithms, machine learning, et cetera, 796 00:39:49,820 --> 00:39:52,250 to overcome such biases? 797 00:39:52,250 --> 00:39:54,170 In particular, instead of hiring and so on, 798 00:39:54,170 --> 00:39:57,080 couldn't you use algorithms and machine 799 00:39:57,080 --> 00:40:01,280 learning to train those algorithms and instead 800 00:40:01,280 --> 00:40:04,310 of use computers instead of humans and then, you know, 801 00:40:04,310 --> 00:40:06,320 we might get rid of those biases? 802 00:40:06,320 --> 00:40:08,240 Well, one key problem with that is 803 00:40:08,240 --> 00:40:10,940 that algorithms themselves can be biased. 804 00:40:10,940 --> 00:40:11,810 Now why is that? 805 00:40:11,810 --> 00:40:16,470 Well, we train algorithms based on human decisions. 806 00:40:16,470 --> 00:40:19,900 As an example, if you train an algorithm-- 807 00:40:19,900 --> 00:40:22,610 this is, in fact, what some companies did, 808 00:40:22,610 --> 00:40:24,505 based on actual decisions, you get 809 00:40:24,505 --> 00:40:26,630 asked the question, this [INAUDIBLE],, if you train 810 00:40:26,630 --> 00:40:29,210 that as an actual hiring decisions or interview 811 00:40:29,210 --> 00:40:33,590 decisions or you say, well, would this person be invited 812 00:40:33,590 --> 00:40:37,610 or is this person, is this resume likely to be invited 813 00:40:37,610 --> 00:40:39,050 for an interview? 814 00:40:39,050 --> 00:40:41,840 Well, if those interviewing decisions to start with 815 00:40:41,840 --> 00:40:44,550 were biased based on sexism, racism, 816 00:40:44,550 --> 00:40:47,690 and so on, if you train your algorithm on it, then 817 00:40:47,690 --> 00:40:51,770 very much you make that algorithm biased as well. 818 00:40:51,770 --> 00:40:56,680 Now, crucially, that's, of course, problematic 819 00:40:56,680 --> 00:40:58,810 and I very much would like to avoid that. 820 00:40:58,810 --> 00:41:00,430 And we want to be very careful. 821 00:41:00,430 --> 00:41:02,680 But, crucially, some of these biases 822 00:41:02,680 --> 00:41:05,470 can be perhaps easier or fixed and perhaps 823 00:41:05,470 --> 00:41:09,660 fixed more easily than biases in human decision-making, right? 824 00:41:09,660 --> 00:41:11,230 If somebody is sexist or racist, it's 825 00:41:11,230 --> 00:41:15,260 very hard to fix this, at least in the short run. 826 00:41:15,260 --> 00:41:18,220 However, an algorithm, essentially, 827 00:41:18,220 --> 00:41:21,010 is just doing whatever you train them to do. 828 00:41:21,010 --> 00:41:24,910 So if you build equity concerns into the objective function 829 00:41:24,910 --> 00:41:27,170 of an algorithm, then we can essentially 830 00:41:27,170 --> 00:41:32,680 reduce or even eliminate these biases explicitly 831 00:41:32,680 --> 00:41:35,968 because the algorithm is explicitly asked to do so. 832 00:41:35,968 --> 00:41:37,510 And so there's a fascinating-- if you 833 00:41:37,510 --> 00:41:38,890 want to learn more about this overall, 834 00:41:38,890 --> 00:41:40,723 there's a fascinating talk on discrimination 835 00:41:40,723 --> 00:41:43,090 by algorithms and people by Sendhil Mullainathan 836 00:41:43,090 --> 00:41:45,760 that I've linked here in the slides 837 00:41:45,760 --> 00:41:48,860 that you can watch it and learn more about. 838 00:41:48,860 --> 00:41:51,940 But overall, so just to say, to summarize, 839 00:41:51,940 --> 00:41:54,635 algorithms can be biased, but they can also be fixed. 840 00:41:54,635 --> 00:41:56,260 And that's an important research agenda 841 00:41:56,260 --> 00:41:59,160 that people have been working on more recently, 842 00:41:59,160 --> 00:42:02,380 the design of fair algorithms. 843 00:42:02,380 --> 00:42:04,390 Next, we're going to talk about gender identity 844 00:42:04,390 --> 00:42:10,070 norms and a very nice paper by Bertrand et. al. from 2015. 845 00:42:10,070 --> 00:42:13,610 Identity considerations were imported from social psychology 846 00:42:13,610 --> 00:42:17,843 to economics in multiple papers by Akerlof and Kranton 847 00:42:17,843 --> 00:42:19,880 in particular in 2000. 848 00:42:19,880 --> 00:42:22,220 There's a very nice book about identity economics 849 00:42:22,220 --> 00:42:25,310 if you're interested in learning more about that. 850 00:42:25,310 --> 00:42:28,990 Gender identity norms are an important example 851 00:42:28,990 --> 00:42:33,380 or application of such norms or identity considerations 852 00:42:33,380 --> 00:42:34,880 more generally. 853 00:42:34,880 --> 00:42:38,870 In particular, Akerlof and Kranton, or we here as well, 854 00:42:38,870 --> 00:42:43,040 focus on two social categories, men and women. 855 00:42:43,040 --> 00:42:46,220 For simplicity, we can also just talk about husbands and wives, 856 00:42:46,220 --> 00:42:49,050 understanding things are more complicated in reality. 857 00:42:49,050 --> 00:42:51,850 But for simplicity, we're going to focus on that for now. 858 00:42:54,420 --> 00:42:55,890 What do we mean by gender identity? 859 00:42:55,890 --> 00:42:57,307 Well, gender identity is something 860 00:42:57,307 --> 00:43:00,090 that changes payoffs from different action as dictated 861 00:43:00,090 --> 00:43:02,190 by some prescriptive norms. 862 00:43:02,190 --> 00:43:03,360 What might those norms be? 863 00:43:03,360 --> 00:43:05,460 Well, one norm would be men should not 864 00:43:05,460 --> 00:43:07,980 do women's work in the house, in the home, 865 00:43:07,980 --> 00:43:11,820 like cooking, cleaning, and so on. 866 00:43:11,820 --> 00:43:14,520 Some men might consider that as women's work. 867 00:43:14,520 --> 00:43:17,490 And they might feel particularly high disutility 868 00:43:17,490 --> 00:43:21,390 from doing so if they violate that law. 869 00:43:21,390 --> 00:43:26,370 Another norm would be men should earn more than their wives. 870 00:43:26,370 --> 00:43:30,690 Now husbands now lose identity, experience lower utility 871 00:43:30,690 --> 00:43:33,330 in circumstances when these prescriptions are violated, 872 00:43:33,330 --> 00:43:37,890 when he does housework, or when his wife earns more than half 873 00:43:37,890 --> 00:43:40,030 of the household income. 874 00:43:40,030 --> 00:43:44,030 Notice that wives might also lose identity. 875 00:43:44,030 --> 00:43:45,780 For simplicity, we're going to talk more-- 876 00:43:45,780 --> 00:43:49,150 only about husbands, I'm doing so for now. 877 00:43:49,150 --> 00:43:51,360 Now Bertrand et. al. 878 00:43:51,360 --> 00:43:54,070 particularly focus on the prescription 879 00:43:54,070 --> 00:43:56,940 that men should earn more than their wives. 880 00:43:56,940 --> 00:44:00,120 And it's important to notice that this particular gender 881 00:44:00,120 --> 00:44:04,650 identity norm would only matter in a world where women could-- 882 00:44:04,650 --> 00:44:06,060 would not matter in a world where 883 00:44:06,060 --> 00:44:09,360 women could never earn more than her actual or potential 884 00:44:09,360 --> 00:44:10,260 husband. 885 00:44:10,260 --> 00:44:12,840 But if women are held back by technological factors, 886 00:44:12,840 --> 00:44:17,100 by education, et cetera, and never even earn 887 00:44:17,100 --> 00:44:20,430 close to as much as their husbands or potential husbands, 888 00:44:20,430 --> 00:44:24,440 well, then the gender identity norm is kind of irrelevant. 889 00:44:24,440 --> 00:44:27,920 However, as women make gains in the labor market, 890 00:44:27,920 --> 00:44:30,650 these slow moving gender identity norms 891 00:44:30,650 --> 00:44:34,550 can become increasingly relevant and increasingly relevant 892 00:44:34,550 --> 00:44:35,700 constraints. 893 00:44:35,700 --> 00:44:38,690 So now as women have more and more earnings potential 894 00:44:38,690 --> 00:44:40,760 and are much-- 895 00:44:40,760 --> 00:44:43,370 be quite likely to earn more than their husbands, 896 00:44:43,370 --> 00:44:47,480 these norms really bite and might hold women back. 897 00:44:47,480 --> 00:44:50,990 Now how would you study this question empirically? 898 00:44:50,990 --> 00:44:53,120 The idea here in Bertrand et. 899 00:44:53,120 --> 00:44:58,920 al. is that, well, if husbands loose 900 00:44:58,920 --> 00:45:02,880 identity when their wives earn more than half the household 901 00:45:02,880 --> 00:45:06,960 income, well, then they will try to avoid 902 00:45:06,960 --> 00:45:09,180 such situations, either-- 903 00:45:09,180 --> 00:45:11,310 in various ways, and particularly 904 00:45:11,310 --> 00:45:14,430 by avoiding such marriages in the first place. 905 00:45:14,430 --> 00:45:16,260 But also if those marriages happen, 906 00:45:16,260 --> 00:45:18,330 that might lead to divorce and so on 907 00:45:18,330 --> 00:45:20,190 and those marriages might break up. 908 00:45:20,190 --> 00:45:22,590 So the idea of a paper then is to look 909 00:45:22,590 --> 00:45:27,060 for a missing mass in couples where, particularly in 910 00:45:27,060 --> 00:45:29,820 places where the husband earns just a little bit more 911 00:45:29,820 --> 00:45:31,030 than their wife. 912 00:45:31,030 --> 00:45:32,200 Now how do they do that? 913 00:45:32,200 --> 00:45:35,190 Well, they have to look at the distribution of relative income 914 00:45:35,190 --> 00:45:37,733 using US admin data. 915 00:45:37,733 --> 00:45:39,150 What do I mean by relative income? 916 00:45:39,150 --> 00:45:41,400 Relative income is the income between the husband 917 00:45:41,400 --> 00:45:42,750 and the wife. 918 00:45:42,750 --> 00:45:46,080 They have essentially, there's a very nice survey, 919 00:45:46,080 --> 00:45:49,170 data of the survey income and program participation. 920 00:45:49,170 --> 00:45:53,670 These are a series of representative national panels 921 00:45:53,670 --> 00:45:59,130 that about 70,000 couple observations from 1992 to 2004. 922 00:45:59,130 --> 00:46:02,460 This includes couples only in which both the husbands 923 00:46:02,460 --> 00:46:05,810 and wives are earning positive income. 924 00:46:05,810 --> 00:46:08,730 So notice that this was not about husband and wife 925 00:46:08,730 --> 00:46:09,630 working at all. 926 00:46:09,630 --> 00:46:11,880 It's not about women being at home versus working. 927 00:46:11,880 --> 00:46:17,370 But it's about cases where both couples or both spouses 928 00:46:17,370 --> 00:46:18,540 have positive income. 929 00:46:21,180 --> 00:46:24,330 Their income measure is annual total labor income 930 00:46:24,330 --> 00:46:27,060 plus self-employed income as well. 931 00:46:27,060 --> 00:46:29,070 Now using that data, Bertrand et. al. 932 00:46:29,070 --> 00:46:31,140 can now compute the shares of couples 933 00:46:31,140 --> 00:46:34,665 earning different fractions of total income. 934 00:46:34,665 --> 00:46:36,540 So what I'm going to show you is the fraction 935 00:46:36,540 --> 00:46:40,380 of the wife as a share of total income in the household. 936 00:46:40,380 --> 00:46:43,170 And that's graphed using 20 bits. 937 00:46:43,170 --> 00:46:44,530 What does this graph look like? 938 00:46:44,530 --> 00:46:48,990 You can see here the share earned by the wife 939 00:46:48,990 --> 00:46:50,490 is just a fraction of couples. 940 00:46:50,490 --> 00:46:51,920 This overall. 941 00:46:51,920 --> 00:46:53,290 So this is all the couples. 942 00:46:53,290 --> 00:46:54,180 This is [INAUDIBLE]. 943 00:46:54,180 --> 00:46:57,450 This all adds up to 100% overall. 944 00:46:57,450 --> 00:47:01,370 And what you see is, perhaps not surprisingly, that the share-- 945 00:47:01,370 --> 00:47:05,700 the fraction of couple for which the share of earned by the wife 946 00:47:05,700 --> 00:47:09,950 is higher than 50% is not super high. 947 00:47:09,950 --> 00:47:12,530 That's not surprising given that men on average 948 00:47:12,530 --> 00:47:14,540 are earning more than women. 949 00:47:14,540 --> 00:47:15,970 So that's not the question here. 950 00:47:15,970 --> 00:47:21,110 The question here is, is there a missing mass just above 50%? 951 00:47:21,110 --> 00:47:26,840 The idea is, suppose there are some couples where women earn 952 00:47:26,840 --> 00:47:31,610 45% or 46% or even 49%, there the norm does not 953 00:47:31,610 --> 00:47:34,370 cause an issue or there's no identity problems 954 00:47:34,370 --> 00:47:36,200 because the husband earns somewhat more 955 00:47:36,200 --> 00:47:37,770 and feels happy about it. 956 00:47:37,770 --> 00:47:41,553 But as soon as you cross the threshold to like 51%, 957 00:47:41,553 --> 00:47:42,470 now there's a problem. 958 00:47:42,470 --> 00:47:43,640 Now the woman earns more. 959 00:47:43,640 --> 00:47:47,010 The husband feels uncomfortable, unhappy and so on. 960 00:47:47,010 --> 00:47:52,260 And that might lead to those marriages disappearing 961 00:47:52,260 --> 00:47:53,760 in various ways. 962 00:47:53,760 --> 00:47:56,457 A, they might not form in the first place. 963 00:47:56,457 --> 00:47:58,040 Potential husband and wife might never 964 00:47:58,040 --> 00:48:00,890 even marry because husband gets uncomfortable or does not 965 00:48:00,890 --> 00:48:03,410 want to marry somebody who earns more. 966 00:48:03,410 --> 00:48:04,825 Second, there might be divorces. 967 00:48:04,825 --> 00:48:06,200 These marriages might essentially 968 00:48:06,200 --> 00:48:09,680 dissolve because there's lots of conflict in the household. 969 00:48:09,680 --> 00:48:11,333 Now, how does that manifest here? 970 00:48:11,333 --> 00:48:12,750 You can look at like if you come-- 971 00:48:12,750 --> 00:48:16,760 if you look at the graph and look at this line here coming 972 00:48:16,760 --> 00:48:19,700 from the left was 50%, you can kind of look at like, 973 00:48:19,700 --> 00:48:23,360 try to predict-- and it's kind of the idea of the empirical 974 00:48:23,360 --> 00:48:24,170 strategy-- 975 00:48:24,170 --> 00:48:27,080 you can try to predict what fraction would you expect 976 00:48:27,080 --> 00:48:29,020 there to be just above 50%? 977 00:48:29,020 --> 00:48:30,770 And when you do that, you think, you know, 978 00:48:30,770 --> 00:48:33,710 the fraction should be about something like 7%, 979 00:48:33,710 --> 00:48:35,060 should be about here. 980 00:48:35,060 --> 00:48:38,630 But, instead, there's this jump down to about 6%. 981 00:48:38,630 --> 00:48:41,630 So there's about 1% of couples are missing here. 982 00:48:41,630 --> 00:48:44,250 There's a missing mass going on here, 983 00:48:44,250 --> 00:48:47,600 which suggests that there are some marriages that 984 00:48:47,600 --> 00:48:52,520 are do not happen or dissolve because of those identity 985 00:48:52,520 --> 00:48:53,035 concerns. 986 00:48:53,035 --> 00:48:54,410 This is what I already said here. 987 00:48:54,410 --> 00:48:56,540 There's a cliff to the right of 0.5 988 00:48:56,540 --> 00:48:59,990 in the distribution of relative earnings in most couples. 989 00:48:59,990 --> 00:49:02,000 And that's an implication of the prescription 990 00:49:02,000 --> 00:49:04,700 that men should earn more than their wives. 991 00:49:04,700 --> 00:49:08,420 Now what are the mechanisms behind this missing mass? 992 00:49:08,420 --> 00:49:11,210 Bertrand et. al. had a wealth of evidence, additional evidence 993 00:49:11,210 --> 00:49:12,800 showing these mechanisms. 994 00:49:12,800 --> 00:49:15,840 In particular, they show three mechanisms, 995 00:49:15,840 --> 00:49:18,830 which is, first, missing couples did not 996 00:49:18,830 --> 00:49:20,280 form in the first place. 997 00:49:20,280 --> 00:49:21,320 When you look at-- 998 00:49:21,320 --> 00:49:25,330 when you just randomly match men and women, some of which 999 00:49:25,330 --> 00:49:28,170 were men have higher earnings potential compared to women 1000 00:49:28,170 --> 00:49:30,080 and the other way around, the couples 1001 00:49:30,080 --> 00:49:33,020 where women are earning just somewhat less than women 1002 00:49:33,020 --> 00:49:34,970 are just much less likely or less 1003 00:49:34,970 --> 00:49:38,150 likely to marry compared to couples 1004 00:49:38,150 --> 00:49:41,440 where women are just earning somewhat less than men. 1005 00:49:41,440 --> 00:49:44,440 Second, such couples are less happy and stable and more 1006 00:49:44,440 --> 00:49:45,502 likely to end in divorce. 1007 00:49:45,502 --> 00:49:46,960 So conditional on marrying, they're 1008 00:49:46,960 --> 00:49:50,977 more likely to get divorced as marriages just end unhappily. 1009 00:49:50,977 --> 00:49:52,810 And that's, of course, then also contributes 1010 00:49:52,810 --> 00:49:54,720 to this missing mass. 1011 00:49:54,720 --> 00:49:58,860 And then third, in cases where wives have higher earnings 1012 00:49:58,860 --> 00:50:01,200 potential than their husbands, they 1013 00:50:01,200 --> 00:50:05,700 decide now to work less outside of the household or, you know, 1014 00:50:05,700 --> 00:50:08,700 put in less [INAUDIBLE] work so they get less promoted, which 1015 00:50:08,700 --> 00:50:12,300 then sort of pushes down their earnings 1016 00:50:12,300 --> 00:50:15,720 or increased husband's earnings in relative terms, 1017 00:50:15,720 --> 00:50:19,590 such as done in such cases the woman earns less than 50%. 1018 00:50:19,590 --> 00:50:21,720 And that also leads to that missing mass. 1019 00:50:21,720 --> 00:50:23,730 In fact, what you see here is this increase 1020 00:50:23,730 --> 00:50:26,680 in mass going from like 20% to 40%, 1021 00:50:26,680 --> 00:50:28,467 which perhaps is also like the reason 1022 00:50:28,467 --> 00:50:30,300 because some of the mass that's missing here 1023 00:50:30,300 --> 00:50:33,530 is pushed here to the left. 1024 00:50:33,530 --> 00:50:39,160 Now, in addition to working less outside of the household 1025 00:50:39,160 --> 00:50:43,870 and just working fewer hours and working less overtime, it's on, 1026 00:50:43,870 --> 00:50:47,140 women then are also held back by non-marketing 1027 00:50:47,140 --> 00:50:47,980 in childcare work. 1028 00:50:47,980 --> 00:50:49,840 So women, in particular, women who 1029 00:50:49,840 --> 00:50:54,880 are earning just a little bit more than their husbands 1030 00:50:54,880 --> 00:50:58,390 are doing more nonmarket and childcare work, which is often 1031 00:50:58,390 --> 00:51:00,020 called the second shift. 1032 00:51:00,020 --> 00:51:01,690 So the idea here is that husbands 1033 00:51:01,690 --> 00:51:05,470 even when his identity gets threatened by the woman earning 1034 00:51:05,470 --> 00:51:07,660 more than he does, and then he can 1035 00:51:07,660 --> 00:51:10,510 particularly not to the childcare or nonmarket work. 1036 00:51:10,510 --> 00:51:13,302 So the woman ends up not only working more outside 1037 00:51:13,302 --> 00:51:15,760 of the household and earning more and being more productive 1038 00:51:15,760 --> 00:51:19,210 in that work, but also doing more work at home 1039 00:51:19,210 --> 00:51:21,910 and in addition to that. 1040 00:51:21,910 --> 00:51:26,820 That's essentially what's called the second or double shift. 1041 00:51:26,820 --> 00:51:34,550 Now an additional issue that is a really important contributor 1042 00:51:34,550 --> 00:51:39,080 to the gender gap is the arrival of children. 1043 00:51:39,080 --> 00:51:41,260 And this is a very nice paper by Kleven et. al. that 1044 00:51:41,260 --> 00:51:44,960 looks at people's earning, in particular, 1045 00:51:44,960 --> 00:51:48,440 the earning, how earnings devolve after the birth 1046 00:51:48,440 --> 00:51:50,910 of a couple's first child. 1047 00:51:50,910 --> 00:51:54,020 This is an event study that looks at years 1048 00:51:54,020 --> 00:51:57,260 over time for men and women. 1049 00:51:57,260 --> 00:52:00,290 And what you see essentially for men, these are earnings 1050 00:52:00,290 --> 00:52:02,480 and on the right here you have hours worked. 1051 00:52:02,480 --> 00:52:04,850 For men, essentially, there's barely 1052 00:52:04,850 --> 00:52:09,690 any difference, if any difference, of earnings 1053 00:52:09,690 --> 00:52:12,260 once the first child is born. 1054 00:52:12,260 --> 00:52:16,580 Essentially, men's earnings tend to be pretty flat. 1055 00:52:16,580 --> 00:52:18,280 In contrast, of course, that could be 1056 00:52:18,280 --> 00:52:19,530 for the growth that's avoided. 1057 00:52:19,530 --> 00:52:21,200 But, you know, at least there's no reduction 1058 00:52:21,200 --> 00:52:22,190 in men's [INAUDIBLE]. 1059 00:52:22,190 --> 00:52:25,100 This is, by the way, Danish data. 1060 00:52:25,100 --> 00:52:29,870 Now for women, however, there's a clearer reduction 1061 00:52:29,870 --> 00:52:32,600 in earnings. 1062 00:52:32,600 --> 00:52:40,410 And the long-run child penalty is about almost 20%. 1063 00:52:40,410 --> 00:52:41,810 So, essentially, women after-- 1064 00:52:41,810 --> 00:52:43,280 if you look at men and women that 1065 00:52:43,280 --> 00:52:47,990 earn the same to start with, women earn about 20% 1066 00:52:47,990 --> 00:52:51,200 less in the long run. 1067 00:52:51,200 --> 00:52:54,170 That is due to the first child or after-- 1068 00:52:54,170 --> 00:52:57,530 this gap arises after the first child was born. 1069 00:52:57,530 --> 00:53:00,320 This is true for earnings. 1070 00:53:00,320 --> 00:53:02,060 It's also true for hours worked. 1071 00:53:02,060 --> 00:53:04,340 And there the penalty is about 10%. 1072 00:53:04,340 --> 00:53:07,160 It's also true for wages, as in how much 1073 00:53:07,160 --> 00:53:10,760 are people paid conditional on work. 1074 00:53:10,760 --> 00:53:13,160 So part of that is women working less. 1075 00:53:13,160 --> 00:53:16,970 But part of it is also women earning less, a la being 1076 00:53:16,970 --> 00:53:20,480 promoted less, being paid less, getting fewer raises and so on 1077 00:53:20,480 --> 00:53:23,120 once the child is born. 1078 00:53:23,120 --> 00:53:25,610 One very interesting fact in this paper 1079 00:53:25,610 --> 00:53:29,540 is that child penalties are transmitted through generations 1080 00:53:29,540 --> 00:53:32,540 from parents to daughters, suggesting 1081 00:53:32,540 --> 00:53:36,270 an influence of child environment on gender identity. 1082 00:53:36,270 --> 00:53:38,600 There seems to be something about the gender identity 1083 00:53:38,600 --> 00:53:43,250 about the woman is supposed to take care of the child 1084 00:53:43,250 --> 00:53:45,080 and that runs in the family. 1085 00:53:45,080 --> 00:53:46,970 And that leads to particularly large 1086 00:53:46,970 --> 00:53:54,740 what they call child penalties for those kinds of women 1087 00:53:54,740 --> 00:53:56,890 compared to men. 1088 00:53:56,890 --> 00:53:59,060 Now one thing you might say is, well, 1089 00:53:59,060 --> 00:54:01,810 couldn't we have some gender neutral family policies 1090 00:54:01,810 --> 00:54:03,400 that alleviate that? 1091 00:54:03,400 --> 00:54:08,590 And, in particular, in academia, for instance, one idea 1092 00:54:08,590 --> 00:54:13,660 is one could use gender neutral tenure clock stopping policies 1093 00:54:13,660 --> 00:54:16,630 to alleviate such issues. 1094 00:54:16,630 --> 00:54:18,400 And the idea is that, in particular, 1095 00:54:18,400 --> 00:54:20,030 sort of early years of the career 1096 00:54:20,030 --> 00:54:21,765 are particularly valuable. 1097 00:54:21,765 --> 00:54:23,140 So the very least we can do is we 1098 00:54:23,140 --> 00:54:27,430 can stop the tenure clocks to help women or couples 1099 00:54:27,430 --> 00:54:29,830 to deal with childcare. 1100 00:54:29,830 --> 00:54:33,550 And many, many universities and research intensive universities 1101 00:54:33,550 --> 00:54:35,440 in the US have adopted such what's 1102 00:54:35,440 --> 00:54:39,910 called gender neutral tenure clock policies in the US. 1103 00:54:39,910 --> 00:54:43,010 Now what do I mean by gender neutral? 1104 00:54:43,010 --> 00:54:50,620 Essentially, if a couple has a child, not only does the woman, 1105 00:54:50,620 --> 00:54:53,110 if she has given birth to the child, 1106 00:54:53,110 --> 00:54:56,750 does she get an extension in the tenure clock, 1107 00:54:56,750 --> 00:55:00,720 but also the spouse will get such an extension as well. 1108 00:55:00,720 --> 00:55:03,370 So it's gender neutral in that sense. 1109 00:55:03,370 --> 00:55:05,770 Now, what would such policies then do? 1110 00:55:05,770 --> 00:55:08,800 Well, they're intended to involve men more in child care 1111 00:55:08,800 --> 00:55:12,490 and foster that, the idea being like if you only 1112 00:55:12,490 --> 00:55:16,120 give that extension to women, it sort of pushes 1113 00:55:16,120 --> 00:55:21,700 women to take care of their children 1114 00:55:21,700 --> 00:55:23,860 and sort of not to work. 1115 00:55:23,860 --> 00:55:27,880 And perhaps if we involve both spouses or parents 1116 00:55:27,880 --> 00:55:32,050 in the childcare, providing those gender neutral 1117 00:55:32,050 --> 00:55:34,757 tenure clock stopping policies might contribute to that. 1118 00:55:34,757 --> 00:55:36,340 But, of course, there's no enforcement 1119 00:55:36,340 --> 00:55:38,990 and thus, potentially, such policies 1120 00:55:38,990 --> 00:55:40,900 in that potentially these policies 1121 00:55:40,900 --> 00:55:43,702 might be even enhancing gender inequality. 1122 00:55:43,702 --> 00:55:45,160 And, particularly, if it's the case 1123 00:55:45,160 --> 00:55:49,330 that if men take advantage of this tenure clock 1124 00:55:49,330 --> 00:55:52,000 stopping policy by essentially just having another year to do 1125 00:55:52,000 --> 00:55:56,050 research, but still don't do very much in supporting 1126 00:55:56,050 --> 00:55:59,120 the child, then men might actually benefit from it, 1127 00:55:59,120 --> 00:56:02,343 while women might not benefit at all in part 1128 00:56:02,343 --> 00:56:03,760 because, relatively, they're going 1129 00:56:03,760 --> 00:56:05,620 to do worse compared to men. 1130 00:56:05,620 --> 00:56:11,920 And so Antecol look, in fact, at the impact of these adoptions 1131 00:56:11,920 --> 00:56:18,300 of these supposedly gender neutral tenure clock 1132 00:56:18,300 --> 00:56:19,630 stopping policies. 1133 00:56:19,630 --> 00:56:23,100 And they find that the introduction 1134 00:56:23,100 --> 00:56:26,250 of this policy at [INAUDIBLE] economics department 1135 00:56:26,250 --> 00:56:30,060 substantially reduced female tenure rates while increasing 1136 00:56:30,060 --> 00:56:31,200 male tenure rates. 1137 00:56:31,200 --> 00:56:34,762 So essentially what looks like a neutral policy, in fact, 1138 00:56:34,762 --> 00:56:35,970 is not gender neutral at all. 1139 00:56:35,970 --> 00:56:39,870 It makes things worse for women and better for men. 1140 00:56:39,870 --> 00:56:43,800 Let me tell you about one final paper on gender 1141 00:56:43,800 --> 00:56:46,260 identity and the labor market. 1142 00:56:46,260 --> 00:56:48,480 This paper is called "'Acting wife'-- marriage market 1143 00:56:48,480 --> 00:56:50,730 incentives and labor market investments." 1144 00:56:50,730 --> 00:56:52,960 It's by Bursztyn et. al. and it asks the question 1145 00:56:52,960 --> 00:56:56,190 whether women avoid career-enhancing actions 1146 00:56:56,190 --> 00:56:59,160 because these actions signal undesirable or seemingly 1147 00:56:59,160 --> 00:57:02,220 undesirable traits such as ambition to the marriage 1148 00:57:02,220 --> 00:57:03,770 market. 1149 00:57:03,770 --> 00:57:06,660 There's two parts in the paper. 1150 00:57:06,660 --> 00:57:09,390 One part is just observational data 1151 00:57:09,390 --> 00:57:12,720 that finds that while married and unmarried female MBA 1152 00:57:12,720 --> 00:57:15,420 students perform similarly when their performance is 1153 00:57:15,420 --> 00:57:18,990 unobserved by classmates, such on some exams or problem sets, 1154 00:57:18,990 --> 00:57:22,830 unmarried women have lower participation grades which are 1155 00:57:22,830 --> 00:57:24,880 observed by their classmates. 1156 00:57:24,880 --> 00:57:30,060 And so the idea here is that only unmarried female MBA 1157 00:57:30,060 --> 00:57:34,960 students have incentives on the marriage market. 1158 00:57:34,960 --> 00:57:38,775 And, therefore, when their participation is observed, 1159 00:57:38,775 --> 00:57:41,130 when their performance is observed, 1160 00:57:41,130 --> 00:57:44,220 they might want to scale back their ambition 1161 00:57:44,220 --> 00:57:46,890 because they look too ambitious on the marriage market. 1162 00:57:46,890 --> 00:57:52,140 And that might be adversely interpreted by potential dates. 1163 00:57:52,140 --> 00:57:55,150 And, of course, married women, since they are already married, 1164 00:57:55,150 --> 00:57:57,090 don't have that incentive. 1165 00:57:57,090 --> 00:57:59,520 So they don't engage in such behavior. 1166 00:57:59,520 --> 00:58:01,900 As you might know, there's lots of dating going on. 1167 00:58:01,900 --> 00:58:06,180 So lots of MBA students are actively looking for a spouse 1168 00:58:06,180 --> 00:58:09,590 during their experience, both men and women. 1169 00:58:09,590 --> 00:58:12,620 Now in addition, they have field experiments with MBAs 1170 00:58:12,620 --> 00:58:17,180 in which they vary the expectation whether responses 1171 00:58:17,180 --> 00:58:20,390 in a real stake placement questionnaire 1172 00:58:20,390 --> 00:58:23,540 is going to be made public, in particular, whether it's going 1173 00:58:23,540 --> 00:58:26,690 to be observed by their peers. 1174 00:58:26,690 --> 00:58:30,620 And now female, single female students 1175 00:58:30,620 --> 00:58:33,890 reported or desired salary and willingness 1176 00:58:33,890 --> 00:58:38,420 to travel and work long hours on those questionnaires when 1177 00:58:38,420 --> 00:58:41,750 they expected their classmates to see their preferences. 1178 00:58:41,750 --> 00:58:45,230 That is to say, when these questions are public, 1179 00:58:45,230 --> 00:58:52,370 single female students reported lower ambition as manifested 1180 00:58:52,370 --> 00:58:54,830 by desired salaries and willingness to travel and work 1181 00:58:54,830 --> 00:58:56,960 hours, presumably because they want 1182 00:58:56,960 --> 00:59:00,300 to look more favorable on the marriage market. 1183 00:59:00,300 --> 00:59:03,780 Notably, other groups' responses were affected 1184 00:59:03,780 --> 00:59:06,658 by such peer observability. 1185 00:59:06,658 --> 00:59:08,700 So in other groups, it didn't matter whether it's 1186 00:59:08,700 --> 00:59:10,840 in public and in private. 1187 00:59:10,840 --> 00:59:12,580 In addition, there's a second experiment 1188 00:59:12,580 --> 00:59:16,780 that indicates the effects were driven by the observability 1189 00:59:16,780 --> 00:59:18,610 by a single male peers. 1190 00:59:18,610 --> 00:59:22,780 That is to say, if another woman saw the information, that 1191 00:59:22,780 --> 00:59:24,080 didn't matter so much. 1192 00:59:24,080 --> 00:59:26,770 But if this other single male peer 1193 00:59:26,770 --> 00:59:30,160 was likely or possibly going to see that information, 1194 00:59:30,160 --> 00:59:33,470 then these effects would show up. 1195 00:59:33,470 --> 00:59:37,840 So that's overall showing that, essentially, 1196 00:59:37,840 --> 00:59:41,900 what's perceived to be undesirable traits, 1197 00:59:41,900 --> 00:59:44,440 such as ambition, if that is perceived 1198 00:59:44,440 --> 00:59:49,000 to be damaging on the marriage market, that can then, in turn, 1199 00:59:49,000 --> 00:59:51,010 reduce ambition and then contributes 1200 00:59:51,010 --> 00:59:54,760 to the gender gap overall. 1201 00:59:59,990 --> 01:00:02,330 OK, let me tell you now about a final paper 1202 01:00:02,330 --> 01:00:07,470 on saying no and on demand and supply for different tasks. 1203 01:00:07,470 --> 01:00:09,470 This is a very nice paper by Vesterlund et. al. 1204 01:00:09,470 --> 01:00:13,050 And it asks the question whether women say no often enough. 1205 01:00:13,050 --> 01:00:15,440 And so that's motivated by the observation 1206 01:00:15,440 --> 01:00:20,030 that female faculty member spend fewer hours in research 1207 01:00:20,030 --> 01:00:23,450 and more hours on university survey committees 1208 01:00:23,450 --> 01:00:25,020 than male faculty. 1209 01:00:25,020 --> 01:00:27,770 They are also more likely to have positions 1210 01:00:27,770 --> 01:00:29,660 on university-wide committees. 1211 01:00:29,660 --> 01:00:31,460 They advise more undergrad students 1212 01:00:31,460 --> 01:00:33,770 and participate more in departments and college level 1213 01:00:33,770 --> 01:00:37,100 committees than male faculty. 1214 01:00:37,100 --> 01:00:39,500 More generally, in mid-level jobs, 1215 01:00:39,500 --> 01:00:41,930 men more than women evaluate the individual task 1216 01:00:41,930 --> 01:00:43,790 assignments as challenging. 1217 01:00:43,790 --> 01:00:46,940 And this is partially attributed to differential task 1218 01:00:46,940 --> 01:00:49,190 assignments by supervisors. 1219 01:00:49,190 --> 01:00:55,720 So if you sort of see these differences, 1220 01:00:55,720 --> 01:01:01,180 you might ask the question, why do women 1221 01:01:01,180 --> 01:01:04,810 decide to spend or spends their work time differently? 1222 01:01:04,810 --> 01:01:07,100 And there's two types of dimensions here. 1223 01:01:07,100 --> 01:01:10,100 One is demand and one is supply. 1224 01:01:10,100 --> 01:01:11,950 So demand is essentially the question 1225 01:01:11,950 --> 01:01:14,630 whether sex differences in the types of tasks 1226 01:01:14,630 --> 01:01:18,940 that women and men are asked to do at work is that [INAUDIBLE] 1227 01:01:18,940 --> 01:01:21,573 essentially what they're being asked. 1228 01:01:21,573 --> 01:01:22,990 And are women more likely than men 1229 01:01:22,990 --> 01:01:25,720 to be asked to do what's called non-promotable tasks, tasks 1230 01:01:25,720 --> 01:01:27,970 that are not really helpful for their career, 1231 01:01:27,970 --> 01:01:29,320 for getting promotions. 1232 01:01:29,320 --> 01:01:31,790 And then you might ask why that's the case. 1233 01:01:31,790 --> 01:01:33,610 Supply is are there sex differences 1234 01:01:33,610 --> 01:01:36,440 in the willingness to agree to perform non-promotable tasks 1235 01:01:36,440 --> 01:01:36,940 when asked? 1236 01:01:36,940 --> 01:01:38,565 [INAUDIBLE] conditional on being asked, 1237 01:01:38,565 --> 01:01:41,350 were women more likely to say yes. 1238 01:01:41,350 --> 01:01:46,240 And, again, are women more likely to demand to say yes 1239 01:01:46,240 --> 01:01:48,430 to non-promotable tasks. 1240 01:01:48,430 --> 01:01:50,350 Now why do we care? 1241 01:01:50,350 --> 01:01:53,180 Well, there's an individual decision-making perspective, 1242 01:01:53,180 --> 01:01:55,810 which is that people might make suboptimal decisions how 1243 01:01:55,810 --> 01:01:57,430 to allocate their work-- 1244 01:01:57,430 --> 01:01:58,360 time at work. 1245 01:01:58,360 --> 01:02:00,210 And understanding these underlying reasons 1246 01:02:00,210 --> 01:02:04,060 and [INAUDIBLE] understand these underlying reasons, 1247 01:02:04,060 --> 01:02:06,310 which will lead them potentially to some interventions 1248 01:02:06,310 --> 01:02:08,685 to improve decision-making for certain individuals, which 1249 01:02:08,685 --> 01:02:10,780 have a lot of what behavioral economics is. 1250 01:02:10,780 --> 01:02:13,900 But there's also managerial and social planner perspectives 1251 01:02:13,900 --> 01:02:17,230 that organizations may not be using their resources most 1252 01:02:17,230 --> 01:02:18,070 effectively. 1253 01:02:18,070 --> 01:02:22,390 And if you sort of reallocated some of the tasks, 1254 01:02:22,390 --> 01:02:24,988 that would increase output overall. 1255 01:02:24,988 --> 01:02:27,280 And then finally, there is a public policy perspective. 1256 01:02:27,280 --> 01:02:30,760 Well, if these sex differences in the allocation of time 1257 01:02:30,760 --> 01:02:34,570 explain vertical sex segregation, essentially women 1258 01:02:34,570 --> 01:02:36,250 not being promoted enough, then it 1259 01:02:36,250 --> 01:02:38,920 would help us sort of improve gender equity overall 1260 01:02:38,920 --> 01:02:43,220 and perhaps also try to reduce the gender gap. 1261 01:02:43,220 --> 01:02:45,920 Now what are these promotable and non-promotable tasks 1262 01:02:45,920 --> 01:02:47,540 in the [INAUDIBLE] academics? 1263 01:02:47,540 --> 01:02:49,900 A promotable task is a task that's doing research. 1264 01:02:49,900 --> 01:02:53,240 That's essentially what people are evaluated on eventually, 1265 01:02:53,240 --> 01:02:55,400 but also any other task, essentially, that's 1266 01:02:55,400 --> 01:02:58,190 seen where other people will reward you 1267 01:02:58,190 --> 01:02:59,630 for it when you do it. 1268 01:02:59,630 --> 01:03:01,800 And non-promotable tasks are essentially tasks 1269 01:03:01,800 --> 01:03:05,150 that often [INAUDIBLE],, many people could do the task 1270 01:03:05,150 --> 01:03:06,930 and everybody wants the task to be done, 1271 01:03:06,930 --> 01:03:09,320 yet everybody prefers somebody else to do it. 1272 01:03:09,320 --> 01:03:12,440 An example would be an ethics committee of the university. 1273 01:03:12,440 --> 01:03:14,000 Surely, we should have one. 1274 01:03:14,000 --> 01:03:15,500 But surely, sitting on the committee 1275 01:03:15,500 --> 01:03:17,150 will not help people get promoted 1276 01:03:17,150 --> 01:03:20,020 or get tenure when it comes to research. 1277 01:03:20,020 --> 01:03:23,950 Now the authors had to do a field study of faculty 1278 01:03:23,950 --> 01:03:25,330 at a large public university. 1279 01:03:25,330 --> 01:03:26,747 And they sent, essentially, emails 1280 01:03:26,747 --> 01:03:31,300 from the chair of the faculty senate asking them to volunteer 1281 01:03:31,300 --> 01:03:34,000 to join one of several university-wide faculty senate 1282 01:03:34,000 --> 01:03:34,990 committees. 1283 01:03:34,990 --> 01:03:37,360 And the clear answer in this study 1284 01:03:37,360 --> 01:03:42,500 is that women are much more likely to volunteer when asked. 1285 01:03:42,500 --> 01:03:44,490 Now, that is an interesting fact. 1286 01:03:44,490 --> 01:03:47,090 But it doesn't necessarily let us disentangle the demand 1287 01:03:47,090 --> 01:03:50,610 and supply explanation that I showed you previously. 1288 01:03:50,610 --> 01:03:52,550 So in addition then, the authors do 1289 01:03:52,550 --> 01:03:54,110 a lot of experiments in which they 1290 01:03:54,110 --> 01:03:58,190 do what's called the threshold public good scheme. 1291 01:03:58,190 --> 01:04:02,390 And this is sort of very much trying to mirror or resemble 1292 01:04:02,390 --> 01:04:03,890 the reality. 1293 01:04:03,890 --> 01:04:05,810 In this game, a small group that needs 1294 01:04:05,810 --> 01:04:08,180 to find a volunteer for a task. 1295 01:04:08,180 --> 01:04:12,320 And participants are anonymously matched into groups of three. 1296 01:04:12,320 --> 01:04:16,580 They're randomly every match for each of 10 rounds. 1297 01:04:16,580 --> 01:04:19,010 The game is set up such that everyone 1298 01:04:19,010 --> 01:04:21,050 prefers that the task be undertaken by someone 1299 01:04:21,050 --> 01:04:22,970 other than themselves. 1300 01:04:22,970 --> 01:04:25,500 People get two minutes to decide whether to invest. 1301 01:04:25,500 --> 01:04:27,920 And so the task is essentially you have to click a button. 1302 01:04:27,920 --> 01:04:29,780 And then you invest the money. 1303 01:04:29,780 --> 01:04:31,270 Only one person can invest. 1304 01:04:31,270 --> 01:04:33,290 And the round ends when somebody invests. 1305 01:04:33,290 --> 01:04:36,560 If no one invests, group members all earn $1. 1306 01:04:36,560 --> 01:04:39,800 If a person invests, that person earns 1307 01:04:39,800 --> 01:04:43,970 $1.25 and the remaining group members earn $2. 1308 01:04:43,970 --> 01:04:46,040 The clock ticks down until one person invests 1309 01:04:46,040 --> 01:04:48,235 or no investment is made in two minutes. 1310 01:04:48,235 --> 01:04:49,610 So the game is essentially set up 1311 01:04:49,610 --> 01:04:53,135 in a way such that everybody wants that somebody-- 1312 01:04:53,135 --> 01:04:54,950 and it's clear that somebody should invest. 1313 01:04:54,950 --> 01:04:57,158 But nobody wants to invest because if you invest, you 1314 01:04:57,158 --> 01:04:58,017 only $1.25. 1315 01:04:58,017 --> 01:05:00,350 But if you don't invest or somebody else invests and you 1316 01:05:00,350 --> 01:05:03,290 don't invest, you get $2. 1317 01:05:03,290 --> 01:05:05,780 Now in this game, women are significantly more 1318 01:05:05,780 --> 01:05:06,717 likely to invest. 1319 01:05:06,717 --> 01:05:09,050 We can essentially see the probability of investing here 1320 01:05:09,050 --> 01:05:10,730 on the left side by round. 1321 01:05:10,730 --> 01:05:13,310 And the red line, which is the upper line, the woman 1322 01:05:13,310 --> 01:05:15,507 line, the female line is way, way higher 1323 01:05:15,507 --> 01:05:16,590 compared to the male line. 1324 01:05:16,590 --> 01:05:18,140 So there's much higher probability 1325 01:05:18,140 --> 01:05:19,930 of investing in this game. 1326 01:05:19,930 --> 01:05:22,610 Now why are women more likely to invest? 1327 01:05:22,610 --> 01:05:24,050 Is it the case that women believe 1328 01:05:24,050 --> 01:05:26,660 that their cooperation is necessary for an optimal group 1329 01:05:26,660 --> 01:05:29,300 decision, but men believe that their cooperation isn't 1330 01:05:29,300 --> 01:05:29,820 required. 1331 01:05:29,820 --> 01:05:33,020 So it could be that women think that, well, somebody-- 1332 01:05:33,020 --> 01:05:34,937 if I don't do it, nobody else does it. 1333 01:05:34,937 --> 01:05:37,520 And then men essentially think, well, somebody else can do it, 1334 01:05:37,520 --> 01:05:40,560 in particular, women could do it if they're in the group. 1335 01:05:40,560 --> 01:05:44,970 So then the authors do the same experiment again using 1336 01:05:44,970 --> 01:05:51,300 single sex sessions where essentially all three 1337 01:05:51,300 --> 01:05:54,300 people in the session are female or all people in the session 1338 01:05:54,300 --> 01:05:55,495 are male. 1339 01:05:55,495 --> 01:05:57,120 And then what they find is essentially, 1340 01:05:57,120 --> 01:06:01,050 interestingly, these differences in gender now go away entirely. 1341 01:06:01,050 --> 01:06:05,340 There's no gender differences in same sex sessions anymore. 1342 01:06:05,340 --> 01:06:09,300 So just to summarize, so experiment one, 1343 01:06:09,300 --> 01:06:12,147 when there's a mixed gender, women 1344 01:06:12,147 --> 01:06:13,230 are more likely to invest. 1345 01:06:13,230 --> 01:06:15,150 And experiment two, women and men 1346 01:06:15,150 --> 01:06:17,280 are equally likely to invest. 1347 01:06:17,280 --> 01:06:19,980 Now, one question you might ask is, well, 1348 01:06:19,980 --> 01:06:22,410 is it the case that really, isn't this really 1349 01:06:22,410 --> 01:06:23,170 about beliefs? 1350 01:06:23,170 --> 01:06:26,290 So it could be that there's no, in fact, 1351 01:06:26,290 --> 01:06:28,290 no differences in the preferences for investing. 1352 01:06:28,290 --> 01:06:31,050 So it's not like women are necessarily nicer than men. 1353 01:06:31,050 --> 01:06:33,330 But, rather, there are differences in beliefs 1354 01:06:33,330 --> 01:06:35,242 that the women will invest. 1355 01:06:35,242 --> 01:06:37,200 in particular, if there's a woman in the group, 1356 01:06:37,200 --> 01:06:40,080 everybody else will think, oh, the woman is going to invest. 1357 01:06:40,080 --> 01:06:43,530 And the woman might think like, well, if others don't invest 1358 01:06:43,530 --> 01:06:45,648 or others will not invest, and she 1359 01:06:45,648 --> 01:06:47,190 might believe that, she might have to 1360 01:06:47,190 --> 01:06:50,460 or think that she needs to do that herself, otherwise 1361 01:06:50,460 --> 01:06:53,890 everybody will end up with $1 and less money. 1362 01:06:53,890 --> 01:06:56,970 So now there's a third experiment here 1363 01:06:56,970 --> 01:07:01,500 where people are ask to essentially pick 1364 01:07:01,500 --> 01:07:04,080 whom they would like to ask. 1365 01:07:04,080 --> 01:07:06,550 And so there's four people per group. 1366 01:07:06,550 --> 01:07:07,740 And three people can invest. 1367 01:07:07,740 --> 01:07:09,240 And the incentives are [INAUDIBLE].. 1368 01:07:09,240 --> 01:07:11,323 These are like-- they're called the green players. 1369 01:07:11,323 --> 01:07:14,100 And one person is unable to invest, 1370 01:07:14,100 --> 01:07:17,400 but asks one of the three to invest. 1371 01:07:17,400 --> 01:07:20,010 Importantly, the requests are not even binding. 1372 01:07:20,010 --> 01:07:20,970 That's the red player. 1373 01:07:20,970 --> 01:07:23,400 The red player, essentially, once the [INAUDIBLE] happens, 1374 01:07:23,400 --> 01:07:25,890 you incentivize that that happens. 1375 01:07:25,890 --> 01:07:27,760 So this is kind of what this looks like. 1376 01:07:27,760 --> 01:07:29,760 You would get like three different players here. 1377 01:07:29,760 --> 01:07:32,977 And you can essentially decide who you would like to trust. 1378 01:07:32,977 --> 01:07:35,310 And it's set up, of course, that you can see the gender. 1379 01:07:35,310 --> 01:07:38,470 Some are male and some are female. 1380 01:07:38,470 --> 01:07:41,520 Now women are way more likely to be asked. 1381 01:07:41,520 --> 01:07:44,220 This is the total times asked to invest 1382 01:07:44,220 --> 01:07:45,688 for male and female players. 1383 01:07:45,688 --> 01:07:46,980 This is the relative frequency. 1384 01:07:46,980 --> 01:07:49,200 And essentially, you can see their distribution here 1385 01:07:49,200 --> 01:07:54,090 is way shifted to the right for women compared to men. 1386 01:07:54,090 --> 01:07:57,030 Now is it better to ask a woman? 1387 01:07:57,030 --> 01:07:59,790 Well, absent a request, the investment rate actually 1388 01:07:59,790 --> 01:08:01,680 does not differ by gender. 1389 01:08:01,680 --> 01:08:05,170 But when asked to invest, women are more likely to comply. 1390 01:08:05,170 --> 01:08:08,040 So when the woman is asked to invest, 76% of women 1391 01:08:08,040 --> 01:08:10,440 invest compared to 14% when not asked. 1392 01:08:10,440 --> 01:08:13,350 In contrast, men are also more likely to invest when asked, 1393 01:08:13,350 --> 01:08:16,930 but it's only 51$ compared to 14% when not asked. 1394 01:08:16,930 --> 01:08:20,700 So the marginal increase of being asked is higher for women 1395 01:08:20,700 --> 01:08:21,731 than for men. 1396 01:08:21,731 --> 01:08:23,939 So if you're thinking about, like whom should we ask? 1397 01:08:23,939 --> 01:08:25,890 Well, you kind of want to ask a woman 1398 01:08:25,890 --> 01:08:30,300 because a woman is more likely to comply with your request. 1399 01:08:30,300 --> 01:08:32,430 And so now if women-- since women 1400 01:08:32,430 --> 01:08:37,723 are more likely to be expected to say yes, they also ask more. 1401 01:08:37,723 --> 01:08:40,140 So in that sort of sense, you know, the gender differences 1402 01:08:40,140 --> 01:08:43,979 get amplified by increased demand for women to contribute. 1403 01:08:43,979 --> 01:08:46,359 Well, particularly, if there's a man and a woman where 1404 01:08:46,359 --> 01:08:48,750 both can think about like one of them could do it, 1405 01:08:48,750 --> 01:08:51,090 the man will think, well, the woman will do it anyway. 1406 01:08:51,090 --> 01:08:53,507 The woman will think, well, the man is not going to do it. 1407 01:08:53,507 --> 01:08:55,720 And he's going to think that I will do it anyway. 1408 01:08:55,720 --> 01:08:58,740 And so then the woman ends up doing it more. 1409 01:08:58,740 --> 01:09:02,609 In addition, when there's an opportunity to ask somebody, 1410 01:09:02,609 --> 01:09:04,500 but asking is kind of costly and you just 1411 01:09:04,500 --> 01:09:06,390 want to find somebody who will do it. 1412 01:09:06,390 --> 01:09:07,450 But who will you ask? 1413 01:09:07,450 --> 01:09:08,310 You ask the woman. 1414 01:09:08,310 --> 01:09:13,850 Because she's more likely to comply, more likely to say yes. 1415 01:09:13,850 --> 01:09:17,340 Let me sort of summarize what we discussed. 1416 01:09:17,340 --> 01:09:21,020 So first is large gender, wage, and earnings gaps. 1417 01:09:21,020 --> 01:09:24,564 They have been reduced due to technological advances 1418 01:09:24,564 --> 01:09:25,939 and other improvements over time. 1419 01:09:25,939 --> 01:09:30,950 But there are still persistent gender differences in the US 1420 01:09:30,950 --> 01:09:32,689 and many other countries. 1421 01:09:32,689 --> 01:09:35,120 Bias beliefs and identity concerns 1422 01:09:35,120 --> 01:09:40,729 play a major role in explaining these differences. 1423 01:09:40,729 --> 01:09:42,740 In addition, there's some feedback mechanisms 1424 01:09:42,740 --> 01:09:45,740 between the demand and supply of non-promotable tasks 1425 01:09:45,740 --> 01:09:48,413 that could be, potentially quite important. 1426 01:09:48,413 --> 01:09:50,330 Of course, I showed you only a lab experiment. 1427 01:09:50,330 --> 01:09:51,788 but these could be really important 1428 01:09:51,788 --> 01:09:53,550 in real-world situations. 1429 01:09:53,550 --> 01:09:55,460 So not only is it that what matters 1430 01:09:55,460 --> 01:09:58,040 is what women decide to do, but also what they're 1431 01:09:58,040 --> 01:10:00,650 asked to do by others, in part as a response 1432 01:10:00,650 --> 01:10:05,060 to their propensity to say yes or no to certain tasks. 1433 01:10:05,060 --> 01:10:06,620 Better understanding these issues 1434 01:10:06,620 --> 01:10:08,622 can help us mitigate the gender gap. 1435 01:10:08,622 --> 01:10:10,330 And we can sort of then, by understanding 1436 01:10:10,330 --> 01:10:12,122 these issues better and particular beliefs, 1437 01:10:12,122 --> 01:10:16,970 you might be able to improve or correct biased beliefs 1438 01:10:16,970 --> 01:10:21,600 and, therefore, close the gender gap potential. 1439 01:10:21,600 --> 01:10:24,240 What's next in the next few lectures, lecture 19, 1440 01:10:24,240 --> 01:10:26,220 we'll talk about frames, defaults, nudges, 1441 01:10:26,220 --> 01:10:27,360 and mental accounting. 1442 01:10:27,360 --> 01:10:29,870 Please read Madrian and Shea 2001. 1443 01:10:29,870 --> 01:10:31,680 And lecture 20, we'll look about-- talk 1444 01:10:31,680 --> 01:10:36,060 about malleability and inaccessability of preferences. 1445 01:10:36,060 --> 01:10:39,930 Please read Ariely 2003 for that. 1446 01:10:39,930 --> 01:10:41,470 That was all I have to say. 1447 01:10:41,470 --> 01:10:43,520 Thank you so much.