1 00:00:00,000 --> 00:00:02,520 The following content is provided under a Creative 2 00:00:02,520 --> 00:00:03,970 Commons license. 3 00:00:03,970 --> 00:00:06,360 Your support will help MIT OpenCourseWare 4 00:00:06,360 --> 00:00:10,660 continue to offer high quality educational resources for free. 5 00:00:10,660 --> 00:00:13,320 To make a donation or view additional materials 6 00:00:13,320 --> 00:00:16,670 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:16,670 --> 00:00:18,080 at ocw.mit.edu. 8 00:00:26,460 --> 00:00:28,200 PROFESSOR: We have the privilege today 9 00:00:28,200 --> 00:00:30,300 of hearing from Professor Susan Silbey. 10 00:00:30,300 --> 00:00:33,120 Professor Silbey taught this class 11 00:00:33,120 --> 00:00:37,230 with Professor [? Dysart ?] and me in the last two years. 12 00:00:37,230 --> 00:00:42,690 And she is a sociologist, political scientist, 13 00:00:42,690 --> 00:00:46,920 anthropologist, head of the anthropology section. 14 00:00:46,920 --> 00:00:50,220 And obviously brings a different perspective 15 00:00:50,220 --> 00:00:56,010 to a number of these issues than I, a poor country economist do. 16 00:00:56,010 --> 00:00:58,710 And as I said last time, I showed 17 00:00:58,710 --> 00:01:01,980 you a little bit of the economics of energy demand, 18 00:01:01,980 --> 00:01:05,220 and then suggested at the end of the lecture, 19 00:01:05,220 --> 00:01:08,950 by talking about dollars on the sidewalk 20 00:01:08,950 --> 00:01:12,240 and what picked up in energy efficiency on the one hand, 21 00:01:12,240 --> 00:01:15,330 and people aggressively investing in energy 22 00:01:15,330 --> 00:01:17,800 efficiency on the other hand. 23 00:01:17,800 --> 00:01:21,720 Perhaps, the notion of tangible energy services 24 00:01:21,720 --> 00:01:26,490 wasn't quite rich enough to explain energy demand. 25 00:01:26,490 --> 00:01:29,880 And now, it will all be explained. 26 00:01:29,880 --> 00:01:35,590 SUSAN SILBEY: Well, Hello, it's rather a tall order. 27 00:01:35,590 --> 00:01:37,230 But we'll try. 28 00:01:37,230 --> 00:01:41,460 On Monday, Dick talked to you about the economics 29 00:01:41,460 --> 00:01:43,260 of energy demand. 30 00:01:43,260 --> 00:01:47,490 He talked about how demand for energy 31 00:01:47,490 --> 00:01:52,170 is derived from demand for products and services. 32 00:01:52,170 --> 00:01:54,960 And so we talk about it as mediated 33 00:01:54,960 --> 00:01:58,290 or derived demand, which requires some adjustments 34 00:01:58,290 --> 00:02:01,410 in the basic economic models. 35 00:02:01,410 --> 00:02:04,320 He also mentioned, I believe, that variations 36 00:02:04,320 --> 00:02:10,020 in derived demand depend upon fixed capital investments, as 37 00:02:10,020 --> 00:02:14,640 well as short and long run expectations. 38 00:02:14,640 --> 00:02:20,160 And that, because of those short and long run expectations, 39 00:02:20,160 --> 00:02:23,250 income and price elasticities are generally 40 00:02:23,250 --> 00:02:27,240 limited, unlike in other products and services, 41 00:02:27,240 --> 00:02:29,970 and thus, constraining the ability 42 00:02:29,970 --> 00:02:34,060 to maximize with efficient responses. 43 00:02:34,060 --> 00:02:35,940 And as Dick just mentioned again, 44 00:02:35,940 --> 00:02:41,130 people do not always maximize or optimize, let me say, 45 00:02:41,130 --> 00:02:42,180 their decisions. 46 00:02:42,180 --> 00:02:44,160 There's lots of possible efficiencies 47 00:02:44,160 --> 00:02:45,240 that are not pursued. 48 00:02:45,240 --> 00:02:48,990 And he mentioned, I believe, on Monday, the leaving 49 00:02:48,990 --> 00:02:51,750 those dollars on the sidewalk. 50 00:02:51,750 --> 00:02:54,990 Because of, perhaps, imperfect information, 51 00:02:54,990 --> 00:02:58,830 riskiness and uncertainty of future savings, 52 00:02:58,830 --> 00:03:01,470 could be a lot of other reasons too. 53 00:03:01,470 --> 00:03:03,780 And as he mentioned again, just now, 54 00:03:03,780 --> 00:03:06,630 people might be over investing. 55 00:03:06,630 --> 00:03:09,540 Now, let's think about what that means. 56 00:03:09,540 --> 00:03:13,380 To over invest in efficiency means, by economic models, 57 00:03:13,380 --> 00:03:19,050 you're not going to get a specific dollar return, 58 00:03:19,050 --> 00:03:20,640 the Prius being example. 59 00:03:20,640 --> 00:03:25,710 But there might be other reasons why people might do it. 60 00:03:25,710 --> 00:03:32,310 That is go for more efficiency than is defined economically. 61 00:03:32,310 --> 00:03:37,200 So why do we see economically non-optimal decisions? 62 00:03:37,200 --> 00:03:40,200 That's the $64,000 question. 63 00:03:40,200 --> 00:03:42,090 How do people act? 64 00:03:42,090 --> 00:03:45,040 Why do people do what they do? 65 00:03:45,040 --> 00:03:47,730 Why are they not, in the economic model, 66 00:03:47,730 --> 00:03:50,910 perfectly rational actors? 67 00:03:50,910 --> 00:03:54,000 Now, it is the mission of all of social science 68 00:03:54,000 --> 00:03:58,050 to try to describe human behavior. 69 00:03:58,050 --> 00:04:02,550 But the explanations that come from economics 70 00:04:02,550 --> 00:04:05,280 are not necessarily the same that come 71 00:04:05,280 --> 00:04:07,380 from the other social sciences. 72 00:04:07,380 --> 00:04:11,970 How many of you have had a course in economics 73 00:04:11,970 --> 00:04:14,230 before this one? 74 00:04:14,230 --> 00:04:17,408 Oh, lots of you, because it's required for this course. 75 00:04:17,408 --> 00:04:17,950 That's right. 76 00:04:17,950 --> 00:04:19,390 But I didn't see every hand. 77 00:04:19,390 --> 00:04:23,890 I was wondering how accurate this survey was going to be. 78 00:04:23,890 --> 00:04:28,720 How many of you have had a political science class? 79 00:04:28,720 --> 00:04:29,840 Let me see those hands. 80 00:04:29,840 --> 00:04:34,720 I can't see, one, two, three, four, five, six. 81 00:04:34,720 --> 00:04:37,780 How many-- oh did I miss some over here? 82 00:04:37,780 --> 00:04:38,860 Seven, eight. 83 00:04:38,860 --> 00:04:41,935 OK, how many have had a history class? 84 00:04:44,700 --> 00:04:46,710 Wow, that's pretty good. 85 00:04:46,710 --> 00:04:52,470 How many have had an anthropology class? 86 00:04:52,470 --> 00:04:53,930 Zero, right? 87 00:04:53,930 --> 00:04:56,790 Oh, one, two. 88 00:04:56,790 --> 00:05:01,170 And we don't offer sociology here for undergraduates, only 89 00:05:01,170 --> 00:05:04,570 really for graduate students. 90 00:05:04,570 --> 00:05:09,830 So this-- why do people do what they do? 91 00:05:09,830 --> 00:05:13,680 It turns out that each of the social sciences 92 00:05:13,680 --> 00:05:17,370 has a slightly different answer. 93 00:05:17,370 --> 00:05:21,000 And so by way of, well, for those of you who 94 00:05:21,000 --> 00:05:25,380 have had a political science class, how is it 95 00:05:25,380 --> 00:05:27,360 different from economics? 96 00:05:31,330 --> 00:05:34,052 Nobody wants-- Yeah, Andrew? 97 00:05:34,052 --> 00:05:36,890 AUDIENCE: No, equations and stuff like that. 98 00:05:36,890 --> 00:05:39,540 SUSAN SILBEY: Well, that turns out not to be true. 99 00:05:39,540 --> 00:05:42,680 It just isn't in the-- 100 00:05:42,680 --> 00:05:45,080 it's just not in the undergraduate class. 101 00:05:45,080 --> 00:05:46,800 There are plenty of equations. 102 00:05:46,800 --> 00:05:49,340 There are plenty of equations in sociology, too, 103 00:05:49,340 --> 00:05:51,660 not in anthropology. 104 00:05:51,660 --> 00:05:54,510 OK, so maybe I need to do this. 105 00:05:54,510 --> 00:05:57,020 So if you think of the social sciences 106 00:05:57,020 --> 00:06:02,130 as the study of human behavior, there is a division of labor. 107 00:06:02,130 --> 00:06:05,510 Now, this division of labor can be simply explained, 108 00:06:05,510 --> 00:06:10,190 but gets a little complicated when you go into detail. 109 00:06:10,190 --> 00:06:14,570 So we might think of the most basic and simplest 110 00:06:14,570 --> 00:06:18,320 social science as psychology, which we almost 111 00:06:18,320 --> 00:06:20,540 don't have anymore, which is about 112 00:06:20,540 --> 00:06:24,950 what goes on inside people's heads. 113 00:06:24,950 --> 00:06:29,810 Psychology being the study of a person, 114 00:06:29,810 --> 00:06:36,380 and explaining what this person does by looking inside. 115 00:06:36,380 --> 00:06:42,020 And we have various terms for talking about what's inside 116 00:06:42,020 --> 00:06:45,410 people's heads, their desires, we talk about the id, which 117 00:06:45,410 --> 00:06:49,550 is the desires you have, and the ego, which is the control, 118 00:06:49,550 --> 00:06:53,910 and the notion of a constraint on what you do. 119 00:06:53,910 --> 00:06:55,190 But then we have-- 120 00:06:55,190 --> 00:07:00,980 and I would put psychology by itself as different 121 00:07:00,980 --> 00:07:03,590 from the rest of the social sciences. 122 00:07:03,590 --> 00:07:08,360 And then we have sociology, which 123 00:07:08,360 --> 00:07:11,690 says that it is interested in what 124 00:07:11,690 --> 00:07:17,930 goes on between two or more persons, never a person alone. 125 00:07:17,930 --> 00:07:22,610 And the unit of analysis, or the subject, is the interaction. 126 00:07:22,610 --> 00:07:26,840 Not what's in here, but what is exchanged 127 00:07:26,840 --> 00:07:28,800 between the two of them. 128 00:07:28,800 --> 00:07:31,160 Now, some people would say that sociology 129 00:07:31,160 --> 00:07:33,680 is the queen, because everything follows 130 00:07:33,680 --> 00:07:36,260 from these interactions. 131 00:07:36,260 --> 00:07:39,590 Because the interactions can take various patterns. 132 00:07:39,590 --> 00:07:44,750 So they can be interactions about production 133 00:07:44,750 --> 00:07:47,390 and distribution. 134 00:07:47,390 --> 00:07:49,412 And then we get economics. 135 00:07:52,550 --> 00:07:55,400 If you can think about it as the interactions 136 00:07:55,400 --> 00:07:58,130 where some people are making things 137 00:07:58,130 --> 00:08:00,500 that other people are receiving. 138 00:08:00,500 --> 00:08:04,400 And it happens through this thing called the market. 139 00:08:04,400 --> 00:08:08,330 Which is sort of an invisible black box. 140 00:08:08,330 --> 00:08:11,540 Then we might have interactions that have 141 00:08:11,540 --> 00:08:14,870 to do with force, violence. 142 00:08:14,870 --> 00:08:17,090 And when that's concentrated, we call 143 00:08:17,090 --> 00:08:22,160 that the state who has a monopoly of force. 144 00:08:22,160 --> 00:08:24,050 And that's what political science studies. 145 00:08:27,850 --> 00:08:32,650 So here, we have four social sciences 146 00:08:32,650 --> 00:08:41,020 leaving the study of the circulation of the signs. 147 00:08:41,020 --> 00:08:42,669 This one's more complicated. 148 00:08:42,669 --> 00:08:44,230 That's probably why only two of you 149 00:08:44,230 --> 00:08:47,470 took the classes, because you don't understand what it is. 150 00:08:47,470 --> 00:08:53,140 But it's the circulation of the signs and symbols among people. 151 00:08:53,140 --> 00:08:56,350 And we call that culture. 152 00:08:56,350 --> 00:08:59,320 And that's what anthropology studies, 153 00:08:59,320 --> 00:09:01,600 the practices, and the production, 154 00:09:01,600 --> 00:09:04,510 and circulation of signs. 155 00:09:04,510 --> 00:09:07,330 So there you have the social sciences. 156 00:09:07,330 --> 00:09:10,720 Now, that's a very simple, easy distinction. 157 00:09:10,720 --> 00:09:16,660 But like most things in life, that's too simple. 158 00:09:16,660 --> 00:09:21,580 And in any place-- 159 00:09:21,580 --> 00:09:23,470 oh, I forgot, you're not supposed to. 160 00:09:23,470 --> 00:09:27,850 Oh, I forgot, I forgot, I forgot. 161 00:09:27,850 --> 00:09:29,470 No human power here. 162 00:09:32,920 --> 00:09:41,500 OK, so we have political science, which 163 00:09:41,500 --> 00:09:46,270 studies force and the state. 164 00:09:46,270 --> 00:09:50,480 But it overlaps with some sociology, 165 00:09:50,480 --> 00:09:52,630 which looks at the patterns of interactions. 166 00:09:52,630 --> 00:09:55,300 And there's something called political sociology. 167 00:09:55,300 --> 00:10:01,150 And then some of sociology studies production 168 00:10:01,150 --> 00:10:02,350 and distribution. 169 00:10:02,350 --> 00:10:05,050 And we call that economic sociology. 170 00:10:05,050 --> 00:10:10,690 And some of economics is now interested in how 171 00:10:10,690 --> 00:10:15,490 people's desires and behaviors affect their decision making. 172 00:10:15,490 --> 00:10:18,820 We call that behavioral economics. 173 00:10:18,820 --> 00:10:23,480 And that overlaps with psychology and anthropology. 174 00:10:23,480 --> 00:10:25,000 So you might just think about what 175 00:10:25,000 --> 00:10:29,530 is actually studied as having normal distributions that 176 00:10:29,530 --> 00:10:30,640 overlap. 177 00:10:30,640 --> 00:10:32,920 And depending-- lots of people are 178 00:10:32,920 --> 00:10:37,990 studying in these places, which we call interdisciplinary. 179 00:10:37,990 --> 00:10:40,480 And that's how you get a class like this. 180 00:10:40,480 --> 00:10:47,650 OK, so why do people, or how do people-- 181 00:10:47,650 --> 00:10:53,950 how do people deal with these questions about energy? 182 00:10:53,950 --> 00:10:56,270 How do they make the decisions? 183 00:10:56,270 --> 00:11:01,090 So what I'm going to try to do today, and this is my outline, 184 00:11:01,090 --> 00:11:05,200 is to go through some basic models, 185 00:11:05,200 --> 00:11:09,430 some very general models, about how people make decisions, 186 00:11:09,430 --> 00:11:13,960 the variations and the motives, we might say, 187 00:11:13,960 --> 00:11:16,480 or the vocabularies of motive. 188 00:11:16,480 --> 00:11:19,300 I'm going to look at some results 189 00:11:19,300 --> 00:11:24,010 from cognitive science and sociology of decision making, 190 00:11:24,010 --> 00:11:26,380 going to look at a few experiments. 191 00:11:26,380 --> 00:11:29,320 And then, maybe, we can end, if time permits, 192 00:11:29,320 --> 00:11:32,890 with some discussion of what happens 193 00:11:32,890 --> 00:11:35,590 when we focus on the individual and the ironies 194 00:11:35,590 --> 00:11:36,970 of popular psychology. 195 00:11:36,970 --> 00:11:38,470 I'm not sure we'll get there. 196 00:11:38,470 --> 00:11:40,510 In future classes, I know you are 197 00:11:40,510 --> 00:11:43,480 going to discuss decision making in firms 198 00:11:43,480 --> 00:11:47,530 and organizations, which are aggregates, patterned 199 00:11:47,530 --> 00:11:48,640 aggregates. 200 00:11:48,640 --> 00:11:51,730 Not one, but the pattern by which people are 201 00:11:51,730 --> 00:11:54,350 arranged in their activities. 202 00:11:54,350 --> 00:11:56,590 Those are organizations and firms. 203 00:11:56,590 --> 00:11:59,450 And we will look at states, or you will, 204 00:11:59,450 --> 00:12:03,250 in which we make public policies. 205 00:12:03,250 --> 00:12:10,580 So how do people make decisions? 206 00:12:10,580 --> 00:12:15,400 How do we make decisions about right and wrong? 207 00:12:15,400 --> 00:12:17,620 What kind of car to buy? 208 00:12:17,620 --> 00:12:22,810 Why are people buying that Prius even though they'll never 209 00:12:22,810 --> 00:12:23,875 recoup the savings? 210 00:12:29,520 --> 00:12:35,790 Why do we think that skinny women are beautiful now? 211 00:12:35,790 --> 00:12:38,550 And 100 years ago, we thought that heavy women 212 00:12:38,550 --> 00:12:40,380 were beautiful? 213 00:12:40,380 --> 00:12:43,616 Maybe none of you know that that was the case? 214 00:12:43,616 --> 00:12:45,270 AUDIENCE: Advertising. 215 00:12:45,270 --> 00:12:47,210 SUSAN SILBEY: OK, advertising. 216 00:12:56,250 --> 00:12:58,530 Why might that-- but what about the advertising? 217 00:12:58,530 --> 00:13:01,350 What did the advertising say? 218 00:13:01,350 --> 00:13:03,660 And was it advertising 100 years ago? 219 00:13:06,560 --> 00:13:08,140 Yeah. 220 00:13:08,140 --> 00:13:10,930 AUDIENCE: There's some predominately-- 221 00:13:10,930 --> 00:13:12,940 fashion advertising of the '50s and '60s 222 00:13:12,940 --> 00:13:17,185 switched from more volumptuous sexual icon 223 00:13:17,185 --> 00:13:19,050 to something really skinny. 224 00:13:19,050 --> 00:13:20,700 There's one person in particular named 225 00:13:20,700 --> 00:13:25,100 Twiggy who kind of became a very famous, very skinny fashion 226 00:13:25,100 --> 00:13:25,600 model. 227 00:13:25,600 --> 00:13:27,370 And everyhting from that kind of followed. 228 00:13:27,370 --> 00:13:29,530 SUSAN SILBEY: Yeah, but it was in the 19th century, 229 00:13:29,530 --> 00:13:31,360 before there was advertising. 230 00:13:31,360 --> 00:13:34,014 Yes, David. 231 00:13:34,014 --> 00:13:37,140 AUDIENCE: I'd say there's also a change in the signaling 232 00:13:37,140 --> 00:13:41,440 about what you're buying. 233 00:13:41,440 --> 00:13:44,300 If you go further back, there are 234 00:13:44,300 --> 00:13:47,680 depictions of larger, heavier people as being wealthy, 235 00:13:47,680 --> 00:13:50,000 which is something [INAUDIBLE]. 236 00:13:50,000 --> 00:13:52,120 Which was, you didn't have to work in fields. 237 00:13:52,120 --> 00:13:53,620 You had enough to eat. 238 00:13:53,620 --> 00:13:55,975 [INAUDIBLE] 239 00:13:55,975 --> 00:13:57,100 SUSAN SILBEY: That's right. 240 00:13:57,100 --> 00:14:01,840 And the signaling was, in the 19th century, 241 00:14:01,840 --> 00:14:04,990 to have heft on you meant you were wealthy enough 242 00:14:04,990 --> 00:14:06,010 to eat well. 243 00:14:06,010 --> 00:14:08,090 Because most people did not. 244 00:14:08,090 --> 00:14:10,520 So this is just a little exercise. 245 00:14:10,520 --> 00:14:12,430 And this is about culture. 246 00:14:12,430 --> 00:14:16,660 This is about the circulating signs, the signaling. 247 00:14:16,660 --> 00:14:18,820 So you've all got it. 248 00:14:18,820 --> 00:14:19,540 You got it. 249 00:14:19,540 --> 00:14:23,740 The culture encourages us to want certain things 250 00:14:23,740 --> 00:14:26,170 to think their beautiful. 251 00:14:26,170 --> 00:14:27,550 Is there anything else? 252 00:14:27,550 --> 00:14:29,590 None of you believe-- 253 00:14:29,590 --> 00:14:31,140 AUDIENCE: Education as well. 254 00:14:31,140 --> 00:14:35,300 SUSAN SILBEY: You're all so socialized, it's fantastic. 255 00:14:35,300 --> 00:14:38,380 AUDIENCE: If we put that slide we had up on Monday 256 00:14:38,380 --> 00:14:41,020 that showed you that you're not going to make your money back 257 00:14:41,020 --> 00:14:43,103 from buying Prius, I think that would deter people 258 00:14:43,103 --> 00:14:44,080 from buying them. 259 00:14:44,080 --> 00:14:48,730 And similarly with things like obesity in the US, 260 00:14:48,730 --> 00:14:53,470 while being as skinny as Twiggy might not be healthy, 261 00:14:53,470 --> 00:14:55,540 generally, more fit would be more healthy 262 00:14:55,540 --> 00:14:57,620 due to the knowledge we have now we might not 263 00:14:57,620 --> 00:14:59,470 have had 100 years ago. 264 00:14:59,470 --> 00:15:01,570 SUSAN SILBEY: That's right, that's right. 265 00:15:01,570 --> 00:15:02,480 You've got it. 266 00:15:02,480 --> 00:15:05,860 But there's another set of reasons people often offer. 267 00:15:05,860 --> 00:15:07,720 I'll go back. 268 00:15:07,720 --> 00:15:09,580 Well, you know the other reasons? 269 00:15:09,580 --> 00:15:14,200 None of you have any other reasons why we do what we do? 270 00:15:14,200 --> 00:15:17,260 Yeah, is that Matthew? 271 00:15:17,260 --> 00:15:21,050 AUDIENCE: They saw other people doing the same thing. 272 00:15:21,050 --> 00:15:26,050 [INAUDIBLE] 273 00:15:26,050 --> 00:15:29,740 SUSAN SILBEY: These are all sociological and 274 00:15:29,740 --> 00:15:31,090 anthropological reasons. 275 00:15:31,090 --> 00:15:34,630 And I just love it, as being head of that department. 276 00:15:34,630 --> 00:15:38,620 But I usually get, not necessarily in this class, 277 00:15:38,620 --> 00:15:40,970 I think you've all learned quite well, 278 00:15:40,970 --> 00:15:46,600 many students tell me that they are programmed 279 00:15:46,600 --> 00:15:50,320 to want and do certain things. 280 00:15:50,320 --> 00:15:52,830 And I'm so surprised MIT students 281 00:15:52,830 --> 00:15:53,830 not coming up with that. 282 00:15:53,830 --> 00:15:57,280 Because when I teach sociology or anthropology class, 283 00:15:57,280 --> 00:16:00,790 this is the one that comes up first. 284 00:16:00,790 --> 00:16:02,857 None of you believe that, anymore? 285 00:16:05,780 --> 00:16:08,902 It's not in your DNA. 286 00:16:08,902 --> 00:16:10,390 Fantastic, I love it. 287 00:16:10,390 --> 00:16:10,890 I love it. 288 00:16:13,700 --> 00:16:14,665 I think it's great. 289 00:16:18,130 --> 00:16:20,223 Yeah? 290 00:16:20,223 --> 00:16:22,390 AUDIENCE: I mean, there's evolutionary psychologists 291 00:16:22,390 --> 00:16:26,640 who would argue that we're attracted to certain things 292 00:16:26,640 --> 00:16:29,035 based on reproductive purposes. 293 00:16:34,270 --> 00:16:35,640 AUDIENCE: But not a Prius. 294 00:16:35,640 --> 00:16:39,210 AUDIENCE: But not a Prius. 295 00:16:39,210 --> 00:16:43,820 [INAUDIBLE] the hourglass shape. 296 00:16:43,820 --> 00:16:44,772 Wider hips. 297 00:16:49,290 --> 00:16:52,422 [INAUDIBLE] 298 00:16:52,422 --> 00:16:54,630 SUSAN SILBEY: I should have worn something different. 299 00:16:54,630 --> 00:17:02,598 OK, all right, so what I want to offer before we go further is. 300 00:17:02,598 --> 00:17:04,140 Oh, did you have your hand up, David? 301 00:17:04,140 --> 00:17:07,440 AUDIENCE: I was going to say, you posed the question why 302 00:17:07,440 --> 00:17:11,990 did the appeal change from women that were rounder to women that 303 00:17:11,990 --> 00:17:14,928 were skinnier? 304 00:17:14,928 --> 00:17:16,470 SUSAN SILBEY: Except for their boobs. 305 00:17:20,359 --> 00:17:22,609 AUDIENCE: So DNA kind doesn't account 306 00:17:22,609 --> 00:17:24,979 for that, because your DNA didn't change over 307 00:17:24,979 --> 00:17:26,910 the course of 50 years or 100 years. 308 00:17:26,910 --> 00:17:33,690 But maybe it is hardcoded DNA is that [INAUDIBLE] 309 00:17:33,690 --> 00:17:37,920 or able to support themselves, raise children, or whatever. 310 00:17:37,920 --> 00:17:42,870 So maybe the ideal, what it is in the social, 311 00:17:42,870 --> 00:17:44,580 this woman is fit. 312 00:17:44,580 --> 00:17:47,160 So that means that she's more capable? 313 00:17:47,160 --> 00:17:50,910 SUSAN SILBEY: So maybe it's fit versus 314 00:17:50,910 --> 00:17:55,110 the specific definition of fit. 315 00:17:55,110 --> 00:17:58,650 AUDIENCE: Exactly, yeah so part of it comes from the DNA 316 00:17:58,650 --> 00:17:59,800 that you're seeking. 317 00:17:59,800 --> 00:18:02,340 But the social kind of defines what exactly-- 318 00:18:02,340 --> 00:18:03,700 SUSAN SILBEY: That means. 319 00:18:03,700 --> 00:18:05,760 Well, that's a very interesting observation. 320 00:18:05,760 --> 00:18:08,670 Because there's lots of good social theory 321 00:18:08,670 --> 00:18:11,850 which suggests that about lots of concepts. 322 00:18:11,850 --> 00:18:16,110 That many-- all societies have some form 323 00:18:16,110 --> 00:18:21,430 of marriage system and family system for raising children. 324 00:18:21,430 --> 00:18:28,320 So you might say that there is a human system of reproduction 325 00:18:28,320 --> 00:18:32,130 in the social relations, in addition to the biological. 326 00:18:32,130 --> 00:18:34,830 But, so we say there is something called 327 00:18:34,830 --> 00:18:37,260 the institution of the family. 328 00:18:37,260 --> 00:18:40,560 But that varies from society to society. 329 00:18:40,560 --> 00:18:43,650 And that's the same kind-- the specificity varies. 330 00:18:43,650 --> 00:18:45,660 Though there could be something more general. 331 00:18:45,660 --> 00:18:47,340 And that's a good introduction to what 332 00:18:47,340 --> 00:18:49,320 I want to do for a few minutes. 333 00:18:49,320 --> 00:18:55,050 Is I want to suggest to you that there are a few general models 334 00:18:55,050 --> 00:19:00,900 of human action, which I sort of suggested in the differences 335 00:19:00,900 --> 00:19:02,880 among the social sciences. 336 00:19:02,880 --> 00:19:06,300 And we use different concepts and terminology 337 00:19:06,300 --> 00:19:09,900 to describe what is observable in human behavior. 338 00:19:09,900 --> 00:19:12,060 And these social science theories 339 00:19:12,060 --> 00:19:16,560 are not always, as Andrew suggested at the outset, 340 00:19:16,560 --> 00:19:19,860 quantified into equations. 341 00:19:19,860 --> 00:19:23,790 That economics begins and ends with those equations. 342 00:19:23,790 --> 00:19:26,190 The other social sciences haven't all 343 00:19:26,190 --> 00:19:29,590 developed into equations for most of what they do. 344 00:19:29,590 --> 00:19:32,100 But you will find it in some. 345 00:19:32,100 --> 00:19:34,290 In all of them, there'll be some. 346 00:19:34,290 --> 00:19:38,350 So I want to show you, for example, 347 00:19:38,350 --> 00:19:40,830 the equation you had on Monday. 348 00:19:40,830 --> 00:19:44,010 That's an economic one about derived demand. 349 00:19:44,010 --> 00:19:48,090 Well, here's one you all you may or may not know. 350 00:19:48,090 --> 00:19:50,220 This is one of my favorite ones. 351 00:19:50,220 --> 00:19:54,420 Because it's the field that I'm specialist in. 352 00:19:54,420 --> 00:19:59,760 And it says that formal social control is inversely related 353 00:19:59,760 --> 00:20:04,490 to informal social control. 354 00:20:04,490 --> 00:20:12,500 Social control is what we do to manage other people's behavior, 355 00:20:12,500 --> 00:20:13,940 or to manage behavior. 356 00:20:13,940 --> 00:20:17,390 And formal social control simply means government. 357 00:20:17,390 --> 00:20:19,610 And informal means everything that's 358 00:20:19,610 --> 00:20:23,750 not the government, that's not the legitimate use of force. 359 00:20:23,750 --> 00:20:29,150 And the more you have of informal norms, 360 00:20:29,150 --> 00:20:33,560 informal culture, the less you have of government. 361 00:20:33,560 --> 00:20:35,390 That's what this theory says. 362 00:20:35,390 --> 00:20:37,610 So one varies inversely. 363 00:20:37,610 --> 00:20:39,980 And sociologists have been trying 364 00:20:39,980 --> 00:20:44,760 to validate this theory for a long time. 365 00:20:44,760 --> 00:20:50,510 OK, theories without equations are an organization 366 00:20:50,510 --> 00:20:53,930 of words that are clearly defined, 367 00:20:53,930 --> 00:20:58,760 that try to make explicit what is often 368 00:20:58,760 --> 00:21:01,850 tacit, that is unspoken. 369 00:21:01,850 --> 00:21:04,880 And in social science, our task is 370 00:21:04,880 --> 00:21:11,210 to make explicit and describable through systematic 371 00:21:11,210 --> 00:21:15,830 observations, through transparent methods, what 372 00:21:15,830 --> 00:21:17,850 goes on in human behavior. 373 00:21:17,850 --> 00:21:22,100 So Andrew's first answer to me was more methodological 374 00:21:22,100 --> 00:21:23,870 than it was conceptual. 375 00:21:23,870 --> 00:21:26,960 And what I offered in my distinctions 376 00:21:26,960 --> 00:21:29,690 in the social science was more conceptual. 377 00:21:29,690 --> 00:21:33,320 The methods we use turn out to be almost the same 378 00:21:33,320 --> 00:21:35,180 across all the social sciences. 379 00:21:35,180 --> 00:21:39,290 We observe, we ask, we count. 380 00:21:39,290 --> 00:21:41,690 There are a variety of ways of doing that. 381 00:21:41,690 --> 00:21:47,630 OK, so how do people make decisions? 382 00:21:47,630 --> 00:21:53,330 Well, I tired to get from you two different notions, more 383 00:21:53,330 --> 00:21:58,190 cultural and institutional versus biological. 384 00:21:58,190 --> 00:22:04,850 But even if you look in the non-biological explanations, 385 00:22:04,850 --> 00:22:10,280 there are sort of patterns that come up in the explanations 386 00:22:10,280 --> 00:22:12,020 that people give. 387 00:22:12,020 --> 00:22:17,510 So one pattern is the psychological one, 388 00:22:17,510 --> 00:22:24,710 that people act on the basis of individual desires, wills. 389 00:22:24,710 --> 00:22:28,700 That sometimes, this individual desire and will 390 00:22:28,700 --> 00:22:32,240 is a product of their DNA, nature, 391 00:22:32,240 --> 00:22:36,020 they made their biography, their parenthood. 392 00:22:36,020 --> 00:22:38,270 This is what made them who there. 393 00:22:38,270 --> 00:22:40,740 There are different kinds of persons. 394 00:22:40,740 --> 00:22:44,900 So this is the biological, individualistic, personality, 395 00:22:44,900 --> 00:22:48,860 psychology, evolutionary biology. 396 00:22:48,860 --> 00:22:51,590 Man makes history. 397 00:22:51,590 --> 00:22:55,460 We have an Obama as president because he's 398 00:22:55,460 --> 00:22:59,100 a unique individual. 399 00:22:59,100 --> 00:23:04,260 We had George Washington because he was a special person. 400 00:23:04,260 --> 00:23:08,010 Oh, Gandhi was charismatic. 401 00:23:08,010 --> 00:23:09,940 These are the explanations that say 402 00:23:09,940 --> 00:23:13,560 there is something about the person that 403 00:23:13,560 --> 00:23:18,190 produces the behavior and the achievements. 404 00:23:18,190 --> 00:23:21,510 The alternative-- and so, OK, so they're 405 00:23:21,510 --> 00:23:24,450 unique and independent individuals. 406 00:23:24,450 --> 00:23:32,110 And society is merely the aggregate of these individuals. 407 00:23:32,110 --> 00:23:38,260 The philosophers and writers who exemplify this point of view 408 00:23:38,260 --> 00:23:42,130 include John Locke in the 17th century, 409 00:23:42,130 --> 00:23:46,720 John Stuart Mill in the 19th century, Friedrich Hayek 410 00:23:46,720 --> 00:23:50,140 and Milton Friedman in the 20th century. 411 00:23:50,140 --> 00:23:53,170 OK, when you come across these names, 412 00:23:53,170 --> 00:23:55,480 you can now put them in the box that 413 00:23:55,480 --> 00:24:01,150 says we can explain what happens in the world, including 414 00:24:01,150 --> 00:24:03,550 the production and distribution of energy 415 00:24:03,550 --> 00:24:06,470 by looking at individual behavior. 416 00:24:06,470 --> 00:24:09,340 And that's what we're going to do today. 417 00:24:09,340 --> 00:24:14,020 But there's a second answer, or a second kind theory, 418 00:24:14,020 --> 00:24:16,640 which many of you gave me to start with, 419 00:24:16,640 --> 00:24:21,670 which is that the motives and the intentions of individuals 420 00:24:21,670 --> 00:24:26,440 are located in a larger circulation, 421 00:24:26,440 --> 00:24:30,070 whether it's advertising or education, 422 00:24:30,070 --> 00:24:33,700 that people anticipate certain things to happen. 423 00:24:33,700 --> 00:24:36,950 That they locate themselves in a sequence. 424 00:24:36,950 --> 00:24:41,170 And so that it's not the individual, 425 00:24:41,170 --> 00:24:44,500 but it's what the individual has learned 426 00:24:44,500 --> 00:24:48,040 over time, and the experiences that they have had, 427 00:24:48,040 --> 00:24:53,110 and what the philosopher John Dewey called situated action. 428 00:24:53,110 --> 00:24:55,900 That an action doesn't mean the same 429 00:24:55,900 --> 00:24:58,100 in one place as in another. 430 00:24:58,100 --> 00:25:03,400 And so therefore, the action you make or do 431 00:25:03,400 --> 00:25:05,960 isn't the same in one place or another. 432 00:25:05,960 --> 00:25:08,590 And if you take account of that variation, 433 00:25:08,590 --> 00:25:11,900 then it isn't the individual causation. 434 00:25:11,900 --> 00:25:14,120 But it's also the situation. 435 00:25:14,120 --> 00:25:17,650 And the authors who say these kinds of things 436 00:25:17,650 --> 00:25:19,900 include Pierre Bourdieu. 437 00:25:19,900 --> 00:25:24,250 And he calls this situation, or these culturally 438 00:25:24,250 --> 00:25:29,290 learned behaviors, habits, or the habitus. 439 00:25:29,290 --> 00:25:33,430 And Adam Smith, in his Theory of Moral Sentiments, which 440 00:25:33,430 --> 00:25:38,730 he wrote before he wrote The Wealth of Nations in 1776, 441 00:25:38,730 --> 00:25:42,990 do you all know The Wealth of Nations, 1776, OK, 442 00:25:42,990 --> 00:25:45,870 and you all know about the invisible hand. 443 00:25:45,870 --> 00:25:48,210 OK, yes, no? 444 00:25:48,210 --> 00:25:51,960 Well, look it up on Wikipedia. 445 00:25:51,960 --> 00:25:56,490 But what is important about that theory, which most people take 446 00:25:56,490 --> 00:25:59,880 as obvious, because the foundation of the market 447 00:25:59,880 --> 00:26:02,790 is that before he wrote The Wealth of Nations he, 448 00:26:02,790 --> 00:26:05,640 wrote The Theory of Moral Sentiments. 449 00:26:05,640 --> 00:26:09,450 And in that book, what Adam Smith said 450 00:26:09,450 --> 00:26:14,340 is that the person all people will try to be good. 451 00:26:14,340 --> 00:26:18,030 They will be limited in their desires, which 452 00:26:18,030 --> 00:26:21,630 is what gets translated into demand in the market 453 00:26:21,630 --> 00:26:29,620 by the expectation of what their neighbors will think of them. 454 00:26:29,620 --> 00:26:32,320 That doesn't hold very true in the modern world, 455 00:26:32,320 --> 00:26:34,690 except if you go to what some of you 456 00:26:34,690 --> 00:26:36,670 start at the beginning, advertising, 457 00:26:36,670 --> 00:26:38,900 and signaling, and education. 458 00:26:38,900 --> 00:26:42,130 So what we desire, Adam Smith said, 459 00:26:42,130 --> 00:26:45,010 will be constrained by what we think 460 00:26:45,010 --> 00:26:47,050 our neighbors will think of us. 461 00:26:47,050 --> 00:26:52,300 And that's the framework in which our desires or demands 462 00:26:52,300 --> 00:26:54,880 can shape a market. 463 00:26:54,880 --> 00:27:00,010 So in this second bifurcated notion, 464 00:27:00,010 --> 00:27:02,620 it's not that history makes the man. 465 00:27:02,620 --> 00:27:06,480 It's that man makes history. 466 00:27:06,480 --> 00:27:09,240 No, no, I said it backwards. 467 00:27:09,240 --> 00:27:10,860 Erase that. 468 00:27:10,860 --> 00:27:13,830 The second theory, history makes the man. 469 00:27:13,830 --> 00:27:21,500 So Barack Obama can get elected in 2008. 470 00:27:21,500 --> 00:27:29,930 He could not have gotten elected president in 1980, or 1990, 471 00:27:29,930 --> 00:27:31,640 or 2000. 472 00:27:31,640 --> 00:27:36,140 Conditions had to be such, is how the story would go. 473 00:27:36,140 --> 00:27:41,780 Or that George Washington would have been a farmer, 474 00:27:41,780 --> 00:27:44,630 and he would have spent his whole life as a farmer, 475 00:27:44,630 --> 00:27:49,220 were it not for the American Revolution, 476 00:27:49,220 --> 00:27:53,240 which he did not participate in until the war started. 477 00:27:53,240 --> 00:27:56,060 He was not one of the signers of the Declaration 478 00:27:56,060 --> 00:27:57,650 of Independence. 479 00:27:57,650 --> 00:28:00,530 They needed someone to be a general. 480 00:28:00,530 --> 00:28:03,480 Without that need, he wouldn't have been the president. 481 00:28:03,480 --> 00:28:10,040 And so this second model is that personhood, identity, will, 482 00:28:10,040 --> 00:28:17,000 the desires are produced and are contingent on the opportunities 483 00:28:17,000 --> 00:28:19,340 that exist, and the constraints. 484 00:28:19,340 --> 00:28:21,870 And the authors who write from this perspective 485 00:28:21,870 --> 00:28:24,620 are Max Weber, Emile [? Durkheim ?] Karl 486 00:28:24,620 --> 00:28:28,310 Polanyi, Karl Marx, Jean-Jacques Rousseau. 487 00:28:28,310 --> 00:28:30,650 So when you encounter those names, 488 00:28:30,650 --> 00:28:35,810 you can put them as antithesis to Locke, Mill, Hayek, 489 00:28:35,810 --> 00:28:37,050 and Friedman. 490 00:28:37,050 --> 00:28:40,430 OK, now, I'll put in a little amendment 491 00:28:40,430 --> 00:28:41,900 that's not in the notes. 492 00:28:41,900 --> 00:28:44,750 That for the most part, Americans, 493 00:28:44,750 --> 00:28:47,810 when you survey them and ask them 494 00:28:47,810 --> 00:28:52,430 why things happen as they do, they give you the first answer. 495 00:28:52,430 --> 00:28:54,990 Because people are special. 496 00:28:54,990 --> 00:28:56,840 That's why what they have. 497 00:28:56,840 --> 00:28:59,540 Or that's why things turn out the way they are. 498 00:28:59,540 --> 00:29:04,340 We are, in the United States, a nation of individualists. 499 00:29:04,340 --> 00:29:09,320 We rarely see the patterns of aggregation. 500 00:29:09,320 --> 00:29:21,780 So now, one more, one more how do people make decisions. 501 00:29:25,410 --> 00:29:27,840 There's one more model that I'd like 502 00:29:27,840 --> 00:29:31,440 to present to you, that's not a binary model. 503 00:29:31,440 --> 00:29:33,600 These have been binaries, right? 504 00:29:33,600 --> 00:29:37,620 Two different views, individual, biological, collective 505 00:29:37,620 --> 00:29:40,110 and culture. 506 00:29:40,110 --> 00:29:43,380 The sociologist Max Weber offered us 507 00:29:43,380 --> 00:29:45,780 a more complex notion. 508 00:29:45,780 --> 00:29:50,100 He suggested that action is social 509 00:29:50,100 --> 00:29:52,650 whenever we take account of somebody else, 510 00:29:52,650 --> 00:29:55,830 just like Adam Smith's notion of the moral person. 511 00:29:55,830 --> 00:29:58,740 We care what our neighbors think. 512 00:29:58,740 --> 00:30:02,250 Now, Weber didn't say we care what our neighbors think. 513 00:30:02,250 --> 00:30:15,310 He merely said that he will define action as social when 514 00:30:15,310 --> 00:30:19,640 we take account of another, when there's more than one. 515 00:30:19,640 --> 00:30:24,100 And we anticipate, or we care, or we 516 00:30:24,100 --> 00:30:29,380 want to say something about or imagine an other 517 00:30:29,380 --> 00:30:31,120 in relationship to ourselves. 518 00:30:31,120 --> 00:30:32,590 That's what makes it social. 519 00:30:32,590 --> 00:30:34,420 So whenever you hear the word "social," 520 00:30:34,420 --> 00:30:37,120 which is bandied about something terrible, 521 00:30:37,120 --> 00:30:41,240 it always means an interaction of two or more. 522 00:30:41,240 --> 00:30:46,540 And in his grand theory, which I'll try to do in five minutes, 523 00:30:46,540 --> 00:30:51,970 this is 1,000 pages, OK, you're getting it in five minutes, 524 00:30:51,970 --> 00:30:55,270 there are four ways in which we can 525 00:30:55,270 --> 00:30:58,030 take account of other people. 526 00:30:58,030 --> 00:31:02,080 The first one is what he called rational action. 527 00:31:02,080 --> 00:31:06,820 And here, we have exactly the economic model. 528 00:31:06,820 --> 00:31:11,830 That people act and make decisions 529 00:31:11,830 --> 00:31:17,710 on the basis of reasoning, of means and ends reasoning. 530 00:31:17,710 --> 00:31:22,900 That they take account of the objects in the external world, 531 00:31:22,900 --> 00:31:24,370 the situations. 532 00:31:24,370 --> 00:31:27,100 They mobilize logic. 533 00:31:27,100 --> 00:31:31,390 And they try to figure out how to get from here to there. 534 00:31:31,390 --> 00:31:36,520 They maximize their ends by figuring out appropriate means. 535 00:31:36,520 --> 00:31:39,950 This is our economic rational actor model. 536 00:31:39,950 --> 00:31:44,170 And he called that formal rational action. 537 00:31:44,170 --> 00:31:46,990 But he said there was another kind of rationality, 538 00:31:46,990 --> 00:31:52,480 in which we were oriented not to maximizing 539 00:31:52,480 --> 00:31:57,220 means ends efficiencies, but because there was something 540 00:31:57,220 --> 00:32:01,150 we thought was good, an absolute value. 541 00:32:01,150 --> 00:32:05,710 This could be a conscious belief in a God, who tells us 542 00:32:05,710 --> 00:32:06,760 what to do. 543 00:32:06,760 --> 00:32:12,820 It could be some ethical value, such as do unto others as you 544 00:32:12,820 --> 00:32:14,800 would have them do unto you. 545 00:32:14,800 --> 00:32:17,650 It could be aesthetic to maximize beauty. 546 00:32:17,650 --> 00:32:21,160 It could be to maximize equality, 547 00:32:21,160 --> 00:32:28,280 as communism was originally imagined by Karl Marx. 548 00:32:28,280 --> 00:32:34,480 So that behavior is organized to maximize or to fulfill 549 00:32:34,480 --> 00:32:36,020 that absolute value. 550 00:32:36,020 --> 00:32:41,290 It's not an efficiency or logical requirement 551 00:32:41,290 --> 00:32:43,270 of means and ends. 552 00:32:43,270 --> 00:32:46,700 And the third kind of behavior-- 553 00:32:46,700 --> 00:32:49,180 so this is virtue. 554 00:32:49,180 --> 00:32:53,170 Could be you want to be environmentally sustainable, 555 00:32:53,170 --> 00:32:55,550 whether it's efficient or not. 556 00:32:55,550 --> 00:32:59,080 OK, that's an alternative to efficiency. 557 00:32:59,080 --> 00:33:02,740 The third kind of behavior he described 558 00:33:02,740 --> 00:33:06,400 was ones which come from our feelings, from our emotions. 559 00:33:06,400 --> 00:33:11,270 We do things because it feels good. 560 00:33:11,270 --> 00:33:13,000 Now, you all know what that's like you 561 00:33:13,000 --> 00:33:15,650 do a lot of that we all do. 562 00:33:15,650 --> 00:33:18,400 That's why people take drugs, drink alcohol, 563 00:33:18,400 --> 00:33:22,060 have sex, because it feels good. 564 00:33:22,060 --> 00:33:25,660 It's also the basis on which we sometimes 565 00:33:25,660 --> 00:33:27,790 convey honor to people. 566 00:33:27,790 --> 00:33:31,930 We say some people are better than others. 567 00:33:31,930 --> 00:33:37,360 Being seen and celebrity, this whole TV and internet 568 00:33:37,360 --> 00:33:41,710 world of being there, people are getting some sort 569 00:33:41,710 --> 00:33:44,560 of emotional satisfaction. 570 00:33:44,560 --> 00:33:48,820 And the fourth kind of social action 571 00:33:48,820 --> 00:33:52,420 is simply tradition and habit. 572 00:33:52,420 --> 00:33:56,230 We act because we have always done it this way. 573 00:33:56,230 --> 00:34:01,660 And that, sometimes, is a little bit of mimicry, but mimicry 574 00:34:01,660 --> 00:34:05,620 of the past, rather than necessarily something 575 00:34:05,620 --> 00:34:06,910 co-present. 576 00:34:06,910 --> 00:34:11,590 And there are lots of societies, lots of groups. 577 00:34:11,590 --> 00:34:14,800 And this institution, MIT, is frequently 578 00:34:14,800 --> 00:34:20,120 accused of being stuck in its ways. 579 00:34:20,120 --> 00:34:26,290 So when we were accredited a few years ago, or re-accredited, 580 00:34:26,290 --> 00:34:28,510 the re-accreditation committee said 581 00:34:28,510 --> 00:34:33,850 that MIT had just about come into the 20th century in terms 582 00:34:33,850 --> 00:34:37,540 of the information systems in which it managed. 583 00:34:37,540 --> 00:34:40,810 We had hardly made it to the end of the 20th century, 584 00:34:40,810 --> 00:34:43,659 because we don't change things too easily. 585 00:34:43,659 --> 00:34:46,969 OK, habit, it's comfortable. 586 00:34:46,969 --> 00:34:57,620 So now, like those overlapping disciplines, those overlapping 587 00:34:57,620 --> 00:35:01,730 disciplines, all these concepts have fuzzy borders. 588 00:35:01,730 --> 00:35:07,430 So it's rare that you'll ever find an empirically observable 589 00:35:07,430 --> 00:35:12,260 action social action that's just one kind, that's just 590 00:35:12,260 --> 00:35:16,970 rational action, or that's just affect without habit, 591 00:35:16,970 --> 00:35:17,990 or something else. 592 00:35:17,990 --> 00:35:22,130 And that brings us to exactly where Dick left off on Monday. 593 00:35:22,130 --> 00:35:25,430 What else, besides rational action, 594 00:35:25,430 --> 00:35:32,570 besides economic reasoning might be affecting demand for energy? 595 00:35:32,570 --> 00:35:38,150 OK, so these are the basic models of-- 596 00:35:38,150 --> 00:35:45,380 and habit, or past, is what those 597 00:35:45,380 --> 00:35:50,840 are historic investments are that we can't change very much. 598 00:35:50,840 --> 00:35:55,040 That we have suburbs and automobiles. 599 00:35:55,040 --> 00:35:58,520 That we have habituated preferences 600 00:35:58,520 --> 00:36:01,340 in our political system and the individualism 601 00:36:01,340 --> 00:36:06,110 that makes it very hard for us to change our energy production 602 00:36:06,110 --> 00:36:09,380 and consumption systems. 603 00:36:09,380 --> 00:36:15,010 So what I want-- this is the big picture, 604 00:36:15,010 --> 00:36:18,360 and now, let's look at some specific research that wants 605 00:36:18,360 --> 00:36:21,000 to do something with this. 606 00:36:21,000 --> 00:36:23,760 But I would like you to take away from this the notion 607 00:36:23,760 --> 00:36:28,290 that any human action involves usually 608 00:36:28,290 --> 00:36:31,830 a combination of these forms of action. 609 00:36:31,830 --> 00:36:33,360 These are the categories which we 610 00:36:33,360 --> 00:36:37,590 mix and match to describe it. 611 00:36:37,590 --> 00:36:41,460 So cognitive science and the sociology 612 00:36:41,460 --> 00:36:48,270 of decisions, these days there's been a recent marriage. 613 00:36:48,270 --> 00:36:54,920 And it may, in the end, end up being a marriage of our brain 614 00:36:54,920 --> 00:36:58,230 structure and our behavior. 615 00:36:58,230 --> 00:37:03,320 So what used to be psychology, asking people 616 00:37:03,320 --> 00:37:06,710 why they do what they do, watching them in experiments, 617 00:37:06,710 --> 00:37:09,680 treating them just like rats in experiments, 618 00:37:09,680 --> 00:37:16,760 we now look inside their brains with PET scans, fMRIs. 619 00:37:16,760 --> 00:37:22,000 The neuroscientists look at the brains, the architecture 620 00:37:22,000 --> 00:37:25,300 of our brains, the arrangement of the pieces, 621 00:37:25,300 --> 00:37:29,200 that's about all they can do, so far. 622 00:37:29,200 --> 00:37:34,300 And look at the physical matter inside our skulls, 623 00:37:34,300 --> 00:37:39,670 where the psychologist used to study, we might say, the mind. 624 00:37:39,670 --> 00:37:41,890 Brain scientists or neuroscientists 625 00:37:41,890 --> 00:37:43,240 study the brain. 626 00:37:43,240 --> 00:37:47,440 And the cognitive scientists go, try to put the two together. 627 00:37:47,440 --> 00:37:48,430 Is that fair? 628 00:37:48,430 --> 00:37:52,580 Do we have any BCS majors here? 629 00:37:52,580 --> 00:37:54,602 Not a one? 630 00:37:54,602 --> 00:37:58,530 Not one brain and cognitive science major. 631 00:37:58,530 --> 00:38:00,000 Interesting, I would have thought 632 00:38:00,000 --> 00:38:02,010 you'd all want to become brain scientists. 633 00:38:02,010 --> 00:38:03,350 That's the future, folks. 634 00:38:06,570 --> 00:38:09,900 Well, do the brain and cognitive science of energy decisions. 635 00:38:09,900 --> 00:38:13,650 There you go, there's a dissertation for you. 636 00:38:13,650 --> 00:38:19,560 OK, so some of this marriage of mind and brain 637 00:38:19,560 --> 00:38:22,900 has produced some interesting results. 638 00:38:22,900 --> 00:38:28,320 So that what people used to think was hard wired, 639 00:38:28,320 --> 00:38:32,610 the neuroscientists have showed us is quite malleable. 640 00:38:32,610 --> 00:38:38,250 That in fact, oh, I think up until about the 1980s, doctors 641 00:38:38,250 --> 00:38:41,280 and scientists would say oh, well, you 642 00:38:41,280 --> 00:38:44,310 might be able to regenerate muscles if you've 643 00:38:44,310 --> 00:38:48,300 had an accident or damaged it. 644 00:38:48,300 --> 00:38:51,420 But the brain was what it was and it never changed. 645 00:38:51,420 --> 00:38:54,430 Well, we know that's not true. 646 00:38:54,430 --> 00:38:57,310 Used to think it was fixed and formed. 647 00:38:57,310 --> 00:39:00,280 But now, we know that the neurons are always 648 00:39:00,280 --> 00:39:01,150 in the making. 649 00:39:01,150 --> 00:39:03,970 And those circuits are changing. 650 00:39:03,970 --> 00:39:08,680 The neurons and networks, we have learned, 651 00:39:08,680 --> 00:39:16,330 develop in response to experience to the inputs. 652 00:39:16,330 --> 00:39:20,190 So that they shape the capacity to act. 653 00:39:20,190 --> 00:39:24,500 So that it's not just how you're born. 654 00:39:24,500 --> 00:39:27,430 But the experiences that you have had 655 00:39:27,430 --> 00:39:30,390 transform your mental capacities. 656 00:39:30,390 --> 00:39:31,820 Well, you all know that's true. 657 00:39:31,820 --> 00:39:33,980 You wouldn't have gotten to MIT otherwise. 658 00:39:33,980 --> 00:39:37,220 Unless you all think there was all these equations 659 00:39:37,220 --> 00:39:41,810 were in there to start with and they just came out recently. 660 00:39:41,810 --> 00:39:43,280 That you've learned. 661 00:39:43,280 --> 00:39:45,620 You've learned how to defer gratification. 662 00:39:45,620 --> 00:39:48,650 That's what most successful college students, to get here, 663 00:39:48,650 --> 00:39:49,790 you learned. 664 00:39:49,790 --> 00:39:54,830 And some people don't have the experiences which allow that. 665 00:39:54,830 --> 00:39:56,930 What if you're a child who's never 666 00:39:56,930 --> 00:40:01,790 had gratification of any bodily or emotional function? 667 00:40:01,790 --> 00:40:03,980 Well then delaying it is not something 668 00:40:03,980 --> 00:40:07,400 you could learn that will be rewarded. 669 00:40:07,400 --> 00:40:09,450 You understand what I'm saying? 670 00:40:09,450 --> 00:40:13,550 So if you've never had a full belly, then when food 671 00:40:13,550 --> 00:40:15,470 is put in front of you, you're not necessarily 672 00:40:15,470 --> 00:40:18,750 going to wait to eat it until you're told to eat it. 673 00:40:18,750 --> 00:40:19,830 It's as simple as that. 674 00:40:19,830 --> 00:40:23,150 And the same thing with cognitive activities. 675 00:40:23,150 --> 00:40:25,070 So there's lots of work now going 676 00:40:25,070 --> 00:40:30,340 on that is closing the gap between what 677 00:40:30,340 --> 00:40:35,710 is observable between the inner states and the external aspects 678 00:40:35,710 --> 00:40:37,700 of culture. 679 00:40:37,700 --> 00:40:43,800 So in this work, in this scholarship, 680 00:40:43,800 --> 00:40:46,140 scholars, researchers, have started 681 00:40:46,140 --> 00:40:50,670 to distinguish a fundamental difference between what 682 00:40:50,670 --> 00:40:56,700 they call automatic cognition and deliberate cognition. 683 00:40:56,700 --> 00:41:01,650 Automatic cognition is rapid, effortless, unintentional 684 00:41:01,650 --> 00:41:05,100 thoughts, things which are those habits, those things we 685 00:41:05,100 --> 00:41:06,690 do without thinking. 686 00:41:06,690 --> 00:41:09,750 And it allows quick processing of information 687 00:41:09,750 --> 00:41:15,150 without extended thought or figuring out what to do. 688 00:41:15,150 --> 00:41:20,700 Deliberate cognition is slow, considered, measured. 689 00:41:20,700 --> 00:41:24,960 Where we can reject or accept alternatives. 690 00:41:24,960 --> 00:41:31,500 We can consider various actions to take. 691 00:41:31,500 --> 00:41:35,970 Where we might actively search for solutions, 692 00:41:35,970 --> 00:41:38,580 for characteristics, for connections, 693 00:41:38,580 --> 00:41:41,700 for seeing if we can find relationships. 694 00:41:41,700 --> 00:41:47,670 And where we don't assume, but we solve, you might say. 695 00:41:47,670 --> 00:41:53,400 So examples of automatic cognition 696 00:41:53,400 --> 00:41:56,640 are as the example, male and female. 697 00:41:56,640 --> 00:41:59,490 We look at somebody and we categorize them like that. 698 00:41:59,490 --> 00:42:03,150 When we can't categorize them like that, we start to think. 699 00:42:03,150 --> 00:42:06,690 And we frequently, then, refer to that person 700 00:42:06,690 --> 00:42:09,750 as not usual or not normal. 701 00:42:09,750 --> 00:42:12,870 Age, race, in the United States is 702 00:42:12,870 --> 00:42:16,170 a fundamental automatic cognition. 703 00:42:16,170 --> 00:42:19,640 Turns out not to be that in all cultures. 704 00:42:19,640 --> 00:42:22,700 In many cultures, it's much more varied. 705 00:42:22,700 --> 00:42:26,360 It's not a binary, like it is in the United States. 706 00:42:26,360 --> 00:42:33,510 So our habits are examples of automatic cognition. 707 00:42:33,510 --> 00:42:36,390 It's outside of our thinking process. 708 00:42:36,390 --> 00:42:38,970 It's outside of consciousness. 709 00:42:38,970 --> 00:42:43,470 And it happens more often when we're under stress. 710 00:42:43,470 --> 00:42:46,710 The experiments put people in different conditions. 711 00:42:46,710 --> 00:42:50,970 And how quickly they respond when they're under stress 712 00:42:50,970 --> 00:42:52,290 is measurable. 713 00:42:52,290 --> 00:42:55,710 In deliberate cognition, what has 714 00:42:55,710 --> 00:43:00,300 been shown is that people deliberate and they think 715 00:43:00,300 --> 00:43:03,180 about alternatives when there has been 716 00:43:03,180 --> 00:43:07,410 a disruption of those habits, a disruption 717 00:43:07,410 --> 00:43:14,430 of non-thinking-- of acting without thought. 718 00:43:14,430 --> 00:43:17,910 OK, so I want to give you some empirical examples 719 00:43:17,910 --> 00:43:19,290 of how this works. 720 00:43:22,410 --> 00:43:27,420 The first example is by Carol Heimer 721 00:43:27,420 --> 00:43:29,820 at Northwestern University, where 722 00:43:29,820 --> 00:43:35,710 she did a study of neonatal care in several hospitals. 723 00:43:35,710 --> 00:43:41,220 She noticed in these wards for preemie babies 724 00:43:41,220 --> 00:43:44,850 that routines the hospital had established 725 00:43:44,850 --> 00:43:47,430 were used to manage what was a very 726 00:43:47,430 --> 00:43:50,460 uncertain medical situation, very 727 00:43:50,460 --> 00:43:53,550 uncertain for the families, very stressful. 728 00:43:53,550 --> 00:43:58,710 And that there was a very marked hierarchy among the staff. 729 00:43:58,710 --> 00:44:02,850 The aides at the bottom, the doctors at the top, 730 00:44:02,850 --> 00:44:07,080 the surgical nurses higher up, others lower down. 731 00:44:07,080 --> 00:44:11,430 And along with this hierarchy of persons, 732 00:44:11,430 --> 00:44:16,080 Heimer also identified a continuum of routines. 733 00:44:16,080 --> 00:44:18,390 At one end of her spectrum, there 734 00:44:18,390 --> 00:44:22,200 were tasks which were highly routinized. 735 00:44:22,200 --> 00:44:26,410 That there was almost never any variation for how it was done. 736 00:44:26,410 --> 00:44:28,530 And then there were other tasks which 737 00:44:28,530 --> 00:44:33,120 varied a lot from patient to patient and family to family. 738 00:44:33,120 --> 00:44:38,350 She noticed that when there was either very, very 739 00:44:38,350 --> 00:44:43,510 routinized or very, very unroutinized, 740 00:44:43,510 --> 00:44:46,360 that there was deliberate cognition. 741 00:44:46,360 --> 00:44:52,810 There was a great deal of talk, examination, consideration 742 00:44:52,810 --> 00:44:54,640 of alternatives. 743 00:44:54,640 --> 00:44:58,840 But where there was moderate routinization, 744 00:44:58,840 --> 00:45:01,240 that things were automatic. 745 00:45:01,240 --> 00:45:05,440 According to Heimer the overroutinization 746 00:45:05,440 --> 00:45:09,010 triggers deliberate cognition, because people 747 00:45:09,010 --> 00:45:14,590 are so overwhelmed and overloaded with routines 748 00:45:14,590 --> 00:45:20,890 that the things they have to do routinely become a burden. 749 00:45:20,890 --> 00:45:25,030 They become noise, rather than a signal. 750 00:45:25,030 --> 00:45:28,750 And they cease to focus attention on the child 751 00:45:28,750 --> 00:45:30,430 or on the situation. 752 00:45:30,430 --> 00:45:34,900 Then because there's so many things that need to be done, 753 00:45:34,900 --> 00:45:39,538 and need to be done in a specified way, 754 00:45:39,538 --> 00:45:43,090 they have to think about how to do it. 755 00:45:43,090 --> 00:45:47,980 In underroutinized context, they have 756 00:45:47,980 --> 00:45:50,770 to figure out what to do when there are no routines. 757 00:45:50,770 --> 00:45:52,510 And they have to be inductive. 758 00:45:52,510 --> 00:45:55,220 Is this like one we saw before? 759 00:45:55,220 --> 00:45:58,660 Is this like the four other cases last week? 760 00:45:58,660 --> 00:46:01,750 So that lots and lots of routine, 761 00:46:01,750 --> 00:46:04,120 they have to think about how to manage. 762 00:46:04,120 --> 00:46:06,310 No routine, they have to think. 763 00:46:06,310 --> 00:46:12,190 But simple, not high volume of routines, and they 764 00:46:12,190 --> 00:46:13,630 act without thought. 765 00:46:13,630 --> 00:46:16,270 So that might be taking blood. 766 00:46:16,270 --> 00:46:20,210 It might be putting in the injection. 767 00:46:20,210 --> 00:46:23,170 But if they have to do 10 of those things at once, 768 00:46:23,170 --> 00:46:24,800 then they will think about it. 769 00:46:24,800 --> 00:46:28,390 So that's one example. 770 00:46:28,390 --> 00:46:35,140 In a second study about residents in an Argentinean 771 00:46:35,140 --> 00:46:40,540 shanty town, two scholars, Auyero and Swistun 772 00:46:40,540 --> 00:46:44,050 studied the ways in which people handled uncertainty 773 00:46:44,050 --> 00:46:50,410 in risk and especially the conditions under which-- 774 00:46:50,410 --> 00:46:53,410 the conditions which sustained uncertainty 775 00:46:53,410 --> 00:46:58,550 and sometimes led to assessments of environmental pollutants. 776 00:46:58,550 --> 00:47:03,730 So they saw that sometimes, people paid attention 777 00:47:03,730 --> 00:47:05,290 to the pollutants around them. 778 00:47:05,290 --> 00:47:06,730 And sometimes, they didn't. 779 00:47:06,730 --> 00:47:10,630 And they wanted to identify those different situations. 780 00:47:10,630 --> 00:47:13,450 And the evidence they collected suggested 781 00:47:13,450 --> 00:47:19,120 that if the polluters pollute without disrupting 782 00:47:19,120 --> 00:47:24,040 the habits of the community, the community paid no attention 783 00:47:24,040 --> 00:47:25,330 to the pollution. 784 00:47:25,330 --> 00:47:28,730 But if the pollution disrupted habits, 785 00:47:28,730 --> 00:47:31,180 such as getting children to school, 786 00:47:31,180 --> 00:47:35,290 or preparing meals, or being able to ride on the bus, 787 00:47:35,290 --> 00:47:38,080 then the population mobilized. 788 00:47:38,080 --> 00:47:42,970 The routines encouraged people to adopt 789 00:47:42,970 --> 00:47:47,470 what we are calling automatic pilot, or automatic cognition, 790 00:47:47,470 --> 00:47:48,730 to their surroundings. 791 00:47:48,730 --> 00:47:51,370 But when routines are disrupted, people 792 00:47:51,370 --> 00:47:54,220 began to explore and try to figure out 793 00:47:54,220 --> 00:47:58,990 what it was in the environment that was disrupting it. 794 00:47:58,990 --> 00:48:02,620 So in these cases, familiar routines 795 00:48:02,620 --> 00:48:05,500 combined with automatic cognition 796 00:48:05,500 --> 00:48:10,450 to restrict the attention to the pollution. 797 00:48:10,450 --> 00:48:12,790 So that's a second example. 798 00:48:12,790 --> 00:48:19,150 Third example, in a study of political commitments, 799 00:48:19,150 --> 00:48:23,320 Martin and Desmond found that when citizens identify 800 00:48:23,320 --> 00:48:27,310 themselves as having strong political commitments 801 00:48:27,310 --> 00:48:31,480 along a continuum from liberal to conservative, 802 00:48:31,480 --> 00:48:34,480 so if they were strong at one end or the other 803 00:48:34,480 --> 00:48:39,190 of the political spectrum, they spent less time evaluating 804 00:48:39,190 --> 00:48:41,740 the evidence about politics. 805 00:48:41,740 --> 00:48:46,600 The ideological commitments organized the world for them. 806 00:48:46,600 --> 00:48:49,300 And it made it effortless and efficient 807 00:48:49,300 --> 00:48:53,860 to know how to interpret and how to make decisions. 808 00:48:53,860 --> 00:48:59,590 In a study of social movements, the scholars 809 00:48:59,590 --> 00:49:05,620 found that when movement leaders use everyday metaphors, that is 810 00:49:05,620 --> 00:49:08,740 ordinary language, in particular ways, 811 00:49:08,740 --> 00:49:12,610 they generate more or less support. 812 00:49:12,610 --> 00:49:19,160 So for example, the right to life movement came up, 813 00:49:19,160 --> 00:49:21,340 that is an anti-abortion movement, 814 00:49:21,340 --> 00:49:25,180 came up with a simple phrase, right to life. 815 00:49:25,180 --> 00:49:30,160 And that had more salience than did choice 816 00:49:30,160 --> 00:49:32,560 for those who were pro-abortion. 817 00:49:32,560 --> 00:49:36,520 And so the metaphors aligned with a whole set 818 00:49:36,520 --> 00:49:37,950 of other metaphors. 819 00:49:37,950 --> 00:49:42,000 So that people thought more or less 820 00:49:42,000 --> 00:49:45,480 about the process of abortion. 821 00:49:45,480 --> 00:49:50,880 OK, the same thing with faith-based labor movements. 822 00:49:50,880 --> 00:49:53,970 In the 19th century, Mark Steinberg 823 00:49:53,970 --> 00:49:57,300 has written about the ways in which the cotton 824 00:49:57,300 --> 00:50:00,840 workers in England were able to mobilize and form 825 00:50:00,840 --> 00:50:04,740 a union because, this is in the 1830s 826 00:50:04,740 --> 00:50:09,810 to '50s, because they used the already existing language 827 00:50:09,810 --> 00:50:12,240 of abolition against slavery. 828 00:50:12,240 --> 00:50:18,210 And they identified their work in the cotton mills as slavery. 829 00:50:18,210 --> 00:50:21,390 And by associating themselves with slavery, 830 00:50:21,390 --> 00:50:25,020 people didn't pay as much attention to the differences 831 00:50:25,020 --> 00:50:29,040 and were more likely to support and join the unions. 832 00:50:29,040 --> 00:50:33,240 More recent research along this issue 833 00:50:33,240 --> 00:50:39,870 of the metaphors, the signaling, this is about signaling, 834 00:50:39,870 --> 00:50:45,450 the metaphors that affect deliberate or automatic 835 00:50:45,450 --> 00:50:52,410 cognition, a study of the Serbo-Croatian conflict 836 00:50:52,410 --> 00:51:00,270 in the 1990s showed that the associations 837 00:51:00,270 --> 00:51:05,520 with the past that people used affected the level of conflict. 838 00:51:05,520 --> 00:51:09,720 Particular words and metaphors generated associations 839 00:51:09,720 --> 00:51:10,920 with the past. 840 00:51:10,920 --> 00:51:14,130 So that people viewed the current conflict 841 00:51:14,130 --> 00:51:18,810 as a continuation of hundreds of years of conflict, 842 00:51:18,810 --> 00:51:21,540 rather than assess the current conflict 843 00:51:21,540 --> 00:51:23,970 within current conditions. 844 00:51:23,970 --> 00:51:30,870 So simple metaphors associations encourage 845 00:51:30,870 --> 00:51:36,340 people to think deliberately or to act automatically. 846 00:51:36,340 --> 00:51:39,600 So finally, the last example of these differences 847 00:51:39,600 --> 00:51:42,630 between automatic and deliberate cognition 848 00:51:42,630 --> 00:51:44,790 comes from work on narrative. 849 00:51:44,790 --> 00:51:46,590 So we've gone, I just want you to see, 850 00:51:46,590 --> 00:51:51,510 from routines, to disruptive habits, 851 00:51:51,510 --> 00:51:57,240 to language organizing social movements and wars. 852 00:51:57,240 --> 00:52:02,010 And now we have, from metaphors, we go to narratives. 853 00:52:02,010 --> 00:52:04,530 So researchers have not only looked 854 00:52:04,530 --> 00:52:07,620 at these routines, metaphors, analogies, 855 00:52:07,620 --> 00:52:11,820 for triggering automatic rather than deliberate cognition, 856 00:52:11,820 --> 00:52:15,600 but also at the structure, how the metaphors are 857 00:52:15,600 --> 00:52:19,630 organized into a story at narrative and storytelling. 858 00:52:19,630 --> 00:52:21,690 Some of the very best research on this 859 00:52:21,690 --> 00:52:23,790 has been done in courtrooms. 860 00:52:23,790 --> 00:52:28,540 So that if witnesses tell a story 861 00:52:28,540 --> 00:52:35,520 and they leave out little links in the sequence of action, 862 00:52:35,520 --> 00:52:38,760 the audience, the jurors, and the judges 863 00:52:38,760 --> 00:52:43,020 are more likely to fill in those ellipses 864 00:52:43,020 --> 00:52:46,020 with conventional action. 865 00:52:46,020 --> 00:52:50,820 That is what usually is done in those circumstances. 866 00:52:50,820 --> 00:52:55,200 And it is most often to the disadvantage of a defendant, 867 00:52:55,200 --> 00:52:59,820 because the defendant is arguing that this was not routine. 868 00:52:59,820 --> 00:53:02,130 But this was unusual. 869 00:53:02,130 --> 00:53:05,520 So witnessing, in criminal courts, 870 00:53:05,520 --> 00:53:10,710 makes a difference how complete your narrative is. 871 00:53:10,710 --> 00:53:13,620 If it is not, then people automatically 872 00:53:13,620 --> 00:53:16,980 fill in with routinized information. 873 00:53:16,980 --> 00:53:24,780 In my own work, I have shown that when a story describes 874 00:53:24,780 --> 00:53:26,850 or fails to describe-- let me put it 875 00:53:26,850 --> 00:53:30,870 this way, when a story fails to describe how power is 876 00:53:30,870 --> 00:53:35,610 organized, people are less likely to resist the power 877 00:53:35,610 --> 00:53:38,010 of that organization or person. 878 00:53:38,010 --> 00:53:42,990 But when a story reveals who holds power and why, 879 00:53:42,990 --> 00:53:46,680 then people are more likely to resist authority. 880 00:53:46,680 --> 00:53:50,400 So a little bit of, perhaps, what 881 00:53:50,400 --> 00:53:53,280 was happening last spring in the Middle East. 882 00:53:53,280 --> 00:53:58,260 As stories circulated on the internet about what 883 00:53:58,260 --> 00:54:00,630 was happening in different places, 884 00:54:00,630 --> 00:54:04,200 in villages and towns in Libya and in Egypt, 885 00:54:04,200 --> 00:54:05,910 those stories spread. 886 00:54:05,910 --> 00:54:08,970 And they told stories, they were stories, 887 00:54:08,970 --> 00:54:13,140 about what the government was doing to whom. 888 00:54:13,140 --> 00:54:17,100 And it helped to mobilize the people. 889 00:54:17,100 --> 00:54:21,390 So that's what cognitive science is telling us 890 00:54:21,390 --> 00:54:27,730 about how people make deliberate or automatic decisions. 891 00:54:27,730 --> 00:54:30,880 We have some experimental data that 892 00:54:30,880 --> 00:54:34,330 helps us to understand how decisions 893 00:54:34,330 --> 00:54:39,350 can be made specifically about energy consumption. 894 00:54:39,350 --> 00:54:43,990 So this was the articles you read, I think, the experiments? 895 00:54:43,990 --> 00:54:49,510 Right, Goldstein, Cialdini, and Grisk-- 896 00:54:49,510 --> 00:54:51,080 I can't say that. 897 00:54:51,080 --> 00:54:54,050 So what did they want to show? 898 00:54:54,050 --> 00:54:56,940 What did they want to show? 899 00:54:56,940 --> 00:54:59,260 Yeah, Veronica? 900 00:54:59,260 --> 00:55:02,580 AUDIENCE: They wanted to change the wording 901 00:55:02,580 --> 00:55:08,160 that they had to say your fellow guests are doing it. 902 00:55:08,160 --> 00:55:10,490 Help your fellow guests. 903 00:55:10,490 --> 00:55:11,280 SUSAN SILBEY: Why? 904 00:55:11,280 --> 00:55:12,570 Why did they want to do that? 905 00:55:12,570 --> 00:55:15,780 AUDIENCE: Because subliminally, that social pressure 906 00:55:15,780 --> 00:55:20,460 would motivate them more to do it. 907 00:55:20,460 --> 00:55:23,280 And there's examples given in the other article 908 00:55:23,280 --> 00:55:29,000 about how when people gave money to the guy playing 909 00:55:29,000 --> 00:55:33,197 guitar or whatever, other people behind the guy who gave-- 910 00:55:33,197 --> 00:55:34,530 SUSAN SILBEY: So those are two-- 911 00:55:34,530 --> 00:55:36,370 those are two different norms. 912 00:55:36,370 --> 00:55:39,660 Yes, those are-- looking for norms are 913 00:55:39,660 --> 00:55:42,510 observed patterns of action. 914 00:55:42,510 --> 00:55:43,500 That's what a norm is. 915 00:55:43,500 --> 00:55:46,620 They're looking to see if they can find 916 00:55:46,620 --> 00:55:49,555 the pattern in human action. 917 00:55:49,555 --> 00:55:50,490 AUDIENCE: Yes, Vivian. 918 00:55:50,490 --> 00:55:51,660 So they did find a pattern. 919 00:55:51,660 --> 00:55:54,510 And I think they just want to show that this tactic isn't 920 00:55:54,510 --> 00:55:57,270 being used. 921 00:55:57,270 --> 00:55:59,430 Whereas, because this is something 922 00:55:59,430 --> 00:56:03,170 that we can use to help people started 923 00:56:03,170 --> 00:56:05,900 making environmentally conscious decisions, 924 00:56:05,900 --> 00:56:08,540 and it's not being used. 925 00:56:08,540 --> 00:56:11,270 SUSAN SILBEY: I don't want to be personally critical. 926 00:56:11,270 --> 00:56:13,490 But they don't want to show anything. 927 00:56:13,490 --> 00:56:15,770 They want to discover something. 928 00:56:15,770 --> 00:56:21,020 OK, let's talk about it in a way that is fair to them. 929 00:56:21,020 --> 00:56:24,830 They have an aspiration to understand. 930 00:56:24,830 --> 00:56:27,230 They want to understand the basis 931 00:56:27,230 --> 00:56:34,750 upon which decisions about energy use can be affected. 932 00:56:34,750 --> 00:56:39,110 OK, they want to understand the basis of decision making. 933 00:56:39,110 --> 00:56:42,470 OK, and the assumption they start with, 934 00:56:42,470 --> 00:56:47,630 the hypothesis is that people mimic. 935 00:56:47,630 --> 00:56:50,690 They do what other people do. 936 00:56:50,690 --> 00:56:55,670 And they care about whether they are a member 937 00:56:55,670 --> 00:56:57,470 of one group or another. 938 00:56:57,470 --> 00:56:59,780 That seems they have an identity. 939 00:56:59,780 --> 00:57:01,430 So let's walk through it. 940 00:57:01,430 --> 00:57:06,050 All right, they're trying to use, as I have on the screen, 941 00:57:06,050 --> 00:57:11,090 social norms, that's patterns, to motivate conservation. 942 00:57:11,090 --> 00:57:12,690 Is it possible? 943 00:57:12,690 --> 00:57:18,350 So the question was, the first one, 944 00:57:18,350 --> 00:57:23,750 would people not put their towels for laundry 945 00:57:23,750 --> 00:57:25,850 but would reuse their towels? 946 00:57:25,850 --> 00:57:31,550 And the standard message was, help save the environment. 947 00:57:31,550 --> 00:57:38,460 But the message that they offered was slightly different. 948 00:57:38,460 --> 00:57:44,520 They said, you should do this, because everybody else 949 00:57:44,520 --> 00:57:46,142 is doing it. 950 00:57:46,142 --> 00:57:48,420 Does that appeal to you? 951 00:57:48,420 --> 00:57:50,970 Have you seen these signs? 952 00:57:50,970 --> 00:57:52,875 You've never seen these signs in hotels? 953 00:57:56,517 --> 00:57:58,850 AUDIENCE: I've definitely seen the signs, more in Europe 954 00:57:58,850 --> 00:57:59,780 than over America. 955 00:57:59,780 --> 00:58:02,930 But it's also not only that other people are doing it. 956 00:58:02,930 --> 00:58:05,810 I think in this case, it speaks to anti-environmental movement 957 00:58:05,810 --> 00:58:09,690 that says one towel hanging is not going to make a difference. 958 00:58:09,690 --> 00:58:11,853 So a lot of people, their motivation 959 00:58:11,853 --> 00:58:13,520 is brought down, because they're saying, 960 00:58:13,520 --> 00:58:14,978 I'm going through all this trouble. 961 00:58:14,978 --> 00:58:15,830 Nobody is around me. 962 00:58:15,830 --> 00:58:16,997 I'm not making a difference. 963 00:58:16,997 --> 00:58:19,490 Why am I going to keep the self-sacrifice? 964 00:58:19,490 --> 00:58:22,010 So by telling them, it's not only the bandwagon idea. 965 00:58:22,010 --> 00:58:23,210 But it's also the-- 966 00:58:23,210 --> 00:58:24,860 look we're actually doing it in Europe, 967 00:58:24,860 --> 00:58:27,440 and it's actually going to go a long way. 968 00:58:27,440 --> 00:58:30,650 SUSAN SILBEY: So trying to say that each effort aggregates 969 00:58:30,650 --> 00:58:32,553 to a larger effort. 970 00:58:32,553 --> 00:58:33,470 AUDIENCE: [INAUDIBLE]. 971 00:58:33,470 --> 00:58:36,300 SUSAN SILBEY: And that other people are doing it, too. 972 00:58:36,300 --> 00:58:40,540 So it's not just one person's effort, right? 973 00:58:40,540 --> 00:58:45,410 OK, so they got 75% more reuse by simply 974 00:58:45,410 --> 00:58:49,070 saying join your fellow guests. 975 00:58:49,070 --> 00:58:53,960 So then they tried to specify, is it just all guests? 976 00:58:53,960 --> 00:58:56,835 Or could they get an even greater result? 977 00:59:00,400 --> 00:59:03,640 So in our Weberian concepts, this 978 00:59:03,640 --> 00:59:08,630 is sort of a little bit of be like others. 979 00:59:08,630 --> 00:59:12,230 Enjoy your group membership. 980 00:59:12,230 --> 00:59:16,260 But it's hardly efficiency, right? 981 00:59:16,260 --> 00:59:19,690 It's not a calculation of what would be rational action. 982 00:59:19,690 --> 00:59:25,620 So then they first ask the people who they identify with. 983 00:59:25,620 --> 00:59:31,590 Do you identify with the people who are male or female, 984 00:59:31,590 --> 00:59:34,920 or are citizens of the state, or are you 985 00:59:34,920 --> 00:59:37,770 an environmentally concerned person, 986 00:59:37,770 --> 00:59:42,000 with other hotel guests, or with guests in this room? 987 00:59:42,000 --> 00:59:46,230 I have to confess I find this an extraordinary result. 988 00:59:46,230 --> 00:59:53,470 As you see, in the bottom set, those 989 00:59:53,470 --> 00:59:58,300 with who identified with the standard environmental message 990 00:59:58,300 --> 01:00:04,960 had less reuse than those where they said people in this room 991 01:00:04,960 --> 01:00:06,535 didn't reuse their towels. 992 01:00:11,080 --> 01:00:11,800 They did. 993 01:00:11,800 --> 01:00:14,140 I'm sorry, did reuse their towels. 994 01:00:14,140 --> 01:00:16,675 So you should be like people in this room. 995 01:00:22,000 --> 01:00:24,370 I think it's bizarre. 996 01:00:24,370 --> 01:00:25,690 Yes. 997 01:00:25,690 --> 01:00:27,580 AUDIENCE: Well, it's the proximity. 998 01:00:27,580 --> 01:00:31,570 You don't want to be seen as that one guy who's not 999 01:00:31,570 --> 01:00:34,045 doing his part to contribute. 1000 01:00:34,045 --> 01:00:35,920 SUSAN SILBEY: You think it's a smaller group. 1001 01:00:35,920 --> 01:00:37,960 And therefore, you'll be more outstanding. 1002 01:00:40,840 --> 01:00:42,550 AUDIENCE: The effect of you being 1003 01:00:42,550 --> 01:00:45,910 seen as a pariah is more immediate, 1004 01:00:45,910 --> 01:00:48,283 because the small room. 1005 01:00:48,283 --> 01:00:50,200 SUSAN SILBEY: But you don't know those people. 1006 01:00:53,620 --> 01:00:55,800 AUDIENCE: So there's no one in this room. 1007 01:00:55,800 --> 01:00:57,700 SUSAN SILBEY: He's alone. 1008 01:00:57,700 --> 01:00:58,930 He's alone. 1009 01:00:58,930 --> 01:01:03,080 It's a hotel room, not a dormitory. 1010 01:01:03,080 --> 01:01:04,580 AUDIENCE: I can't really articulate. 1011 01:01:04,580 --> 01:01:07,122 I still think proximity has to do something with it, the fact 1012 01:01:07,122 --> 01:01:08,320 that it was this room. 1013 01:01:08,320 --> 01:01:10,540 SUSAN SILBEY: Yeah, obviously has something, 1014 01:01:10,540 --> 01:01:14,580 but I find it unpersuasive. 1015 01:01:14,580 --> 01:01:15,510 Oh, sorry. 1016 01:01:15,510 --> 01:01:18,990 AUDIENCE: [INAUDIBLE] identify themselves 1017 01:01:18,990 --> 01:01:20,890 with the select group of people who are 1018 01:01:20,890 --> 01:01:22,612 more likely to be in that room. 1019 01:01:22,612 --> 01:01:24,070 SUSAN SILBEY: I think that's right. 1020 01:01:24,070 --> 01:01:26,110 I think that this room-- 1021 01:01:26,110 --> 01:01:29,350 and maybe that's what he was getting at. 1022 01:01:29,350 --> 01:01:31,850 That's a smaller group of people. 1023 01:01:31,850 --> 01:01:34,990 I'm more like a person who would be in this room 1024 01:01:34,990 --> 01:01:42,940 than I am like 350 million citizens or half of them, male 1025 01:01:42,940 --> 01:01:45,240 or female. 1026 01:01:45,240 --> 01:01:49,440 Or even the number of environmentally concerned 1027 01:01:49,440 --> 01:01:52,170 citizen. 1028 01:01:52,170 --> 01:01:55,170 So people in this room is a smaller group. 1029 01:01:55,170 --> 01:01:59,760 And I'm more like those who stay in this kind of hotel room. 1030 01:01:59,760 --> 01:02:00,835 Yes, Matthew. 1031 01:02:00,835 --> 01:02:04,290 AUDIENCE: Is it saying that the people who identify themselves 1032 01:02:04,290 --> 01:02:06,300 as environmentally concerned are least 1033 01:02:06,300 --> 01:02:07,950 likely to recycle the towels? 1034 01:02:07,950 --> 01:02:12,630 SUSAN SILBEY: Than when they offered the other identities 1035 01:02:12,630 --> 01:02:13,470 to choose. 1036 01:02:13,470 --> 01:02:17,280 AUDIENCE: I think that's the most surprising on there. 1037 01:02:17,280 --> 01:02:19,080 PROFESSOR: But the message was presumably 1038 01:02:19,080 --> 01:02:22,980 be environmentally responsible, recycle, as opposed 1039 01:02:22,980 --> 01:02:25,820 to recycle like people who stayed in the room. 1040 01:02:25,820 --> 01:02:27,900 AUDIENCE: So it's not what they identify with. 1041 01:02:27,900 --> 01:02:30,028 It's what they're given. 1042 01:02:30,028 --> 01:02:31,320 SUSAN SILBEY: It's the message. 1043 01:02:31,320 --> 01:02:34,695 Oh, it's the message, which message got a boost. 1044 01:02:34,695 --> 01:02:36,570 PROFESSOR: The standard environmental message 1045 01:02:36,570 --> 01:02:40,260 said help save the environment, use your towels. 1046 01:02:40,260 --> 01:02:42,810 SUSAN SILBEY: And this one, and for people who-- 1047 01:02:42,810 --> 01:02:46,320 OK, but the other one was be like people 1048 01:02:46,320 --> 01:02:47,682 who were in this room. 1049 01:02:47,682 --> 01:02:49,140 AUDIENCE: I think you can attribute 1050 01:02:49,140 --> 01:02:51,900 the discrepancy between the standard environmental message 1051 01:02:51,900 --> 01:02:52,780 and the same room. 1052 01:02:52,780 --> 01:02:55,130 And the whole-- 1053 01:02:55,130 --> 01:02:58,350 I think the one to the extreme right appeals to something 1054 01:02:58,350 --> 01:03:01,860 that's more common among humans, just the need or the desire 1055 01:03:01,860 --> 01:03:06,975 to appear more fit, or more civilized, or more respectable, 1056 01:03:06,975 --> 01:03:08,970 even when nobody's watching, as opposed to just 1057 01:03:08,970 --> 01:03:10,637 standard environmental message that only 1058 01:03:10,637 --> 01:03:17,720 a small subset of the public even, I guess, care about, 1059 01:03:17,720 --> 01:03:18,470 acknowledge. 1060 01:03:21,890 --> 01:03:23,030 SUSAN SILBEY: Charlotte? 1061 01:03:23,030 --> 01:03:24,020 AUDIENCE: Could also be like what 1062 01:03:24,020 --> 01:03:26,280 you were talking about earlier, with what you register 1063 01:03:26,280 --> 01:03:27,947 and what you just kind of see something 1064 01:03:27,947 --> 01:03:29,030 that's there all the time. 1065 01:03:29,030 --> 01:03:31,520 So standard environmental messages are everywhere. 1066 01:03:31,520 --> 01:03:34,130 So it's possible people just don't really register it. 1067 01:03:34,130 --> 01:03:36,380 But if you see a message that says something different 1068 01:03:36,380 --> 01:03:39,110 than you're used to seeing, you may just notice it more 1069 01:03:39,110 --> 01:03:41,000 and respond to it because of that. 1070 01:03:41,000 --> 01:03:42,950 SUSAN SILBEY: Possible, disrupted the routine. 1071 01:03:42,950 --> 01:03:44,450 Rachel? 1072 01:03:44,450 --> 01:03:49,290 AUDIENCE: The other thing, too, is the argument about people 1073 01:03:49,290 --> 01:03:51,760 that have been in that room doing the same thing, I think 1074 01:03:51,760 --> 01:03:53,460 Cialdini, in his book, or whatever, 1075 01:03:53,460 --> 01:03:55,752 he talks about people being more likely to do something 1076 01:03:55,752 --> 01:03:57,270 if someone's watching them. 1077 01:03:57,270 --> 01:03:59,200 And the fact that they're saying, 1078 01:03:59,200 --> 01:04:01,575 oh, people who've stayed in this room have been doing it, 1079 01:04:01,575 --> 01:04:05,910 implies the hotel is watching which guests are. 1080 01:04:05,910 --> 01:04:08,760 SUSAN SILBEY: Yes, that implies they know. 1081 01:04:08,760 --> 01:04:12,030 Very good, so maybe that suggested surveillance. 1082 01:04:12,030 --> 01:04:14,730 There was another hand up over here? 1083 01:04:14,730 --> 01:04:15,285 Yeah. 1084 01:04:15,285 --> 01:04:17,702 AUDIENCE: I was going metnion, all these were good points. 1085 01:04:17,702 --> 01:04:20,010 But also people like to be a part of a team. 1086 01:04:20,010 --> 01:04:22,560 That kind of makes you feel part of a group that's being 1087 01:04:22,560 --> 01:04:24,025 environmentally conscious. 1088 01:04:24,025 --> 01:04:26,400 SUSAN SILBEY: Yeah, I think that's what Ovema and Everson 1089 01:04:26,400 --> 01:04:27,602 was saying, too. 1090 01:04:27,602 --> 01:04:28,560 You're all on the same. 1091 01:04:28,560 --> 01:04:32,100 This business of possibly their surveillance is a new insight. 1092 01:04:32,100 --> 01:04:33,090 Yes, Andrew? 1093 01:04:35,730 --> 01:04:36,810 AUDIENCE: [INAUDIBLE]. 1094 01:04:48,823 --> 01:04:50,240 SUSAN SILBEY: And so why would you 1095 01:04:50,240 --> 01:04:55,830 get more reuse of the towels by saying this room? 1096 01:04:55,830 --> 01:04:57,155 AUDIENCE: [INAUDIBLE]. 1097 01:05:10,637 --> 01:05:12,970 SUSAN SILBEY: We don't have a lot of evidence about that 1098 01:05:12,970 --> 01:05:13,595 is the problem. 1099 01:05:16,150 --> 01:05:19,090 That's the individualist fallacy, 1100 01:05:19,090 --> 01:05:21,820 that there are always people who want to be different. 1101 01:05:21,820 --> 01:05:24,050 But then there's people who want to be different. 1102 01:05:24,050 --> 01:05:26,020 It's a group who wants to be different. 1103 01:05:26,020 --> 01:05:29,973 This search for the unique is the problem. 1104 01:05:29,973 --> 01:05:32,265 PROFESSOR: Back in the '60s, there were a lot of rebels 1105 01:05:32,265 --> 01:05:34,420 all together. 1106 01:05:34,420 --> 01:05:37,000 SUSAN SILBEY: All wanting to be different. 1107 01:05:37,000 --> 01:05:37,940 That's the problem. 1108 01:05:37,940 --> 01:05:39,320 It's social. 1109 01:05:39,320 --> 01:05:40,640 It's not individual. 1110 01:05:40,640 --> 01:05:43,090 That's the issue. 1111 01:05:43,090 --> 01:05:44,410 Yes, Scott? 1112 01:05:44,410 --> 01:05:48,310 AUDIENCE: Somebody pointed out earlier that someone might not 1113 01:05:48,310 --> 01:05:49,570 want to reuse a towel. 1114 01:05:49,570 --> 01:05:52,030 Because they think that I'm only one person, 1115 01:05:52,030 --> 01:05:53,030 what's this going to do? 1116 01:05:53,030 --> 01:05:56,620 But I think if you have the info that, oh, well, everyone 1117 01:05:56,620 --> 01:05:58,630 in this room is using a towel. 1118 01:05:58,630 --> 01:06:00,760 So that you think, oh, if I'm the one person who's 1119 01:06:00,760 --> 01:06:03,160 not doing it, then, I am making a difference, 1120 01:06:03,160 --> 01:06:04,510 but it's in a bad way. 1121 01:06:04,510 --> 01:06:07,090 So if you contribute to it, then you're 1122 01:06:07,090 --> 01:06:08,560 helping make a difference. 1123 01:06:08,560 --> 01:06:11,110 Whereas, it's not you're just thinking, oh, well, 1124 01:06:11,110 --> 01:06:14,780 one person what's that going to do? 1125 01:06:14,780 --> 01:06:20,060 SUSAN SILBEY: All right, so what other norms or incentives might 1126 01:06:20,060 --> 01:06:23,100 we use for energy efficient-- 1127 01:06:23,100 --> 01:06:24,410 Oh, let me see. 1128 01:06:24,410 --> 01:06:26,630 Let's take the takeaway message and then consider 1129 01:06:26,630 --> 01:06:28,050 some alternative. 1130 01:06:28,050 --> 01:06:31,700 So what this cognitive science research 1131 01:06:31,700 --> 01:06:34,160 and the experimental research shows 1132 01:06:34,160 --> 01:06:38,990 is that people do follow the norms of others 1133 01:06:38,990 --> 01:06:42,320 with whom they feel associated, even if that association 1134 01:06:42,320 --> 01:06:46,040 doesn't seem, on the outside, terribly meaningful 1135 01:06:46,040 --> 01:06:48,470 as some group identity. 1136 01:06:48,470 --> 01:06:51,770 And in some circumstances, individuals 1137 01:06:51,770 --> 01:06:57,350 follow norms of a meaningless and unimportant identity 1138 01:06:57,350 --> 01:07:01,400 rather than a meaningful and important one if the connection 1139 01:07:01,400 --> 01:07:05,840 is based on an uncommon characteristic, which is maybe 1140 01:07:05,840 --> 01:07:08,390 helps Andrew's point a little bit, 1141 01:07:08,390 --> 01:07:11,030 that this is a smaller group, it's 1142 01:07:11,030 --> 01:07:13,730 an uncommon characteristic. 1143 01:07:13,730 --> 01:07:16,910 And it's the same as Jacob, and Ovema, and Everson. 1144 01:07:16,910 --> 01:07:20,210 So I can identify with this group. 1145 01:07:20,210 --> 01:07:26,750 Now, maybe if they had some other sign, 1146 01:07:26,750 --> 01:07:28,160 they could have done better. 1147 01:07:28,160 --> 01:07:31,580 Have you got any suggestions of what other norms 1148 01:07:31,580 --> 01:07:33,890 they might try to invoke? 1149 01:07:36,860 --> 01:07:40,715 How might you get people, if not only in towel reuse? 1150 01:07:44,760 --> 01:07:47,790 How can we get people? 1151 01:07:47,790 --> 01:07:49,260 Yes, Sid. 1152 01:07:49,260 --> 01:07:51,180 AUDIENCE: Give them a financial statistic, 1153 01:07:51,180 --> 01:07:52,960 nothing coming to me. 1154 01:07:52,960 --> 01:07:55,260 If I have like a number or something, telling me, 1155 01:07:55,260 --> 01:08:00,120 how much choice A, why choice is better than choice B. 1156 01:08:00,120 --> 01:08:01,740 SUSAN SILBEY: Tell them how much? 1157 01:08:01,740 --> 01:08:04,418 AUDIENCE: Well, I tell them how much they're saving. 1158 01:08:04,418 --> 01:08:06,210 SUSAN SILBEY: Well, now that's interesting. 1159 01:08:06,210 --> 01:08:10,110 Because there was a story in the news. 1160 01:08:10,110 --> 01:08:11,790 Remember the story about there was 1161 01:08:11,790 --> 01:08:15,210 a story about how the hotels were only 1162 01:08:15,210 --> 01:08:19,470 doing this to save money, that they weren't, in fact, saving 1163 01:08:19,470 --> 01:08:21,090 energy. 1164 01:08:21,090 --> 01:08:24,010 And that had a negative effect. 1165 01:08:24,010 --> 01:08:24,899 So it has to be-- 1166 01:08:24,899 --> 01:08:27,060 AUDIENCE: I don't mean in this, hotel instance, 1167 01:08:27,060 --> 01:08:31,470 I mean when you're just talking about energy saving. 1168 01:08:31,470 --> 01:08:33,420 if you can find that discrepancy and you 1169 01:08:33,420 --> 01:08:39,707 can show that choice A is energy efficient and is saving you-- 1170 01:08:39,707 --> 01:08:41,790 SUSAN SILBEY: So you're arguing that if you put it 1171 01:08:41,790 --> 01:08:44,069 into dollars, people will behave. 1172 01:08:44,069 --> 01:08:46,859 AUDIENCE: I mean, for me, a dollar amount says a lot more 1173 01:08:46,859 --> 01:08:50,010 than, perhaps, trying to I guess group 1174 01:08:50,010 --> 01:08:53,240 me into one group or the other. 1175 01:08:53,240 --> 01:08:54,240 SUSAN SILBEY: That sign? 1176 01:08:56,819 --> 01:08:58,620 It's not too clear. 1177 01:08:58,620 --> 01:09:01,590 But it's how many miles per gallon 1178 01:09:01,590 --> 01:09:04,590 you'll get is the car stickers. 1179 01:09:04,590 --> 01:09:08,040 So if you by one car, in city driving, 1180 01:09:08,040 --> 01:09:10,600 you'll get 18 miles per gallon. 1181 01:09:10,600 --> 01:09:13,425 But on the other car, you'll get 23 miles per gallon. 1182 01:09:16,149 --> 01:09:19,990 Will you buy the car with the 23 and the 30? 1183 01:09:19,990 --> 01:09:22,995 Is that kind of number helpful? 1184 01:09:22,995 --> 01:09:23,620 AUDIENCE: Sure. 1185 01:09:27,428 --> 01:09:29,470 AUDIENCE: To kind of bounce off waht he's saying, 1186 01:09:29,470 --> 01:09:31,060 it's a number, but it doesn't really 1187 01:09:31,060 --> 01:09:33,550 tell you dollars and cents. 1188 01:09:33,550 --> 01:09:35,177 Oh, bigger number is better. 1189 01:09:35,177 --> 01:09:36,760 SUSAN SILBEY: So you think that should 1190 01:09:36,760 --> 01:09:44,109 say you will save $500 this year on your gasoline, 1191 01:09:44,109 --> 01:09:47,710 on an average of 10,000 miles a year or something like that? 1192 01:09:53,220 --> 01:09:53,720 Yeah? 1193 01:09:53,720 --> 01:09:56,180 AUDIENCE: And I think it's important that you 1194 01:09:56,180 --> 01:09:57,937 talking about their savings. 1195 01:09:57,937 --> 01:09:59,520 But that kind of sounds like the stuff 1196 01:09:59,520 --> 01:10:03,800 that you were saying before, how people act, they might have-- 1197 01:10:03,800 --> 01:10:08,490 habitually or traditionally and perhaps 1198 01:10:08,490 --> 01:10:11,040 the willingness to change their habits 1199 01:10:11,040 --> 01:10:15,252 is not as great as the willingness to-- 1200 01:10:15,252 --> 01:10:17,210 SUSAN SILBEY: And so why do people buy Priuses? 1201 01:10:21,180 --> 01:10:22,650 Yeah, Matt. 1202 01:10:22,650 --> 01:10:25,690 AUDIENCE: I think it's talks to the point about style 1203 01:10:25,690 --> 01:10:27,330 and a need for individuality. 1204 01:10:27,330 --> 01:10:31,290 That people have Priuses because they 1205 01:10:31,290 --> 01:10:33,430 think that they're environmentally friendly, 1206 01:10:33,430 --> 01:10:34,593 but they also want-- 1207 01:10:34,593 --> 01:10:36,990 SUSAN SILBEY: Everybody else to think they are. 1208 01:10:36,990 --> 01:10:38,550 So it's not individuality. 1209 01:10:38,550 --> 01:10:41,400 Let's just get the terminology clear, because it's-- 1210 01:10:41,400 --> 01:10:43,710 because we talk about individuals a lot. 1211 01:10:43,710 --> 01:10:45,150 People buy Priuses. 1212 01:10:45,150 --> 01:10:47,490 And what you're trying to say is because they 1213 01:10:47,490 --> 01:10:51,060 want to be seen to be environmentally? 1214 01:10:51,060 --> 01:10:51,810 No? 1215 01:10:51,810 --> 01:10:54,310 AUDIENCE: Yeah, but in a way that makes them look effortless 1216 01:10:54,310 --> 01:10:55,450 like they're doing it. 1217 01:10:55,450 --> 01:10:57,480 SUSAN SILBEY: Because it's just routine. 1218 01:10:57,480 --> 01:10:58,770 I've made a commitment. 1219 01:10:58,770 --> 01:11:01,690 I'm a good person. 1220 01:11:01,690 --> 01:11:03,650 Jessica, and then Charles. 1221 01:11:03,650 --> 01:11:05,650 AUDIENCE: It's got to be coupled with you go in, 1222 01:11:05,650 --> 01:11:11,075 and you see something that's got off the charts mileage, MPG 1223 01:11:11,075 --> 01:11:13,023 mileage, and the sales person there's 1224 01:11:13,023 --> 01:11:14,440 going to tell you this is the best 1225 01:11:14,440 --> 01:11:15,648 investment I'm going to make. 1226 01:11:15,648 --> 01:11:17,543 I think if they actually ran the numbers, 1227 01:11:17,543 --> 01:11:19,210 like you did in class, in front of them, 1228 01:11:19,210 --> 01:11:23,060 and they told them it's never going to break even, never 1229 01:11:23,060 --> 01:11:23,950 pay for itself. 1230 01:11:23,950 --> 01:11:29,830 I think they might not want to still be seen [INAUDIBLE].. 1231 01:11:29,830 --> 01:11:33,040 SUSAN SILBEY: I didn't think the argument was, about the Prius, 1232 01:11:33,040 --> 01:11:34,870 that it wouldn't be environmentally good. 1233 01:11:34,870 --> 01:11:38,990 They just never get the savings back. 1234 01:11:38,990 --> 01:11:41,740 But so you two are not disagreeing. 1235 01:11:41,740 --> 01:11:44,353 You're offering two different motivations. 1236 01:11:44,353 --> 01:11:47,020 AUDIENCE: I just think their own economical well-being 1237 01:11:47,020 --> 01:11:49,810 would overpower their want to be environmental. 1238 01:11:49,810 --> 01:11:51,880 SUSAN SILBEY: Well, that's a hypothesis. 1239 01:11:51,880 --> 01:11:56,140 That's a hypothesis which is challenged by all the Priuses. 1240 01:11:56,140 --> 01:11:57,730 Oh, you're saying they don't know. 1241 01:11:57,730 --> 01:11:58,397 They don't know. 1242 01:11:58,397 --> 01:11:59,855 PROFESSOR: There's also a question, 1243 01:11:59,855 --> 01:12:02,110 there are hybrids available in other cars that 1244 01:12:02,110 --> 01:12:03,940 aren't distinctly styled. 1245 01:12:03,940 --> 01:12:06,850 You could buy hybrids and various other vehicles, 1246 01:12:06,850 --> 01:12:09,400 as opposed to the Prius being a hybrid Corolla, 1247 01:12:09,400 --> 01:12:11,260 looks like a Corolla, it's distinctive. 1248 01:12:11,260 --> 01:12:14,030 You can buy hybrids in various Ford and Chevy models 1249 01:12:14,030 --> 01:12:16,990 that are indistinguishable from a distance. 1250 01:12:16,990 --> 01:12:20,680 SUSAN SILBEY: And that would not serve Max's reason, 1251 01:12:20,680 --> 01:12:23,990 because everybody has to know that you are-- 1252 01:12:23,990 --> 01:12:25,240 PROFESSOR: Not many people do. 1253 01:12:25,240 --> 01:12:26,657 You don't buy it for Max's reason, 1254 01:12:26,657 --> 01:12:28,830 because nobody can tell you made the purchase. 1255 01:12:28,830 --> 01:12:30,900 And the sales are less. 1256 01:12:30,900 --> 01:12:32,580 SUSAN SILBEY: Jacob and then Charlotte. 1257 01:12:32,580 --> 01:12:34,770 AUDIENCE: I would say there's kind of a threshold. 1258 01:12:34,770 --> 01:12:36,670 Once so many Priuses were sold, they 1259 01:12:36,670 --> 01:12:38,580 became a thing in advertising. 1260 01:12:38,580 --> 01:12:40,482 When someone thought, I need to buy a hybrid. 1261 01:12:40,482 --> 01:12:41,440 They though of a Prius. 1262 01:12:41,440 --> 01:12:42,945 They didn't think of other hybrids. 1263 01:12:42,945 --> 01:12:44,070 SUSAN SILBEY: I'll confess. 1264 01:12:44,070 --> 01:12:46,740 I didn't know, for a long time, there were others. 1265 01:12:46,740 --> 01:12:48,600 I just saw the Priuses. 1266 01:12:48,600 --> 01:12:51,690 So I was one of those people who don't recognize the ones that 1267 01:12:51,690 --> 01:12:55,620 are not styled differently. 1268 01:12:55,620 --> 01:12:56,630 That's right. 1269 01:12:56,630 --> 01:12:57,880 That's right. 1270 01:12:57,880 --> 01:12:59,255 AUDIENCE: I was just going to say 1271 01:12:59,255 --> 01:13:02,400 that I agree that I think that economical position comes 1272 01:13:02,400 --> 01:13:03,510 into a lot of part. 1273 01:13:03,510 --> 01:13:06,850 Because the way that Priuses are advertised, is their advertised 1274 01:13:06,850 --> 01:13:08,650 that gas is really expensive. 1275 01:13:08,650 --> 01:13:10,930 And this is also environmentally friendly. 1276 01:13:10,930 --> 01:13:11,820 But it's a lot of-- 1277 01:13:11,820 --> 01:13:13,810 people think that they're going to save a lot of money. 1278 01:13:13,810 --> 01:13:15,000 And if you show them the numbers, they wouldn't. 1279 01:13:15,000 --> 01:13:17,160 But whenever I was getting a car, 1280 01:13:17,160 --> 01:13:19,192 I was considering buying a Prius, because it's 1281 01:13:19,192 --> 01:13:20,650 environmentally friendly, whatever. 1282 01:13:20,650 --> 01:13:23,130 But also because I live in Oklahoma. 1283 01:13:23,130 --> 01:13:24,040 So I drive a lot. 1284 01:13:24,040 --> 01:13:26,757 And I wanted to not have to pay a lot of money for gas. 1285 01:13:26,757 --> 01:13:27,840 And that was my reasoning. 1286 01:13:27,840 --> 01:13:30,663 And I would have never considered running the numbers. 1287 01:13:30,663 --> 01:13:32,580 SUSAN SILBEY: You just believe what they told. 1288 01:13:32,580 --> 01:13:34,200 AUDIENCE: Yeah, and then also, just 1289 01:13:34,200 --> 01:13:36,283 because when you think of hybrid, 1290 01:13:36,283 --> 01:13:38,700 you think of Prius, because of their advertising strategy. 1291 01:13:38,700 --> 01:13:41,040 It's not like I wasn't wanting to buy it 1292 01:13:41,040 --> 01:13:43,622 just because I thought it would make people think that I 1293 01:13:43,622 --> 01:13:44,830 was environmentally friendly. 1294 01:13:44,830 --> 01:13:47,310 It was because I thought, oh, maybe I wanted a hybrid car. 1295 01:13:47,310 --> 01:13:49,840 I should get a Prius, because that's the only one I know of. 1296 01:13:49,840 --> 01:13:52,920 SUSAN SILBEY: I want to get on Max's side for just a moment. 1297 01:13:52,920 --> 01:13:57,060 Because I think that people don't just 1298 01:13:57,060 --> 01:14:00,210 buy things because they're efficient, 1299 01:14:00,210 --> 01:14:03,430 or because they're a better economic deal. 1300 01:14:03,430 --> 01:14:07,050 You cannot explain the fashion industry. 1301 01:14:07,050 --> 01:14:12,270 You cannot explain celebrity. 1302 01:14:12,270 --> 01:14:15,180 It's not about economic efficiency. 1303 01:14:15,180 --> 01:14:19,410 I mean, you don't need to have the number of sweaters 1304 01:14:19,410 --> 01:14:20,580 and shoes. 1305 01:14:20,580 --> 01:14:23,085 They don't wear out. 1306 01:14:23,085 --> 01:14:26,640 You don't have to change the hems from long to short 1307 01:14:26,640 --> 01:14:28,650 for efficiency. 1308 01:14:28,650 --> 01:14:31,080 Why do people change their clothes, 1309 01:14:31,080 --> 01:14:33,300 change their hairstyles? 1310 01:14:33,300 --> 01:14:36,270 It's not because it's economically efficient. 1311 01:14:36,270 --> 01:14:37,875 There is something else. 1312 01:14:37,875 --> 01:14:42,060 PROFESSOR: Cadillac Escalades, they don't carry much, 1313 01:14:42,060 --> 01:14:46,470 their performance is lousy, terrible gas mileage. 1314 01:14:46,470 --> 01:14:48,960 SUSAN SILBEY: And why do people buy them? 1315 01:14:48,960 --> 01:14:51,240 Because 1316 01:14:51,240 --> 01:14:54,252 AUDIENCE: Ford F150 is still the best selling care in America. 1317 01:14:54,252 --> 01:14:55,710 And people like trucks, people like 1318 01:14:55,710 --> 01:14:59,540 to be seen in high, big vehicles to make them look wealthy. 1319 01:14:59,540 --> 01:15:01,200 And they go outdoors-- 1320 01:15:01,200 --> 01:15:03,960 SUSAN SILBEY: It's called status and approval. 1321 01:15:03,960 --> 01:15:06,630 Sid, and then Andrew. 1322 01:15:06,630 --> 01:15:10,080 AUDIENCE: I think has to do with everyone's personal utility, 1323 01:15:10,080 --> 01:15:12,260 [INAUDIBLE]. 1324 01:15:12,260 --> 01:15:14,010 SUSAN SILBEY: So where did that personal-- 1325 01:15:14,010 --> 01:15:16,440 let's get to the basis of this lecture. 1326 01:15:16,440 --> 01:15:19,860 Where did that personal utility come from? 1327 01:15:19,860 --> 01:15:23,602 AUDIENCE: I mean, I'm sure like it has to do with group theory, 1328 01:15:23,602 --> 01:15:25,560 and which group you want to be associated with. 1329 01:15:25,560 --> 01:15:28,500 But there are different groups. 1330 01:15:28,500 --> 01:15:30,570 And different people might buy different things 1331 01:15:30,570 --> 01:15:31,810 for different reasons. 1332 01:15:31,810 --> 01:15:36,290 SUSAN SILBEY: And do we choose all the groups we belong to? 1333 01:15:36,290 --> 01:15:37,400 Come on. 1334 01:15:37,400 --> 01:15:39,560 I mean, no. 1335 01:15:39,560 --> 01:15:41,750 You can't say, just look around this room, 1336 01:15:41,750 --> 01:15:45,710 you can't say we all choose the places, the groups we're 1337 01:15:45,710 --> 01:15:47,390 identified with. 1338 01:15:47,390 --> 01:15:48,710 Come on. 1339 01:15:48,710 --> 01:15:52,580 We wouldn't have the wars we've been having if people all 1340 01:15:52,580 --> 01:15:55,760 chose their group identities. 1341 01:15:55,760 --> 01:15:56,960 Think about it. 1342 01:15:56,960 --> 01:15:59,750 Think what's driving the world. 1343 01:15:59,750 --> 01:16:02,090 And the price of oil. 1344 01:16:02,090 --> 01:16:08,150 Group identities, not rational decision making, folks. 1345 01:16:08,150 --> 01:16:11,030 It's group identities. 1346 01:16:11,030 --> 01:16:15,320 Habitual, historic, group identities 1347 01:16:15,320 --> 01:16:19,400 have raised the price of oil this week. 1348 01:16:19,400 --> 01:16:22,970 Not rational decision making. 1349 01:16:22,970 --> 01:16:25,820 The stock market went down yesterday, 1350 01:16:25,820 --> 01:16:27,500 and what did the analysts say? 1351 01:16:27,500 --> 01:16:30,650 Fear of oil reserves. 1352 01:16:30,650 --> 01:16:33,050 The oil reserves did not change yesterday. 1353 01:16:36,110 --> 01:16:38,690 I am sorry to push back at you. 1354 01:16:38,690 --> 01:16:40,370 Make the rest of the argument. 1355 01:16:40,370 --> 01:16:41,420 Part of it's right. 1356 01:16:41,420 --> 01:16:46,860 But those utilities come from someplace. 1357 01:16:46,860 --> 01:16:48,900 AUDIENCE: Yes, but what I was trying to say 1358 01:16:48,900 --> 01:16:54,610 is that I think everyone can reason 1359 01:16:54,610 --> 01:16:56,080 as to why they make a decision. 1360 01:16:56,080 --> 01:16:57,520 And it's like very-- 1361 01:16:57,520 --> 01:17:00,100 yes, everyone's utility is affected 1362 01:17:00,100 --> 01:17:01,840 by the groups around them. 1363 01:17:01,840 --> 01:17:04,750 But all I'm trying to say is someone 1364 01:17:04,750 --> 01:17:08,380 will buy that Prius to make a fashion statement. 1365 01:17:08,380 --> 01:17:11,260 But the utility they get out of making that fashion statement 1366 01:17:11,260 --> 01:17:14,833 is worth more for them than calculating 1367 01:17:14,833 --> 01:17:17,000 the savings they're going to make and knowing that-- 1368 01:17:17,000 --> 01:17:18,375 SUSAN SILBEY: So you want to do-- 1369 01:17:18,375 --> 01:17:21,250 OK, you want to do a cost benefit analysis 1370 01:17:21,250 --> 01:17:27,160 and add to the benefit status rewards and emotional rewards. 1371 01:17:27,160 --> 01:17:29,860 Fine, we're total heated agreement. 1372 01:17:29,860 --> 01:17:37,120 OK, however, not everybody can think through. 1373 01:17:37,120 --> 01:17:41,200 There are-- you have to have the categories to think with. 1374 01:17:41,200 --> 01:17:42,760 That's where culture comes from. 1375 01:17:42,760 --> 01:17:45,145 We have cultures, and I don't want you to-- 1376 01:17:45,145 --> 01:17:48,910 that do not have certain categories in them. 1377 01:17:48,910 --> 01:17:50,500 They just don't have them. 1378 01:17:50,500 --> 01:17:53,500 You're sitting in the institution that 1379 01:17:53,500 --> 01:17:56,170 makes categories to think with. 1380 01:17:56,170 --> 01:17:58,150 But not everybody has them. 1381 01:17:58,150 --> 01:18:02,800 So we have to deal with the fact that we live in an integrated 1382 01:18:02,800 --> 01:18:04,660 world, interconnected. 1383 01:18:04,660 --> 01:18:09,430 And we do not get to choose all the time. 1384 01:18:09,430 --> 01:18:11,642 So Andrew had his hand. 1385 01:18:11,642 --> 01:18:12,830 AUDIENCE: [INAUDIBLE]. 1386 01:18:44,790 --> 01:18:47,310 SUSAN SILBEY: Save the dollar. 1387 01:18:47,310 --> 01:18:49,920 Saving the dollar is a good idea. 1388 01:18:49,920 --> 01:18:51,050 AUDIENCE: [INAUDIBLE]. 1389 01:18:57,900 --> 01:18:59,490 SUSAN SILBEY: To care about that. 1390 01:18:59,490 --> 01:19:00,570 AUDIENCE: [INAUDIBLE]. 1391 01:19:02,970 --> 01:19:05,500 SUSAN SILBEY: To care about little bits of money. 1392 01:19:05,500 --> 01:19:06,570 AUDIENCE: [INAUDIBLE]. 1393 01:19:10,800 --> 01:19:16,020 SUSAN SILBEY: Well, this has been observed 1394 01:19:16,020 --> 01:19:18,540 about American society. 1395 01:19:18,540 --> 01:19:20,310 If you want, I'll give you the source, 1396 01:19:20,310 --> 01:19:24,000 Thorstein Veblen was an observer at the beginning 1397 01:19:24,000 --> 01:19:26,490 of the 20th century and described 1398 01:19:26,490 --> 01:19:30,580 Americans preoccupation with these kinds of things. 1399 01:19:30,580 --> 01:19:33,140 OK, 1400 01:19:33,140 --> 01:19:35,780 PROFESSOR: [INAUDIBLE]. 1401 01:19:35,780 --> 01:19:37,470 Dallas, Texas, you would not do that. 1402 01:19:37,470 --> 01:19:41,170 Houston, Texas, you would not do that. 1403 01:19:41,170 --> 01:19:43,150 SUSAN SILBEY: Well, see you on-- 1404 01:19:43,150 --> 01:19:43,700 next week. 1405 01:19:43,700 --> 01:19:45,480 Thank you.