1 00:00:00,000 --> 00:00:01,980 [SQUEAKING] 2 00:00:01,980 --> 00:00:04,455 [RUSTLING] 3 00:00:04,455 --> 00:00:07,425 [CLICKING] 4 00:00:10,205 --> 00:00:11,330 NANCY KANWISHER: All right. 5 00:00:11,330 --> 00:00:13,163 So I'm going to finish up some of the things 6 00:00:13,163 --> 00:00:16,005 that I talked about with experimental design last time. 7 00:00:16,005 --> 00:00:17,630 And then we're going to get on and talk 8 00:00:17,630 --> 00:00:20,493 about category-selective regions in the cortex, 9 00:00:20,493 --> 00:00:22,910 which, of course, we've been talking about in various ways 10 00:00:22,910 --> 00:00:23,690 all along. 11 00:00:23,690 --> 00:00:26,900 But I'll raise some general controversies 12 00:00:26,900 --> 00:00:29,720 about that, some alternative views from the kind of one 13 00:00:29,720 --> 00:00:31,970 that I've been foisting on you, and what 14 00:00:31,970 --> 00:00:34,350 I consider to be some of the strongest, 15 00:00:34,350 --> 00:00:36,290 most important evidence against the view 16 00:00:36,290 --> 00:00:38,810 that I've been putting forth here. 17 00:00:38,810 --> 00:00:42,920 And then we'll talk about decoding signals from brains. 18 00:00:42,920 --> 00:00:44,540 That's the agenda. 19 00:00:44,540 --> 00:00:45,320 Here we go. 20 00:00:45,320 --> 00:00:47,420 So last time, I had you guys work in groups 21 00:00:47,420 --> 00:00:49,910 to think about experimental design because, really, 22 00:00:49,910 --> 00:00:52,160 most decisions about experimental design, 23 00:00:52,160 --> 00:00:55,190 once you know the bare basics of the measurement methods, 24 00:00:55,190 --> 00:00:57,170 they're just applying common sense, 25 00:00:57,170 --> 00:00:59,540 thinking about what it's like for the subject. 26 00:00:59,540 --> 00:01:01,650 How are you going to get the data you need? 27 00:01:01,650 --> 00:01:04,190 So in terms of what exact conditions to run 28 00:01:04,190 --> 00:01:07,010 in any experiment, I talked about the idea 29 00:01:07,010 --> 00:01:09,740 of a minimal pair, this kind of platonic ideal 30 00:01:09,740 --> 00:01:12,560 of the perfect contrast, which never exists in reality 31 00:01:12,560 --> 00:01:14,330 but that you aspire toward. 32 00:01:14,330 --> 00:01:16,610 So ideally, you want two conditions 33 00:01:16,610 --> 00:01:19,220 that are identical, except for the one little thing 34 00:01:19,220 --> 00:01:21,350 that you're interested in. 35 00:01:21,350 --> 00:01:24,200 And you don't want to have other things that co-vary 36 00:01:24,200 --> 00:01:25,700 with that thing you're manipulating, 37 00:01:25,700 --> 00:01:27,450 other than the thing you're interested in. 38 00:01:27,450 --> 00:01:32,000 And that's the crux of the matter in experimental design. 39 00:01:32,000 --> 00:01:33,902 You guys talked about what kind of tasks 40 00:01:33,902 --> 00:01:35,360 to have subjects do in the scanner. 41 00:01:35,360 --> 00:01:37,880 There's a trade-off between kind of doing the most natural 42 00:01:37,880 --> 00:01:41,240 thing, which is they're just lying there and stimuli come-- 43 00:01:41,240 --> 00:01:43,820 visual, auditory, whatever-- 44 00:01:43,820 --> 00:01:46,130 versus the fact that subjects might fall asleep 45 00:01:46,130 --> 00:01:47,910 if they have nothing to do. 46 00:01:47,910 --> 00:01:50,030 And if they fall asleep, you won't know. 47 00:01:50,030 --> 00:01:51,260 And that's not good. 48 00:01:51,260 --> 00:01:53,930 So it's sometimes better to have a task to keep them awake 49 00:01:53,930 --> 00:01:57,260 and to tell you that they're awake. 50 00:01:57,260 --> 00:02:01,190 Key important point-- don't have one task for one stimulus 51 00:02:01,190 --> 00:02:04,020 condition and a different task for a different stimulus 52 00:02:04,020 --> 00:02:04,520 condition. 53 00:02:04,520 --> 00:02:06,945 If you did, that would be a-- 54 00:02:06,945 --> 00:02:07,820 AUDIENCE: Confounder. 55 00:02:07,820 --> 00:02:08,270 NANCY KANWISHER: Sorry? 56 00:02:08,270 --> 00:02:09,080 AUDIENCE: Confounder. 57 00:02:09,080 --> 00:02:10,163 NANCY KANWISHER: Confound. 58 00:02:10,163 --> 00:02:10,699 Exactly. 59 00:02:10,699 --> 00:02:12,170 That would be a confound. 60 00:02:12,170 --> 00:02:14,120 Don't do that. 61 00:02:14,120 --> 00:02:17,130 We talked about baseline conditions. 62 00:02:17,130 --> 00:02:19,040 So for example, in a vision experiment, 63 00:02:19,040 --> 00:02:22,550 staring at a dot or a cross is kind of as far 64 00:02:22,550 --> 00:02:24,930 as you can go in turning off your visual system. 65 00:02:24,930 --> 00:02:27,260 Why would you want to bother with that? 66 00:02:27,260 --> 00:02:32,090 Well, it's sometimes useful to have that kind of baseline 67 00:02:32,090 --> 00:02:33,890 because we sometimes want to look 68 00:02:33,890 --> 00:02:36,770 not just at a difference between two conditions-- remember, 69 00:02:36,770 --> 00:02:39,980 one condition alone in MRI tells you not a damn thing. 70 00:02:39,980 --> 00:02:41,960 All we can see is differences. 71 00:02:41,960 --> 00:02:44,780 But even just two conditions showing you a difference, 72 00:02:44,780 --> 00:02:45,860 that's something. 73 00:02:45,860 --> 00:02:47,490 But it can be ambiguous. 74 00:02:47,490 --> 00:02:51,380 So for example, if you had a situation like this where there 75 00:02:51,380 --> 00:02:54,227 was a response in some brain region to the red condition 76 00:02:54,227 --> 00:02:56,060 here and the green condition there-- they're 77 00:02:56,060 --> 00:02:56,970 just two numbers. 78 00:02:56,970 --> 00:02:58,910 That's all you have-- 79 00:02:58,910 --> 00:03:00,590 that is different. 80 00:03:00,590 --> 00:03:01,700 That's kind of meh. 81 00:03:01,700 --> 00:03:04,400 There's a difference, but it's meh. 82 00:03:04,400 --> 00:03:06,710 But if you have a good baseline and you really 83 00:03:06,710 --> 00:03:11,630 know that zero is zero or as close to zero as you can get-- 84 00:03:11,630 --> 00:03:14,120 and now imagine if zero was here. 85 00:03:14,120 --> 00:03:18,290 That would be like, wow, that's a really strong effect 86 00:03:18,290 --> 00:03:21,260 and especially in neuroscience, where we care a lot, 87 00:03:21,260 --> 00:03:23,930 as you may have noticed, about selectivity, 88 00:03:23,930 --> 00:03:26,990 about how much more we get a response in one 89 00:03:26,990 --> 00:03:28,110 condition than another. 90 00:03:28,110 --> 00:03:29,840 And selectivities are usually more 91 00:03:29,840 --> 00:03:32,840 interesting as a ratio than as a difference, 92 00:03:32,840 --> 00:03:35,100 as I'm illustrating here. 93 00:03:35,100 --> 00:03:37,070 And so you can't get a ratio unless you 94 00:03:37,070 --> 00:03:40,070 have a third condition, usually a baseline. 95 00:03:40,070 --> 00:03:41,120 All right. 96 00:03:41,120 --> 00:03:43,460 A few other things-- 97 00:03:43,460 --> 00:03:47,300 we talked about how you allocate subjects to conditions. 98 00:03:47,300 --> 00:03:49,370 You could have all your subjects do-- 99 00:03:49,370 --> 00:03:51,530 one half of the subjects do the face condition 100 00:03:51,530 --> 00:03:53,090 for an hour in the scanner. 101 00:03:53,090 --> 00:03:55,340 Another half of your subjects do the object condition 102 00:03:55,340 --> 00:03:56,840 for an hour in the scanner. 103 00:03:56,840 --> 00:03:57,870 That's no good. 104 00:03:57,870 --> 00:03:58,920 We don't want to do that. 105 00:03:58,920 --> 00:04:00,710 We want a within-subjects design. 106 00:04:00,710 --> 00:04:02,870 We want all the conditions within a subject 107 00:04:02,870 --> 00:04:04,640 whenever we can do that. 108 00:04:04,640 --> 00:04:05,240 Why? 109 00:04:05,240 --> 00:04:08,300 Well, my best analogy to this is suppose we 110 00:04:08,300 --> 00:04:12,230 decided to grade your assignments as follows. 111 00:04:12,230 --> 00:04:15,500 A third of the class is going to be graded only by Heather. 112 00:04:15,500 --> 00:04:18,019 This third of the class is going to be graded only by Dana. 113 00:04:18,019 --> 00:04:20,660 Across the whole semester, you guys are Heather people. 114 00:04:20,660 --> 00:04:22,040 You guys are Dana people. 115 00:04:22,040 --> 00:04:23,930 You guys are Anya people. 116 00:04:23,930 --> 00:04:25,580 Is that fair? 117 00:04:25,580 --> 00:04:27,080 No, that's dumb. 118 00:04:27,080 --> 00:04:29,030 What if Heather's a hard-ass? 119 00:04:29,030 --> 00:04:35,270 And she is kind of a hard-ass, not that you guys aren't. 120 00:04:35,270 --> 00:04:36,920 They're all a pretty tough crew there. 121 00:04:36,920 --> 00:04:39,743 I stand here just waiting for the gong to go, wrong. 122 00:04:39,743 --> 00:04:40,910 And you guys should do that. 123 00:04:40,910 --> 00:04:43,243 I'm sure I've already said wrong things and you knew it. 124 00:04:43,243 --> 00:04:45,980 So next time, sound the gong and correct me. 125 00:04:45,980 --> 00:04:49,310 Anyway, that wouldn't be fair in grading exams. 126 00:04:49,310 --> 00:04:51,690 And neither is it good in experimental design. 127 00:04:51,690 --> 00:04:55,070 So for all the same reasons that you guys can hopefully 128 00:04:55,070 --> 00:04:58,250 get an intuition here, you want to have all the conditions 129 00:04:58,250 --> 00:05:00,980 within a person, because maybe one person's brain just 130 00:05:00,980 --> 00:05:03,200 activates more than another person's brain. 131 00:05:03,200 --> 00:05:05,120 Maybe this person had more coffee. 132 00:05:05,120 --> 00:05:08,420 Coffee increases your bold response. 133 00:05:08,420 --> 00:05:11,100 We give away free chocolate espresso beans 134 00:05:11,100 --> 00:05:14,025 before scans in my lab to increase the MRI response. 135 00:05:16,890 --> 00:05:21,330 All of that-- do designs within subjects whenever possible. 136 00:05:21,330 --> 00:05:24,990 How do you allocate conditions to runs, these kind of subsets 137 00:05:24,990 --> 00:05:26,790 of a whole hour-long experiment where 138 00:05:26,790 --> 00:05:29,520 you scan people for maybe five minutes at a time 139 00:05:29,520 --> 00:05:31,860 and give them a break and another five minutes? 140 00:05:31,860 --> 00:05:34,110 Well, the same logic applies. 141 00:05:34,110 --> 00:05:36,173 Imagine you're in a scanner for an hour. 142 00:05:36,173 --> 00:05:37,090 You're getting sleepy. 143 00:05:37,090 --> 00:05:37,920 You're getting bored. 144 00:05:37,920 --> 00:05:39,378 You're thinking about other things. 145 00:05:39,378 --> 00:05:40,860 You're kind of not on the ball. 146 00:05:40,860 --> 00:05:43,690 Those things change over slow periods of time. 147 00:05:43,690 --> 00:05:45,900 And so you want to get all those conditions together 148 00:05:45,900 --> 00:05:48,540 within a run, just as you want to get conditions together 149 00:05:48,540 --> 00:05:50,340 within a subject whenever possible. 150 00:05:53,640 --> 00:05:55,740 And so then we didn't really get into this. 151 00:05:55,740 --> 00:05:57,300 And I think you did in your groups. 152 00:05:57,300 --> 00:06:00,240 But how do we stick all these conditions together 153 00:06:00,240 --> 00:06:01,090 within a run? 154 00:06:01,090 --> 00:06:03,030 Do we clump them together in a batch? 155 00:06:03,030 --> 00:06:04,770 Or do we interleave them? 156 00:06:04,770 --> 00:06:06,600 And I think most of you guys realize 157 00:06:06,600 --> 00:06:09,900 that there's this deep set of trade-offs there. 158 00:06:09,900 --> 00:06:13,702 And so here's what's sometimes called the block design, where 159 00:06:13,702 --> 00:06:15,660 you clump a condition-- a whole bunch of trials 160 00:06:15,660 --> 00:06:18,118 with one condition, then a whole bunch of trials of another 161 00:06:18,118 --> 00:06:21,000 with, in this case, some kind of baseline in between, 162 00:06:21,000 --> 00:06:23,790 versus a mixed interleaved condition, which 163 00:06:23,790 --> 00:06:26,160 is called event-related for uninteresting 164 00:06:26,160 --> 00:06:28,960 historical reasons. 165 00:06:28,960 --> 00:06:33,540 And if it's event-related, you can have it slow or fast. 166 00:06:33,540 --> 00:06:38,880 So why wouldn't you-- 167 00:06:38,880 --> 00:06:42,210 what are the reasons to do this rather than that? 168 00:06:45,715 --> 00:06:47,590 Many of you guys came up with this last time. 169 00:06:47,590 --> 00:06:49,880 So nothing-- 170 00:06:49,880 --> 00:06:50,720 AUDIENCE: Biases. 171 00:06:50,720 --> 00:06:51,678 NANCY KANWISHER: Sorry? 172 00:06:51,678 --> 00:06:53,120 AUDIENCE: Minimize the biases. 173 00:06:53,120 --> 00:06:54,200 NANCY KANWISHER: Yeah. 174 00:06:54,200 --> 00:06:55,033 What kind of biases? 175 00:06:57,600 --> 00:06:59,670 AUDIENCE: In a blocked experiment, 176 00:06:59,670 --> 00:07:03,210 they might be biased by what they're looking for. 177 00:07:03,210 --> 00:07:05,310 NANCY KANWISHER: Yeah, all kinds of biases. 178 00:07:05,310 --> 00:07:07,530 Consider this trial here in a yellow condition. 179 00:07:07,530 --> 00:07:09,570 Well, you just did a bunch of yellow trials. 180 00:07:09,570 --> 00:07:13,560 So maybe your yellow system is adapted out or something 181 00:07:13,560 --> 00:07:14,760 or biased somehow. 182 00:07:14,760 --> 00:07:17,220 But you also know that the next one's going to be yellow. 183 00:07:17,220 --> 00:07:20,670 And so there's all that previous stuff and anticipatory 184 00:07:20,670 --> 00:07:22,590 stuff, all on top of the actual effect 185 00:07:22,590 --> 00:07:24,240 of a single yellow trial. 186 00:07:24,240 --> 00:07:24,990 Yeah. 187 00:07:24,990 --> 00:07:25,950 Was that what you were going to say, as well? 188 00:07:25,950 --> 00:07:28,740 AUDIENCE: I was going to say that you also have effects 189 00:07:28,740 --> 00:07:30,468 from previous trials. 190 00:07:30,468 --> 00:07:32,260 NANCY KANWISHER: Yeah, all of those things, 191 00:07:32,260 --> 00:07:34,480 the effects of recent history doing the same thing 192 00:07:34,480 --> 00:07:37,150 and anticipation of the future, all on top of what 193 00:07:37,150 --> 00:07:39,070 actually happens in this trial. 194 00:07:39,070 --> 00:07:40,360 So those are not deal killers. 195 00:07:40,360 --> 00:07:42,243 But they're things to be aware of. 196 00:07:42,243 --> 00:07:43,660 So those are reasons why you might 197 00:07:43,660 --> 00:07:46,900 want to go with this condition or this condition. 198 00:07:46,900 --> 00:07:49,330 Why wouldn't you always do this? 199 00:07:49,330 --> 00:07:51,910 Alternate the order-- not alternate. 200 00:07:51,910 --> 00:07:56,260 Randomize the order of conditions over time 201 00:07:56,260 --> 00:07:57,640 and bunch them in together. 202 00:07:57,640 --> 00:08:01,270 Why is that not always a great idea? 203 00:08:01,270 --> 00:08:02,530 People do that sometimes. 204 00:08:02,530 --> 00:08:03,740 It's not a terrible idea. 205 00:08:03,740 --> 00:08:05,980 But there are things to keep in mind here. 206 00:08:05,980 --> 00:08:09,370 What's the challenge with that? 207 00:08:09,370 --> 00:08:09,970 Yeah. 208 00:08:09,970 --> 00:08:12,950 AUDIENCE: One possible challenge is that the bold response 209 00:08:12,950 --> 00:08:15,415 has a 10-second window. 210 00:08:15,415 --> 00:08:16,518 So it's scraggly. 211 00:08:16,518 --> 00:08:17,560 NANCY KANWISHER: Exactly. 212 00:08:17,560 --> 00:08:20,920 So the bold responses here are going to be massively 213 00:08:20,920 --> 00:08:22,540 on top of each other. 214 00:08:22,540 --> 00:08:24,190 That's why people sometimes do this. 215 00:08:24,190 --> 00:08:26,230 It's like, OK, we'll have a random order. 216 00:08:26,230 --> 00:08:28,520 And we'll put a big chunk of time in between. 217 00:08:28,520 --> 00:08:32,230 But if you have to stick 10 seconds in between trials, 218 00:08:32,230 --> 00:08:33,880 your subject is going to fall asleep. 219 00:08:33,880 --> 00:08:37,120 And you're spending all that expensive scan time 220 00:08:37,120 --> 00:08:39,549 not collecting enough trials. 221 00:08:39,549 --> 00:08:41,630 So none of these is right or wrong. 222 00:08:41,630 --> 00:08:44,620 They're right or wrong in different conditions. 223 00:08:44,620 --> 00:08:48,490 So as Eke-- am I saying it right? 224 00:08:48,490 --> 00:08:50,800 As Eke mentioned, the challenge here-- 225 00:08:50,800 --> 00:08:53,450 let me just give you my crude depiction of this. 226 00:08:53,450 --> 00:08:56,470 So let's suppose this is time, a series of trials 227 00:08:56,470 --> 00:08:59,425 with a house, a dot, a face, a dot, a dot-- 228 00:08:59,425 --> 00:09:01,550 I don't know where that dot went-- a face, a house, 229 00:09:01,550 --> 00:09:03,110 et cetera. 230 00:09:03,110 --> 00:09:06,610 And each of those trials is one second long. 231 00:09:06,610 --> 00:09:09,580 Well, let's imagine the response in the fusiform face area 232 00:09:09,580 --> 00:09:10,780 to that first house. 233 00:09:10,780 --> 00:09:13,150 You get some kind of middling low response 234 00:09:13,150 --> 00:09:16,240 that's going to take many seconds to peak. 235 00:09:16,240 --> 00:09:18,590 Let's look at the response to this face. 236 00:09:18,590 --> 00:09:19,840 Well, it's going to be higher. 237 00:09:19,840 --> 00:09:22,210 And it's going to peak out there. 238 00:09:22,210 --> 00:09:24,310 And so then you can look at the response 239 00:09:24,310 --> 00:09:25,780 of each of these things. 240 00:09:25,780 --> 00:09:28,900 And so you get this whole series of bold responses 241 00:09:28,900 --> 00:09:30,760 from each of those different trials. 242 00:09:33,400 --> 00:09:35,650 But now, here's the problem. 243 00:09:35,650 --> 00:09:40,270 What we observe when we measure the response of a little voxel, 244 00:09:40,270 --> 00:09:43,120 a little three-dimensional pixel in the brain, 245 00:09:43,120 --> 00:09:45,500 is the sum of all of that, something like this. 246 00:09:45,500 --> 00:09:48,160 It'll be higher than that, but some big blurry sum 247 00:09:48,160 --> 00:09:49,600 of all that. 248 00:09:49,600 --> 00:09:52,000 So now, we want to go backwards from observing 249 00:09:52,000 --> 00:09:55,630 this to seeing the difference between that and that. 250 00:09:55,630 --> 00:09:58,390 And that's a problem. 251 00:09:58,390 --> 00:10:00,800 So that's not great. 252 00:10:00,800 --> 00:10:02,350 But here's the crazy thing. 253 00:10:02,350 --> 00:10:04,700 It's not impossible. 254 00:10:04,700 --> 00:10:09,280 It's not impossible because by weird, mysterious, to me 255 00:10:09,280 --> 00:10:14,890 still kind of unfathomable physiological mechanisms, 256 00:10:14,890 --> 00:10:18,400 these things add up approximately linearly. 257 00:10:18,400 --> 00:10:19,780 It's really counterintuitive. 258 00:10:19,780 --> 00:10:22,030 Who would think a big sloppy biological system 259 00:10:22,030 --> 00:10:24,820 with many different causal steps could produce something 260 00:10:24,820 --> 00:10:26,620 that is approximately linear? 261 00:10:26,620 --> 00:10:27,980 But it does. 262 00:10:27,980 --> 00:10:29,800 And because they add up linearly, 263 00:10:29,800 --> 00:10:32,950 if you have enough trials, you can take this thing 264 00:10:32,950 --> 00:10:35,455 and recover that and that. 265 00:10:35,455 --> 00:10:37,330 We're not going to go through the math of it. 266 00:10:37,330 --> 00:10:40,720 It's just basically addition. 267 00:10:40,720 --> 00:10:44,678 It's solving for multiple equations 268 00:10:44,678 --> 00:10:46,720 because you have all these different time points. 269 00:10:46,720 --> 00:10:48,670 Did everybody get the gist of the idea 270 00:10:48,670 --> 00:10:52,090 that even if you're observing something really slowly varying 271 00:10:52,090 --> 00:10:54,700 and weakly varying because it's massively blurred, 272 00:10:54,700 --> 00:10:57,040 you could, in principle, with enough trials, 273 00:10:57,040 --> 00:10:59,260 go backwards and solve for that and that? 274 00:10:59,260 --> 00:11:01,750 Everybody get that idea? 275 00:11:01,750 --> 00:11:05,080 So what that means is it's a bit of an uphill battle 276 00:11:05,080 --> 00:11:07,090 to do a fast event-related thing. 277 00:11:07,090 --> 00:11:08,770 You can't just look at the response. 278 00:11:08,770 --> 00:11:10,570 You have to actually do a lot of math. 279 00:11:10,570 --> 00:11:14,290 And you may or may not have enough trials to pull it out. 280 00:11:14,290 --> 00:11:16,390 But under some circumstances where you really 281 00:11:16,390 --> 00:11:21,010 need things to be interleaved, you can pull that off. 282 00:11:21,010 --> 00:11:21,910 All right. 283 00:11:21,910 --> 00:11:23,440 So that's what I just said. 284 00:11:23,440 --> 00:11:26,140 Blah, blah, blah. 285 00:11:26,140 --> 00:11:28,810 A few other design things that I didn't really 286 00:11:28,810 --> 00:11:30,580 talk about in detail-- 287 00:11:30,580 --> 00:11:32,350 one I've mentioned glancingly, but I 288 00:11:32,350 --> 00:11:35,140 want it to be more explicit about it, this whole idea 289 00:11:35,140 --> 00:11:38,027 that we've talked about a few times of defining 290 00:11:38,027 --> 00:11:39,610 a region of the brain that we're going 291 00:11:39,610 --> 00:11:44,620 to look at with a localizer scan with functional MRI. 292 00:11:44,620 --> 00:11:47,260 We talked about that with the case of characterizing face 293 00:11:47,260 --> 00:11:48,010 areas. 294 00:11:48,010 --> 00:11:50,290 Go run a face versus object scan. 295 00:11:50,290 --> 00:11:52,210 Find the face area in that subject. 296 00:11:52,210 --> 00:11:54,850 And then do new experiments and testing. 297 00:11:54,850 --> 00:11:57,700 Or when you guys proposed your snake experiments, 298 00:11:57,700 --> 00:12:01,420 you said, first, localize a candidate snake-specific region 299 00:12:01,420 --> 00:12:03,250 with snakes versus non-snakes. 300 00:12:03,250 --> 00:12:05,680 And then do repeated tests in that region 301 00:12:05,680 --> 00:12:07,510 that you found in each subject. 302 00:12:07,510 --> 00:12:10,230 Why do we have to do all that within each subject? 303 00:12:10,230 --> 00:12:11,480 You don't technically have to. 304 00:12:11,480 --> 00:12:12,598 Lots of people don't. 305 00:12:12,598 --> 00:12:14,140 But the reason I think it's important 306 00:12:14,140 --> 00:12:15,940 and the reason we do it in my lab 307 00:12:15,940 --> 00:12:17,710 and all of my intellectual descendants 308 00:12:17,710 --> 00:12:20,050 do that and lots of other people do, too, 309 00:12:20,050 --> 00:12:23,440 the reason we do that is that region is not in exactly 310 00:12:23,440 --> 00:12:26,350 the same place in each subject. 311 00:12:26,350 --> 00:12:29,560 So I have a dopey analogy. 312 00:12:29,560 --> 00:12:33,250 Brains are physically different from one person to the next. 313 00:12:33,250 --> 00:12:35,562 If we scan you guys just anatomically 314 00:12:35,562 --> 00:12:37,270 and look at the structure of your brains, 315 00:12:37,270 --> 00:12:39,103 your brains are as different from each other 316 00:12:39,103 --> 00:12:40,450 as your faces are. 317 00:12:40,450 --> 00:12:42,580 That is, you all have the same basic structure, 318 00:12:42,580 --> 00:12:46,360 the same major lobes and sulci, just as you all 319 00:12:46,360 --> 00:12:48,070 have eyes and nose and mouth. 320 00:12:48,070 --> 00:12:50,110 But they're in slightly different positions. 321 00:12:50,110 --> 00:12:51,790 And that's just the anatomy. 322 00:12:51,790 --> 00:12:55,150 The function on top of that is even more variable. 323 00:12:55,150 --> 00:12:59,200 So it's like trying to align faces. 324 00:12:59,200 --> 00:13:01,633 So if you have a bunch of photographs of faces 325 00:13:01,633 --> 00:13:03,550 and you try to align them on top of each other 326 00:13:03,550 --> 00:13:05,260 and superimpose them, even if you 327 00:13:05,260 --> 00:13:08,620 have a few degrees of stretch, you can't do it perfectly. 328 00:13:08,620 --> 00:13:10,552 You'll get some kind of mess like this. 329 00:13:10,552 --> 00:13:11,510 Just they're different. 330 00:13:11,510 --> 00:13:14,890 So they don't perfectly superimpose. 331 00:13:14,890 --> 00:13:16,450 Well, it's the same deal with brains. 332 00:13:16,450 --> 00:13:19,990 You try to align them perfectly from one person to the next, 333 00:13:19,990 --> 00:13:21,530 but they're physically different. 334 00:13:21,530 --> 00:13:23,590 They do not perfectly superimpose. 335 00:13:26,410 --> 00:13:28,985 So now, imagine that-- this is a totally crazy analogy. 336 00:13:28,985 --> 00:13:30,610 But it's the best I could come up with. 337 00:13:30,610 --> 00:13:32,020 Suppose you're a dermatologist. 338 00:13:32,020 --> 00:13:34,840 And you're interested in skin cancers 339 00:13:34,840 --> 00:13:36,023 that arise in the upper lip. 340 00:13:36,023 --> 00:13:36,940 Well, it could happen. 341 00:13:36,940 --> 00:13:39,190 There's more sunlight hitting the upper lip, whatever. 342 00:13:39,190 --> 00:13:41,857 So now, you could take a whole-- and you're studying photographs 343 00:13:41,857 --> 00:13:44,402 to try to see how many people have it or something like that. 344 00:13:44,402 --> 00:13:46,360 So you could take a whole bunch of photographs. 345 00:13:46,360 --> 00:13:48,777 And you could just say, OK, I'm going to look right there. 346 00:13:48,777 --> 00:13:51,250 It's usually going to be the upper lip. 347 00:13:51,250 --> 00:13:53,380 But it's not always going to be the upper lip. 348 00:13:53,380 --> 00:13:55,720 And so you're really throwing away a lot of information 349 00:13:55,720 --> 00:13:58,150 by choosing the wrong location. 350 00:13:58,150 --> 00:14:00,460 For this person down here, you missed it. 351 00:14:00,460 --> 00:14:03,010 You're looking at the wrong thing. 352 00:14:03,010 --> 00:14:08,080 So in the same way, if you want to study that region, 353 00:14:08,080 --> 00:14:10,450 you've got to find it on each individual photograph. 354 00:14:10,450 --> 00:14:13,390 And similarly, if you want to study the fusiform 355 00:14:13,390 --> 00:14:15,310 face area or the snake area-- 356 00:14:15,310 --> 00:14:18,010 which doesn't exist, but whatever-- 357 00:14:18,010 --> 00:14:20,620 you've got to go find that thing in that person individually. 358 00:14:20,620 --> 00:14:22,750 Otherwise, you're really blurring your data 359 00:14:22,750 --> 00:14:24,670 just as those data are blurred there. 360 00:14:24,670 --> 00:14:27,640 Make sense? 361 00:14:27,640 --> 00:14:29,650 All right. 362 00:14:29,650 --> 00:14:31,390 Good. 363 00:14:31,390 --> 00:14:32,920 Different topic about design-- these 364 00:14:32,920 --> 00:14:34,337 are just kind of different topics. 365 00:14:34,337 --> 00:14:37,150 I couldn't find a good segue. 366 00:14:37,150 --> 00:14:40,810 So far, we have been talking about the most rudimentary 367 00:14:40,810 --> 00:14:43,150 simple possible experimental design. 368 00:14:43,150 --> 00:14:45,670 That means two conditions-- 369 00:14:45,670 --> 00:14:48,400 faces and objects, snakes and non-snakes, 370 00:14:48,400 --> 00:14:50,905 moving or stationary, whatever-- two conditions 371 00:14:50,905 --> 00:14:52,780 where you contrast and you look in the brain. 372 00:14:52,780 --> 00:14:56,020 Is there a higher response to A than B? 373 00:14:56,020 --> 00:14:57,140 Nothing wrong with that-- 374 00:14:57,140 --> 00:14:59,630 you can get pretty far with this. 375 00:14:59,630 --> 00:15:01,300 But first of all, of course, we can 376 00:15:01,300 --> 00:15:02,870 have more than two conditions. 377 00:15:02,870 --> 00:15:04,090 So you can have one factor-- 378 00:15:04,090 --> 00:15:06,700 in this case, stimulus category-- 379 00:15:06,700 --> 00:15:09,950 with many different conditions-- 380 00:15:09,950 --> 00:15:13,330 faces, bodies, objects, scenes, whatever. 381 00:15:13,330 --> 00:15:15,430 So that's not rocket science. 382 00:15:15,430 --> 00:15:18,820 We've just added a few more conditions of the same factor. 383 00:15:18,820 --> 00:15:21,280 Here, factor is the dimension you're varying. 384 00:15:21,280 --> 00:15:25,540 In this case, it's stimulus type. 385 00:15:25,540 --> 00:15:28,000 But we could get fancy. 386 00:15:28,000 --> 00:15:30,670 And we could have four conditions 387 00:15:30,670 --> 00:15:35,470 that are two factors varied orthogonally, like this. 388 00:15:35,470 --> 00:15:38,740 This is sometimes called a 2 by 2 design. 389 00:15:38,740 --> 00:15:42,010 We're going to vary one thing on this axis and another thing 390 00:15:42,010 --> 00:15:45,220 on this axis. 391 00:15:45,220 --> 00:15:47,090 Why would we want to do that? 392 00:15:47,090 --> 00:15:49,160 Well, let's look at an example. 393 00:15:49,160 --> 00:15:50,770 Now, let's suppose that you were going 394 00:15:50,770 --> 00:15:53,140 to compare faces to objects-- 395 00:15:53,140 --> 00:15:55,660 in this case, chairs. 396 00:15:55,660 --> 00:15:58,270 But beyond just those two conditions 397 00:15:58,270 --> 00:16:02,230 of comparing the response in the brain when people are looking 398 00:16:02,230 --> 00:16:05,020 at faces versus objects, we could now ask, 399 00:16:05,020 --> 00:16:08,410 does a response in the brain to faces and objects 400 00:16:08,410 --> 00:16:10,570 depend on whether you're paying attention 401 00:16:10,570 --> 00:16:12,520 to the faces and objects? 402 00:16:12,520 --> 00:16:14,780 What if you're paying attention to something else? 403 00:16:14,780 --> 00:16:18,040 What if we have little colored letters right 404 00:16:18,040 --> 00:16:20,920 in the middle of the display and they're changing rapidly 405 00:16:20,920 --> 00:16:24,130 over time and your task is to monitor 406 00:16:24,130 --> 00:16:27,100 for a repetition of a letter, a one-back task? 407 00:16:27,100 --> 00:16:28,300 And it's going really fast. 408 00:16:28,300 --> 00:16:29,608 So it's very demanding. 409 00:16:29,608 --> 00:16:31,150 You're just looking at those letters. 410 00:16:31,150 --> 00:16:32,410 They're flashing up. 411 00:16:32,410 --> 00:16:32,950 Oh, two B's. 412 00:16:32,950 --> 00:16:33,700 You hit a button. 413 00:16:33,700 --> 00:16:36,190 It's very demanding. 414 00:16:36,190 --> 00:16:38,200 The information hitting your retina 415 00:16:38,200 --> 00:16:39,763 is still coming in from the face, 416 00:16:39,763 --> 00:16:41,180 because the little letter is tiny. 417 00:16:41,180 --> 00:16:43,540 It's not hiding much of the face. 418 00:16:43,540 --> 00:16:44,470 What do you think? 419 00:16:44,470 --> 00:16:46,360 If you're doing the letter task, do you 420 00:16:46,360 --> 00:16:48,610 think you'll still get a response in the fusiform face 421 00:16:48,610 --> 00:16:50,260 area when the face comes up? 422 00:16:50,260 --> 00:16:53,190 And will it be higher than when the chairs come up? 423 00:16:53,190 --> 00:16:54,100 Any intuitions? 424 00:16:54,100 --> 00:16:55,150 Yes? 425 00:16:55,150 --> 00:16:56,470 Talk to me about that. 426 00:16:56,470 --> 00:16:59,440 AUDIENCE: It should be because I think 427 00:16:59,440 --> 00:17:01,570 what people are doing is the signal winds 428 00:17:01,570 --> 00:17:03,418 and hits the retina. 429 00:17:03,418 --> 00:17:04,960 There'll be one in the process of it. 430 00:17:04,960 --> 00:17:06,550 NANCY KANWISHER: It's still coming in. 431 00:17:06,550 --> 00:17:07,060 Yeah. 432 00:17:07,060 --> 00:17:10,089 Will it be just as high? 433 00:17:10,089 --> 00:17:11,027 What do you think? 434 00:17:11,027 --> 00:17:12,860 AUDIENCE: No, I don't think it'd be as high. 435 00:17:12,860 --> 00:17:15,530 But I think there'd be some response that's 436 00:17:15,530 --> 00:17:17,312 higher than that for chairs. 437 00:17:17,312 --> 00:17:18,770 NANCY KANWISHER: Does everybody see 438 00:17:18,770 --> 00:17:20,839 how this is kind of an interesting question? 439 00:17:20,839 --> 00:17:22,250 The machinery is the same. 440 00:17:22,250 --> 00:17:24,069 All the feed-forward stuff is the same. 441 00:17:24,069 --> 00:17:26,569 You can't-- when I tell you just now you're doing the letter 442 00:17:26,569 --> 00:17:28,520 task, now you're doing the face task, 443 00:17:28,520 --> 00:17:30,500 when you switch to the letter task, 444 00:17:30,500 --> 00:17:32,510 the wiring in your brain doesn't change. 445 00:17:32,510 --> 00:17:34,600 All the same wiring is there. 446 00:17:34,600 --> 00:17:36,350 The stimulus is still hitting your retina. 447 00:17:36,350 --> 00:17:37,790 It's still going up the system. 448 00:17:37,790 --> 00:17:40,940 So it becomes interesting to ask how could it be different. 449 00:17:40,940 --> 00:17:42,030 Would it be different? 450 00:17:42,030 --> 00:17:42,530 All right. 451 00:17:42,530 --> 00:17:44,330 I just wanted you all in the grip of this 452 00:17:44,330 --> 00:17:46,580 is a question that we might ask. 453 00:17:46,580 --> 00:17:50,610 So how could we ask this question? 454 00:17:50,610 --> 00:17:54,200 Well, we can do-- as I just said, 455 00:17:54,200 --> 00:17:58,370 we can have subjects in one case do their standard object task. 456 00:17:58,370 --> 00:18:02,980 Look for consecutive repetitions of a face or of a chair. 457 00:18:02,980 --> 00:18:04,730 We can have all different kinds of chairs. 458 00:18:04,730 --> 00:18:09,830 But every once in a while, two in a row are the same. 459 00:18:09,830 --> 00:18:13,280 Or we could have this other task where they're monitoring 460 00:18:13,280 --> 00:18:14,760 for letter repetitions. 461 00:18:14,760 --> 00:18:17,540 So does everybody get this 2 by 2 design? 462 00:18:17,540 --> 00:18:22,610 On one factor, we're varying the stimulus. 463 00:18:22,610 --> 00:18:25,880 Is it faces or objects? 464 00:18:25,880 --> 00:18:27,840 Faces or objects, those are the two conditions. 465 00:18:27,840 --> 00:18:30,410 That's just terminology. 466 00:18:30,410 --> 00:18:33,860 And on the other factor, we're varying task. 467 00:18:33,860 --> 00:18:37,250 Are you doing the face-object task or the letter task? 468 00:18:37,250 --> 00:18:37,750 Yeah. 469 00:18:37,750 --> 00:18:39,650 Ben, is it? 470 00:18:39,650 --> 00:18:42,590 AUDIENCE: So what is it that this task-- 471 00:18:42,590 --> 00:18:45,680 what conclusions does it allow you to draw that the simpler 472 00:18:45,680 --> 00:18:46,338 task won't? 473 00:18:46,338 --> 00:18:47,630 NANCY KANWISHER: Good question. 474 00:18:47,630 --> 00:18:48,818 Good question. 475 00:18:48,818 --> 00:18:50,110 Anybody have an intuition here? 476 00:18:50,110 --> 00:18:53,293 You mean other than just doing that, never mind the letters? 477 00:18:53,293 --> 00:18:53,960 AUDIENCE: Right. 478 00:18:53,960 --> 00:18:55,700 NANCY KANWISHER: Yes, exactly. 479 00:18:55,700 --> 00:18:57,473 Exactly the right question. 480 00:18:57,473 --> 00:18:58,640 You guys, what do you think? 481 00:18:58,640 --> 00:18:59,960 Is there any reason to do this? 482 00:18:59,960 --> 00:19:02,328 Does anybody care about this? 483 00:19:02,328 --> 00:19:03,245 What would it tell us? 484 00:19:06,360 --> 00:19:07,080 Yeah. 485 00:19:07,080 --> 00:19:07,950 I forget your name. 486 00:19:07,950 --> 00:19:08,400 LAUREN: Lauren. 487 00:19:08,400 --> 00:19:08,760 NANCY KANWISHER: Lauren. 488 00:19:08,760 --> 00:19:09,030 Yeah. 489 00:19:09,030 --> 00:19:11,040 LAUREN: The effect of attention on perception. 490 00:19:11,040 --> 00:19:12,660 NANCY KANWISHER: Yeah. 491 00:19:12,660 --> 00:19:14,850 So if we want to know not just is there 492 00:19:14,850 --> 00:19:17,010 some bit that responds more to faces than objects-- 493 00:19:17,010 --> 00:19:18,343 we've been doing that for weeks. 494 00:19:18,343 --> 00:19:19,020 Enough already. 495 00:19:19,020 --> 00:19:20,640 We know there is. 496 00:19:20,640 --> 00:19:23,370 Now, we want to know does it matter what 497 00:19:23,370 --> 00:19:24,900 you're paying attention to. 498 00:19:24,900 --> 00:19:27,720 Is that thing just like a little machine 499 00:19:27,720 --> 00:19:30,390 that's going to do its thing no matter what? 500 00:19:30,390 --> 00:19:34,080 Or do you, the perceiver, have any control over it? 501 00:19:34,080 --> 00:19:36,030 Here's another version of that question. 502 00:19:36,030 --> 00:19:37,928 You guys can all sit there and look 503 00:19:37,928 --> 00:19:40,470 bright-eyed and bushy-tailed and look at me and smile and nod 504 00:19:40,470 --> 00:19:42,512 and think about whatever you want to think about. 505 00:19:42,512 --> 00:19:43,920 And I won't know. 506 00:19:43,920 --> 00:19:45,420 You could be bored out of your mind, 507 00:19:45,420 --> 00:19:49,110 thinking about what you did last night, whatever. 508 00:19:49,110 --> 00:19:50,460 And I won't know. 509 00:19:50,460 --> 00:19:51,900 And that's great. 510 00:19:51,900 --> 00:19:53,880 Isn't that nice, that we human beings are not 511 00:19:53,880 --> 00:19:57,360 trapped by the stimulus that's in front of us at any moment? 512 00:19:57,360 --> 00:19:59,700 Instead, we can control our mental processes 513 00:19:59,700 --> 00:20:01,290 to some degree. 514 00:20:01,290 --> 00:20:04,500 And if you choose to think about something else, you go for it. 515 00:20:04,500 --> 00:20:05,670 You have good judgment. 516 00:20:05,670 --> 00:20:06,590 That is fine. 517 00:20:06,590 --> 00:20:08,250 It happens to me all the time. 518 00:20:08,250 --> 00:20:09,233 You have that ability. 519 00:20:09,233 --> 00:20:11,400 I have that ability-- not really when I'm lecturing. 520 00:20:11,400 --> 00:20:12,690 I kind of have to stay on task. 521 00:20:12,690 --> 00:20:13,815 That's why it's exhausting. 522 00:20:13,815 --> 00:20:17,700 But anyway, we are not trapped. 523 00:20:17,700 --> 00:20:21,730 We are not completely controlled by the sensory world impinging 524 00:20:21,730 --> 00:20:22,230 on us. 525 00:20:22,230 --> 00:20:23,860 And that's a good thing. 526 00:20:23,860 --> 00:20:26,550 And so if you wanted to find out about how that works 527 00:20:26,550 --> 00:20:28,920 and study how well we can control 528 00:20:28,920 --> 00:20:31,950 our own mental processes, you would do something like this. 529 00:20:31,950 --> 00:20:32,640 Make sense? 530 00:20:35,190 --> 00:20:37,050 All right. 531 00:20:37,050 --> 00:20:42,610 So this design enables us to ask a whole bunch of things. 532 00:20:42,610 --> 00:20:46,110 One, does a response in some region or voxel 533 00:20:46,110 --> 00:20:49,770 or wherever we're looking depend on stimulus category? 534 00:20:49,770 --> 00:20:53,370 This is what we've been talking about for a couple weeks now. 535 00:20:53,370 --> 00:20:54,910 To do that, you could just say, OK, 536 00:20:54,910 --> 00:20:56,400 is there an overall higher response 537 00:20:56,400 --> 00:20:58,590 to these two conditions than those two conditions? 538 00:20:58,590 --> 00:20:59,840 You wouldn't worry about task. 539 00:20:59,840 --> 00:21:01,890 You just say, overall, is there a bit 540 00:21:01,890 --> 00:21:03,870 that likes faces more than objects? 541 00:21:03,870 --> 00:21:04,680 Everybody got that? 542 00:21:04,680 --> 00:21:05,388 That's one thing. 543 00:21:05,388 --> 00:21:08,760 That's sort of what we've been doing so far is just comparing 544 00:21:08,760 --> 00:21:10,890 two levels of one factor. 545 00:21:10,890 --> 00:21:12,135 That's called a main effect-- 546 00:21:15,360 --> 00:21:20,760 in this case, a main effect of the factor stimulus type. 547 00:21:20,760 --> 00:21:22,950 Or we could ask a different question. 548 00:21:22,950 --> 00:21:27,480 Does the response of a region of the brain depend on attention? 549 00:21:27,480 --> 00:21:31,050 So overall, never mind whether it's faces or objects. 550 00:21:31,050 --> 00:21:33,150 There are photographs flashing up there. 551 00:21:33,150 --> 00:21:36,270 Does it matter if you're paying attention to those photographs 552 00:21:36,270 --> 00:21:39,390 or paying attention to something else? 553 00:21:39,390 --> 00:21:41,700 So for that, we compare the average of these two 554 00:21:41,700 --> 00:21:44,070 versus the average of those two. 555 00:21:44,070 --> 00:21:47,310 That would be a main effect of task. 556 00:21:47,310 --> 00:21:48,340 Make sense? 557 00:21:48,340 --> 00:21:49,260 It's just terminology. 558 00:21:49,260 --> 00:21:50,635 But it's important to see that we 559 00:21:50,635 --> 00:21:53,355 can ask these different questions of a 2 by 2 design. 560 00:21:55,860 --> 00:21:56,670 Everybody with me? 561 00:21:56,670 --> 00:21:57,810 Anybody want to ask me something? 562 00:21:57,810 --> 00:21:59,640 This main effect isn't very interesting. 563 00:21:59,640 --> 00:22:01,900 It's kind of a weird one, but you could do it. 564 00:22:05,370 --> 00:22:08,670 So that's main effect of attention or task. 565 00:22:08,670 --> 00:22:11,005 Now, we could ask, as someone else said a moment ago-- 566 00:22:11,005 --> 00:22:11,880 was that you, Lauren? 567 00:22:11,880 --> 00:22:16,470 Yes-- if we want to know, does the effect of stimulus category 568 00:22:16,470 --> 00:22:18,330 depend on attention? 569 00:22:18,330 --> 00:22:22,140 That's what a 2 by 2 design is-- it's that kind of question 570 00:22:22,140 --> 00:22:24,810 that a 2 by 2 design enables you to ask. 571 00:22:24,810 --> 00:22:27,630 So to ask that question, really, what we would do 572 00:22:27,630 --> 00:22:30,720 is essentially look at this row and then that row. 573 00:22:30,720 --> 00:22:32,260 And then we compare them. 574 00:22:32,260 --> 00:22:34,860 So we might say how much higher of a response 575 00:22:34,860 --> 00:22:36,480 you get for faces and objects. 576 00:22:36,480 --> 00:22:38,550 And we get some number in that cell-- 577 00:22:38,550 --> 00:22:40,470 sorry, when you're paying attention to them. 578 00:22:40,470 --> 00:22:42,990 And how much would you get when you're not paying attention 579 00:22:42,990 --> 00:22:43,260 to them? 580 00:22:43,260 --> 00:22:45,000 You're paying attention to the letters. 581 00:22:45,000 --> 00:22:48,780 And then we could say, oh, how selective 582 00:22:48,780 --> 00:22:52,080 is the face response when it's attended versus unattended? 583 00:22:52,080 --> 00:22:54,900 In other words, how do the response to stimuli 584 00:22:54,900 --> 00:22:57,300 depend on task? 585 00:22:57,300 --> 00:22:58,410 It's not rocket science. 586 00:22:58,410 --> 00:23:01,740 But it's important to see how this humble little 2 by 2 587 00:23:01,740 --> 00:23:04,390 enables you to ask these very different questions. 588 00:23:04,390 --> 00:23:07,410 So this question of how the effect of one factor 589 00:23:07,410 --> 00:23:10,020 depends on the level you're at with the other factor 590 00:23:10,020 --> 00:23:13,710 is called an interaction. 591 00:23:13,710 --> 00:23:16,470 And it's often the most interesting kind of question 592 00:23:16,470 --> 00:23:18,270 to ask of any kind of data, whether it's 593 00:23:18,270 --> 00:23:19,800 MRI or anything else. 594 00:23:19,800 --> 00:23:23,790 You could think of it as a difference of differences 595 00:23:23,790 --> 00:23:28,260 or, more directly, how the effect of one factor 596 00:23:28,260 --> 00:23:31,650 depends on the level of the other factor. 597 00:23:31,650 --> 00:23:33,750 In this case, the terminology would 598 00:23:33,750 --> 00:23:36,360 be we would be looking at an interaction of stimulus 599 00:23:36,360 --> 00:23:38,730 category by task. 600 00:23:38,730 --> 00:23:39,600 Make sense? 601 00:23:39,600 --> 00:23:41,100 Everybody with the program about how 602 00:23:41,100 --> 00:23:43,830 this question is different than the two different main effect 603 00:23:43,830 --> 00:23:46,760 questions? 604 00:23:46,760 --> 00:23:49,190 To get some practice with this, I'm 605 00:23:49,190 --> 00:23:52,460 going to have you guys come up here and draw some data. 606 00:23:52,460 --> 00:23:55,430 So we're going to consider a-- 607 00:23:55,430 --> 00:23:58,100 just to get experience with main effects and interactions, 608 00:23:58,100 --> 00:24:01,760 we're going to consider a main effect of factor X, which 609 00:24:01,760 --> 00:24:05,300 is an overall effect of X, the difference between condition 610 00:24:05,300 --> 00:24:07,400 one versus condition two within X. 611 00:24:07,400 --> 00:24:10,160 And we're going to consider interactions of factor X 612 00:24:10,160 --> 00:24:14,360 and factor Y. That is how the effect of X 613 00:24:14,360 --> 00:24:17,720 depends on Y and vice versa. 614 00:24:17,720 --> 00:24:20,060 So I'm going to have you guys draw data. 615 00:24:20,060 --> 00:24:21,470 I need my first volunteer. 616 00:24:21,470 --> 00:24:23,180 This is not hard. 617 00:24:23,180 --> 00:24:25,070 How do I put this thing up? 618 00:24:25,070 --> 00:24:27,140 I forgot to check if I have red and black pens. 619 00:24:27,140 --> 00:24:30,470 Hopefully, I do. 620 00:24:30,470 --> 00:24:33,900 If you don't volunteer, I'm going to pick randomly. 621 00:24:33,900 --> 00:24:37,410 And that could be worse. 622 00:24:37,410 --> 00:24:39,590 It's not too awful. 623 00:24:39,590 --> 00:24:40,850 Is it Carrie? 624 00:24:40,850 --> 00:24:42,350 Unfortunately, I remember your name. 625 00:24:42,350 --> 00:24:43,220 So come on up here. 626 00:24:46,220 --> 00:24:51,440 So you got an easy one. 627 00:24:51,440 --> 00:24:54,045 This doesn't write very well, but it will do. 628 00:24:54,045 --> 00:24:54,920 This is your red pen. 629 00:24:54,920 --> 00:24:56,720 That's your black pen. 630 00:24:56,720 --> 00:25:00,980 So we have here the response in red or orange 631 00:25:00,980 --> 00:25:02,610 will be the attended case. 632 00:25:02,610 --> 00:25:04,640 We're looking at a response in the fusiform face 633 00:25:04,640 --> 00:25:08,000 area, a possible response-- in this case, a pretty 634 00:25:08,000 --> 00:25:08,630 unlikely one. 635 00:25:08,630 --> 00:25:11,250 But never mind-- so the attended one. 636 00:25:11,250 --> 00:25:12,500 And there's an unattended one. 637 00:25:12,500 --> 00:25:14,930 And there's a response to objects and faces. 638 00:25:14,930 --> 00:25:17,360 And what I want you to draw is a pattern 639 00:25:17,360 --> 00:25:20,870 of data in which there's no main effect of stimulus type, 640 00:25:20,870 --> 00:25:24,380 no main effect of attention, and no interaction of stimulus 641 00:25:24,380 --> 00:25:26,000 type by attention. 642 00:25:26,000 --> 00:25:28,970 So you're going to draw four dots. 643 00:25:28,970 --> 00:25:30,515 You can do X's and O's or whatever. 644 00:25:33,912 --> 00:25:37,380 CARRIE: So no effect of the stimulus-- 645 00:25:37,380 --> 00:25:39,963 so that means it doesn't matter if it was a face or an object. 646 00:25:39,963 --> 00:25:40,963 NANCY KANWISHER: Uh-huh. 647 00:25:40,963 --> 00:25:41,610 Exactly. 648 00:25:41,610 --> 00:25:42,988 CARRIE: So I guess-- 649 00:25:42,988 --> 00:25:44,780 NANCY KANWISHER: You could do that for one. 650 00:25:44,780 --> 00:25:46,460 Go do that for the attended task first. 651 00:25:46,460 --> 00:25:47,660 Do that in red. 652 00:25:47,660 --> 00:25:49,670 Yeah. 653 00:25:49,670 --> 00:25:53,210 CARRIE: So about midway. 654 00:25:53,210 --> 00:25:55,310 NANCY KANWISHER: You have to really lean on it. 655 00:25:55,310 --> 00:25:56,570 Oh, it worked for me. 656 00:25:56,570 --> 00:25:57,260 Sorry. 657 00:25:57,260 --> 00:25:59,750 We'll have the extremely counterintuitive thing 658 00:25:59,750 --> 00:26:01,710 of this is attention. 659 00:26:01,710 --> 00:26:02,370 There we go. 660 00:26:02,370 --> 00:26:02,870 Here you go. 661 00:26:06,220 --> 00:26:08,410 Perfect, no main effect of stimulus type. 662 00:26:08,410 --> 00:26:09,730 Good. 663 00:26:09,730 --> 00:26:12,520 Now, we've got no main effect of attention. 664 00:26:12,520 --> 00:26:13,765 So take the blue pen. 665 00:26:16,855 --> 00:26:18,220 CARRIE: So this is like-- 666 00:26:18,220 --> 00:26:19,762 NANCY KANWISHER: And no interaction-- 667 00:26:19,762 --> 00:26:24,290 CARRIE: No effect of attention relative to attention? 668 00:26:24,290 --> 00:26:25,540 NANCY KANWISHER: That's right. 669 00:26:25,540 --> 00:26:27,970 No main effect of attention means no difference 670 00:26:27,970 --> 00:26:30,000 for attended and unattended. 671 00:26:30,000 --> 00:26:32,220 CARRIE: OK. 672 00:26:32,220 --> 00:26:34,380 But stimulus type is important now. 673 00:26:34,380 --> 00:26:34,740 NANCY KANWISHER: No, no, no. 674 00:26:34,740 --> 00:26:36,270 We're still-- this is all the same. 675 00:26:36,270 --> 00:26:37,680 We're drawing all the same situation. 676 00:26:37,680 --> 00:26:38,190 Yeah, exactly. 677 00:26:38,190 --> 00:26:39,810 It's a little bit of a-- there you go. 678 00:26:39,810 --> 00:26:40,470 Beautiful. 679 00:26:40,470 --> 00:26:41,940 Nicely done, Carrie. 680 00:26:41,940 --> 00:26:43,680 So that's kind of a dopey case. 681 00:26:43,680 --> 00:26:44,190 Well done. 682 00:26:44,190 --> 00:26:44,910 You can sit down. 683 00:26:44,910 --> 00:26:45,979 Yeah. 684 00:26:45,979 --> 00:26:48,060 CARRIE: Hopefully, I seemed right. 685 00:26:48,060 --> 00:26:50,352 NANCY KANWISHER: So we're just starting basic here. 686 00:26:50,352 --> 00:26:52,560 That's what it looks like if you have no main effects 687 00:26:52,560 --> 00:26:53,430 and no interactions. 688 00:26:53,430 --> 00:26:55,950 Everything's the same. 689 00:26:55,950 --> 00:26:56,785 All right. 690 00:26:56,785 --> 00:26:59,160 That's not going to happen if you're in the fusiform face 691 00:26:59,160 --> 00:26:59,470 area. 692 00:26:59,470 --> 00:27:01,170 If you get that, there's something wrong with your scan. 693 00:27:01,170 --> 00:27:02,790 Or something went way wrong. 694 00:27:02,790 --> 00:27:06,090 But we're just fleshing out the logical possibilities. 695 00:27:06,090 --> 00:27:07,620 I need the next volunteer. 696 00:27:07,620 --> 00:27:09,870 Who's going to do a main effect of stimulus type, 697 00:27:09,870 --> 00:27:12,810 no main effect of attention, and no interaction of stimulus type 698 00:27:12,810 --> 00:27:14,680 by attention? 699 00:27:14,680 --> 00:27:15,180 Yes. 700 00:27:15,180 --> 00:27:16,560 Come on up here. 701 00:27:16,560 --> 00:27:18,540 Is it-- what's your name? 702 00:27:18,540 --> 00:27:19,662 AKWILE: Akwile. 703 00:27:19,662 --> 00:27:20,620 NANCY KANWISHER: Sorry? 704 00:27:20,620 --> 00:27:21,150 Akwile. 705 00:27:21,150 --> 00:27:21,650 Yes. 706 00:27:21,650 --> 00:27:23,230 Right. 707 00:27:23,230 --> 00:27:23,730 Great. 708 00:27:23,730 --> 00:27:26,550 So go ahead and draw that for me. 709 00:27:26,550 --> 00:27:29,370 I'm just going to clarify that this-- what did we do? 710 00:27:29,370 --> 00:27:31,080 This is unattended-- wait a minute. 711 00:27:31,080 --> 00:27:33,880 Yeah, unattended here. 712 00:27:33,880 --> 00:27:34,380 Here you go. 713 00:27:34,380 --> 00:27:35,832 AKWILE: Oh, static. 714 00:27:40,190 --> 00:27:42,690 So you start with the-- 715 00:27:42,690 --> 00:27:44,940 NANCY KANWISHER: There's main effect of stimulus type. 716 00:27:44,940 --> 00:27:46,510 It's probably easiest if you start-- yeah, 717 00:27:46,510 --> 00:27:47,280 start with attended. 718 00:27:47,280 --> 00:27:48,905 There's a main effect of stimulus type. 719 00:27:52,940 --> 00:27:53,440 Great. 720 00:27:53,440 --> 00:27:54,370 You're in the FFA. 721 00:27:54,370 --> 00:27:56,200 The faces are going to be higher than the objects. 722 00:27:56,200 --> 00:27:57,790 And main effect of stimulus type says 723 00:27:57,790 --> 00:27:58,660 you're going to get a difference. 724 00:27:58,660 --> 00:27:59,160 Good. 725 00:28:06,958 --> 00:28:07,970 AKWILE: Is that good? 726 00:28:07,970 --> 00:28:09,095 NANCY KANWISHER: Beautiful. 727 00:28:09,095 --> 00:28:09,680 Well done. 728 00:28:09,680 --> 00:28:10,760 Make sense to everyone? 729 00:28:10,760 --> 00:28:11,510 Thank you, Akwile. 730 00:28:14,660 --> 00:28:16,640 Does that make sense, everyone? 731 00:28:16,640 --> 00:28:19,140 So what would this mean if you got this? 732 00:28:19,140 --> 00:28:19,640 OK, Akwile. 733 00:28:19,640 --> 00:28:20,430 You're not quite done. 734 00:28:20,430 --> 00:28:21,110 So you get that. 735 00:28:21,110 --> 00:28:24,230 What's that telling you? 736 00:28:24,230 --> 00:28:28,310 AKWILE: It tells you that it responds to the stimulus. 737 00:28:28,310 --> 00:28:32,853 But the attention doesn't make any difference to the energy. 738 00:28:32,853 --> 00:28:33,770 NANCY KANWISHER: Yeah. 739 00:28:33,770 --> 00:28:36,200 The selectivity you get doesn't depend 740 00:28:36,200 --> 00:28:37,460 on attention in this case. 741 00:28:37,460 --> 00:28:39,710 Again, these are all-- we're just making up data. 742 00:28:39,710 --> 00:28:41,090 We're just considering the different ways 743 00:28:41,090 --> 00:28:43,257 the data could come out and what they would tell us. 744 00:28:43,257 --> 00:28:45,830 Everybody got that? 745 00:28:45,830 --> 00:28:48,717 Now, the plot is going to thicken a little bit. 746 00:28:48,717 --> 00:28:50,300 Now, we're going to have a main effect 747 00:28:50,300 --> 00:28:52,700 of stimulus and main effect of attention 748 00:28:52,700 --> 00:28:57,050 and no interaction of stimulus by attention. 749 00:28:57,050 --> 00:28:58,890 Let's go ahead up here. 750 00:28:58,890 --> 00:28:59,703 Is it Talia? 751 00:28:59,703 --> 00:29:00,370 Come on up here. 752 00:29:07,300 --> 00:29:07,800 Here. 753 00:29:18,900 --> 00:29:19,400 Mm-hmm. 754 00:29:25,690 --> 00:29:27,370 Beautiful. 755 00:29:27,370 --> 00:29:28,940 Thank you. 756 00:29:28,940 --> 00:29:32,290 Everybody see how this is a main effect of stimulus type. 757 00:29:32,290 --> 00:29:35,380 Faces are higher than objects, a main effect of attention. 758 00:29:35,380 --> 00:29:39,160 Attended is higher than unattended, but no interaction. 759 00:29:39,160 --> 00:29:43,340 The effect of stimulus type is the same at each level. 760 00:29:43,340 --> 00:29:43,840 Yeah? 761 00:29:43,840 --> 00:29:44,910 Ben. 762 00:29:44,910 --> 00:29:46,660 BEN: Just something that maybe was unclear 763 00:29:46,660 --> 00:29:49,390 for me-- does attention usually affect 764 00:29:49,390 --> 00:29:51,430 the selectivity or the average response? 765 00:29:51,430 --> 00:29:52,300 NANCY KANWISHER: These are great questions. 766 00:29:52,300 --> 00:29:53,675 Right now, we're just considering 767 00:29:53,675 --> 00:29:55,660 the logical possibilities. 768 00:29:55,660 --> 00:29:56,980 We'll talk about that later. 769 00:29:56,980 --> 00:29:57,970 Yeah, it's a good-- 770 00:29:57,970 --> 00:29:59,380 you should be wondering. 771 00:29:59,380 --> 00:30:00,760 Yeah. 772 00:30:00,760 --> 00:30:02,770 So Talia, tell us. 773 00:30:02,770 --> 00:30:07,130 If you found that, what would that mean? 774 00:30:07,130 --> 00:30:11,800 TALIA: So because the difference in effect 775 00:30:11,800 --> 00:30:14,990 between attended and unattended objects and faces 776 00:30:14,990 --> 00:30:19,960 is the same, that does show that attention plays an effect 777 00:30:19,960 --> 00:30:21,700 and the stimulus plays an effect. 778 00:30:21,700 --> 00:30:25,840 But there's no interaction between them, 779 00:30:25,840 --> 00:30:27,298 because the difference is the same. 780 00:30:27,298 --> 00:30:29,965 NANCY KANWISHER: It's like there are these two different things. 781 00:30:29,965 --> 00:30:31,330 There's face selectivity. 782 00:30:31,330 --> 00:30:33,430 And then there's just a big overall 783 00:30:33,430 --> 00:30:35,660 if you're looking at stuff, you get higher responses 784 00:30:35,660 --> 00:30:37,243 than if you're looking at the letters. 785 00:30:37,243 --> 00:30:38,910 Yeah, exactly. 786 00:30:38,910 --> 00:30:39,410 All right. 787 00:30:39,410 --> 00:30:43,870 One more-- I need a volunteer. 788 00:30:43,870 --> 00:30:45,160 David. 789 00:30:45,160 --> 00:30:46,617 That's not a volunteer, I realize. 790 00:30:46,617 --> 00:30:47,950 It's different than a volunteer. 791 00:30:51,190 --> 00:30:54,340 So draw me a case where you have a main effect of stimulus, 792 00:30:54,340 --> 00:30:56,890 a main effect of attention, and an interaction of stimulus 793 00:30:56,890 --> 00:30:57,904 by attention. 794 00:31:00,570 --> 00:31:04,070 DAVID: This one. 795 00:31:04,070 --> 00:31:09,340 And then it goes like this. 796 00:31:09,340 --> 00:31:10,510 NANCY KANWISHER: Yeah. 797 00:31:10,510 --> 00:31:11,560 Beautiful. 798 00:31:11,560 --> 00:31:14,270 So here, we have-- 799 00:31:14,270 --> 00:31:14,770 oh, wait. 800 00:31:14,770 --> 00:31:15,580 Actually, hang on. 801 00:31:15,580 --> 00:31:16,080 Hang on. 802 00:31:16,080 --> 00:31:17,547 Hang on. 803 00:31:17,547 --> 00:31:18,130 Wait a second. 804 00:31:18,130 --> 00:31:20,417 You got a main effect of stimulus. 805 00:31:20,417 --> 00:31:22,750 Actually, you don't have a main effect of stimulus here. 806 00:31:22,750 --> 00:31:24,010 DAVID: Did I get rid of that? 807 00:31:24,010 --> 00:31:25,802 NANCY KANWISHER: Yeah, you got rid of that. 808 00:31:30,580 --> 00:31:31,460 DAVID: I come back. 809 00:31:31,460 --> 00:31:32,400 NANCY KANWISHER: Ah. 810 00:31:32,400 --> 00:31:33,780 Now, you have a main effect of attention. 811 00:31:33,780 --> 00:31:34,410 DAVID: Wait. 812 00:31:34,410 --> 00:31:35,327 NANCY KANWISHER: Wait. 813 00:31:35,327 --> 00:31:36,972 Oh, maybe I said it wrong. 814 00:31:36,972 --> 00:31:37,680 You didn't have-- 815 00:31:37,680 --> 00:31:38,180 OK. 816 00:31:38,180 --> 00:31:38,980 Wait a second. 817 00:31:38,980 --> 00:31:39,647 DAVID: Oh, yeah. 818 00:31:39,647 --> 00:31:40,480 You're right. 819 00:31:40,480 --> 00:31:42,120 AUDIENCE: No, he has a main effect. 820 00:31:42,120 --> 00:31:43,680 NANCY KANWISHER: I think I screwed you up here. 821 00:31:43,680 --> 00:31:45,097 We want a main effect of stimulus. 822 00:31:45,097 --> 00:31:46,320 Yeah. 823 00:31:46,320 --> 00:31:46,996 DAVID: Yeah. 824 00:31:46,996 --> 00:31:53,180 And then let's just we move it a little bit like that. 825 00:31:53,180 --> 00:31:53,978 And we get-- 826 00:31:53,978 --> 00:31:55,020 NANCY KANWISHER: Exactly. 827 00:31:55,020 --> 00:31:56,490 So don't go away. 828 00:31:56,490 --> 00:31:59,090 Does everybody see how this is a main effect of stimulus? 829 00:31:59,090 --> 00:32:01,010 Those guys are higher than those guys. 830 00:32:01,010 --> 00:32:03,230 A main effect of attention-- the green guys 831 00:32:03,230 --> 00:32:06,230 are higher than the blue guys, but an interaction 832 00:32:06,230 --> 00:32:09,110 like that difference is bigger than that difference. 833 00:32:09,110 --> 00:32:10,850 Now, don't go away, David. 834 00:32:10,850 --> 00:32:14,330 If you got that, what would you conclude 835 00:32:14,330 --> 00:32:18,510 about the fusiform face area if you got those data? 836 00:32:18,510 --> 00:32:26,350 DAVID: Well, the FFA, if it was like this, it depends-- 837 00:32:26,350 --> 00:32:27,930 not only does it depend on attention. 838 00:32:27,930 --> 00:32:31,680 But it kind of depends on attention more than-- 839 00:32:31,680 --> 00:32:34,300 maybe the object attention doesn't depend on attention 840 00:32:34,300 --> 00:32:35,620 so much. 841 00:32:35,620 --> 00:32:36,870 NANCY KANWISHER: That's right. 842 00:32:36,870 --> 00:32:39,450 That's what your data show is that the response to faces 843 00:32:39,450 --> 00:32:42,000 is more strongly affected by attention than the response 844 00:32:42,000 --> 00:32:43,080 to objects. 845 00:32:43,080 --> 00:32:45,000 But another way of saying the same thing 846 00:32:45,000 --> 00:32:48,960 is to say that the selectivity is greater 847 00:32:48,960 --> 00:32:51,570 when you're attending than when you're not attending. 848 00:32:51,570 --> 00:32:52,650 Make sense? 849 00:32:52,650 --> 00:32:55,780 Or the differential response is greater. 850 00:32:55,780 --> 00:32:56,280 Great. 851 00:32:56,280 --> 00:32:56,880 Thank you. 852 00:32:56,880 --> 00:32:59,325 Everybody got these basic ideas? 853 00:32:59,325 --> 00:33:00,450 They're pretty rudimentary. 854 00:33:00,450 --> 00:33:02,158 I don't want to insult your intelligence. 855 00:33:02,158 --> 00:33:04,752 But I really found that people often don't get main effects 856 00:33:04,752 --> 00:33:05,460 and interactions. 857 00:33:05,460 --> 00:33:08,430 And often, really, the crux of an interesting design 858 00:33:08,430 --> 00:33:09,225 is an interaction. 859 00:33:09,225 --> 00:33:11,100 And keeping it straight from the main effects 860 00:33:11,100 --> 00:33:14,340 sometimes takes a little doing. 861 00:33:14,340 --> 00:33:17,340 So let's consider what is the key sign of an interaction. 862 00:33:17,340 --> 00:33:19,590 Oh, well, we already have a case where, often, people 863 00:33:19,590 --> 00:33:21,510 draw an interaction where the lines cross. 864 00:33:21,510 --> 00:33:22,740 But they don't need to cross. 865 00:33:22,740 --> 00:33:24,407 David just showed you a nice interaction 866 00:33:24,407 --> 00:33:27,360 where the lines don't cross. 867 00:33:27,360 --> 00:33:27,860 All right. 868 00:33:30,830 --> 00:33:33,050 Moving on-- that was all leftovers. 869 00:33:33,050 --> 00:33:34,520 That's bad planning. 870 00:33:34,520 --> 00:33:35,030 Oh, sorry. 871 00:33:35,030 --> 00:33:35,390 What? 872 00:33:35,390 --> 00:33:36,473 Oh, yes, put the thing up. 873 00:33:36,473 --> 00:33:39,420 Good point-- or down. 874 00:33:39,420 --> 00:33:42,510 Thank you, Chris. 875 00:33:42,510 --> 00:33:44,520 Let's talk about category-selective regions 876 00:33:44,520 --> 00:33:45,480 of the visual cortex. 877 00:33:45,480 --> 00:33:47,310 We have been talking about these all along. 878 00:33:47,310 --> 00:33:50,680 But it's time to get a little more critical. 879 00:33:50,680 --> 00:33:53,580 So first, I've been talking about how 880 00:33:53,580 --> 00:33:55,500 there's a patch in there that responds 881 00:33:55,500 --> 00:33:57,310 pretty selectively to faces. 882 00:33:57,310 --> 00:34:00,120 There's a patch out there on the lateral surface that responds 883 00:34:00,120 --> 00:34:01,980 pretty selectively to bodies. 884 00:34:01,980 --> 00:34:03,390 And we haven't mentioned it much. 885 00:34:03,390 --> 00:34:05,057 But next week, you'll hear more than you 886 00:34:05,057 --> 00:34:07,650 want to hear about a patch smack in the middle there 887 00:34:07,650 --> 00:34:11,010 that responds selectively to images of scenes. 888 00:34:11,010 --> 00:34:14,040 So you just look at that. 889 00:34:14,040 --> 00:34:15,870 And it's really damn near impossible 890 00:34:15,870 --> 00:34:19,770 not to wonder what else is lurking in there. 891 00:34:19,770 --> 00:34:22,020 What else is in there? 892 00:34:22,020 --> 00:34:25,389 And of course, we wondered that many years ago, 893 00:34:25,389 --> 00:34:29,489 me and Paul Downing who did the body area paper. 894 00:34:29,489 --> 00:34:30,752 He was my postdoc at the time. 895 00:34:30,752 --> 00:34:32,460 And we said, well, let's just scan people 896 00:34:32,460 --> 00:34:34,710 looking at 20 different categories of objects. 897 00:34:34,710 --> 00:34:37,500 And we put in all kinds of silly stuff in there. 898 00:34:37,500 --> 00:34:39,840 I'm phobic about snakes, so I wanted snakes. 899 00:34:39,840 --> 00:34:41,429 He's phobic about spiders. 900 00:34:41,429 --> 00:34:43,608 We compromised in our creepies condition-- 901 00:34:43,608 --> 00:34:44,650 threw them both in there. 902 00:34:44,650 --> 00:34:45,567 It was kind of sloppy. 903 00:34:45,567 --> 00:34:49,980 But we had food and plants, because we figured those 904 00:34:49,980 --> 00:34:51,929 are biologically important. 905 00:34:51,929 --> 00:34:57,390 We had weapons because those are-- and tools because those 906 00:34:57,390 --> 00:34:59,010 are important in other ways. 907 00:34:59,010 --> 00:35:02,280 We had flowers because Steve Pinker has this line in one 908 00:35:02,280 --> 00:35:04,800 of his books saying that "a flower is 909 00:35:04,800 --> 00:35:07,080 a veritable microfiche of biologically 910 00:35:07,080 --> 00:35:08,610 relevant information." 911 00:35:08,610 --> 00:35:10,950 And he hypothesized based on that that people might 912 00:35:10,950 --> 00:35:14,130 have special-purpose neural machinery for flowers. 913 00:35:14,130 --> 00:35:17,010 It sounded like a crock to me, but it's an empirical question. 914 00:35:17,010 --> 00:35:19,230 So we threw flowers in there for Steve Pinker. 915 00:35:22,658 --> 00:35:25,200 And so then we scanned people looking at all of these things. 916 00:35:25,200 --> 00:35:27,300 And we replicated in every subject 917 00:35:27,300 --> 00:35:30,810 the existence of selective regions for places, faces, 918 00:35:30,810 --> 00:35:33,210 and bodies. 919 00:35:33,210 --> 00:35:36,570 And we didn't find anything else. 920 00:35:36,570 --> 00:35:38,430 None of these other categories produced 921 00:35:38,430 --> 00:35:43,350 clear whopping selectivities in systematic regions of the kind 922 00:35:43,350 --> 00:35:47,010 that you see in every subject for faces, places, and bodies. 923 00:35:47,010 --> 00:35:49,200 Now, I hasten to say that there are 924 00:35:49,200 --> 00:35:52,680 lots of ways with any method to not see something that's 925 00:35:52,680 --> 00:35:54,270 actually there. 926 00:35:54,270 --> 00:35:57,450 You might not have enough statistical power to see it. 927 00:35:57,450 --> 00:36:00,130 It might be that there's a whole bunch of neurons that do that, 928 00:36:00,130 --> 00:36:01,660 but they're scattered all over the brain. 929 00:36:01,660 --> 00:36:03,060 And so they're spatially interleaved 930 00:36:03,060 --> 00:36:05,060 with neurons that do other things, in which case 931 00:36:05,060 --> 00:36:07,230 MRI will never see it. 932 00:36:07,230 --> 00:36:09,357 There are big black holes in MRI images 933 00:36:09,357 --> 00:36:10,440 where there are artifacts. 934 00:36:10,440 --> 00:36:11,922 And you can't see anything. 935 00:36:11,922 --> 00:36:13,380 And if the soul was right there, we 936 00:36:13,380 --> 00:36:15,180 wouldn't have discovered it yet because we can't see it 937 00:36:15,180 --> 00:36:17,130 in our MRI images, not that I know what 938 00:36:17,130 --> 00:36:18,360 the contrast is for the soul. 939 00:36:18,360 --> 00:36:19,895 You could work on that. 940 00:36:19,895 --> 00:36:20,895 Did you have a question? 941 00:36:20,895 --> 00:36:22,270 AUDIENCE: Yeah, I'm just curious. 942 00:36:22,270 --> 00:36:24,205 Did you try it on text? 943 00:36:24,205 --> 00:36:25,080 NANCY KANWISHER: Yes. 944 00:36:25,080 --> 00:36:26,330 And we will get to that later. 945 00:36:26,330 --> 00:36:28,680 And there is absolutely a specialized region for text. 946 00:36:28,680 --> 00:36:30,490 And we'll talk about that in a few weeks. 947 00:36:30,490 --> 00:36:31,530 Yeah. 948 00:36:31,530 --> 00:36:34,020 We didn't in this experiment, but we and lots of others 949 00:36:34,020 --> 00:36:35,835 have in other cases. 950 00:36:40,140 --> 00:36:41,700 So don't take this too seriously. 951 00:36:41,700 --> 00:36:43,680 My main point is just that you don't 952 00:36:43,680 --> 00:36:46,770 find a little patch of brain for any damn thing you test. 953 00:36:46,770 --> 00:36:49,770 Mostly, you don't find it. 954 00:36:49,770 --> 00:36:52,170 There is some disagreement in the field about the case 955 00:36:52,170 --> 00:36:53,960 of tools and hands. 956 00:36:53,960 --> 00:36:56,460 There are many reports that if you look at pictures of tools 957 00:36:56,460 --> 00:36:58,290 or look at pictures of hands, you 958 00:36:58,290 --> 00:37:00,170 can get a nice little selective blob. 959 00:37:00,170 --> 00:37:01,920 I have looked at both of those many times. 960 00:37:01,920 --> 00:37:02,670 I don't see it. 961 00:37:02,670 --> 00:37:04,080 I don't know what everyone else is on about. 962 00:37:04,080 --> 00:37:05,130 I'm confused about that. 963 00:37:05,130 --> 00:37:06,570 I just leave that as in play. 964 00:37:06,570 --> 00:37:08,470 I don't know. 965 00:37:08,470 --> 00:37:11,710 But with that exception, there's good agreement that faces, 966 00:37:11,710 --> 00:37:14,520 places, and bodies, everyone replicates. 967 00:37:14,520 --> 00:37:17,220 And most of these others, no one replicates. 968 00:37:17,220 --> 00:37:21,510 And so in particular, nobody reports selective patches 969 00:37:21,510 --> 00:37:24,390 of brain that respond selectively to cars, chairs, 970 00:37:24,390 --> 00:37:26,070 food, or lots of other things. 971 00:37:26,070 --> 00:37:27,750 We have tested snakes, by the way, 972 00:37:27,750 --> 00:37:31,990 and not found anything, at least in the cortex. 973 00:37:31,990 --> 00:37:33,070 So what does that mean? 974 00:37:33,070 --> 00:37:36,870 That implies, kinda sorta, that some categories are 975 00:37:36,870 --> 00:37:40,830 special in the brain, at least at this crude grain 976 00:37:40,830 --> 00:37:43,560 that we can see with functional MRI. 977 00:37:43,560 --> 00:37:46,320 And that seems pretty interesting and important. 978 00:37:46,320 --> 00:37:47,443 Yes. 979 00:37:47,443 --> 00:37:49,110 AUDIENCE: I had a question about places. 980 00:37:49,110 --> 00:37:53,412 Did you distinguish between human-made places and naturals? 981 00:37:53,412 --> 00:37:54,870 NANCY KANWISHER: We'll get into all 982 00:37:54,870 --> 00:37:56,860 of that in excruciating detail next week. 983 00:37:56,860 --> 00:37:57,360 Yeah. 984 00:37:57,360 --> 00:37:59,193 It doesn't really make much of a difference. 985 00:37:59,193 --> 00:38:00,480 It likes all of those things. 986 00:38:00,480 --> 00:38:00,980 Yeah. 987 00:38:04,170 --> 00:38:07,020 So I've been going around for 20 years saying, see, 988 00:38:07,020 --> 00:38:09,660 these categories are really special in the brain 989 00:38:09,660 --> 00:38:10,320 and the mind. 990 00:38:10,320 --> 00:38:11,460 And that's what we're getting from this. 991 00:38:11,460 --> 00:38:12,990 And that's deep and fundamental. 992 00:38:12,990 --> 00:38:16,870 It's telling us something about who we are as human beings 993 00:38:16,870 --> 00:38:17,370 or whatever. 994 00:38:17,370 --> 00:38:20,190 Sometimes, I go off the deep end with huge claims. 995 00:38:20,190 --> 00:38:22,600 But not everybody buys this. 996 00:38:22,600 --> 00:38:24,900 And so what I want to do is allude briefly 997 00:38:24,900 --> 00:38:27,420 to the general kinds of ways you could argue against this 998 00:38:27,420 --> 00:38:31,260 and then talk in some detail about one main one. 999 00:38:31,260 --> 00:38:35,940 So ongoing controversies-- this view here 1000 00:38:35,940 --> 00:38:37,260 is highly caricaturized. 1001 00:38:37,260 --> 00:38:38,980 And this is actually not right. 1002 00:38:38,980 --> 00:38:41,170 The brain doesn't have completely discrete 1003 00:38:41,170 --> 00:38:41,860 little regions. 1004 00:38:41,860 --> 00:38:43,780 It's a mucky biological system. 1005 00:38:43,780 --> 00:38:45,700 If you actually look at the part, 1006 00:38:45,700 --> 00:38:48,880 the face-selective regions, they have ratty edges 1007 00:38:48,880 --> 00:38:52,570 and little kind of archipelagos of sub-blobs and stuff. 1008 00:38:52,570 --> 00:38:54,820 It's kind of a bit of a mess. 1009 00:38:54,820 --> 00:38:57,910 There's a general cluster in that vicinity in most subjects, 1010 00:38:57,910 --> 00:39:01,840 but it isn't always a discrete blob unless you blur your data. 1011 00:39:01,840 --> 00:39:03,850 You take any data and blur it enough, 1012 00:39:03,850 --> 00:39:05,982 it looks nice and clean. 1013 00:39:05,982 --> 00:39:08,440 But if you want to know the actual native form in the brain 1014 00:39:08,440 --> 00:39:10,930 unblurred, it's kind of mucky. 1015 00:39:10,930 --> 00:39:13,950 So one could react to that in different ways. 1016 00:39:13,950 --> 00:39:15,950 My reaction to that is like, what do you expect? 1017 00:39:15,950 --> 00:39:17,200 It's a biological system. 1018 00:39:17,200 --> 00:39:20,080 Does it really need to be perfectly 1019 00:39:20,080 --> 00:39:21,940 oval-shaped with a perfectly sharp edge? 1020 00:39:21,940 --> 00:39:24,550 I don't really care if it's interleaved with other stuff 1021 00:39:24,550 --> 00:39:26,660 around the edges. 1022 00:39:26,660 --> 00:39:28,100 But people react different ways. 1023 00:39:28,100 --> 00:39:30,490 And one kind of important alternative view 1024 00:39:30,490 --> 00:39:33,820 is, look, how do we know that these are really 1025 00:39:33,820 --> 00:39:34,870 things in the brain? 1026 00:39:34,870 --> 00:39:37,570 I'm talking about them as things, pieces, 1027 00:39:37,570 --> 00:39:40,300 parts of brain and mind. 1028 00:39:40,300 --> 00:39:43,780 And maybe they're just kind of peaks 1029 00:39:43,780 --> 00:39:47,470 in a broader landscape of responses across the cortex 1030 00:39:47,470 --> 00:39:48,880 that are fluctuating. 1031 00:39:48,880 --> 00:39:50,260 And empirically, that's true. 1032 00:39:50,260 --> 00:39:53,200 There isn't just one butte and then nothing else 1033 00:39:53,200 --> 00:39:54,310 in the cortex around it. 1034 00:39:54,310 --> 00:39:56,590 There's some kind of profile. 1035 00:39:56,590 --> 00:39:58,960 So it's a bit of a judgment call how excited 1036 00:39:58,960 --> 00:40:04,660 you want to be about a big peak in a fluctuating background. 1037 00:40:04,660 --> 00:40:06,730 And so there's much discussion about that. 1038 00:40:06,730 --> 00:40:10,510 Is it really just a peak in a broader spatial organization? 1039 00:40:10,510 --> 00:40:15,153 And if so, what is that broader spatial organization all about? 1040 00:40:15,153 --> 00:40:16,570 It just pushes that question back. 1041 00:40:16,570 --> 00:40:19,990 It says, we're wrong to think about discrete things. 1042 00:40:19,990 --> 00:40:21,970 But that still leaves many mysteries 1043 00:40:21,970 --> 00:40:25,180 about what that continuous gradient is. 1044 00:40:25,180 --> 00:40:28,060 So that's kind of one line of response, which I 1045 00:40:28,060 --> 00:40:31,220 think is completely legitimate. 1046 00:40:31,220 --> 00:40:33,280 Any sort of version of that kind of blurs 1047 00:40:33,280 --> 00:40:35,770 into this next view, which we've talked about a little bit. 1048 00:40:35,770 --> 00:40:38,170 And that is to what extent can these things, 1049 00:40:38,170 --> 00:40:40,750 if I'm calling them things, be accounted for just 1050 00:40:40,750 --> 00:40:42,530 by their perceptual features. 1051 00:40:42,530 --> 00:40:45,550 So we've grappled with that in a number of ways so far. 1052 00:40:45,550 --> 00:40:48,010 One of the first things we asked about the face area 1053 00:40:48,010 --> 00:40:51,970 is, is it just responding to curvy stuff or round 1054 00:40:51,970 --> 00:40:53,680 things or whatever? 1055 00:40:53,680 --> 00:40:56,020 And so there are many lines of work 1056 00:40:56,020 --> 00:40:58,905 where me and many other people have asked that question. 1057 00:40:58,905 --> 00:41:00,280 And for the most part, the answer 1058 00:41:00,280 --> 00:41:05,200 seems to be there are some featural selectivities 1059 00:41:05,200 --> 00:41:08,680 in these regions, but probably not enough to account 1060 00:41:08,680 --> 00:41:10,660 for their category selectivity. 1061 00:41:10,660 --> 00:41:12,670 But that one, too, is still in play. 1062 00:41:12,670 --> 00:41:15,130 And there's this dude in England who 1063 00:41:15,130 --> 00:41:17,970 publishes several papers a year saying, 1064 00:41:17,970 --> 00:41:19,870 no, this thing isn't category-selective. 1065 00:41:19,870 --> 00:41:22,237 It's just that. 1066 00:41:22,237 --> 00:41:24,070 I'm going to try to assign one of his papers 1067 00:41:24,070 --> 00:41:26,890 to you because I want you to expose you to alternate views. 1068 00:41:26,890 --> 00:41:29,620 But I haven't yet taught you the key methods 1069 00:41:29,620 --> 00:41:30,910 you need for that paper. 1070 00:41:30,910 --> 00:41:33,580 Anyway, so there's room for debate in that question, 1071 00:41:33,580 --> 00:41:35,990 as well. 1072 00:41:35,990 --> 00:41:39,130 Then there's just a continuum of OK, exactly how selective 1073 00:41:39,130 --> 00:41:40,660 are these regions. 1074 00:41:40,660 --> 00:41:42,610 I'm excited if a face area responds 1075 00:41:42,610 --> 00:41:44,620 like this to faces and like that to objects. 1076 00:41:44,620 --> 00:41:46,690 But hey, it responds like that to objects. 1077 00:41:46,690 --> 00:41:48,130 Is that selective enough? 1078 00:41:48,130 --> 00:41:50,980 So there's a lot of debate about what that means. 1079 00:41:50,980 --> 00:41:53,200 So there's a lot of room to push back 1080 00:41:53,200 --> 00:41:57,303 on the simpleminded story I've been serving up to you guys. 1081 00:41:57,303 --> 00:41:58,720 But what I want to do next is talk 1082 00:41:58,720 --> 00:42:01,870 about what I take to be the most smart and serious challenge, 1083 00:42:01,870 --> 00:42:05,230 which is somewhat different from all of these. 1084 00:42:05,230 --> 00:42:07,030 And this comes from a guy up at Dartmouth 1085 00:42:07,030 --> 00:42:09,820 named Jim Haxby, who published the paper that 1086 00:42:09,820 --> 00:42:12,070 was assigned for today. 1087 00:42:12,070 --> 00:42:15,430 And I intended for you to struggle with it a little bit 1088 00:42:15,430 --> 00:42:16,550 and try to understand it. 1089 00:42:16,550 --> 00:42:17,740 But if you didn't understand it fully, 1090 00:42:17,740 --> 00:42:19,073 I'm going to talk about it here. 1091 00:42:19,073 --> 00:42:22,570 And hopefully, that'll make it more intelligible. 1092 00:42:22,570 --> 00:42:25,390 So here's the big idea that Haxby-- there 1093 00:42:25,390 --> 00:42:26,620 are many ideas in that paper. 1094 00:42:26,620 --> 00:42:30,250 But the part of it that's most relevant to us for now 1095 00:42:30,250 --> 00:42:31,900 is the following. 1096 00:42:31,900 --> 00:42:35,680 Even if the fusiform face area responds weakly 1097 00:42:35,680 --> 00:42:40,870 to chairs and cars, in contrast with strong response to faces, 1098 00:42:40,870 --> 00:42:43,780 that doesn't mean that it doesn't hold information 1099 00:42:43,780 --> 00:42:45,040 about chairs and cars. 1100 00:42:48,000 --> 00:42:50,460 So all along, I've been just talking about one dimension. 1101 00:42:50,460 --> 00:42:52,680 Does it respond like this or like that? 1102 00:42:52,680 --> 00:42:54,360 And that's gotten us pretty far. 1103 00:42:54,360 --> 00:42:56,430 But the essence of Haxby's idea is 1104 00:42:56,430 --> 00:42:57,900 that we should care not just about 1105 00:42:57,900 --> 00:42:59,880 the overall mean response. 1106 00:42:59,880 --> 00:43:02,520 We should ask if there's information present 1107 00:43:02,520 --> 00:43:04,980 in the pattern of response across voxels. 1108 00:43:07,800 --> 00:43:11,310 And his point is that even if there's a low mean response, 1109 00:43:11,310 --> 00:43:13,530 you could still have information in the pattern 1110 00:43:13,530 --> 00:43:18,210 across voxels, even if it averages to some low number. 1111 00:43:18,210 --> 00:43:19,890 And that pattern of information could 1112 00:43:19,890 --> 00:43:23,760 enable you to distinguish different categories. 1113 00:43:23,760 --> 00:43:25,950 So let's get very particular. 1114 00:43:25,950 --> 00:43:27,600 So how exactly would you tell? 1115 00:43:27,600 --> 00:43:29,580 So here's what Haxby did, essentially. 1116 00:43:29,580 --> 00:43:31,950 Or here's the subset of the assigned paper 1117 00:43:31,950 --> 00:43:34,410 that's relevant to the current question. 1118 00:43:34,410 --> 00:43:37,830 If we want to know, does the fusiform face area hold 1119 00:43:37,830 --> 00:43:40,170 information about cars and chairs, 1120 00:43:40,170 --> 00:43:44,983 thereby arguing against its selectivity for faces-- 1121 00:43:44,983 --> 00:43:46,900 we should care about information in the brain, 1122 00:43:46,900 --> 00:43:49,198 not just magnitude of response. 1123 00:43:49,198 --> 00:43:51,240 If the brain is an information processing system, 1124 00:43:51,240 --> 00:43:53,032 we care what information the parts contain, 1125 00:43:53,032 --> 00:43:55,950 not just how much the neurons are firing. 1126 00:43:55,950 --> 00:43:56,640 All right. 1127 00:43:56,640 --> 00:43:59,380 So if we want to know this, here's what you can do. 1128 00:43:59,380 --> 00:44:01,350 Here's a version of what Haxby did. 1129 00:44:01,350 --> 00:44:05,520 You scan subjects while they're looking at chairs and cars. 1130 00:44:05,520 --> 00:44:07,410 You've localized the fusiform face area 1131 00:44:07,410 --> 00:44:09,060 so you know where it is. 1132 00:44:09,060 --> 00:44:11,010 So now, you get the response. 1133 00:44:11,010 --> 00:44:13,560 This is highly schematic. 1134 00:44:13,560 --> 00:44:16,260 This is an idealized version of the cortical surface. 1135 00:44:16,260 --> 00:44:17,710 Remember, the cortex is a surface. 1136 00:44:17,710 --> 00:44:19,500 So we can mathematically unfold it 1137 00:44:19,500 --> 00:44:21,750 and look at the magnitude of response 1138 00:44:21,750 --> 00:44:23,340 of each voxel in the FFA. 1139 00:44:23,340 --> 00:44:26,280 FFA isn't square, but we're idealizing it here. 1140 00:44:26,280 --> 00:44:29,130 Everybody get how that could be a pattern of response 1141 00:44:29,130 --> 00:44:34,410 across voxels in the FFA when the subject looks at chairs? 1142 00:44:34,410 --> 00:44:36,600 And maybe you have some other pattern when 1143 00:44:36,600 --> 00:44:38,940 the subject is looking at cars. 1144 00:44:38,940 --> 00:44:41,100 Now, certainly, the pattern when they're 1145 00:44:41,100 --> 00:44:44,130 looking at faces, all of these bars would be much higher. 1146 00:44:44,130 --> 00:44:46,710 But our point is that even if these are low, 1147 00:44:46,710 --> 00:44:50,280 they're different across voxels. 1148 00:44:50,280 --> 00:44:51,930 So that's step one. 1149 00:44:51,930 --> 00:44:55,110 So then what Haxby says is you do the same thing 1150 00:44:55,110 --> 00:44:56,070 in the same subject. 1151 00:44:56,070 --> 00:44:59,040 You do it again, hopefully in the same scanning session. 1152 00:44:59,040 --> 00:45:03,270 And you get another pattern, like this and this. 1153 00:45:06,127 --> 00:45:07,335 Now, here's the key question. 1154 00:45:11,380 --> 00:45:14,950 If those patterns are systematic for chairs 1155 00:45:14,950 --> 00:45:18,010 and systematically different for cars, 1156 00:45:18,010 --> 00:45:20,140 then there is information in that region 1157 00:45:20,140 --> 00:45:23,650 about the difference between chairs and cars. 1158 00:45:23,650 --> 00:45:25,090 And chairs and cars aren't faces. 1159 00:45:25,090 --> 00:45:28,030 So that's an important challenge to my story about how 1160 00:45:28,030 --> 00:45:30,640 that region only does faces. 1161 00:45:30,640 --> 00:45:31,910 So how do you measure that? 1162 00:45:31,910 --> 00:45:34,150 Well, there's lots of ways. 1163 00:45:34,150 --> 00:45:36,850 Haxby's is the lowest tech and most intuitive. 1164 00:45:36,850 --> 00:45:40,810 He just says, let's look at the similarity of this pattern 1165 00:45:40,810 --> 00:45:43,430 to that pattern, repeated measures on cars-- 1166 00:45:43,430 --> 00:45:46,420 I'm sorry-- chairs, same subject-- chairs on the even 1167 00:45:46,420 --> 00:45:49,210 runs and chairs on the odd runs. 1168 00:45:49,210 --> 00:45:53,110 By the way, why do you split your data like this rather than 1169 00:45:53,110 --> 00:45:56,140 like this? 1170 00:45:56,140 --> 00:45:57,670 He does eight runs. 1171 00:45:57,670 --> 00:46:00,520 We could take the first half of the runs, put those data here, 1172 00:46:00,520 --> 00:46:02,620 and the second half and put them over there. 1173 00:46:02,620 --> 00:46:04,240 Or we could take the data like this 1174 00:46:04,240 --> 00:46:06,370 and take even runs and odd runs. 1175 00:46:06,370 --> 00:46:09,400 Why is even and odd better than first half, second half? 1176 00:46:13,390 --> 00:46:14,673 Yeah. 1177 00:46:14,673 --> 00:46:16,090 AUDIENCE: I guess it doesn't allow 1178 00:46:16,090 --> 00:46:20,350 the subjects to get used to one particular thing 1179 00:46:20,350 --> 00:46:21,640 one after the other. 1180 00:46:21,640 --> 00:46:22,880 NANCY KANWISHER: Well, they're doing the same thing. 1181 00:46:22,880 --> 00:46:24,080 It's all the same data. 1182 00:46:24,080 --> 00:46:27,860 It's just how you analyze it. 1183 00:46:27,860 --> 00:46:28,360 Yes. 1184 00:46:28,360 --> 00:46:30,690 What's your name? 1185 00:46:30,690 --> 00:46:31,312 BAYLA: Bayla. 1186 00:46:31,312 --> 00:46:32,270 NANCY KANWISHER: Bayla. 1187 00:46:32,270 --> 00:46:34,227 Yeah. 1188 00:46:34,227 --> 00:46:35,060 BAYLA: I'm not sure. 1189 00:46:35,060 --> 00:46:41,840 I think it's probably easier to compare between one 1190 00:46:41,840 --> 00:46:44,155 face and the other, I guess. 1191 00:46:44,155 --> 00:46:46,280 NANCY KANWISHER: You can actually do it either way. 1192 00:46:46,280 --> 00:46:48,080 It's like you scan these eight runs. 1193 00:46:48,080 --> 00:46:48,920 Here they are. 1194 00:46:48,920 --> 00:46:49,700 You can do that. 1195 00:46:49,700 --> 00:46:51,200 I don't know if you can see what I'm 1196 00:46:51,200 --> 00:46:52,550 doing here with my whole crew. 1197 00:46:52,550 --> 00:46:55,490 Or you can do this. 1198 00:46:55,490 --> 00:46:58,230 Why is this better than this? 1199 00:46:58,230 --> 00:46:58,730 Yeah. 1200 00:46:58,730 --> 00:46:59,300 Isabel. 1201 00:46:59,300 --> 00:47:02,223 ISABEL: It could be a subject was really tired. 1202 00:47:02,223 --> 00:47:03,140 NANCY KANWISHER: Yeah. 1203 00:47:03,140 --> 00:47:06,860 Maybe they fell asleep halfway through the scan. 1204 00:47:06,860 --> 00:47:09,830 Then if you do like this, the odd and even 1205 00:47:09,830 --> 00:47:13,220 are going to be better compared to each other than first half, 1206 00:47:13,220 --> 00:47:14,510 second half. 1207 00:47:14,510 --> 00:47:16,070 Make sense? 1208 00:47:16,070 --> 00:47:18,530 It's another version of why you do things within subjects. 1209 00:47:18,530 --> 00:47:19,822 It's the same kind of argument. 1210 00:47:19,822 --> 00:47:21,200 Yeah. 1211 00:47:21,200 --> 00:47:24,200 So he splits into even and odd. 1212 00:47:24,200 --> 00:47:27,020 And so you ask, how similar are they within a category, 1213 00:47:27,020 --> 00:47:30,380 within chairs and within cars? 1214 00:47:30,380 --> 00:47:32,420 You get two different correlation values. 1215 00:47:32,420 --> 00:47:33,920 Just how similar are those patterns? 1216 00:47:33,920 --> 00:47:35,300 You get an r value. 1217 00:47:35,300 --> 00:47:38,570 And we compare that with how similar 1218 00:47:38,570 --> 00:47:45,830 the patterns are between chairs, even, to cars, odd, and cars, 1219 00:47:45,830 --> 00:47:49,190 even, to chairs, odd. 1220 00:47:49,190 --> 00:47:51,320 And so the key question you ask-- 1221 00:47:51,320 --> 00:47:55,790 if there's information about chairs and cars in this pattern 1222 00:47:55,790 --> 00:47:57,740 of responses, then the correlations 1223 00:47:57,740 --> 00:48:01,890 will be higher within-category than between-category. 1224 00:48:01,890 --> 00:48:03,680 In other words, two different times 1225 00:48:03,680 --> 00:48:05,750 you scan looking at chairs, those patterns 1226 00:48:05,750 --> 00:48:09,770 are more similar than chairs are to cars. 1227 00:48:09,770 --> 00:48:11,240 Make sense? 1228 00:48:11,240 --> 00:48:12,050 It's pretty basic. 1229 00:48:12,050 --> 00:48:15,680 But it's one of these things that's simple and yet subtle 1230 00:48:15,680 --> 00:48:16,920 at the same time. 1231 00:48:16,920 --> 00:48:20,030 Does everybody get this? 1232 00:48:20,030 --> 00:48:22,040 So you just do these repeated measures. 1233 00:48:22,040 --> 00:48:24,020 And you look at these patterns of correlations. 1234 00:48:24,020 --> 00:48:26,480 And if the patterns are more similar or more 1235 00:48:26,480 --> 00:48:29,910 correlated within a category than between categories, 1236 00:48:29,910 --> 00:48:32,540 then you have information in that pattern 1237 00:48:32,540 --> 00:48:35,780 that enables you to distinguish those categories. 1238 00:48:35,780 --> 00:48:38,090 Yeah? 1239 00:48:38,090 --> 00:48:41,120 So that's what Haxby did. 1240 00:48:41,120 --> 00:48:42,065 Yes? 1241 00:48:42,065 --> 00:48:43,950 AUDIENCE: So if this information doubles, 1242 00:48:43,950 --> 00:48:45,950 so would it be difficult to look at correlation? 1243 00:48:45,950 --> 00:48:47,840 NANCY KANWISHER: Nothing, really. 1244 00:48:47,840 --> 00:48:48,710 Well, wait a second. 1245 00:48:48,710 --> 00:48:50,120 Oh, that's the same. 1246 00:48:50,120 --> 00:48:51,980 That's essentially like this. 1247 00:48:51,980 --> 00:48:54,380 It's just that since we're going from even 1248 00:48:54,380 --> 00:48:56,750 to odd in the within case, we're going 1249 00:48:56,750 --> 00:48:59,030 to go even to odd in the between case. 1250 00:48:59,030 --> 00:49:01,410 You could have done it this way, but then-- 1251 00:49:01,410 --> 00:49:01,910 yeah. 1252 00:49:05,330 --> 00:49:08,580 So that's the method. 1253 00:49:08,580 --> 00:49:10,120 What does Haxby find? 1254 00:49:10,120 --> 00:49:11,870 Well, you guys could all look at the paper 1255 00:49:11,870 --> 00:49:13,190 some more so you get a sense of it, 1256 00:49:13,190 --> 00:49:14,630 because it's actually really nicely written, 1257 00:49:14,630 --> 00:49:15,710 even though it's dense. 1258 00:49:15,710 --> 00:49:18,530 Those science papers are very dense. 1259 00:49:18,530 --> 00:49:20,340 But basically, here's what happened. 1260 00:49:20,340 --> 00:49:22,040 So in that paper, he says, yes, he 1261 00:49:22,040 --> 00:49:26,120 can distinguish between cars and chairs in the FFA. 1262 00:49:26,120 --> 00:49:29,360 And therefore, to quote from his paper, 1263 00:49:29,360 --> 00:49:31,220 "regions such as the 'FFA'"-- 1264 00:49:31,220 --> 00:49:34,580 notice the scare quotes he's putting there to diss me. 1265 00:49:34,580 --> 00:49:35,850 I hear you. 1266 00:49:35,850 --> 00:49:37,550 I hear you, Jim. 1267 00:49:37,550 --> 00:49:39,920 "Regions such as the 'FFA' are not 1268 00:49:39,920 --> 00:49:42,800 dedicated to representing only human faces. 1269 00:49:42,800 --> 00:49:45,710 Rather, they're part of a more extended representation 1270 00:49:45,710 --> 00:49:47,630 for all objects." 1271 00:49:47,630 --> 00:49:49,610 Them's fighting words. 1272 00:49:49,610 --> 00:49:51,530 Everybody see how this is a serious challenge 1273 00:49:51,530 --> 00:49:54,170 with a very elegant method? 1274 00:49:54,170 --> 00:49:56,810 So when I first read that paper, I was like, huh. 1275 00:49:56,810 --> 00:49:57,670 All right. 1276 00:49:57,670 --> 00:49:58,873 I'm paying attention. 1277 00:49:58,873 --> 00:50:00,290 But he didn't do everything right. 1278 00:50:00,290 --> 00:50:02,060 I didn't like the way he defined the FFA. 1279 00:50:02,060 --> 00:50:04,130 I found a million reasons to diss it. 1280 00:50:04,130 --> 00:50:05,400 And I ran my own version. 1281 00:50:05,400 --> 00:50:07,207 And in my paper that we published, 1282 00:50:07,207 --> 00:50:08,540 we could not discriminate those. 1283 00:50:08,540 --> 00:50:09,360 So we said, ha. 1284 00:50:09,360 --> 00:50:09,860 You can. 1285 00:50:09,860 --> 00:50:10,412 We can't. 1286 00:50:10,412 --> 00:50:11,120 You did it wrong. 1287 00:50:11,120 --> 00:50:13,250 We did it right. 1288 00:50:13,250 --> 00:50:17,030 Then a few years later, Jim publishes a paper 1289 00:50:17,030 --> 00:50:19,640 with a collaborator in which they re-analyzed their old data 1290 00:50:19,640 --> 00:50:21,807 and said, actually, you really can't discriminate it 1291 00:50:21,807 --> 00:50:22,310 very well. 1292 00:50:22,310 --> 00:50:25,730 It was significantly above chance, but really lousy. 1293 00:50:25,730 --> 00:50:28,160 And so they concluded, "Preferred 1294 00:50:28,160 --> 00:50:30,290 regions for faces and houses"-- that is, regions 1295 00:50:30,290 --> 00:50:32,480 that respond preferentially to faces or houses-- 1296 00:50:32,480 --> 00:50:35,090 "are not well-suited to object classifications 1297 00:50:35,090 --> 00:50:39,260 that do not involve faces and houses, respectively." 1298 00:50:39,260 --> 00:50:41,000 But I didn't get to gloat because right 1299 00:50:41,000 --> 00:50:43,550 about the same time, we were redoing our experiments 1300 00:50:43,550 --> 00:50:44,600 at higher resolution. 1301 00:50:44,600 --> 00:50:47,540 And actually, we could distinguish 1302 00:50:47,540 --> 00:50:51,000 two different non-faces in the fusiform face area. 1303 00:50:51,000 --> 00:50:53,900 So that was the little drama that unfolded. 1304 00:50:53,900 --> 00:50:57,170 And so the current status is yes, 1305 00:50:57,170 --> 00:51:00,140 you really can discriminate two different non-face categories 1306 00:51:00,140 --> 00:51:03,410 within the fusiform face area, even if you do it right. 1307 00:51:03,410 --> 00:51:05,585 Even if I do it right and I don't want that result 1308 00:51:05,585 --> 00:51:07,760 and I do it right, I can get that result. 1309 00:51:07,760 --> 00:51:09,800 So that's true empirically. 1310 00:51:09,800 --> 00:51:12,013 The ability to discriminate is feeble. 1311 00:51:12,013 --> 00:51:14,180 It's not very strong, but it's significantly greater 1312 00:51:14,180 --> 00:51:15,210 than chance. 1313 00:51:15,210 --> 00:51:19,730 So does that mean I'm toast and I wasted the last few weeks 1314 00:51:19,730 --> 00:51:23,360 telling you guys a bunch of BS that has been disproven 1315 00:51:23,360 --> 00:51:26,910 and that I should not have been telling you? 1316 00:51:26,910 --> 00:51:27,410 Yeah. 1317 00:51:27,410 --> 00:51:27,740 David. 1318 00:51:27,740 --> 00:51:29,210 DAVID: Isn't it kind of like saying 1319 00:51:29,210 --> 00:51:31,760 that you could use a vending machine like a clock 1320 00:51:31,760 --> 00:51:36,510 and then ask the question, then what is this thing for? 1321 00:51:36,510 --> 00:51:38,780 Well, it's obviously the office clock. 1322 00:51:38,780 --> 00:51:40,830 NANCY KANWISHER: That's a great analogy. 1323 00:51:40,830 --> 00:51:41,330 I love that. 1324 00:51:41,330 --> 00:51:43,400 Absolutely. 1325 00:51:43,400 --> 00:51:44,660 Absolutely. 1326 00:51:44,660 --> 00:51:48,290 So now, to me, the central question-- and here's 1327 00:51:48,290 --> 00:51:51,470 another example that I think is exactly like that, 1328 00:51:51,470 --> 00:51:54,380 but even more on point. 1329 00:51:54,380 --> 00:51:57,320 And that is that there are deep nets that people 1330 00:51:57,320 --> 00:51:58,610 have trained on faces. 1331 00:51:58,610 --> 00:52:01,670 VGG Face, it's really good at face recognition. 1332 00:52:01,670 --> 00:52:03,560 It has only ever seen faces. 1333 00:52:03,560 --> 00:52:05,360 It has only been trained on faces. 1334 00:52:05,360 --> 00:52:07,700 That is all it's about. 1335 00:52:07,700 --> 00:52:10,940 And if you feed it chairs or-- what 1336 00:52:10,940 --> 00:52:13,880 do I have-- chairs and cars, it can discriminate 1337 00:52:13,880 --> 00:52:15,980 between chairs and cars. 1338 00:52:15,980 --> 00:52:18,710 So even if you have this perfect representation 1339 00:52:18,710 --> 00:52:21,980 that's only been trained on faces, that has only evolved-- 1340 00:52:21,980 --> 00:52:22,580 if it evolved. 1341 00:52:22,580 --> 00:52:23,690 We'll get to that later-- 1342 00:52:23,690 --> 00:52:26,510 to deal with faces, it can still give you 1343 00:52:26,510 --> 00:52:30,290 a somewhat different response to chairs and cars. 1344 00:52:30,290 --> 00:52:32,885 And that doesn't mean that that's what it's doing. 1345 00:52:35,510 --> 00:52:39,080 So I think this is a really important challenge. 1346 00:52:39,080 --> 00:52:41,310 But I think centrally, crucially, 1347 00:52:41,310 --> 00:52:43,310 what we really need to be thinking about-- maybe 1348 00:52:43,310 --> 00:52:45,190 Akwile has a contribution. 1349 00:52:45,190 --> 00:52:46,070 Yeah. 1350 00:52:46,070 --> 00:52:47,990 AKWILE: So if it's only been trained on faces 1351 00:52:47,990 --> 00:52:52,010 and you feed it a chair, what's the outcome? 1352 00:52:52,010 --> 00:52:53,990 What's it say? 1353 00:52:53,990 --> 00:52:57,650 NANCY KANWISHER: So it's just a bunch of feed-forward layers 1354 00:52:57,650 --> 00:53:00,200 that are connected with boatloads 1355 00:53:00,200 --> 00:53:02,120 of units at each thing and connected 1356 00:53:02,120 --> 00:53:03,540 in the systematic pattern. 1357 00:53:03,540 --> 00:53:07,340 And once you train it up, you can feed it any stimulus. 1358 00:53:07,340 --> 00:53:10,670 And you can collect a response out the top. 1359 00:53:10,670 --> 00:53:13,070 So even though it was designed for and only 1360 00:53:13,070 --> 00:53:15,560 been trained on faces, you can feed it in non-faces 1361 00:53:15,560 --> 00:53:18,320 and get the response out the top and see. 1362 00:53:18,320 --> 00:53:20,930 That is not the category, not the top layer 1363 00:53:20,930 --> 00:53:22,670 where it says that's Joe or that's Bob. 1364 00:53:22,670 --> 00:53:24,470 But just before that layer, there's 1365 00:53:24,470 --> 00:53:27,138 a whole bunch of units that have some representation distributed 1366 00:53:27,138 --> 00:53:27,680 across units. 1367 00:53:27,680 --> 00:53:30,290 You can take that and try to read it out and ask 1368 00:53:30,290 --> 00:53:31,860 if there's information there. 1369 00:53:31,860 --> 00:53:34,110 I'm not giving you all the details of how you do that. 1370 00:53:34,110 --> 00:53:36,020 But hopefully, you can get at least the gist. 1371 00:53:36,020 --> 00:53:37,892 And later in the semester, Katharina Dobs 1372 00:53:37,892 --> 00:53:39,350 is going to tell you more about how 1373 00:53:39,350 --> 00:53:40,600 you do all this kind of stuff. 1374 00:53:44,360 --> 00:53:48,470 I spent a lot of time in the last few weeks talking 1375 00:53:48,470 --> 00:53:51,530 about a key difference between two different kinds of methods, 1376 00:53:51,530 --> 00:53:55,550 one set of methods that allows this kind of inference 1377 00:53:55,550 --> 00:53:58,383 and another set of methods that allows that kind of inference. 1378 00:53:58,383 --> 00:54:00,050 I'm trying to give you guys a clue here. 1379 00:54:02,400 --> 00:54:04,900 Actually, what I'm going to do is let you percolate on this. 1380 00:54:04,900 --> 00:54:06,040 I don't think this is obvious. 1381 00:54:06,040 --> 00:54:07,332 I worried about this for years. 1382 00:54:07,332 --> 00:54:08,980 I think there are many answers to it. 1383 00:54:08,980 --> 00:54:10,630 It's not cut and dried. 1384 00:54:10,630 --> 00:54:13,480 I will say I have already presented to you at least 1385 00:54:13,480 --> 00:54:19,060 two different lines of work that provide 1386 00:54:19,060 --> 00:54:21,790 an important counterargument to this. 1387 00:54:21,790 --> 00:54:24,790 One of the people who gave me crappy teaching evaluations 1388 00:54:24,790 --> 00:54:28,120 last year said, she told us about counterarguments 1389 00:54:28,120 --> 00:54:30,940 and then made us tell her how they could in fact, 1390 00:54:30,940 --> 00:54:32,680 after all, be consistent with her data. 1391 00:54:32,680 --> 00:54:33,790 I thought that was weird. 1392 00:54:36,790 --> 00:54:39,170 I was just trying to teach people to think about data. 1393 00:54:39,170 --> 00:54:41,378 But anyway, I won't make you do that because somebody 1394 00:54:41,378 --> 00:54:42,400 didn't like that before. 1395 00:54:42,400 --> 00:54:43,525 But you can think about it. 1396 00:54:43,525 --> 00:54:44,530 And we'll talk later. 1397 00:54:44,530 --> 00:54:46,780 And it's actually good to think about. 1398 00:54:46,780 --> 00:54:48,370 And we will come back to it. 1399 00:54:48,370 --> 00:54:50,897 But I want to get on with the rest. 1400 00:54:50,897 --> 00:54:53,230 I mention all this because it is an important challenge. 1401 00:54:53,230 --> 00:54:53,950 Yeah. 1402 00:54:53,950 --> 00:54:56,950 AUDIENCE: I'm wondering if objects are not 1403 00:54:56,950 --> 00:55:00,140 processed in FFA, they must be processed somewhere else. 1404 00:55:00,140 --> 00:55:01,210 NANCY KANWISHER: Totally. 1405 00:55:01,210 --> 00:55:02,380 AUDIENCE: Somewhere else. 1406 00:55:02,380 --> 00:55:02,830 NANCY KANWISHER: Totally. 1407 00:55:02,830 --> 00:55:04,540 I had a whole piece of this lecture on that. 1408 00:55:04,540 --> 00:55:05,770 And then I thought, for once, I'm 1409 00:55:05,770 --> 00:55:06,890 not going to go over my time. 1410 00:55:06,890 --> 00:55:08,390 So I'm not going to talk about that. 1411 00:55:08,390 --> 00:55:12,280 But remember, there's all those other bits of cortex. 1412 00:55:12,280 --> 00:55:14,440 I've just identified a few particular ones. 1413 00:55:14,440 --> 00:55:16,030 There's lots of cortex in between. 1414 00:55:16,030 --> 00:55:17,710 And the simple statement is there's 1415 00:55:17,710 --> 00:55:20,110 a lot of nearby cortex near the FFA 1416 00:55:20,110 --> 00:55:23,440 and the PPA that seems to respond generically 1417 00:55:23,440 --> 00:55:24,820 to object shape. 1418 00:55:24,820 --> 00:55:27,010 And the first-pass guess is that there's 1419 00:55:27,010 --> 00:55:30,130 a general-purpose visual machine in there, 1420 00:55:30,130 --> 00:55:33,580 in addition to some more specialized ones. 1421 00:55:33,580 --> 00:55:35,870 But I'm going to not say more at the moment. 1422 00:55:35,870 --> 00:55:38,710 And I'll just say, actually, you may read it in papers. 1423 00:55:38,710 --> 00:55:40,960 It's sometimes called LO or LOC. 1424 00:55:40,960 --> 00:55:43,360 That's kind of a shape-selective region which 1425 00:55:43,360 --> 00:55:47,410 is arguably the kind of generic, let's process everything else 1426 00:55:47,410 --> 00:55:49,667 system. 1427 00:55:49,667 --> 00:55:51,250 Only if it's a clarification question. 1428 00:55:55,290 --> 00:55:56,008 Ask it. 1429 00:55:56,008 --> 00:55:56,550 AUDIENCE: No. 1430 00:55:56,550 --> 00:55:58,550 I was just wondering which came first, this work 1431 00:55:58,550 --> 00:55:59,932 or the transcranial stuff. 1432 00:55:59,932 --> 00:56:00,890 NANCY KANWISHER: Sorry. 1433 00:56:00,890 --> 00:56:01,550 This work or-- 1434 00:56:01,550 --> 00:56:03,374 AUDIENCE: Or the transcranial. 1435 00:56:03,374 --> 00:56:04,952 NANCY KANWISHER: Ah, good question. 1436 00:56:04,952 --> 00:56:06,410 The transcranial stuff has actually 1437 00:56:06,410 --> 00:56:07,460 been going on for a long time. 1438 00:56:07,460 --> 00:56:09,460 But the relevant kind that I talked to you about 1439 00:56:09,460 --> 00:56:10,130 is more recent. 1440 00:56:10,130 --> 00:56:10,838 And you're right. 1441 00:56:10,838 --> 00:56:12,470 It is one of the very strong answers 1442 00:56:12,470 --> 00:56:13,640 to this kind of critique. 1443 00:56:13,640 --> 00:56:14,660 There's several. 1444 00:56:14,660 --> 00:56:17,620 Actually, I've told you about three, so far, answers to this. 1445 00:56:17,620 --> 00:56:18,560 But think about it. 1446 00:56:22,880 --> 00:56:25,250 So what we're going to do now is talk about not just 1447 00:56:25,250 --> 00:56:27,950 this particular use of this method 1448 00:56:27,950 --> 00:56:31,670 to ask a serious question about the selectivity of regions 1449 00:56:31,670 --> 00:56:32,990 in the ventral visual pathway. 1450 00:56:32,990 --> 00:56:36,540 Now, what I'm going to do is argue that, actually-- 1451 00:56:36,540 --> 00:56:39,230 I think I just said all of this-- 1452 00:56:39,230 --> 00:56:41,870 what Haxby has given us is also a method 1453 00:56:41,870 --> 00:56:44,480 to ask what information is present in this little patch 1454 00:56:44,480 --> 00:56:45,560 of the brain. 1455 00:56:45,560 --> 00:56:47,190 And that's an awesome thing. 1456 00:56:47,190 --> 00:56:49,040 So let's go on and talk about that. 1457 00:56:49,040 --> 00:56:52,470 Let's talk about neural decoding with functional MRI. 1458 00:56:52,470 --> 00:56:54,200 So that was an instance of it, but I'm 1459 00:56:54,200 --> 00:56:56,700 going to cash it out in another way more generally. 1460 00:56:56,700 --> 00:56:58,610 So let's take the case where there's 1461 00:56:58,610 --> 00:57:00,380 a person with a patch of their brain 1462 00:57:00,380 --> 00:57:03,050 and a pattern of response across voxels 1463 00:57:03,050 --> 00:57:04,610 in that patch of their brain when 1464 00:57:04,610 --> 00:57:06,440 they look at some stimulus. 1465 00:57:06,440 --> 00:57:09,170 Let's suppose you're given this. 1466 00:57:09,170 --> 00:57:12,530 And you want to know what was that person looking at 1467 00:57:12,530 --> 00:57:15,350 to produce that pattern. 1468 00:57:15,350 --> 00:57:17,120 What was the stimulus out in the world 1469 00:57:17,120 --> 00:57:19,370 that produced that pattern? 1470 00:57:19,370 --> 00:57:21,410 Can you do that? 1471 00:57:21,410 --> 00:57:25,788 So more generally, can you read the mind with functional MRI? 1472 00:57:25,788 --> 00:57:27,830 Or maybe a little more honestly, can you at least 1473 00:57:27,830 --> 00:57:30,200 tell what the person saw from their pattern of brain 1474 00:57:30,200 --> 00:57:31,070 response? 1475 00:57:31,070 --> 00:57:33,830 Everybody get the question here? 1476 00:57:33,830 --> 00:57:35,100 How can we try this? 1477 00:57:35,100 --> 00:57:37,550 Well, they're all variations of that Haxby method 1478 00:57:37,550 --> 00:57:39,470 that I just told you about. 1479 00:57:39,470 --> 00:57:41,070 But let's walk through this. 1480 00:57:41,070 --> 00:57:43,790 So the first thing you need is you have this pattern. 1481 00:57:43,790 --> 00:57:45,290 And you're trying to figure out what 1482 00:57:45,290 --> 00:57:49,040 stimulus produced that pattern in that part of this person's 1483 00:57:49,040 --> 00:57:50,120 brain. 1484 00:57:50,120 --> 00:57:51,800 Well, you need a decoder. 1485 00:57:51,800 --> 00:57:54,483 You need to know what those voxels respond 1486 00:57:54,483 --> 00:57:56,150 like when they look at different things, 1487 00:57:56,150 --> 00:57:58,640 where you know the answer. 1488 00:57:58,640 --> 00:58:00,980 So what you do is you scan the subject 1489 00:58:00,980 --> 00:58:04,350 on a bunch of different conditions to get your decoder. 1490 00:58:04,350 --> 00:58:06,050 And then you can take your unknown data 1491 00:58:06,050 --> 00:58:08,390 and compare it to those decoder data. 1492 00:58:08,390 --> 00:58:11,760 So in particular, you have to train your decoder. 1493 00:58:11,760 --> 00:58:14,240 So you scan the person looking at, say, shoes, 1494 00:58:14,240 --> 00:58:16,220 and you get pattern. 1495 00:58:16,220 --> 00:58:19,350 You scan them looking at cats, and you get a pattern. 1496 00:58:19,350 --> 00:58:22,172 Maybe you scan them looking at five, 10, 100 other things-- 1497 00:58:22,172 --> 00:58:22,880 probably not 100. 1498 00:58:22,880 --> 00:58:25,040 You don't have enough scan time-- 1499 00:58:25,040 --> 00:58:26,362 but some number of things. 1500 00:58:26,362 --> 00:58:27,237 And so now, you know. 1501 00:58:27,237 --> 00:58:29,898 You know this is how those voxels respond 1502 00:58:29,898 --> 00:58:31,190 when the person looks at shoes. 1503 00:58:31,190 --> 00:58:35,630 And this is how those voxels respond when they look at cats. 1504 00:58:35,630 --> 00:58:40,310 Now, you test your decoder with your mystery pattern. 1505 00:58:40,310 --> 00:58:42,640 Now, you have your mystery unknown pattern. 1506 00:58:42,640 --> 00:58:47,420 And you want to know, was that shoes or cats? 1507 00:58:47,420 --> 00:58:49,550 Well, you can just look. 1508 00:58:49,550 --> 00:58:51,008 What is it most similar to? 1509 00:58:51,008 --> 00:58:52,550 All the methods are versions of that. 1510 00:58:52,550 --> 00:58:54,842 They're just fancy mathematical versions of that. 1511 00:58:54,842 --> 00:58:56,300 So what do you think that pattern-- 1512 00:58:56,300 --> 00:58:57,425 what produced that pattern? 1513 00:58:57,425 --> 00:58:58,132 AUDIENCE: Shoes. 1514 00:58:58,132 --> 00:58:59,090 NANCY KANWISHER: Shoes. 1515 00:58:59,090 --> 00:59:00,950 It's more similar to the shoe pattern. 1516 00:59:00,950 --> 00:59:01,820 Exactly. 1517 00:59:01,820 --> 00:59:05,120 You guys just did neural decoding. 1518 00:59:05,120 --> 00:59:07,590 So that's exactly how you do this. 1519 00:59:07,590 --> 00:59:10,220 There are all kinds of ways of doing this, from just saying, 1520 00:59:10,220 --> 00:59:12,500 is this more correlated with that than that? 1521 00:59:12,500 --> 00:59:14,060 That's Haxby's version. 1522 00:59:14,060 --> 00:59:17,180 Or you can put a whole big fancy machine learning rigmarole 1523 00:59:17,180 --> 00:59:19,363 in there to do pattern classification, 1524 00:59:19,363 --> 00:59:21,530 because that is, after all, what machine learning is 1525 00:59:21,530 --> 00:59:23,870 so awesome at is pattern classification. 1526 00:59:23,870 --> 00:59:26,210 And this is just a straightforward pattern 1527 00:59:26,210 --> 00:59:27,710 classification task. 1528 00:59:27,710 --> 00:59:28,580 Train on these. 1529 00:59:28,580 --> 00:59:30,500 Test on that. 1530 00:59:30,500 --> 00:59:34,340 Is that sort of intuitive, what we're doing here? 1531 00:59:34,340 --> 00:59:35,840 So that's the agenda. 1532 00:59:35,840 --> 00:59:38,820 That's the logic of how we do this. 1533 00:59:38,820 --> 00:59:41,790 And so does that work? 1534 00:59:41,790 --> 00:59:44,490 Well, a little bit. 1535 00:59:44,490 --> 00:59:49,340 But you don't have to worry, at least at the moment, 1536 00:59:49,340 --> 00:59:50,810 because there are a million ways. 1537 00:59:50,810 --> 00:59:52,790 About 10 years ago, I was getting called up 1538 00:59:52,790 --> 00:59:57,010 by legal types all the time because, 1539 00:59:57,010 --> 01:00:00,938 are people going to detect lies with functional MRI? 1540 01:00:00,938 --> 01:00:02,480 And I thought this was a total crock. 1541 01:00:02,480 --> 01:00:04,730 And I was going around giving talks on all the reasons 1542 01:00:04,730 --> 01:00:08,240 why nobody has to worry that they're going to be compelled 1543 01:00:08,240 --> 01:00:10,640 to testify by being shoved in a scanner 1544 01:00:10,640 --> 01:00:12,560 and have their brains read. 1545 01:00:12,560 --> 01:00:15,260 It's not a totally stupid thing to worry about. 1546 01:00:15,260 --> 01:00:17,298 But lest anybody-- 1547 01:00:17,298 --> 01:00:18,590 I don't think this will happen. 1548 01:00:18,590 --> 01:00:21,500 But lest anybody try to read your mind against your will 1549 01:00:21,500 --> 01:00:24,050 while you're in an MRI scanner, you can totally 1550 01:00:24,050 --> 01:00:25,710 foil it in any number of ways. 1551 01:00:25,710 --> 01:00:27,050 One, move your head. 1552 01:00:27,050 --> 01:00:30,200 Two, if they've got your head bolted down, move your tongue. 1553 01:00:30,200 --> 01:00:34,430 You totally mess up your whole signal if you move your tongue. 1554 01:00:34,430 --> 01:00:36,553 Three, do mental arithmetic. 1555 01:00:36,553 --> 01:00:37,970 You can totally shut down whatever 1556 01:00:37,970 --> 01:00:40,262 they're trying to do if you think about something else. 1557 01:00:40,262 --> 01:00:42,270 Anyway, so we don't need to worry about it. 1558 01:00:42,270 --> 01:00:45,890 It's not good for insidious kind of legal efforts. 1559 01:00:45,890 --> 01:00:49,790 But it is pretty good for science sometimes. 1560 01:00:49,790 --> 01:00:53,240 So there are lots of versions of neural decoding with functional 1561 01:00:53,240 --> 01:00:54,570 MRI. 1562 01:00:54,570 --> 01:00:57,620 So we've been talking so far about decoding 1563 01:00:57,620 --> 01:01:01,500 functional MRI patterns of response across voxels. 1564 01:01:01,500 --> 01:01:05,447 That's called MVPA, Multiple Voxel Pattern Analysis. 1565 01:01:05,447 --> 01:01:06,780 You don't need to memorize that. 1566 01:01:06,780 --> 01:01:08,880 But when you see MVPA in a paper, 1567 01:01:08,880 --> 01:01:12,180 this is what it's talking about. 1568 01:01:12,180 --> 01:01:15,550 But you can also do it with lots of other kinds of neural data. 1569 01:01:15,550 --> 01:01:16,050 Oh, sorry. 1570 01:01:16,050 --> 01:01:20,340 Within MVPA, you can ask it of a particular ROI in the brain, 1571 01:01:20,340 --> 01:01:23,370 Region Of Interest, like V1 or the face area 1572 01:01:23,370 --> 01:01:25,890 or the body area or something else. 1573 01:01:25,890 --> 01:01:28,620 But you can also apply it to the whole damn pile of data 1574 01:01:28,620 --> 01:01:30,090 from the whole brain and say, can I 1575 01:01:30,090 --> 01:01:31,740 tell what this person is thinking by looking 1576 01:01:31,740 --> 01:01:32,615 at their whole brain? 1577 01:01:35,580 --> 01:01:37,980 Beyond functional MRI, you can apply it 1578 01:01:37,980 --> 01:01:39,520 to lots of other kinds of data. 1579 01:01:39,520 --> 01:01:41,400 So you can do monkey neurophysiology, 1580 01:01:41,400 --> 01:01:43,230 as we discussed briefly last time, 1581 01:01:43,230 --> 01:01:48,090 where you have actual firing rates from individual neurons. 1582 01:01:48,090 --> 01:01:51,990 And you can look at the response of each stimulus class 1583 01:01:51,990 --> 01:01:54,000 to each neuron in a region of the brain. 1584 01:01:54,000 --> 01:01:57,480 And you can do the same deal, running a pattern classifier 1585 01:01:57,480 --> 01:02:00,660 or a simple correlation method on the pattern of response 1586 01:02:00,660 --> 01:02:02,340 across neurons, rather than voxels. 1587 01:02:02,340 --> 01:02:05,055 Everybody see how that's sort of the same deal, just better? 1588 01:02:08,220 --> 01:02:10,860 Or you can do magnetoencephalography, 1589 01:02:10,860 --> 01:02:11,820 as we've talked about. 1590 01:02:11,820 --> 01:02:14,370 Stick your head in the big expensive hairdryer. 1591 01:02:14,370 --> 01:02:16,200 Collect magnetic signals from all 1592 01:02:16,200 --> 01:02:18,330 around the head, 300 channels. 1593 01:02:18,330 --> 01:02:21,700 And now, those magnetic signals are changing over time. 1594 01:02:21,700 --> 01:02:25,080 So the cool thing about neural decoding with MEG 1595 01:02:25,080 --> 01:02:28,560 is you can say, OK, let's take the data from just exactly 80 1596 01:02:28,560 --> 01:02:31,410 milliseconds after the stimulus flashed on. 1597 01:02:31,410 --> 01:02:33,960 And let's ask, what can you decode then? 1598 01:02:33,960 --> 01:02:37,350 What can you decode at 100 milliseconds, 120 milliseconds? 1599 01:02:37,350 --> 01:02:39,600 You can see the growth of information 1600 01:02:39,600 --> 01:02:42,990 over time as neural information processing proceeds 1601 01:02:42,990 --> 01:02:47,430 by running the decoder separately at each time point. 1602 01:02:47,430 --> 01:02:50,170 I'm going to try to squeeze into a future lecture 1603 01:02:50,170 --> 01:02:52,170 more talk about that, because I think it's cool. 1604 01:02:52,170 --> 01:02:54,170 And we're doing a lot of it in my lab right now. 1605 01:02:54,170 --> 01:02:57,875 Does everybody get the gist of this, at least? 1606 01:02:57,875 --> 01:02:59,250 So that gives you the time course 1607 01:02:59,250 --> 01:03:02,610 of information extraction. 1608 01:03:02,610 --> 01:03:05,700 Similarly, there are lots of different decoding methods. 1609 01:03:05,700 --> 01:03:07,470 You can use, as I mentioned, the kind 1610 01:03:07,470 --> 01:03:10,260 of simple, low-tech Haxby-style correlations. 1611 01:03:10,260 --> 01:03:13,020 Or you can use something called linear support vector 1612 01:03:13,020 --> 01:03:16,230 machines or various other kinds of fancy machine 1613 01:03:16,230 --> 01:03:22,460 learning math to do those classifiers. 1614 01:03:22,460 --> 01:03:26,370 Let's take-- do I have time to do this? 1615 01:03:26,370 --> 01:03:27,740 I'm going to skip this. 1616 01:03:27,740 --> 01:03:29,490 Yeah, we're going to skip that. 1617 01:03:29,490 --> 01:03:31,272 It's cool, but we're going to cut to the-- 1618 01:03:31,272 --> 01:03:31,980 oh, I don't know. 1619 01:03:31,980 --> 01:03:32,813 No, we'll do it all. 1620 01:03:32,813 --> 01:03:33,850 We've got time. 1621 01:03:33,850 --> 01:03:34,350 All right. 1622 01:03:36,900 --> 01:03:39,330 So now that I've wasted all that time 1623 01:03:39,330 --> 01:03:41,550 deciding whether we had time, we're 1624 01:03:41,550 --> 01:03:44,160 going to compare how well this works when you do it 1625 01:03:44,160 --> 01:03:46,620 on MRI versus how well it works when you do it 1626 01:03:46,620 --> 01:03:47,910 on neurons in monkey brains. 1627 01:03:51,570 --> 01:03:53,850 So there was a beautiful paper a few years ago 1628 01:03:53,850 --> 01:03:54,865 that looked at this. 1629 01:03:54,865 --> 01:03:57,240 So the question is here are these face patches in monkeys 1630 01:03:57,240 --> 01:03:59,940 that I told you about and that David Leopold will be talking 1631 01:03:59,940 --> 01:04:02,170 about at 4 o'clock today. 1632 01:04:02,170 --> 01:04:04,740 And so the question is this particular one, AM, 1633 01:04:04,740 --> 01:04:07,440 one of the nice face patches up there, 1634 01:04:07,440 --> 01:04:10,230 these guys wanted to know what information is represented 1635 01:04:10,230 --> 01:04:12,990 up there in face patch AM. 1636 01:04:12,990 --> 01:04:15,720 Is there information about different individual face 1637 01:04:15,720 --> 01:04:16,560 identities? 1638 01:04:16,560 --> 01:04:20,940 Can you use it to decode which face the monkey saw? 1639 01:04:20,940 --> 01:04:24,550 And so they did this experiment two ways. 1640 01:04:24,550 --> 01:04:26,880 One, they did monkey neurophysiology. 1641 01:04:26,880 --> 01:04:30,810 They recorded from 167 different individual neurons 1642 01:04:30,810 --> 01:04:31,950 in that region. 1643 01:04:31,950 --> 01:04:34,440 And for each neuron, they measured its response 1644 01:04:34,440 --> 01:04:35,490 to five different faces. 1645 01:04:38,580 --> 01:04:41,340 In another condition, they popped the very same monkeys 1646 01:04:41,340 --> 01:04:42,000 in the scanner. 1647 01:04:42,000 --> 01:04:43,830 And they scanned them with functional MRI. 1648 01:04:43,830 --> 01:04:45,570 And they did the same experiment. 1649 01:04:45,570 --> 01:04:48,270 And they measured the magnitude of response 1650 01:04:48,270 --> 01:04:51,360 of each of 100 voxels in that same patch of brain 1651 01:04:51,360 --> 01:04:52,620 in that same monkey. 1652 01:04:52,620 --> 01:04:57,030 And they got the MRI response to each of those 100 voxels 1653 01:04:57,030 --> 01:04:59,400 and for each voxel to each of those five faces. 1654 01:04:59,400 --> 01:05:02,280 Everybody get that this is asking the same question? 1655 01:05:02,280 --> 01:05:04,950 How well can you decode face identity 1656 01:05:04,950 --> 01:05:08,040 from individual neurons or from functional MRI 1657 01:05:08,040 --> 01:05:09,990 in the same animal? 1658 01:05:09,990 --> 01:05:12,930 And the answer is damn depressing. 1659 01:05:12,930 --> 01:05:15,420 The answer is you can decode identity really well 1660 01:05:15,420 --> 01:05:16,560 from neurophysiology. 1661 01:05:16,560 --> 01:05:20,160 And you can't do it worth a damn with functional MRI-- 1662 01:05:20,160 --> 01:05:21,820 big bummer. 1663 01:05:21,820 --> 01:05:23,040 Yeah. 1664 01:05:23,040 --> 01:05:24,690 So that's a drag. 1665 01:05:24,690 --> 01:05:25,950 It's just what it is. 1666 01:05:25,950 --> 01:05:30,690 Presumably, remember, each MRI voxel has hundreds of thousands 1667 01:05:30,690 --> 01:05:32,010 of neurons in it. 1668 01:05:32,010 --> 01:05:34,990 So the real miracle is that we ever see anything at all. 1669 01:05:34,990 --> 01:05:37,770 And when we can't see the neural code with the resolution 1670 01:05:37,770 --> 01:05:40,890 that we need to to tell whether it's got information about face 1671 01:05:40,890 --> 01:05:42,450 identity, that's just because we're 1672 01:05:42,450 --> 01:05:45,150 averaging over so many neurons. 1673 01:05:45,150 --> 01:05:48,780 That was my lament at the end of the lecture on Monday, 1674 01:05:48,780 --> 01:05:51,703 that there are so many limitations in human methods. 1675 01:05:51,703 --> 01:05:52,995 And here's one of the key ones. 1676 01:05:56,097 --> 01:05:57,180 What are the implications? 1677 01:05:57,180 --> 01:05:57,680 It sucks. 1678 01:05:57,680 --> 01:06:00,600 Anyway, I want to get one more idea out. 1679 01:06:00,600 --> 01:06:02,260 And that is-- yeah. 1680 01:06:02,260 --> 01:06:02,760 Question? 1681 01:06:02,760 --> 01:06:04,300 AUDIENCE: Is that limited to fMRI? 1682 01:06:04,300 --> 01:06:06,157 Or does it also translate to EEGs? 1683 01:06:06,157 --> 01:06:07,740 NANCY KANWISHER: Oh, EEG's much worse. 1684 01:06:07,740 --> 01:06:08,070 AUDIENCE: Worse. 1685 01:06:08,070 --> 01:06:08,970 NANCY KANWISHER: Much worse. 1686 01:06:08,970 --> 01:06:09,520 Oh my god. 1687 01:06:09,520 --> 01:06:10,020 Yeah. 1688 01:06:13,422 --> 01:06:14,880 The only thing that might be better 1689 01:06:14,880 --> 01:06:17,790 someday is intracranial recording. 1690 01:06:17,790 --> 01:06:20,550 But even there, you usually don't get enough electrodes. 1691 01:06:20,550 --> 01:06:22,230 So you need these very rare cases 1692 01:06:22,230 --> 01:06:24,330 where you have very-high-density grids 1693 01:06:24,330 --> 01:06:27,300 of intracranial electrodes that some surgeon has decided 1694 01:06:27,300 --> 01:06:30,090 by chance to put on a part of the brain 1695 01:06:30,090 --> 01:06:31,800 where you happen to have hypotheses 1696 01:06:31,800 --> 01:06:35,440 and you happen to be incredibly lucky to test your hypothesis. 1697 01:06:35,440 --> 01:06:37,277 And that's very rare. 1698 01:06:37,277 --> 01:06:38,610 Did you have a question, Akwile? 1699 01:06:38,610 --> 01:06:40,860 No. 1700 01:06:40,860 --> 01:06:43,470 So I've been talking about neural decoding. 1701 01:06:43,470 --> 01:06:45,510 And that's a way of asking what information 1702 01:06:45,510 --> 01:06:47,310 is present in this batch of neurons 1703 01:06:47,310 --> 01:06:48,600 or this bunch of voxels. 1704 01:06:48,600 --> 01:06:50,100 And that's a really deep question 1705 01:06:50,100 --> 01:06:51,892 to ask for cognitive science, because we're 1706 01:06:51,892 --> 01:06:53,434 interested in information processing. 1707 01:06:53,434 --> 01:06:55,684 And we want to know what's represented in each region. 1708 01:06:55,684 --> 01:06:57,180 It's really the crux of the matter 1709 01:06:57,180 --> 01:06:59,100 in cognitive neuroscience. 1710 01:06:59,100 --> 01:07:01,830 But we can also use it to ask in a richer 1711 01:07:01,830 --> 01:07:06,150 way about the nature of that information in each region. 1712 01:07:06,150 --> 01:07:09,790 So suppose we want to know what exactly is represented there. 1713 01:07:09,790 --> 01:07:13,950 We want to know not just that it can distinguish shoes 1714 01:07:13,950 --> 01:07:15,450 from cats. 1715 01:07:15,450 --> 01:07:16,950 That's OK. 1716 01:07:16,950 --> 01:07:20,730 But suppose we want to know how is it doing shoes versus cats. 1717 01:07:20,730 --> 01:07:23,340 Does it just know, for example, that shoes are elongated 1718 01:07:23,340 --> 01:07:25,080 this way and cats are roundish? 1719 01:07:25,080 --> 01:07:27,502 And that's all it's using to do its classification. 1720 01:07:27,502 --> 01:07:29,460 In other words, it's not really shoes and cats. 1721 01:07:29,460 --> 01:07:33,810 It's this versus that or something. 1722 01:07:33,810 --> 01:07:36,870 If we want to know how abstract those representations are 1723 01:07:36,870 --> 01:07:39,720 or how invariant they are to variations 1724 01:07:39,720 --> 01:07:41,730 in viewing conditions, then we can 1725 01:07:41,730 --> 01:07:44,310 do the following cool thing. 1726 01:07:44,310 --> 01:07:47,670 We can train the decoder on one set of stimuli 1727 01:07:47,670 --> 01:07:51,300 and test on a different kind of stimuli. 1728 01:07:51,300 --> 01:07:57,270 So for example, we can ask, or are there 1729 01:07:57,270 --> 01:08:00,180 representations of shoes that are invariant to, 1730 01:08:00,180 --> 01:08:02,490 for example, color and viewpoint, 1731 01:08:02,490 --> 01:08:05,690 chosen just because that was the nice shoe I could find when 1732 01:08:05,690 --> 01:08:08,040 I was searching an hour ago? 1733 01:08:08,040 --> 01:08:12,513 So if we train on these and test on that, is that going to work? 1734 01:08:12,513 --> 01:08:15,180 Is it going to know that this is the same kind of thing as that? 1735 01:08:15,180 --> 01:08:20,120 If it does, what have we learned about that shoe representation? 1736 01:08:20,120 --> 01:08:20,620 Yeah. 1737 01:08:20,620 --> 01:08:22,029 AUDIENCE: It's kind of generalizable. 1738 01:08:22,029 --> 01:08:23,770 NANCY KANWISHER: It's very generalizable. 1739 01:08:23,770 --> 01:08:24,180 Yeah. 1740 01:08:24,180 --> 01:08:24,970 AUDIENCE: Different perspective. 1741 01:08:24,970 --> 01:08:26,500 NANCY KANWISHER: Totally. 1742 01:08:26,500 --> 01:08:28,120 It's not just this. 1743 01:08:28,120 --> 01:08:29,800 It's something closer to shoeness. 1744 01:08:29,800 --> 01:08:31,840 And we don't know exactly how far it is 1745 01:08:31,840 --> 01:08:33,100 until we test more conditions. 1746 01:08:33,100 --> 01:08:34,850 But exactly-- we've shown that it's really 1747 01:08:34,850 --> 01:08:36,069 abstract and generalizable. 1748 01:08:36,069 --> 01:08:37,300 That makes it more useful. 1749 01:08:37,300 --> 01:08:39,700 That makes it more cognitively interesting. 1750 01:08:39,700 --> 01:08:42,740 We could even go off the deep end and say, OK, 1751 01:08:42,740 --> 01:08:44,830 is it the concept of a shoe? 1752 01:08:44,830 --> 01:08:47,979 We could scan people reading the word "shoe" and ask, 1753 01:08:47,979 --> 01:08:50,630 is that going to work? 1754 01:08:50,630 --> 01:08:52,660 Anya's doing experiments like that. 1755 01:08:52,660 --> 01:08:56,060 There are various people looking at this kind of thing. 1756 01:08:56,060 --> 01:08:59,319 And so you can ask at any level how general or invariant 1757 01:08:59,319 --> 01:09:01,750 is that representation. 1758 01:09:01,750 --> 01:09:04,149 So neural decoders are not just gimmicks 1759 01:09:04,149 --> 01:09:07,149 to try to say, oh, I can see and I can read out 1760 01:09:07,149 --> 01:09:08,479 what this person saw. 1761 01:09:08,479 --> 01:09:10,569 They're powerful methods in science 1762 01:09:10,569 --> 01:09:12,819 to characterize mental representations 1763 01:09:12,819 --> 01:09:16,260 and to characterize how abstract they are.