1 00:00:00,000 --> 00:00:02,415 [SQUEAKING] 2 00:00:02,415 --> 00:00:04,347 [RUSTLING] 3 00:00:04,347 --> 00:00:07,245 [CLICKING] 4 00:00:09,535 --> 00:00:10,660 NANCY KANWISHER: All right. 5 00:00:10,660 --> 00:00:11,530 It's 11:05. 6 00:00:11,530 --> 00:00:14,540 I'm going to try to start promptly at 11:05 each time. 7 00:00:14,540 --> 00:00:16,059 So welcome. 8 00:00:16,059 --> 00:00:17,800 Is everybody psyched? 9 00:00:17,800 --> 00:00:20,020 I'm psyched. 10 00:00:20,020 --> 00:00:22,120 This is 913, the Human Brain. 11 00:00:22,120 --> 00:00:23,330 I'm Nancy Kanwisher. 12 00:00:23,330 --> 00:00:25,120 I'm the prof for this class. 13 00:00:25,120 --> 00:00:28,840 And lest, you were wondering, I have a brain, and there it is. 14 00:00:28,840 --> 00:00:31,420 That's me, with some bits colored in that you 15 00:00:31,420 --> 00:00:33,280 will learn about in this class. 16 00:00:33,280 --> 00:00:34,750 OK. 17 00:00:34,750 --> 00:00:36,250 What I'm going to do today is I'm 18 00:00:36,250 --> 00:00:39,520 going to tell you a brief story for around 10 minutes. 19 00:00:39,520 --> 00:00:42,800 And then I'm going to talk about the why, how, 20 00:00:42,800 --> 00:00:45,850 and what studying the human brain, why it's a cool thing 21 00:00:45,850 --> 00:00:48,320 to do, how you do it, and what in particular 22 00:00:48,320 --> 00:00:50,050 we're going to learn about in here, 23 00:00:50,050 --> 00:00:52,780 and then we'll do some mechanics and details of the course, 24 00:00:52,780 --> 00:00:54,550 and allocation of grades, and all that. 25 00:00:54,550 --> 00:00:56,770 It's on the syllabus anyway. 26 00:00:56,770 --> 00:00:57,940 Cool? 27 00:00:57,940 --> 00:01:00,640 That's the agenda. 28 00:01:00,640 --> 00:01:01,360 All right. 29 00:01:01,360 --> 00:01:04,360 So let's start with that story. 30 00:01:04,360 --> 00:01:07,210 And for this, I'm going to sit up here. 31 00:01:09,885 --> 00:01:11,260 The story isn't that long, but it 32 00:01:11,260 --> 00:01:13,450 has a lot of interesting little weird bits. 33 00:01:13,450 --> 00:01:15,370 So I have cue cards to remind myself 34 00:01:15,370 --> 00:01:18,400 of all the bits I want to remember to say. 35 00:01:18,400 --> 00:01:21,105 So you can put away your phones and your computers. 36 00:01:21,105 --> 00:01:22,480 And you don't need to take notes. 37 00:01:22,480 --> 00:01:24,700 This is just a story. 38 00:01:24,700 --> 00:01:27,340 It's going to foreshadow a lot of the themes in the course, 39 00:01:27,340 --> 00:01:29,800 but it's not stuff you're going to be tested on. 40 00:01:29,800 --> 00:01:31,240 OK. 41 00:01:31,240 --> 00:01:34,960 So this is a true story, and I've 42 00:01:34,960 --> 00:01:38,080 changed only a few tiny little bits 43 00:01:38,080 --> 00:01:41,000 to protect the identity of the people involved. 44 00:01:41,000 --> 00:01:44,600 But otherwise, it's an absolutely true story. 45 00:01:44,600 --> 00:01:47,710 It's a story about a scary medical situation 46 00:01:47,710 --> 00:01:50,990 that happened to a friend of mine a few years ago. 47 00:01:50,990 --> 00:01:53,230 But at the same time, it's a story 48 00:01:53,230 --> 00:01:55,450 about the nature of the human mind, 49 00:01:55,450 --> 00:01:59,630 about the organization of the human brain. 50 00:01:59,630 --> 00:02:04,030 And it's also a story about the ability or lack 51 00:02:04,030 --> 00:02:07,810 thereof to recover after brain damage. 52 00:02:07,810 --> 00:02:15,220 It's also incidentally a story about resilience, privilege, 53 00:02:15,220 --> 00:02:17,710 expertise, and all of those things that 54 00:02:17,710 --> 00:02:23,830 are characteristic of many people in Cambridge society, 55 00:02:23,830 --> 00:02:29,050 not so relevant for the course, but, all right, here goes. 56 00:02:29,050 --> 00:02:32,080 So a few years ago, a friend of mine 57 00:02:32,080 --> 00:02:34,210 was staying over at my house in Cambridge 58 00:02:34,210 --> 00:02:37,420 en route to a conference in a nearby state. 59 00:02:37,420 --> 00:02:42,020 And this guy, I'll call him Bob, was a close friend of mine. 60 00:02:42,020 --> 00:02:45,280 I'd known him for years and years. 61 00:02:45,280 --> 00:02:46,570 We talked regularly. 62 00:02:46,570 --> 00:02:48,550 We went on hiking trips together. 63 00:02:48,550 --> 00:02:50,360 We were pretty close. 64 00:02:50,360 --> 00:02:52,030 So he's en route to this conference. 65 00:02:52,030 --> 00:02:54,340 He's staying over at my house the night before. 66 00:02:56,752 --> 00:02:58,960 The plan was for him to get up early the next morning 67 00:02:58,960 --> 00:03:00,127 and drive to the conference. 68 00:03:00,127 --> 00:03:02,650 So we hung out the night before and chatted. 69 00:03:02,650 --> 00:03:05,380 And the next morning, he's sleeping in the next room 70 00:03:05,380 --> 00:03:06,640 over from mine. 71 00:03:06,640 --> 00:03:09,640 And early in the morning, I hear some shuffling. 72 00:03:09,640 --> 00:03:12,850 I think yep, OK, Bob is packing to leave and thank, 73 00:03:12,850 --> 00:03:14,050 God, I don't need to get up. 74 00:03:14,050 --> 00:03:15,700 I'm only dimly awake. 75 00:03:15,700 --> 00:03:18,610 And so I'm not paying that much attention. 76 00:03:18,610 --> 00:03:22,005 Shuffle, shuffle, shuffle in the background. 77 00:03:22,005 --> 00:03:23,005 And then I hear a crash. 78 00:03:26,420 --> 00:03:29,400 And I think, what the hell is that? 79 00:03:29,400 --> 00:03:32,900 And I get up and I go into the next room. 80 00:03:32,900 --> 00:03:36,680 And Bob is lying on the floor, not moving. 81 00:03:36,680 --> 00:03:40,130 I say, Bob, and there's no answer. 82 00:03:40,130 --> 00:03:44,210 And then I shout, Bob, and there's no answer. 83 00:03:44,210 --> 00:03:47,360 And then I dialed 911. 84 00:03:47,360 --> 00:03:50,270 While we were sitting there waiting for the ambulance 85 00:03:50,270 --> 00:03:53,720 to arrive, Bob starts to wake up. 86 00:03:53,720 --> 00:03:56,180 And he's very woozy, but he's alive. 87 00:03:56,180 --> 00:03:58,680 And he's making a little bit of sense. 88 00:03:58,680 --> 00:04:00,500 And he can't figure out what's going on, 89 00:04:00,500 --> 00:04:04,940 and neither can I. And so we're talking and chatting, 90 00:04:04,940 --> 00:04:07,670 and he's making a little more sense, 91 00:04:07,670 --> 00:04:10,170 but we still don't know what's happening. 92 00:04:10,170 --> 00:04:13,235 So then the ambulance arrives incredibly fast. 93 00:04:13,235 --> 00:04:15,110 I felt like three minutes, boom. 94 00:04:15,110 --> 00:04:18,019 There's three EMTs rushing in the front door, 95 00:04:18,019 --> 00:04:20,089 rushing up to the room where Bob was. 96 00:04:20,089 --> 00:04:22,310 And they take all his vitals. 97 00:04:22,310 --> 00:04:25,050 And they can't find anything wrong. 98 00:04:25,050 --> 00:04:26,990 And so they're really casual. 99 00:04:26,990 --> 00:04:29,640 I guess they confront stuff like this all the time. 100 00:04:29,640 --> 00:04:30,470 I don't. 101 00:04:30,470 --> 00:04:33,380 Bob doesn't, but they're very calm about it. 102 00:04:33,380 --> 00:04:35,330 And they're saying, well, go take him 103 00:04:35,330 --> 00:04:38,720 to the hospital or not. 104 00:04:38,720 --> 00:04:41,510 And I was like, I think we need to know what just happened, 105 00:04:41,510 --> 00:04:42,560 even though we seems OK. 106 00:04:42,560 --> 00:04:44,518 We kind of need to know what this is all about. 107 00:04:44,518 --> 00:04:45,470 Don't you think? 108 00:04:45,470 --> 00:04:48,170 They're like, yeah, you could take him to the ER. 109 00:04:48,170 --> 00:04:51,557 And I said, well, do we need to waste ambulance resources, 110 00:04:51,557 --> 00:04:54,140 or do you think it's safe if I drive him myself, since there's 111 00:04:54,140 --> 00:04:55,490 a hospital not far away? 112 00:04:55,490 --> 00:04:57,620 They say you could drive him yourself. 113 00:04:57,620 --> 00:05:01,670 So I drive Bob to the Mount Auburn Hospital ER, 114 00:05:01,670 --> 00:05:04,370 which is like less than a mile from my house. 115 00:05:04,370 --> 00:05:06,020 And we do the usual ER thing, which 116 00:05:06,020 --> 00:05:08,180 is mostly waiting, and waiting, and waiting, 117 00:05:08,180 --> 00:05:09,890 but various docs come by. 118 00:05:09,890 --> 00:05:11,330 And they take all these tests. 119 00:05:11,330 --> 00:05:13,353 And they take all these history questions, 120 00:05:13,353 --> 00:05:14,270 and it goes on and on. 121 00:05:14,270 --> 00:05:17,700 And basically, they're just not finding anything. 122 00:05:17,700 --> 00:05:19,577 So after about an hour or two of this, 123 00:05:19,577 --> 00:05:20,660 they're still doing tests. 124 00:05:20,660 --> 00:05:22,340 They don't want to quite let him go yet, because they 125 00:05:22,340 --> 00:05:23,580 don't know what happened. 126 00:05:23,580 --> 00:05:25,580 Everybody's calm about it. 127 00:05:25,580 --> 00:05:29,420 I figure, OK, fine, I got work to do. 128 00:05:29,420 --> 00:05:33,080 And I tell, Bob, text me throughout the day, 129 00:05:33,080 --> 00:05:37,130 and I'll come get you whenever they're ready to release you. 130 00:05:37,130 --> 00:05:42,950 And so I go into work, but just before I go into work, 131 00:05:42,950 --> 00:05:45,950 a thought flashes through my mind, and I say to the ER, 132 00:05:45,950 --> 00:05:51,560 doc you should check Bob's brain. 133 00:05:51,560 --> 00:05:54,140 And the reason that thought flashed through my mind 134 00:05:54,140 --> 00:05:57,470 is that actually I had been worrying about Bob 135 00:05:57,470 --> 00:05:58,880 for a number of years. 136 00:05:58,880 --> 00:06:00,230 And I hadn't really-- 137 00:06:00,230 --> 00:06:02,270 it hadn't quite registered consciously, 138 00:06:02,270 --> 00:06:06,170 It was kind of too horrifying a thought for me to really allow 139 00:06:06,170 --> 00:06:08,540 myself to realize I was worried about Bob's brain, 140 00:06:08,540 --> 00:06:10,550 but I was worried about a very particular thing 141 00:06:10,550 --> 00:06:12,830 and that is that Bob had been showing 142 00:06:12,830 --> 00:06:16,070 these weird signs that he often got 143 00:06:16,070 --> 00:06:19,502 lost and didn't know where he was. 144 00:06:19,502 --> 00:06:21,710 And on the one hand, this just didn't make any sense, 145 00:06:21,710 --> 00:06:24,510 because he was fine in every other way, 146 00:06:24,510 --> 00:06:26,250 but it was really pretty striking. 147 00:06:26,250 --> 00:06:29,750 So one time, I was over at Bob's house 148 00:06:29,750 --> 00:06:31,430 with some other friends of ours. 149 00:06:31,430 --> 00:06:33,620 And the friend asked, Bob, how do 150 00:06:33,620 --> 00:06:36,980 we get-- how do I drive from your house into Cambridge? 151 00:06:36,980 --> 00:06:39,230 And Bob said, well, you go to the end of the driveway, 152 00:06:39,230 --> 00:06:41,000 and you turn left. 153 00:06:41,000 --> 00:06:45,908 My friend and I looked at each other like, Bob, what? 154 00:06:45,908 --> 00:06:47,700 And Bob thinks about it for a minute, yeah, 155 00:06:47,700 --> 00:06:50,280 end of the driveway, turn left. 156 00:06:50,280 --> 00:06:53,880 I just had this like sinking feeling of dread 157 00:06:53,880 --> 00:06:58,290 in the pit of my stomach, but we sort of made light of it, 158 00:06:58,290 --> 00:06:59,940 and made fun of it, and it went by. 159 00:06:59,940 --> 00:07:03,610 It was like, no, you turn right, and we gave the directions. 160 00:07:03,610 --> 00:07:05,640 Another time a friend of mine was driving 161 00:07:05,640 --> 00:07:08,250 with Bob in Bob's hometown. 162 00:07:08,250 --> 00:07:11,255 And notice that like Bob didn't seem 163 00:07:11,255 --> 00:07:12,630 to know how to get to the grocery 164 00:07:12,630 --> 00:07:15,240 store in his hometown, where he'd 165 00:07:15,240 --> 00:07:19,140 lived for a really long time, a trip he'd 166 00:07:19,140 --> 00:07:22,320 made hundreds of times. 167 00:07:22,320 --> 00:07:25,980 Another time, I was at a conference in Germany. 168 00:07:25,980 --> 00:07:29,370 And I saw there are these arrays of posters 169 00:07:29,370 --> 00:07:32,610 of people presenting usually pretty dry scientific things. 170 00:07:32,610 --> 00:07:35,430 And out of the corner of my I see the title of a poster 171 00:07:35,430 --> 00:07:39,030 and it says navigational deficits colon, 172 00:07:39,030 --> 00:07:41,310 an early sign of Alzheimer's. 173 00:07:41,310 --> 00:07:44,370 And I saw that, and I just saw ah, 174 00:07:44,370 --> 00:07:47,550 and I just kind of suppress the thought. 175 00:07:47,550 --> 00:07:49,990 I thought, oh my God, Bob, wasn't that old. 176 00:07:49,990 --> 00:07:52,585 I know Alzheimer's can very rarely strike early. 177 00:07:52,585 --> 00:07:54,210 I didn't want to think about it, but it 178 00:07:54,210 --> 00:07:57,270 was like rattling around in the back of back 179 00:07:57,270 --> 00:07:59,200 of my consciousness. 180 00:07:59,200 --> 00:08:01,830 So there had been these signs, but as I 181 00:08:01,830 --> 00:08:05,190 say, it didn't make sense, because Bob was holding down 182 00:08:05,190 --> 00:08:06,750 a very high-powered job. 183 00:08:06,750 --> 00:08:08,970 He was writing beautiful prose. 184 00:08:08,970 --> 00:08:12,780 He was the life of every party he was at witty, funny. 185 00:08:12,780 --> 00:08:16,770 Everybody's like favorite life of the party. 186 00:08:16,770 --> 00:08:18,060 So how could that be? 187 00:08:18,060 --> 00:08:20,310 It just didn't make sense that there would be anything 188 00:08:20,310 --> 00:08:21,930 wrong with Bob's brain. 189 00:08:21,930 --> 00:08:24,870 So I managed for a few years to notice these signs 190 00:08:24,870 --> 00:08:27,270 and ignore them and not pay any attention. 191 00:08:33,580 --> 00:08:40,110 So the killer thing is, I should have known better. 192 00:08:40,110 --> 00:08:44,790 My research for the last 20 years has been on the very fact 193 00:08:44,790 --> 00:08:46,800 that there are different parts of the brain that 194 00:08:46,800 --> 00:08:48,760 do different things. 195 00:08:48,760 --> 00:08:50,910 And one of the corollaries of that 196 00:08:50,910 --> 00:08:52,350 is you can have a problem with one 197 00:08:52,350 --> 00:08:55,990 of those parts and the other parts can work just fine. 198 00:08:55,990 --> 00:08:58,980 And so I, if anyone, should have realized, 199 00:08:58,980 --> 00:09:02,280 yes, there's something really wrong with Bob's navigation 200 00:09:02,280 --> 00:09:03,150 abilities. 201 00:09:03,150 --> 00:09:05,940 And the fact that he's smart, and witty, 202 00:09:05,940 --> 00:09:08,460 and funny and holding down a high powered job 203 00:09:08,460 --> 00:09:11,280 doesn't mean there isn't something wrong with his brain, 204 00:09:11,280 --> 00:09:14,140 with a part of his brain. 205 00:09:14,140 --> 00:09:16,770 But I didn't realize that. 206 00:09:16,770 --> 00:09:21,150 But then, as I'm leaving the ER, they kind of all collect. 207 00:09:21,150 --> 00:09:25,980 And I said to the ER doc, you better check his brain. 208 00:09:25,980 --> 00:09:28,080 I thought Bob was out of earshot when I said that. 209 00:09:28,080 --> 00:09:28,680 He heard it. 210 00:09:28,680 --> 00:09:29,820 He's like what? 211 00:09:29,820 --> 00:09:32,490 I was like, oh, never mind. 212 00:09:32,490 --> 00:09:35,190 Anyway, the ER doc with the kind of confidence 213 00:09:35,190 --> 00:09:38,880 that only dogs can muster said, no, not a brain thing. 214 00:09:38,880 --> 00:09:44,040 This is a heart thing, which wasn't exactly reassuring, 215 00:09:44,040 --> 00:09:47,850 but I set aside the brain thought. 216 00:09:47,850 --> 00:09:48,810 And I went off to work. 217 00:09:51,500 --> 00:09:58,070 So throughout the day, I texted with Bob a few times. 218 00:09:58,070 --> 00:09:59,080 Things seem to be fine. 219 00:09:59,080 --> 00:10:00,080 They've done more tests. 220 00:10:00,080 --> 00:10:02,090 They weren't finding anything. 221 00:10:02,090 --> 00:10:04,010 We just got calmer and calmer about it. 222 00:10:04,010 --> 00:10:08,030 I guess sometimes weird stuff happens, and you just move on. 223 00:10:08,030 --> 00:10:11,000 But then that night around 7:00 or 8:00 224 00:10:11,000 --> 00:10:12,860 at night, I was over at a friend's house, 225 00:10:12,860 --> 00:10:14,480 and the phone rang. 226 00:10:14,480 --> 00:10:15,740 And it was Bob. 227 00:10:15,740 --> 00:10:19,940 I picked it up and Bob says, get over here. 228 00:10:19,940 --> 00:10:23,150 They found something in my brain. 229 00:10:23,150 --> 00:10:25,640 So I ran out of the house, grabbed my phone. 230 00:10:25,640 --> 00:10:29,630 And as I'm driving to the Mount Auburn ER, 231 00:10:29,630 --> 00:10:33,830 I called my trusty lab tech, an amazing guy, 232 00:10:33,830 --> 00:10:37,140 who keeps track of all kinds of things much better than I do, 233 00:10:37,140 --> 00:10:40,520 and I said, I remember that we scanned 234 00:10:40,520 --> 00:10:44,780 Bob a bunch of years ago for a regular experiment in my lab. 235 00:10:44,780 --> 00:10:46,760 And I don't remember the date. 236 00:10:46,760 --> 00:10:50,930 I don't remember anything about it, but dig around in the files 237 00:10:50,930 --> 00:10:52,400 and see if you can figure it out. 238 00:10:52,400 --> 00:10:56,300 It might be useful to have that scan. 239 00:10:56,300 --> 00:10:59,330 So by the time I get to the ER, my lab tech 240 00:10:59,330 --> 00:11:03,030 has already texted me back and said found the scans. 241 00:11:03,030 --> 00:11:05,510 I'm putting them in a Dropbox for you. 242 00:11:05,510 --> 00:11:09,380 So I go into the ER, and there's Bob and the ER doc. 243 00:11:09,380 --> 00:11:15,770 And Bob says to me, do you want to see it? 244 00:11:15,770 --> 00:11:18,440 The ER doc or the radiologist has already 245 00:11:18,440 --> 00:11:21,090 shown Bob the picture of his brain. 246 00:11:21,090 --> 00:11:23,000 And so they take me in there. 247 00:11:23,000 --> 00:11:24,320 And I look at it. 248 00:11:24,320 --> 00:11:25,340 And I gulped. 249 00:11:25,340 --> 00:11:27,470 There was a thing the size of a lime 250 00:11:27,470 --> 00:11:31,060 smack in the middle of his brain. 251 00:11:31,060 --> 00:11:33,460 Pretty terrifying. 252 00:11:33,460 --> 00:11:42,340 So this lime in the middle of Bob's brain 253 00:11:42,340 --> 00:11:44,980 was right next to a region that my lab 254 00:11:44,980 --> 00:11:46,570 had studied in great detail. 255 00:11:46,570 --> 00:11:51,220 In fact, my lab had discovered that a brain region right next 256 00:11:51,220 --> 00:11:54,160 to where that line was located was specifically 257 00:11:54,160 --> 00:11:57,500 involved in navigation. 258 00:11:57,500 --> 00:11:59,730 How could I not have put all this together? 259 00:11:59,730 --> 00:12:03,600 But I didn't until that moment I thought, of course, of course, 260 00:12:03,600 --> 00:12:05,060 there's a thing in his brain right 261 00:12:05,060 --> 00:12:08,840 next to the parahippocampal place area, which I discovered, 262 00:12:08,840 --> 00:12:12,260 and a nearby related region called retrosplenial cortex, 263 00:12:12,260 --> 00:12:13,220 of course. 264 00:12:13,220 --> 00:12:16,170 And how the hell could I not have known? 265 00:12:16,170 --> 00:12:17,030 But I didn't know. 266 00:12:19,580 --> 00:12:23,780 In that earlier work, it had been nearly 20 years ago, 267 00:12:23,780 --> 00:12:26,090 I had a postdoc named Russell Epstein. 268 00:12:26,090 --> 00:12:28,280 And Russell was a computer vision guy. 269 00:12:28,280 --> 00:12:32,150 And he wanted to understand how we see by writing code 270 00:12:32,150 --> 00:12:34,970 to duplicate the algorithms that he thought go on 271 00:12:34,970 --> 00:12:38,030 in the human brain when we understand visual images. 272 00:12:38,030 --> 00:12:40,130 And that's a very respectable cool line 273 00:12:40,130 --> 00:12:43,290 of work, which we'll learn a little bit about in here. 274 00:12:43,290 --> 00:12:45,200 And Russell was really a coding guy. 275 00:12:45,200 --> 00:12:48,650 At the time, we were just starting doing brain imaging, 276 00:12:48,650 --> 00:12:50,718 but Russell was like pooh poohing it all. 277 00:12:50,718 --> 00:12:52,010 It's like the flash in the pan. 278 00:12:52,010 --> 00:12:52,940 It's going to go by. 279 00:12:52,940 --> 00:12:53,750 It's trashy. 280 00:12:53,750 --> 00:12:55,950 So you guys get nice blobs on the brain. 281 00:12:55,950 --> 00:12:57,620 I'm not having any of it. 282 00:12:57,620 --> 00:13:01,010 And I kept saying, Russell, you need to get a job. 283 00:13:01,010 --> 00:13:03,890 Just do one experiment so you can show in your job talk 284 00:13:03,890 --> 00:13:05,180 that you can do brain imaging. 285 00:13:05,180 --> 00:13:06,350 It might help you. 286 00:13:06,350 --> 00:13:07,760 You don't need to do a lot of it. 287 00:13:07,760 --> 00:13:10,310 Just do one dumb experiment. 288 00:13:10,310 --> 00:13:13,340 Russell was interested in how we recognize scenes, not just 289 00:13:13,340 --> 00:13:16,190 objects, and faces, and words, but how do we know where we are 290 00:13:16,190 --> 00:13:19,670 and how do we recognize if the scene as a city, or a beach, 291 00:13:19,670 --> 00:13:21,620 or whatever it is? 292 00:13:21,620 --> 00:13:24,080 I said, OK, Russell, we'll just scan people, 293 00:13:24,080 --> 00:13:26,270 looking at pictures of scenes, and looking 294 00:13:26,270 --> 00:13:27,830 at other kinds of pictures. 295 00:13:27,830 --> 00:13:29,540 And we'll just kind of see if there's 296 00:13:29,540 --> 00:13:32,240 any part of the brain that responds a lot to scenes. 297 00:13:32,240 --> 00:13:34,037 It really was not well thought out. 298 00:13:34,037 --> 00:13:35,870 This is not how you should do an experiment. 299 00:13:35,870 --> 00:13:39,470 It shouldn't be based on political calculations, lack 300 00:13:39,470 --> 00:13:41,190 of theory, any of the above. 301 00:13:41,190 --> 00:13:43,340 But the fact is that's why we did that experiment. 302 00:13:43,340 --> 00:13:45,790 Russell needed to be able to show a brain image in his job 303 00:13:45,790 --> 00:13:46,290 talk. 304 00:13:46,290 --> 00:13:49,160 So we scan some people looking at scenes. 305 00:13:49,160 --> 00:13:51,230 And the results knocked our socks off. 306 00:13:51,230 --> 00:13:54,470 We found a part of the brain that responds very selectively 307 00:13:54,470 --> 00:13:56,630 when you look at images of scenes, 308 00:13:56,630 --> 00:13:59,120 not when you look at faces, objects, words, 309 00:13:59,120 --> 00:14:01,350 or pretty much anything else. 310 00:14:01,350 --> 00:14:03,900 And so we'll learn more about that later in the course. 311 00:14:03,900 --> 00:14:06,110 We called it the parahippocampal place area. 312 00:14:06,110 --> 00:14:09,260 And that launched a whole major line of work in my lab 313 00:14:09,260 --> 00:14:11,540 and now dozens of other labs around the world. 314 00:14:13,700 --> 00:14:15,650 Backtrack-- we'd already found that region. 315 00:14:15,650 --> 00:14:17,540 And here's this lime in my friend 316 00:14:17,540 --> 00:14:21,090 Bob's brain, sitting right next to the parahippocampal place 317 00:14:21,090 --> 00:14:21,590 area. 318 00:14:28,420 --> 00:14:33,790 Then I remembered, let's look at the scans from my lab 319 00:14:33,790 --> 00:14:37,600 from a few years ago in Bob's brain. 320 00:14:37,600 --> 00:14:42,370 I fiddled around and managed to download the files. 321 00:14:42,370 --> 00:14:43,600 And there it was. 322 00:14:43,600 --> 00:14:45,850 You could see that same blob. 323 00:14:45,850 --> 00:14:47,710 But in the scans from a few years 324 00:14:47,710 --> 00:14:48,970 before, it was much smaller. 325 00:14:48,970 --> 00:14:50,095 It was the size of a grape. 326 00:14:52,540 --> 00:14:54,820 That told us a bunch of things. 327 00:14:54,820 --> 00:14:57,760 Most importantly, it told us this thing is growing really 328 00:14:57,760 --> 00:14:59,140 slowly. 329 00:14:59,140 --> 00:15:02,350 And that was hugely important, because brain tumors 330 00:15:02,350 --> 00:15:04,210 are very bad news. 331 00:15:04,210 --> 00:15:05,890 And they usually grow really fast. 332 00:15:05,890 --> 00:15:08,050 And the fact that it grew really slowly 333 00:15:08,050 --> 00:15:11,650 told us that this was not one of the kind of worst, most 334 00:15:11,650 --> 00:15:13,300 invasive, most horrible ones. 335 00:15:13,300 --> 00:15:14,950 It was clearly a problem. 336 00:15:14,950 --> 00:15:16,510 It was big. 337 00:15:16,510 --> 00:15:21,360 But at least it wasn't growing hugely fast. 338 00:15:21,360 --> 00:15:25,050 But how poignant that there was in my own damn data, 339 00:15:25,050 --> 00:15:27,780 and I hadn't seen it in my friend's brain. 340 00:15:27,780 --> 00:15:30,330 Well, I'm not a radiologist. 341 00:15:30,330 --> 00:15:31,680 I'm a basic researcher. 342 00:15:31,680 --> 00:15:35,520 And I didn't look, and I didn't see it. 343 00:15:35,520 --> 00:15:38,850 Indeed, the next day, the docs told us 344 00:15:38,850 --> 00:15:41,647 that they thought this was meningioma, not cancer. 345 00:15:41,647 --> 00:15:43,980 Who knew that you could have tumors that weren't cancer? 346 00:15:43,980 --> 00:15:45,690 But you can. 347 00:15:45,690 --> 00:15:48,030 And they still need to come out, if they're big enough. 348 00:15:48,030 --> 00:15:49,140 And that's very serious. 349 00:15:49,140 --> 00:15:54,663 But it's not as bad as having a cancer in your brain. 350 00:15:54,663 --> 00:15:56,580 As we're collecting information, the next day, 351 00:15:56,580 --> 00:15:59,040 I'm hanging out in the hospital room. 352 00:15:59,040 --> 00:16:01,950 And there was an amusing moment when one of the residents 353 00:16:01,950 --> 00:16:02,450 came by. 354 00:16:02,450 --> 00:16:04,033 And he's taking the history and asking 355 00:16:04,033 --> 00:16:05,200 all of the basic questions. 356 00:16:05,200 --> 00:16:07,020 And I said kind of sheepishly-- because you 357 00:16:07,020 --> 00:16:09,265 don't want to seem like more than the residents. 358 00:16:09,265 --> 00:16:10,890 And in fact, I didn't really know more, 359 00:16:10,890 --> 00:16:13,057 but I just thought I'd provide a little information. 360 00:16:13,057 --> 00:16:16,800 And I said, he's actually had symptoms for a bunch of years, 361 00:16:16,800 --> 00:16:18,660 and there's a region of the brain 362 00:16:18,660 --> 00:16:21,660 nearby that I've actually studied a little bit. 363 00:16:21,660 --> 00:16:23,880 And the resident says, like, we know who you are. 364 00:16:27,810 --> 00:16:33,240 So much for my trying to stay under the radar. 365 00:16:33,240 --> 00:16:36,300 That afternoon, I talked to a neurosurgeon friend of mine, 366 00:16:36,300 --> 00:16:38,460 because I figured, OK, we need advice. 367 00:16:38,460 --> 00:16:39,450 We need help. 368 00:16:39,450 --> 00:16:45,250 And the neurosurgeon friend said, 369 00:16:45,250 --> 00:16:47,400 quote-- it got branded in my brain-- 370 00:16:47,400 --> 00:16:50,040 she said, "it is of paramount importance 371 00:16:50,040 --> 00:16:52,320 that you find the best neurosurgeon. 372 00:16:52,320 --> 00:16:56,040 It's the difference between whether Bob dies on the table 373 00:16:56,040 --> 00:16:59,143 or goes on to live a normal life." 374 00:16:59,143 --> 00:17:00,810 This is the privilege part of the story. 375 00:17:00,810 --> 00:17:03,600 I'm not that well connected, but I'm a little bit connected. 376 00:17:03,600 --> 00:17:06,270 And I kind of dug around, and did what I could. 377 00:17:06,270 --> 00:17:08,940 And we spent a couple of weeks, and we 378 00:17:08,940 --> 00:17:12,599 found the best neurosurgeon. 379 00:17:12,599 --> 00:17:18,420 And the night before the surgery, 380 00:17:18,420 --> 00:17:20,730 Bob is staying over at my house, because the surgery 381 00:17:20,730 --> 00:17:22,420 was in a Boston hospital. 382 00:17:22,420 --> 00:17:25,150 And I thought, I've been dancing around this for years, 383 00:17:25,150 --> 00:17:26,579 but now it's all out in the open. 384 00:17:26,579 --> 00:17:27,662 We know there's a problem. 385 00:17:27,662 --> 00:17:28,890 And I'm going to test him. 386 00:17:28,890 --> 00:17:31,560 I'm going to find out what the hell's going on. 387 00:17:31,560 --> 00:17:35,460 This is, after all, one of the basic forms of data 388 00:17:35,460 --> 00:17:37,080 that we collect in my field-- that 389 00:17:37,080 --> 00:17:39,570 is, testing people with problems in their brain 390 00:17:39,570 --> 00:17:41,970 to try to figure out what things they can do 391 00:17:41,970 --> 00:17:43,290 and what things they can't do. 392 00:17:43,290 --> 00:17:45,570 It's a way of figuring out what the basic components 393 00:17:45,570 --> 00:17:46,950 of the mind and brain are. 394 00:17:46,950 --> 00:17:49,890 It's actually the oldest, most venerable method in our field, 395 00:17:49,890 --> 00:17:52,600 and it's still a hugely important one. 396 00:17:52,600 --> 00:17:54,180 So I thought, what the hell. 397 00:17:54,180 --> 00:17:57,630 So I said, OK, Bob, draw me a sketch map of the floor 398 00:17:57,630 --> 00:18:00,330 plan of your house. 399 00:18:00,330 --> 00:18:03,210 Bob takes a few minutes and he draws this thing. 400 00:18:03,210 --> 00:18:05,610 And it was shocking. 401 00:18:05,610 --> 00:18:09,960 There weren't even-- the rooms in a rectilinearly arranged 402 00:18:09,960 --> 00:18:12,480 house, they weren't even aligned. 403 00:18:12,480 --> 00:18:14,280 There was, like, a soup of lines. 404 00:18:14,280 --> 00:18:17,490 There was no organization from one room to the next. 405 00:18:17,490 --> 00:18:20,880 And Bob kind of realized, this isn't right, is it? 406 00:18:20,880 --> 00:18:22,590 But he didn't know how to fix it. 407 00:18:22,590 --> 00:18:24,685 And he said he just couldn't visualize 408 00:18:24,685 --> 00:18:26,310 what it looked like to be in his house, 409 00:18:26,310 --> 00:18:29,570 and so he couldn't draw the floor plan. 410 00:18:29,570 --> 00:18:32,070 And I thought, OK, he hasn't been there in a couple of days. 411 00:18:32,070 --> 00:18:33,570 So I gave him another piece of paper 412 00:18:33,570 --> 00:18:36,870 and I said, OK, draw the floor plan of my house, where 413 00:18:36,870 --> 00:18:40,290 you are right now. 414 00:18:40,290 --> 00:18:45,750 Bob took a couple of minutes and delivered a similar mess. 415 00:18:45,750 --> 00:18:49,140 He couldn't even imagine the layout of the room next 416 00:18:49,140 --> 00:18:54,230 to him, that he'd been in a few minutes before. 417 00:18:54,230 --> 00:18:57,260 And then, trying to channel my inner neuropsychologist, 418 00:18:57,260 --> 00:18:58,898 I thought OK. 419 00:18:58,898 --> 00:19:00,440 Gave him another piece of paper and I 420 00:19:00,440 --> 00:19:03,710 said, OK, Bob, draw a bicycle. 421 00:19:03,710 --> 00:19:04,850 Why did I choose a bicycle? 422 00:19:04,850 --> 00:19:06,500 Because it's a multi-part object that 423 00:19:06,500 --> 00:19:08,120 has a bunch of different bits that 424 00:19:08,120 --> 00:19:10,250 have a particular relationship to each other, 425 00:19:10,250 --> 00:19:11,960 just as the rooms in a house have 426 00:19:11,960 --> 00:19:14,570 a particular spatial relationship to each other. 427 00:19:14,570 --> 00:19:17,150 And I wanted to know, is his problem specifically 428 00:19:17,150 --> 00:19:21,650 about places, or is it about any complex, multi-part thing 429 00:19:21,650 --> 00:19:25,040 that you have to remember the relationships to? 430 00:19:25,040 --> 00:19:27,320 Bob is no artist, to put it mildly. 431 00:19:27,320 --> 00:19:30,620 But his bicycle was clearly recognizable as a bicycle. 432 00:19:30,620 --> 00:19:32,870 It had the two wheels in the right relationship, 433 00:19:32,870 --> 00:19:34,940 and it had all of the basic parts 434 00:19:34,940 --> 00:19:37,430 in roughly the right place. 435 00:19:37,430 --> 00:19:40,760 I then had him draw a lobster, another multi-part object. 436 00:19:40,760 --> 00:19:42,830 And also, his lobster was not beautiful, 437 00:19:42,830 --> 00:19:46,250 but had everything in the right place. 438 00:19:46,250 --> 00:19:50,120 That's very telling he had a specific problem in-- 439 00:19:50,120 --> 00:19:52,700 I don't know-- imagining, reproducing, remembering? 440 00:19:52,700 --> 00:19:54,350 It's not totally clear. 441 00:19:54,350 --> 00:19:56,450 The arrangements of parts in a room, 442 00:19:56,450 --> 00:20:00,455 but not the arrangements of parts in an object. 443 00:20:00,455 --> 00:20:02,900 And we'll get back to that more in a few weeks. 444 00:20:09,770 --> 00:20:11,690 What do I want to say here? 445 00:20:11,690 --> 00:20:13,970 I said all of that. 446 00:20:13,970 --> 00:20:17,990 The next day, Bob has an 11-hour surgery. 447 00:20:17,990 --> 00:20:21,560 Major, hardcore, extreme neurosurgery. 448 00:20:21,560 --> 00:20:24,990 Remove a huge piece of bone from the back of your head, 449 00:20:24,990 --> 00:20:27,440 pull apart the hemispheres of the brain like this, 450 00:20:27,440 --> 00:20:31,580 go in multiple inches and remove a lime. 451 00:20:31,580 --> 00:20:33,980 Holy crap, right? 452 00:20:33,980 --> 00:20:38,090 Said lime was right near the vein of Galen. 453 00:20:38,090 --> 00:20:40,830 Galen lived, what, a couple of thousand years ago? 454 00:20:40,830 --> 00:20:42,380 The fact that there's a vein of Galen 455 00:20:42,380 --> 00:20:44,600 means it's a big-ass vein-- the kind of vein 456 00:20:44,600 --> 00:20:47,930 that even Galen would have found with dissection 457 00:20:47,930 --> 00:20:49,910 2,000 years ago. 458 00:20:49,910 --> 00:20:52,700 This line was all wrapped around and interleaved 459 00:20:52,700 --> 00:20:53,930 with the vein of Galen. 460 00:20:53,930 --> 00:20:55,350 Not good. 461 00:20:55,350 --> 00:20:58,325 But because we found the best neurosurgeon, 462 00:20:58,325 --> 00:21:02,460 and because we have extreme privilege and all 463 00:21:02,460 --> 00:21:04,400 of the possible medical resources 464 00:21:04,400 --> 00:21:06,860 and expertise you could possibly hope for, 465 00:21:06,860 --> 00:21:08,660 Bob sailed through the surgery. 466 00:21:08,660 --> 00:21:11,690 And an hour after the surgery, I'm chatting with him 467 00:21:11,690 --> 00:21:13,070 and he's making sense. 468 00:21:13,070 --> 00:21:14,640 Amazing, right? 469 00:21:18,290 --> 00:21:22,230 And literally, two days later, they sent him home. 470 00:21:22,230 --> 00:21:24,230 And a few days after that, he's back at work. 471 00:21:24,230 --> 00:21:25,880 No problem. 472 00:21:25,880 --> 00:21:27,990 Totally fine. 473 00:21:27,990 --> 00:21:31,730 But now we get to the question you're probably thinking about. 474 00:21:31,730 --> 00:21:34,880 What about his navigational abilities? 475 00:21:34,880 --> 00:21:38,270 The sad answer is, nothing doing. 476 00:21:38,270 --> 00:21:41,840 None of it came back at all. 477 00:21:41,840 --> 00:21:43,760 Thank god for iPhones. 478 00:21:43,760 --> 00:21:46,910 If Bob lived 30 years ago, he wouldn't be able to function. 479 00:21:46,910 --> 00:21:50,570 But he goes everywhere using his iPhone GPS-- 480 00:21:50,570 --> 00:21:53,450 everywhere. 481 00:21:53,450 --> 00:21:55,970 And this fact that he didn't recover 482 00:21:55,970 --> 00:21:58,880 his navigational abilities is consistent 483 00:21:58,880 --> 00:22:00,380 with the whole literature that we'll 484 00:22:00,380 --> 00:22:02,840 consider later in the course-- 485 00:22:02,840 --> 00:22:05,660 that, often-- not always, but often, 486 00:22:05,660 --> 00:22:07,970 if you have brain damage, especially 487 00:22:07,970 --> 00:22:09,590 to some of these very specialized 488 00:22:09,590 --> 00:22:13,430 circuits that we'll talk about, you don't recover later. 489 00:22:13,430 --> 00:22:16,970 If the damage is early, you may well recover-- early in life, 490 00:22:16,970 --> 00:22:18,560 you may well recover. 491 00:22:18,560 --> 00:22:21,050 Children have much more plastic brains that 492 00:22:21,050 --> 00:22:23,030 can adjust after brain damage. 493 00:22:23,030 --> 00:22:24,860 Adults, not so good. 494 00:22:29,840 --> 00:22:31,520 Bob's doing fine. 495 00:22:31,520 --> 00:22:33,410 That's my story. 496 00:22:33,410 --> 00:22:35,970 Any thoughts or questions? 497 00:22:35,970 --> 00:22:36,470 Yeah? 498 00:22:36,470 --> 00:22:39,507 AUDIENCE: Can he tell the difference between right 499 00:22:39,507 --> 00:22:40,340 to left [INAUDIBLE]? 500 00:22:40,340 --> 00:22:41,540 NANCY KANWISHER: Yes. 501 00:22:41,540 --> 00:22:42,830 Yes. 502 00:22:42,830 --> 00:22:44,690 And it's very interesting. 503 00:22:44,690 --> 00:22:46,820 There are many of his spatial abilities that 504 00:22:46,820 --> 00:22:51,860 are absolutely intact, and yet the ones related to navigation 505 00:22:51,860 --> 00:22:53,780 are not. 506 00:22:53,780 --> 00:22:54,280 Yeah? 507 00:22:54,280 --> 00:22:55,238 AUDIENCE: Can he drive? 508 00:22:55,238 --> 00:22:57,520 NANCY KANWISHER: Yeah, no problem. 509 00:22:57,520 --> 00:23:00,580 But he's always looking at his damn phone to get directions, 510 00:23:00,580 --> 00:23:06,503 or to listen to the GPS directions system. 511 00:23:06,503 --> 00:23:07,420 Driving is no problem. 512 00:23:07,420 --> 00:23:09,378 It's another kind of left-right-- the immediate 513 00:23:09,378 --> 00:23:12,320 spatial orientation abilities are absolutely fine. 514 00:23:12,320 --> 00:23:15,370 But knowing, where am I now, and how would I get there 515 00:23:15,370 --> 00:23:18,933 from here, is blitzed. 516 00:23:18,933 --> 00:23:19,600 Other questions? 517 00:23:19,600 --> 00:23:20,100 Yeah? 518 00:23:20,100 --> 00:23:22,570 AUDIENCE: Can he recognize familiar places? 519 00:23:22,570 --> 00:23:24,070 NANCY KANWISHER: Great question. 520 00:23:24,070 --> 00:23:26,950 Yes, he can recognize familiar places. 521 00:23:26,950 --> 00:23:29,950 What he can't do is, he can say, oh, right, 522 00:23:29,950 --> 00:23:33,940 that's the front of our house, or that's 523 00:23:33,940 --> 00:23:36,910 such-and-such cafe that's near our house. 524 00:23:36,910 --> 00:23:38,890 What he can't do is say, which way 525 00:23:38,890 --> 00:23:41,840 would you turn from there to go home? 526 00:23:41,840 --> 00:23:45,747 AUDIENCE: Can he can he string together multiple [INAUDIBLE]?? 527 00:23:45,747 --> 00:23:47,080 NANCY KANWISHER: Great question. 528 00:23:47,080 --> 00:23:47,980 Great question. 529 00:23:47,980 --> 00:23:49,960 A little bit. 530 00:23:49,960 --> 00:23:52,270 He can navigate a little bit with his GPS. 531 00:23:52,270 --> 00:23:54,130 And because he's learned certain routes 532 00:23:54,130 --> 00:23:56,922 as a series of almost verbal commands-- 533 00:23:56,922 --> 00:23:59,380 if you're here, turn right, then there, nur, nur, nur, nur. 534 00:23:59,380 --> 00:24:00,770 That whole kind of thing. 535 00:24:00,770 --> 00:24:02,920 It's not what any of you guys could do. 536 00:24:02,920 --> 00:24:05,800 If you guys are driving around in Cambridge 537 00:24:05,800 --> 00:24:07,820 or walking around campus-- 538 00:24:07,820 --> 00:24:10,630 remember when they blocked off this whole middle of campus 539 00:24:10,630 --> 00:24:11,740 a couple of years ago? 540 00:24:11,740 --> 00:24:12,910 It was so irritating. 541 00:24:12,910 --> 00:24:14,230 I would like go there, and it's like, oh god, 542 00:24:14,230 --> 00:24:15,188 they've blocked it off. 543 00:24:15,188 --> 00:24:17,200 I can't get over to lobby 7. 544 00:24:17,200 --> 00:24:19,450 Well, you immediately come up with an alternate route. 545 00:24:19,450 --> 00:24:21,250 It's like, OK, I guess we're going to have to do this. 546 00:24:21,250 --> 00:24:22,750 You come up with an alternate route. 547 00:24:22,750 --> 00:24:25,600 This is a normal navigation system can do. 548 00:24:25,600 --> 00:24:27,070 Bob can't do that at all. 549 00:24:27,070 --> 00:24:28,840 He's like, route blocked? 550 00:24:28,840 --> 00:24:29,560 No idea. 551 00:24:29,560 --> 00:24:32,680 Get out the phone. 552 00:24:32,680 --> 00:24:33,180 Yeah? 553 00:24:33,180 --> 00:24:35,640 AUDIENCE: Is he good at estimating distances? 554 00:24:35,640 --> 00:24:38,220 Does he know something is a certain number of miles 555 00:24:38,220 --> 00:24:39,296 away, or? 556 00:24:39,296 --> 00:24:40,650 NANCY KANWISHER: Yes. 557 00:24:40,650 --> 00:24:41,970 Yes, he is. 558 00:24:41,970 --> 00:24:44,340 And that's very interesting. 559 00:24:44,340 --> 00:24:46,530 But that seems to be kind of a different thing. 560 00:24:46,530 --> 00:24:48,863 You could think about all of the different kinds of cues 561 00:24:48,863 --> 00:24:53,040 you have for distance beyond your kind of literal navigation 562 00:24:53,040 --> 00:24:54,880 skills. 563 00:24:54,880 --> 00:24:55,380 Yeah? 564 00:24:55,380 --> 00:24:56,380 AUDIENCE: [INAUDIBLE]? 565 00:25:00,380 --> 00:25:01,640 NANCY KANWISHER: A little bit. 566 00:25:01,640 --> 00:25:03,680 A couple of minutes, yes. 567 00:25:03,680 --> 00:25:04,760 The next day-- 568 00:25:04,760 --> 00:25:06,830 I mean, it would be kind of like this thing. 569 00:25:06,830 --> 00:25:09,470 It's like, I sort of vaguely remember that when I was here, 570 00:25:09,470 --> 00:25:11,485 I turned right, so I'd better do that again. 571 00:25:11,485 --> 00:25:12,610 Yes, did you have question? 572 00:25:12,610 --> 00:25:17,080 AUDIENCE: Can he navigate within buildings? 573 00:25:17,080 --> 00:25:18,560 NANCY KANWISHER: No, not very well. 574 00:25:18,560 --> 00:25:22,390 And this is a problem, because iPhones don't usually-- yeah. 575 00:25:22,390 --> 00:25:25,330 New hotels, big problem. 576 00:25:25,330 --> 00:25:28,120 Finding the bathroom down the hall, or the front door 577 00:25:28,120 --> 00:25:31,240 in a hotel, big problem. 578 00:25:31,240 --> 00:25:32,395 Yeah. 579 00:25:32,395 --> 00:25:33,895 I mean, these are problems you can-- 580 00:25:33,895 --> 00:25:36,310 you can come up with workarounds. 581 00:25:36,310 --> 00:25:40,030 It's not life-threatening, but it's extremely inconvenient. 582 00:25:40,030 --> 00:25:41,050 Yeah? 583 00:25:41,050 --> 00:25:45,050 AUDIENCE: Is it the case that those navigational skills that 584 00:25:45,050 --> 00:25:48,770 develop long-term, like a long time ago are stronger? 585 00:25:48,770 --> 00:25:51,462 So he has a harder time developing-- 586 00:25:51,462 --> 00:25:53,420 for example, you said new hotels are a problem. 587 00:25:53,420 --> 00:25:56,505 But if it is places that are more familiar, like his home, 588 00:25:56,505 --> 00:25:57,880 is it easier for him to navigate? 589 00:25:57,880 --> 00:25:59,505 NANCY KANWISHER: It's a great question. 590 00:25:59,505 --> 00:26:01,000 And you might think that the kind 591 00:26:01,000 --> 00:26:04,102 of navigational maps you laid down long ago would be intact. 592 00:26:04,102 --> 00:26:05,935 So is it just that you can't learn new ones? 593 00:26:08,610 --> 00:26:10,130 It's a great question. 594 00:26:10,130 --> 00:26:12,460 The answer is kind of complicated in this case. 595 00:26:12,460 --> 00:26:15,802 For routes that he's memorized-- 596 00:26:15,802 --> 00:26:18,010 there's a whole different system for knowing a route, 597 00:26:18,010 --> 00:26:20,620 and really having an abstract knowledge 598 00:26:20,620 --> 00:26:23,380 of a place that enables you to devise a new route if something 599 00:26:23,380 --> 00:26:24,760 is blocked on that route. 600 00:26:24,760 --> 00:26:27,880 For highly over-learned routes, he's OK. 601 00:26:27,880 --> 00:26:29,410 He remembers the [INAUDIBLE]. 602 00:26:29,410 --> 00:26:32,470 It's like a memorized motor sequence. 603 00:26:32,470 --> 00:26:35,020 You do A, and then B, and then C, and then D. 604 00:26:35,020 --> 00:26:37,840 He's OK with those, with routes he learned long ago. 605 00:26:37,840 --> 00:26:42,550 But he is not good at coming up with a new route in a place 606 00:26:42,550 --> 00:26:43,690 that he learned long ago. 607 00:26:46,300 --> 00:26:48,678 We'll take one last question. 608 00:26:48,678 --> 00:26:50,715 AUDIENCE: Does he have conscious access 609 00:26:50,715 --> 00:26:55,676 to past knowledge that [INAUDIBLE] 610 00:26:55,676 --> 00:26:57,968 And does he have conscious knowledge that [INAUDIBLE]?? 611 00:27:03,076 --> 00:27:04,860 NANCY KANWISHER: No, he knows-- 612 00:27:04,860 --> 00:27:06,420 well, he knows, because when he tries 613 00:27:06,420 --> 00:27:08,820 to figure out which way to head, he has no idea. 614 00:27:08,820 --> 00:27:12,300 He's extremely aware of it, and very articulate 615 00:27:12,300 --> 00:27:14,310 on precisely what happens. 616 00:27:14,310 --> 00:27:18,608 What he says is, if he's looking at a place-- 617 00:27:18,608 --> 00:27:19,650 here's something he says. 618 00:27:19,650 --> 00:27:20,718 He's looking at a place. 619 00:27:20,718 --> 00:27:22,260 He knows where he is, because there's 620 00:27:22,260 --> 00:27:23,820 all kinds of other bits of information 621 00:27:23,820 --> 00:27:25,410 that tell you where you are, because you intended 622 00:27:25,410 --> 00:27:27,510 to go there, and the relevant things are happening, and all. 623 00:27:27,510 --> 00:27:28,620 So he knows where he is. 624 00:27:28,620 --> 00:27:31,110 And it looks familiar. 625 00:27:31,110 --> 00:27:35,350 If he tries to imagine what's behind him, 626 00:27:35,350 --> 00:27:39,180 he says that he starts to get it and it just kind of vaporizes. 627 00:27:39,180 --> 00:27:40,630 He just can't hang on to it. 628 00:27:40,630 --> 00:27:44,400 He can't kind of construct a stable mental image 629 00:27:44,400 --> 00:27:46,518 of nearby places. 630 00:27:46,518 --> 00:27:48,060 I don't know exactly what that means, 631 00:27:48,060 --> 00:27:51,390 but he's very articulate, and can report what happens-- 632 00:27:51,390 --> 00:27:54,060 what he experiences when he tries to access 633 00:27:54,060 --> 00:27:56,640 this kind of information. 634 00:27:56,640 --> 00:27:57,940 What you guys-- we'll go on. 635 00:27:57,940 --> 00:27:59,815 But what I want to say is, what you guys just 636 00:27:59,815 --> 00:28:02,370 did is exactly what we do in my field. 637 00:28:02,370 --> 00:28:05,820 We try to take a mental ability and tease it apart 638 00:28:05,820 --> 00:28:08,250 and say, is it exactly this or is it exactly that? 639 00:28:08,250 --> 00:28:12,090 And you guys all just did it beautifully. 640 00:28:12,090 --> 00:28:14,370 A lot of what we do in my field is 641 00:28:14,370 --> 00:28:18,480 kind of this common-sense parsing of mental abilities. 642 00:28:18,480 --> 00:28:21,480 What is a particular mental ability-- 643 00:28:21,480 --> 00:28:23,100 how does it relate to some other one? 644 00:28:23,100 --> 00:28:24,300 Are these things separable ? 645 00:28:24,300 --> 00:28:26,130 Can you lose one and not the other? 646 00:28:26,130 --> 00:28:30,240 Do they live in different parts of the brain, et cetera? 647 00:28:30,240 --> 00:28:31,350 All right. 648 00:28:31,350 --> 00:28:33,060 That's the story. 649 00:28:33,060 --> 00:28:40,020 I'm going to cache out some of the particular themes that 650 00:28:40,020 --> 00:28:43,380 came out from the story that will echo through this course. 651 00:28:43,380 --> 00:28:45,090 And the first and most obvious one 652 00:28:45,090 --> 00:28:48,330 is, the brain isn't just a big bunch of mush. 653 00:28:48,330 --> 00:28:49,710 It has structure. 654 00:28:49,710 --> 00:28:51,400 It has organization. 655 00:28:51,400 --> 00:28:53,145 The different bits do different things. 656 00:28:55,800 --> 00:28:59,040 Importantly, when Bob had this big lime in his head, 657 00:28:59,040 --> 00:29:01,420 he didn't just get a little bit stupid. 658 00:29:01,420 --> 00:29:01,920 No. 659 00:29:01,920 --> 00:29:05,100 His IQ, if he'd take an IQ test, would be unchanged. 660 00:29:05,100 --> 00:29:09,210 He lost a very specific mental ability. 661 00:29:09,210 --> 00:29:12,750 And that is fascinating, but it's also 662 00:29:12,750 --> 00:29:13,860 good news for science. 663 00:29:13,860 --> 00:29:16,740 Because often, when you try to understand a complicated thing, 664 00:29:16,740 --> 00:29:19,050 a great way to make progress is to first figure out 665 00:29:19,050 --> 00:29:21,480 what the parts are, and then later try to figure out, 666 00:29:21,480 --> 00:29:23,160 how does each individual part work 667 00:29:23,160 --> 00:29:24,510 and how do they work together? 668 00:29:24,510 --> 00:29:26,100 But if there's part structure, there's 669 00:29:26,100 --> 00:29:27,183 at least a place to start. 670 00:29:30,960 --> 00:29:33,870 Second theme is that some parts of the brain 671 00:29:33,870 --> 00:29:36,840 do extremely specific things. 672 00:29:36,840 --> 00:29:37,765 Not all of them. 673 00:29:37,765 --> 00:29:39,390 Some of them are quite general, and are 674 00:29:39,390 --> 00:29:41,550 engaged in lots of different mental processes. 675 00:29:41,550 --> 00:29:43,770 But some are remarkably specific. 676 00:29:43,770 --> 00:29:46,380 We'll talk a lot about that. 677 00:29:46,380 --> 00:29:48,090 Third big theme. 678 00:29:48,090 --> 00:29:51,630 The organization of the brain echoes the architecture 679 00:29:51,630 --> 00:29:52,185 of the mind. 680 00:29:55,530 --> 00:29:58,110 And I would say, the fundamental pieces of the brain 681 00:29:58,110 --> 00:30:01,560 are telling us what are fundamental parts of the mind. 682 00:30:01,560 --> 00:30:03,120 And that's why I'm in this field. 683 00:30:03,120 --> 00:30:04,350 That's what I think is cool. 684 00:30:04,350 --> 00:30:06,163 The brain is just a bunch of cells. 685 00:30:06,163 --> 00:30:07,080 It's a physical thing. 686 00:30:07,080 --> 00:30:08,610 Who cares about a physical thing? 687 00:30:08,610 --> 00:30:11,640 The reason we care about it is, that's where our mind lives. 688 00:30:11,640 --> 00:30:13,140 And if we study that physical thing, 689 00:30:13,140 --> 00:30:15,580 we can learn something about our minds. 690 00:30:15,580 --> 00:30:18,480 And that's pretty cosmic, I think. 691 00:30:18,480 --> 00:30:20,160 The point of all of this kind of work 692 00:30:20,160 --> 00:30:23,340 is not to say, oh, that mental process is here, not there. 693 00:30:23,340 --> 00:30:24,167 Who cares? 694 00:30:24,167 --> 00:30:25,000 I don't really care. 695 00:30:25,000 --> 00:30:27,000 I mean, at some point, you need to have a ballpark sense. 696 00:30:27,000 --> 00:30:28,620 You need to know to study the things. 697 00:30:28,620 --> 00:30:30,720 But the interesting question is not 698 00:30:30,720 --> 00:30:32,370 where these things are in the brain, 699 00:30:32,370 --> 00:30:35,130 but which mental processes have their own specialized 700 00:30:35,130 --> 00:30:36,630 machinery, and why those? 701 00:30:40,000 --> 00:30:41,410 Another important theme. 702 00:30:41,410 --> 00:30:43,420 How do brains change? 703 00:30:43,420 --> 00:30:45,820 Bob didn't recover after his brain damage, 704 00:30:45,820 --> 00:30:49,420 in that very particular mental function that he lost. 705 00:30:49,420 --> 00:30:52,480 If all of that had happened when he was five years old, 706 00:30:52,480 --> 00:30:55,090 he probably would have. 707 00:30:55,090 --> 00:30:58,160 How do brains change over normal development? 708 00:30:58,160 --> 00:31:01,090 How do they change from learning and experience? 709 00:31:01,090 --> 00:31:03,610 How do they change after injury? 710 00:31:03,610 --> 00:31:06,760 And the final theme echoed in that story 711 00:31:06,760 --> 00:31:08,710 is, there are lots and lots of different ways 712 00:31:08,710 --> 00:31:10,660 to study the brain. 713 00:31:10,660 --> 00:31:12,940 There are the simple behavioral observations. 714 00:31:12,940 --> 00:31:15,190 Bob can't navigate, but he can do everything else. 715 00:31:15,190 --> 00:31:17,800 OK, that's really deep and informative-- low-tech, 716 00:31:17,800 --> 00:31:20,950 but really powerful. 717 00:31:20,950 --> 00:31:23,470 The anatomical brain images that showed 718 00:31:23,470 --> 00:31:25,718 where the lime was in Bob's brain, 719 00:31:25,718 --> 00:31:27,510 that gives you another kind of information. 720 00:31:27,510 --> 00:31:30,380 What's the physical structure of the brain? 721 00:31:30,380 --> 00:31:32,110 The functional images that we had 722 00:31:32,110 --> 00:31:36,610 done in my lab to discover the parahippocampal place area, 723 00:31:36,610 --> 00:31:39,970 and the studies of what mental abilities are preserved 724 00:31:39,970 --> 00:31:41,860 and which are lost in people who have 725 00:31:41,860 --> 00:31:44,265 alterations of their brain. 726 00:31:44,265 --> 00:31:45,640 Those are just a few of the kinds 727 00:31:45,640 --> 00:31:47,680 of methods in our field, each of which 728 00:31:47,680 --> 00:31:52,690 tells us about a different kind of thing about the brain. 729 00:31:52,690 --> 00:31:57,040 Those are the themes I was trying to get at here. 730 00:31:57,040 --> 00:32:00,760 Let's move on to the why, how, and what 731 00:32:00,760 --> 00:32:02,430 of exploring the brain. 732 00:32:02,430 --> 00:32:06,190 I'm going to assign the TAs to get me to shut up at-- 733 00:32:06,190 --> 00:32:06,760 let's see. 734 00:32:06,760 --> 00:32:07,930 We're supposed to end at 5 minutes 735 00:32:07,930 --> 00:32:09,430 before the end of class, is that right? 736 00:32:09,430 --> 00:32:10,570 Is that the MIT tradition? 737 00:32:10,570 --> 00:32:14,770 OK, so at-- oh, my, shockingly soon-- 738 00:32:14,770 --> 00:32:18,100 11:45, you're going to-- 739 00:32:18,100 --> 00:32:19,900 AUDIENCE: [INAUDIBLE] 740 00:32:19,900 --> 00:32:21,940 NANCY KANWISHER: Oh, great, thank you. 741 00:32:21,940 --> 00:32:22,570 Thank you. 742 00:32:22,570 --> 00:32:24,940 This is one of the many things TAs are for. 743 00:32:24,940 --> 00:32:27,640 They pick up the hundreds of typos and "mindos" 744 00:32:27,640 --> 00:32:28,360 and all of that. 745 00:32:28,360 --> 00:32:28,930 Excellent. 746 00:32:28,930 --> 00:32:31,060 I'm thinking, how the hell did I so mis-time this? 747 00:32:31,060 --> 00:32:31,930 Thank you, Heather. 748 00:32:31,930 --> 00:32:32,650 OK, good. 749 00:32:32,650 --> 00:32:34,240 We'll go on. 750 00:32:34,240 --> 00:32:36,640 Why should we study the brain? 751 00:32:36,640 --> 00:32:39,280 First, most obvious reason, know thyself. 752 00:32:39,280 --> 00:32:42,800 Know what this thing is that's operating in our heads. 753 00:32:42,800 --> 00:32:46,330 This is who you are, is your brain. 754 00:32:46,330 --> 00:32:51,760 There are lots of very fine and important organs in the body, 755 00:32:51,760 --> 00:32:54,200 but the brain is special. 756 00:32:54,200 --> 00:32:56,200 So, a heart is important. 757 00:32:56,200 --> 00:32:58,390 You'd die without it. 758 00:32:58,390 --> 00:33:02,020 But it's the brain that's your identity. 759 00:33:02,020 --> 00:33:05,350 There's a reason that surgeons do heart transplants. 760 00:33:05,350 --> 00:33:06,130 That makes sense. 761 00:33:06,130 --> 00:33:09,460 Something wrong with your heart, you need another heart, OK. 762 00:33:09,460 --> 00:33:11,650 But why don't they do brain transplants? 763 00:33:11,650 --> 00:33:13,420 That wouldn't make sense. 764 00:33:13,420 --> 00:33:15,130 If there's something wrong with my brain, 765 00:33:15,130 --> 00:33:17,770 it doesn't make sense to take someone else's brain 766 00:33:17,770 --> 00:33:20,290 and put it in here, because then I'd be that other person. 767 00:33:20,290 --> 00:33:23,808 It doesn't make sense, because the brain is who you are. 768 00:33:23,808 --> 00:33:25,100 So the brain is really special. 769 00:33:25,100 --> 00:33:27,730 It's not just another organ. 770 00:33:27,730 --> 00:33:31,030 That's why, a few years ago, we had the decade of the brain-- 771 00:33:31,030 --> 00:33:33,853 not the decade of the pancreas, or the liver, or the kidney. 772 00:33:33,853 --> 00:33:35,270 People need to study these things. 773 00:33:35,270 --> 00:33:37,120 They need to know how to fix them. 774 00:33:37,120 --> 00:33:38,105 They're important. 775 00:33:38,105 --> 00:33:39,730 But they're not as cosmic as the brain. 776 00:33:44,710 --> 00:33:48,430 Second reason why we should understand brains, 777 00:33:48,430 --> 00:33:52,450 and that is to understand the limits of human knowledge. 778 00:33:52,450 --> 00:33:54,790 The more we understand about the human mind, 779 00:33:54,790 --> 00:33:58,060 the more we can actually evaluate 780 00:33:58,060 --> 00:33:59,950 how good our knowledge is. 781 00:33:59,950 --> 00:34:03,370 Are there things that we might not be able to think? 782 00:34:03,370 --> 00:34:05,470 Possible through scientific theories 783 00:34:05,470 --> 00:34:08,139 we might not be able to understand, ever? 784 00:34:08,139 --> 00:34:09,880 You can think of studying the mind 785 00:34:09,880 --> 00:34:12,820 as a kind of empirical epistemology, a way 786 00:34:12,820 --> 00:34:14,949 to actually know about the knower 787 00:34:14,949 --> 00:34:17,080 so we can figure out how good the knowledge is 788 00:34:17,080 --> 00:34:19,060 in that knower. 789 00:34:19,060 --> 00:34:23,199 That's another reason. 790 00:34:23,199 --> 00:34:26,739 A third reason is to advance AI. 791 00:34:26,739 --> 00:34:29,830 Up until a few years ago, I used to give lectures on vision, 792 00:34:29,830 --> 00:34:33,460 and they would all start with some version of this. 793 00:34:33,460 --> 00:34:37,300 You guys all have amazing visual abilities in the back 794 00:34:37,300 --> 00:34:38,645 of your brain that does vision. 795 00:34:38,645 --> 00:34:40,270 You can do all of this incredible stuff 796 00:34:40,270 --> 00:34:42,580 that no machine can touch. 797 00:34:42,580 --> 00:34:43,690 Hats off to you. 798 00:34:43,690 --> 00:34:46,210 You have an amazing visual system back here, 799 00:34:46,210 --> 00:34:48,280 and those guys in AI-- 800 00:34:48,280 --> 00:34:49,270 it is mostly guys. 801 00:34:49,270 --> 00:34:50,230 Guys, gals, whatever. 802 00:34:50,230 --> 00:34:53,590 Those people in AI could only dream 803 00:34:53,590 --> 00:34:55,300 of coming up with algorithms as good 804 00:34:55,300 --> 00:34:59,500 as the one that's running in the back of your head. 805 00:34:59,500 --> 00:35:02,363 You can't quite start the lectures that way anymore. 806 00:35:02,363 --> 00:35:04,030 If any of you have been living in a cave 807 00:35:04,030 --> 00:35:06,970 and not heard about deep nets, there's 808 00:35:06,970 --> 00:35:09,250 been a massive revolution. 809 00:35:09,250 --> 00:35:12,550 And all of a sudden, deep nets are 810 00:35:12,550 --> 00:35:15,070 doing things that are really close to human abilities, 811 00:35:15,070 --> 00:35:17,230 particularly in vision. 812 00:35:17,230 --> 00:35:20,050 For example, in visual object recognition, 813 00:35:20,050 --> 00:35:23,050 machines were way far behind human vision 814 00:35:23,050 --> 00:35:28,750 until very recently, especially when this paper here came out-- 815 00:35:28,750 --> 00:35:31,720 was published in 2012. 816 00:35:31,720 --> 00:35:33,970 First author, Krizhevsky. 817 00:35:33,970 --> 00:35:37,502 It has now been cited an astonishing 33,000 times. 818 00:35:37,502 --> 00:35:39,460 Actually made this slide a couple of weeks ago. 819 00:35:39,460 --> 00:35:42,130 It's probably been cited 36,000 times by now. 820 00:35:42,130 --> 00:35:44,680 You could look it up on Google Scholar and find out. 821 00:35:44,680 --> 00:35:46,930 That is a huge number of citations. 822 00:35:46,930 --> 00:35:48,845 The influence of this paper is ginormous. 823 00:35:48,845 --> 00:35:51,220 Probably half of you have already heard about this paper. 824 00:35:51,220 --> 00:35:52,870 Raise your hand if you've heard about this paper. 825 00:35:52,870 --> 00:35:53,500 Oh, OK. 826 00:35:53,500 --> 00:35:54,970 All right. 827 00:35:54,970 --> 00:35:57,790 Major, big news. 828 00:35:57,790 --> 00:36:00,190 What's so important about this paper? 829 00:36:00,190 --> 00:36:02,580 Well, they trained-- as, probably, most of you 830 00:36:02,580 --> 00:36:06,760 know-- they trained a deep net on the over 1 million images 831 00:36:06,760 --> 00:36:11,140 in ImageNet, a massive computer database of images. 832 00:36:11,140 --> 00:36:14,200 And they basically taught it to do object recognition. 833 00:36:14,200 --> 00:36:16,720 And it performed much more accurately 834 00:36:16,720 --> 00:36:21,550 than any previous system, and it approaches human abilities. 835 00:36:21,550 --> 00:36:22,320 This is major. 836 00:36:22,320 --> 00:36:24,400 This is a radical change in the situation 837 00:36:24,400 --> 00:36:26,320 that we were in five years ago. 838 00:36:26,320 --> 00:36:28,000 Things have changed radically. 839 00:36:28,000 --> 00:36:31,420 Just as an example, here's one of the figures 840 00:36:31,420 --> 00:36:34,030 from that seminal paper. 841 00:36:34,030 --> 00:36:36,880 Here is one of the images from ImageNet 842 00:36:36,880 --> 00:36:40,360 that AlexNet, this trained network, was tested on. 843 00:36:40,360 --> 00:36:42,460 And the correct answer, according to ImageNet, 844 00:36:42,460 --> 00:36:44,230 is that that's a mite. 845 00:36:44,230 --> 00:36:45,820 And here's what AlexNet says. 846 00:36:45,820 --> 00:36:48,220 Its number one first answer is mite, 847 00:36:48,220 --> 00:36:50,110 and its second, third, fourth answers 848 00:36:50,110 --> 00:36:53,230 are black widow, cockroach, et cetera. 849 00:36:53,230 --> 00:36:53,938 Pretty damn good. 850 00:36:53,938 --> 00:36:56,105 The mite is even sticking off the edge of the frame, 851 00:36:56,105 --> 00:36:56,980 and it gets it. 852 00:36:56,980 --> 00:36:57,970 Container ship. 853 00:36:57,970 --> 00:36:59,650 First choice, container ship. 854 00:36:59,650 --> 00:37:00,520 Pretty good. 855 00:37:00,520 --> 00:37:01,780 Second choice makes sense. 856 00:37:01,780 --> 00:37:02,470 Lifeboat. 857 00:37:02,470 --> 00:37:04,030 Not bad. 858 00:37:04,030 --> 00:37:05,408 Look at that-- motor scooter. 859 00:37:05,408 --> 00:37:07,450 I can barely even see the motor scooter in there, 860 00:37:07,450 --> 00:37:09,670 but AlexNet, awesome. 861 00:37:09,670 --> 00:37:10,240 Right? 862 00:37:10,240 --> 00:37:11,260 Leopard. 863 00:37:11,260 --> 00:37:12,790 Awesome. 864 00:37:12,790 --> 00:37:15,250 Even when AlexNet makes a mistake, 865 00:37:15,250 --> 00:37:17,440 the mistake is totally understandable. 866 00:37:17,440 --> 00:37:20,650 Like, according to ImageNet, that is a picture of a grille, 867 00:37:20,650 --> 00:37:23,020 and AlexNet calls it a convertible. 868 00:37:23,020 --> 00:37:26,710 I'm siding with AlexNet on this one. 869 00:37:26,710 --> 00:37:30,130 This, the correct answer is mushroom, 870 00:37:30,130 --> 00:37:32,050 and AlexNet says agaric. 871 00:37:32,050 --> 00:37:33,200 I had to look that up. 872 00:37:33,200 --> 00:37:36,250 It's a particular kind of mushroom. 873 00:37:36,250 --> 00:37:38,230 This one's pretty funny. 874 00:37:38,230 --> 00:37:40,570 ImageNet says that's pictures of cherry. 875 00:37:40,570 --> 00:37:42,040 There's cherries in the foreground. 876 00:37:42,040 --> 00:37:43,870 But AlexNet says dalmatian. 877 00:37:43,870 --> 00:37:46,570 I'm siding with AlexNet on this. 878 00:37:46,570 --> 00:37:48,970 And Madagascar cat, et cetera. 879 00:37:48,970 --> 00:37:50,110 Pretty amazing. 880 00:37:50,110 --> 00:37:54,430 And nothing even close to this was possible before 2012. 881 00:37:54,430 --> 00:37:56,830 This is very recent history, and it has totally shaken up 882 00:37:56,830 --> 00:37:58,090 the field in lots of ways. 883 00:38:00,705 --> 00:38:03,080 That's been transformative not just for computer science, 884 00:38:03,080 --> 00:38:04,870 but it's also been transformative 885 00:38:04,870 --> 00:38:07,180 for cognitive science and neuroscience. 886 00:38:07,180 --> 00:38:09,460 Because now, we have algorithms-- like, 887 00:38:09,460 --> 00:38:12,040 here's this deep net, and it does this thing. 888 00:38:12,040 --> 00:38:14,590 That's a possible theory of how humans do it. 889 00:38:14,590 --> 00:38:18,070 It's a possible, computationally precise theory 890 00:38:18,070 --> 00:38:19,218 of what's going on in here. 891 00:38:19,218 --> 00:38:21,010 And we didn't use to have those, and now we 892 00:38:21,010 --> 00:38:22,810 have those for a number of domains. 893 00:38:22,810 --> 00:38:24,450 And that's shaking up the field. 894 00:38:24,450 --> 00:38:26,200 There will be a whole lecture on deep nets 895 00:38:26,200 --> 00:38:28,348 and how you can use them to think about minds 896 00:38:28,348 --> 00:38:30,640 and brains toward the end of the course-- guest lecture 897 00:38:30,640 --> 00:38:32,950 by my postdoc Katharina Dobbs. 898 00:38:32,950 --> 00:38:36,340 And we'll hear more about that. 899 00:38:36,340 --> 00:38:38,230 But let's first step back a second 900 00:38:38,230 --> 00:38:40,930 and say, OK, do they really perform as well as humans, even 901 00:38:40,930 --> 00:38:43,660 on just object recognition? 902 00:38:43,660 --> 00:38:47,952 Well, what if we tested it on images not in ImageNet? 903 00:38:47,952 --> 00:38:50,410 ImageNet is a pretty good test because these things, as you 904 00:38:50,410 --> 00:38:51,863 can see, are highly variable. 905 00:38:51,863 --> 00:38:52,780 They have backgrounds. 906 00:38:52,780 --> 00:38:53,613 They're complicated. 907 00:38:53,613 --> 00:38:55,120 They're real-world images. 908 00:38:55,120 --> 00:38:58,780 But they were photographs taken by people in a particular way, 909 00:38:58,780 --> 00:39:01,593 with a particular goal. 910 00:39:01,593 --> 00:39:03,760 And most of the photographs you take, you throw out. 911 00:39:03,760 --> 00:39:05,770 They don't end up in ImageNet. 912 00:39:05,770 --> 00:39:08,920 ImageNet is a weird little idiosyncratic subset 913 00:39:08,920 --> 00:39:11,380 of the kind of visual experience that we have. 914 00:39:11,380 --> 00:39:14,140 So would this really generalize? 915 00:39:14,140 --> 00:39:17,650 It so happens that Boris Katz and Andrei Barbu, 916 00:39:17,650 --> 00:39:19,900 across the street in CSAIL, have been doing 917 00:39:19,900 --> 00:39:21,160 some very interesting studies. 918 00:39:21,160 --> 00:39:23,260 This stuff isn't published yet, but I got their permission 919 00:39:23,260 --> 00:39:25,260 to tell you about this cool stuff they're doing. 920 00:39:25,260 --> 00:39:27,250 And they're saying, hey, let's test 921 00:39:27,250 --> 00:39:31,930 AlexNet and other similar deep nets since then on a more 922 00:39:31,930 --> 00:39:34,420 Realistic, harder version of object recognition 923 00:39:34,420 --> 00:39:37,360 that's more characteristic of what humans do. 924 00:39:37,360 --> 00:39:40,540 They're generating this huge data set of stimuli 925 00:39:40,540 --> 00:39:42,850 that they crowdsource. 926 00:39:42,850 --> 00:39:44,560 Workers on Mechanical Turk go on there 927 00:39:44,560 --> 00:39:45,910 and create images for them. 928 00:39:45,910 --> 00:39:49,420 They get instructions like, hold an object 929 00:39:49,420 --> 00:39:51,550 in this particular location, or at this angle, 930 00:39:51,550 --> 00:39:54,040 or move it here, and send us the images. 931 00:39:54,040 --> 00:39:57,340 They are getting, I think, hundreds of thousands of images 932 00:39:57,340 --> 00:39:58,480 to test this on. 933 00:39:58,480 --> 00:40:00,740 And they're much more variable in the location 934 00:40:00,740 --> 00:40:04,310 of the object in the image, and its orientation, and so forth. 935 00:40:04,310 --> 00:40:06,380 For example, you guys have no problem 936 00:40:06,380 --> 00:40:07,880 telling what that thing is, but it's 937 00:40:07,880 --> 00:40:10,380 a slightly atypical example. 938 00:40:10,380 --> 00:40:12,530 Likewise, what's the object on the floor there? 939 00:40:12,530 --> 00:40:14,090 You can tell what it is, but it's 940 00:40:14,090 --> 00:40:15,470 a slightly atypical example. 941 00:40:18,140 --> 00:40:20,383 What Boris and Andrei are finding 942 00:40:20,383 --> 00:40:21,800 is that human performance is still 943 00:40:21,800 --> 00:40:25,370 pretty good on these images, but the deep nets 944 00:40:25,370 --> 00:40:28,880 are terrible at this stuff. 945 00:40:28,880 --> 00:40:31,940 ResNet, one of the more recent ones, 946 00:40:31,940 --> 00:40:35,750 drops from 71% correct on ImageNet to around 25% 947 00:40:35,750 --> 00:40:38,180 correct on these images. 948 00:40:38,180 --> 00:40:41,960 And the other similar, fancy, more recent networks, 949 00:40:41,960 --> 00:40:44,990 do similarly badly. 950 00:40:44,990 --> 00:40:49,010 On the one hand, AI, the deep nets, 951 00:40:49,010 --> 00:40:50,810 are awesome and transformative. 952 00:40:50,810 --> 00:40:52,260 No question about it. 953 00:40:52,260 --> 00:40:55,430 But on the other hand, despite all the hype, 954 00:40:55,430 --> 00:40:58,350 they're still not quite like human object recognition. 955 00:40:58,350 --> 00:41:00,350 They're a whole lot closer than they used to be, 956 00:41:00,350 --> 00:41:03,740 but they're not really there. 957 00:41:03,740 --> 00:41:06,770 And more generally, what about harder problems, 958 00:41:06,770 --> 00:41:08,930 like image understanding-- not just 959 00:41:08,930 --> 00:41:11,270 labeling and classification, but understanding 960 00:41:11,270 --> 00:41:13,820 what's going on in the image? 961 00:41:13,820 --> 00:41:17,132 You guys have probably seen image captioning bots. 962 00:41:17,132 --> 00:41:18,590 There are lots of these around now. 963 00:41:18,590 --> 00:41:22,280 This kind of hit the scene in 2016, 964 00:41:22,280 --> 00:41:25,330 when Google AI came out with a captioning algorithm. 965 00:41:25,330 --> 00:41:27,080 And of course, right around the same time, 966 00:41:27,080 --> 00:41:29,360 Microsoft had a captioning algorithm. 967 00:41:29,360 --> 00:41:31,310 And let's see how they do. 968 00:41:31,310 --> 00:41:32,210 This is an example. 969 00:41:32,210 --> 00:41:35,090 You give this algorithm this picture here, 970 00:41:35,090 --> 00:41:39,650 and it says, that's a dinosaur on top of a surfboard. 971 00:41:39,650 --> 00:41:42,050 That's pretty damn good, right? 972 00:41:42,050 --> 00:41:44,153 OK, wow. 973 00:41:44,153 --> 00:41:46,070 Let's look more generally, how well this thing 974 00:41:46,070 --> 00:41:48,470 works at other examples. 975 00:41:48,470 --> 00:41:49,970 It looks at this and it says, that's 976 00:41:49,970 --> 00:41:52,400 a group of people on the field playing football. 977 00:41:52,400 --> 00:41:53,630 Like, wow. 978 00:41:53,630 --> 00:41:55,250 OK. 979 00:41:55,250 --> 00:41:57,080 A snow-covered field. 980 00:41:57,080 --> 00:41:59,510 Pretty good. 981 00:41:59,510 --> 00:42:03,680 Liu Shiwen and Ding Ning posing for a picture. 982 00:42:03,680 --> 00:42:05,190 I don't know, but these things are 983 00:42:05,190 --> 00:42:06,440 very good at face recognition. 984 00:42:06,440 --> 00:42:10,190 That's probably exactly those two people. 985 00:42:10,190 --> 00:42:13,250 A car parked in a parking lot. 986 00:42:13,250 --> 00:42:14,720 Pretty good. 987 00:42:14,720 --> 00:42:16,070 A large ship in the water. 988 00:42:16,070 --> 00:42:17,990 Pretty good. 989 00:42:17,990 --> 00:42:20,210 A clock tower lit up at night. 990 00:42:20,210 --> 00:42:23,120 Awesome, right? 991 00:42:23,120 --> 00:42:25,620 A vintage photo of a pond. 992 00:42:25,620 --> 00:42:26,748 Well, the vintage part. 993 00:42:26,748 --> 00:42:28,040 I don't know where the pond is. 994 00:42:28,040 --> 00:42:29,270 There's a little water in there. 995 00:42:29,270 --> 00:42:29,840 I don't know. 996 00:42:29,840 --> 00:42:30,515 Not way off. 997 00:42:33,100 --> 00:42:35,170 A group of people that are standing 998 00:42:35,170 --> 00:42:38,230 in the grass near a bridge. 999 00:42:38,230 --> 00:42:39,190 Not really. 1000 00:42:39,190 --> 00:42:40,570 There's grass. 1001 00:42:40,570 --> 00:42:41,680 There's a bridge, sort of. 1002 00:42:41,680 --> 00:42:42,310 There's people. 1003 00:42:42,310 --> 00:42:46,030 But not really, right? 1004 00:42:46,030 --> 00:42:49,060 A group of people standing on top of a boat. 1005 00:42:49,060 --> 00:42:49,810 Definitely not. 1006 00:42:52,990 --> 00:42:54,970 A building with a cake. 1007 00:42:54,970 --> 00:42:57,880 What? 1008 00:42:57,880 --> 00:43:00,280 A person holding a cell phone. 1009 00:43:00,280 --> 00:43:02,950 Not. 1010 00:43:02,950 --> 00:43:04,330 A group of stuffed animals. 1011 00:43:07,670 --> 00:43:08,740 I love this one. 1012 00:43:08,740 --> 00:43:11,200 A necklace made of bananas. 1013 00:43:11,200 --> 00:43:11,920 Wow. 1014 00:43:11,920 --> 00:43:13,720 We've really landed on Mars here. 1015 00:43:16,300 --> 00:43:19,210 A sign sitting on the grass. 1016 00:43:19,210 --> 00:43:20,440 Talk about missing the boat. 1017 00:43:23,065 --> 00:43:24,690 Now, look at this picture for a second. 1018 00:43:24,690 --> 00:43:26,232 Just figure out what's going on here. 1019 00:43:29,230 --> 00:43:30,780 Takes a couple of seconds. 1020 00:43:30,780 --> 00:43:33,270 Everyone got it? 1021 00:43:33,270 --> 00:43:36,935 There's a lot going on here. 1022 00:43:36,935 --> 00:43:38,310 This algorithm says, I think it's 1023 00:43:38,310 --> 00:43:42,570 a group of people standing next to a man in a suit and tie. 1024 00:43:42,570 --> 00:43:45,150 And the algorithm is correct, but the algorithm 1025 00:43:45,150 --> 00:43:46,920 has profoundly missed the boat. 1026 00:43:49,590 --> 00:43:51,030 I'm channeling-- actually, I stole 1027 00:43:51,030 --> 00:43:52,290 these slides from Josh Tenenbaum. 1028 00:43:52,290 --> 00:43:53,670 But let me channel him for a moment 1029 00:43:53,670 --> 00:43:55,470 and say what his big idea is, which I think 1030 00:43:55,470 --> 00:43:57,060 is really important. 1031 00:43:57,060 --> 00:43:59,430 And that is that both humans and deep nets 1032 00:43:59,430 --> 00:44:01,860 are very good at pattern recognition-- pattern 1033 00:44:01,860 --> 00:44:02,920 classification. 1034 00:44:02,920 --> 00:44:06,750 This is a cat, or a dog, or a car, or a toaster. 1035 00:44:06,750 --> 00:44:08,400 What they're not good at-- 1036 00:44:08,400 --> 00:44:11,040 what humans are good at, but the deep nets are not, 1037 00:44:11,040 --> 00:44:14,640 is building models to understand the world. 1038 00:44:14,640 --> 00:44:16,175 When you look at this picture, there 1039 00:44:16,175 --> 00:44:18,300 are all kinds of things that are crucial for really 1040 00:44:18,300 --> 00:44:21,450 understanding, at a deep level, what's going on in here. 1041 00:44:21,450 --> 00:44:23,880 We need to know why some people-- 1042 00:44:23,880 --> 00:44:26,610 what some people here know, but the guy on the scale 1043 00:44:26,610 --> 00:44:28,650 does not know. 1044 00:44:28,650 --> 00:44:30,600 Namely, even if you don't recognize 1045 00:44:30,600 --> 00:44:32,430 that that's James Comey-- 1046 00:44:32,430 --> 00:44:33,630 I think it is-- 1047 00:44:33,630 --> 00:44:36,090 here's Obama with his foot on the scale. 1048 00:44:36,090 --> 00:44:39,180 You need to know that people find it embarrassing 1049 00:44:39,180 --> 00:44:40,590 if they weigh too much. 1050 00:44:40,590 --> 00:44:43,800 You need to know that he can't see that Obama's doing it. 1051 00:44:43,800 --> 00:44:45,690 You need to know that they can see it, 1052 00:44:45,690 --> 00:44:48,840 even though he can't, and that's kind of the essence of humor. 1053 00:44:48,840 --> 00:44:52,950 There's just a whole universe of rich structural information 1054 00:44:52,950 --> 00:44:55,110 going in here that is part of what it means 1055 00:44:55,110 --> 00:44:57,120 to understand this picture. 1056 00:44:57,120 --> 00:45:02,010 And no deep net is even close to doing that kind of thing. 1057 00:45:05,640 --> 00:45:07,380 Bottom line of all this is-- 1058 00:45:07,380 --> 00:45:10,350 or let me just go on more generally-- 1059 00:45:10,350 --> 00:45:14,160 AI systems can't navigate new situations, 1060 00:45:14,160 --> 00:45:19,020 infer what others believe, use language to communicate, 1061 00:45:19,020 --> 00:45:22,710 write poetry and music to express how they feel, 1062 00:45:22,710 --> 00:45:26,760 or create math to build bridges, devices, 1063 00:45:26,760 --> 00:45:28,980 and lifesaving medicines. 1064 00:45:28,980 --> 00:45:31,290 That's a quote from our leader, Jim DiCarlo, 1065 00:45:31,290 --> 00:45:33,420 head of this department, published in Wired 1066 00:45:33,420 --> 00:45:35,970 a year ago in a beautiful article on the limitations 1067 00:45:35,970 --> 00:45:37,710 of deep nets. 1068 00:45:37,710 --> 00:45:40,470 But more generally, the point is that, yes, 1069 00:45:40,470 --> 00:45:42,565 AI is taking a massive leap now. 1070 00:45:42,565 --> 00:45:44,940 We're right in the middle of it, and it's super exciting, 1071 00:45:44,940 --> 00:45:48,540 and it's helpful to neuroscience and cognitive science. 1072 00:45:48,540 --> 00:45:51,507 But AI has a lot to learn from us too-- 1073 00:45:51,507 --> 00:45:53,340 a lot to learn from what's going on in here, 1074 00:45:53,340 --> 00:45:56,190 and how this thing works that those AI systems 1075 00:45:56,190 --> 00:45:59,130 still can't touch. 1076 00:45:59,130 --> 00:46:01,350 All of that was my third reason for studying-- 1077 00:46:01,350 --> 00:46:03,990 we're still in the, why are we studying the human brain? 1078 00:46:03,990 --> 00:46:05,970 The fourth reason to study the human brain 1079 00:46:05,970 --> 00:46:07,950 is the one most compelling to me, 1080 00:46:07,950 --> 00:46:11,310 and that is that it is just simply the greatest 1081 00:46:11,310 --> 00:46:14,260 intellectual quest of all time. 1082 00:46:14,260 --> 00:46:16,043 We could fight about cosmology. 1083 00:46:16,043 --> 00:46:18,210 I'm not going to fight with you about anything else. 1084 00:46:18,210 --> 00:46:19,860 I don't think there's any contest. 1085 00:46:19,860 --> 00:46:22,750 It's the greatest intellectual quest of all time. 1086 00:46:22,750 --> 00:46:24,330 And that's why I'm in it, and that's 1087 00:46:24,330 --> 00:46:27,210 why I hope it'll be fun for you. 1088 00:46:27,210 --> 00:46:29,430 That was the why. 1089 00:46:29,430 --> 00:46:32,070 How are we going to study the human brain? 1090 00:46:32,070 --> 00:46:33,060 Here's this thing. 1091 00:46:33,060 --> 00:46:34,900 How are we going to figure out how it works? 1092 00:46:34,900 --> 00:46:37,920 Kind of daunting, not totally obvious. 1093 00:46:37,920 --> 00:46:39,600 The first thing to realize is that there 1094 00:46:39,600 --> 00:46:42,360 are lots of levels of organization in this thing, 1095 00:46:42,360 --> 00:46:44,430 and hence, lots of ways of studying it. 1096 00:46:44,430 --> 00:46:47,070 We could look at molecules and their interactions. 1097 00:46:47,070 --> 00:46:49,200 Lots of people in this building do that. 1098 00:46:49,200 --> 00:46:52,480 We could look at properties of individual neurons. 1099 00:46:52,480 --> 00:46:54,240 We could look at circuits of neurons 1100 00:46:54,240 --> 00:46:56,400 interacting with each other. 1101 00:46:56,400 --> 00:46:58,530 We could look at entire brain regions 1102 00:46:58,530 --> 00:47:00,270 and what their functions are. 1103 00:47:00,270 --> 00:47:03,030 We could look at networks of multiple brain regions 1104 00:47:03,030 --> 00:47:06,450 interacting with each other. 1105 00:47:06,450 --> 00:47:08,777 All of those things are possible. 1106 00:47:08,777 --> 00:47:10,860 But actually, what we're going to do in the course 1107 00:47:10,860 --> 00:47:12,990 is none of those things in particular. 1108 00:47:12,990 --> 00:47:15,820 Instead, we're going to ask a somewhat different question. 1109 00:47:15,820 --> 00:47:18,000 And that question is, how does the brain 1110 00:47:18,000 --> 00:47:20,340 give rise to the mind? 1111 00:47:20,340 --> 00:47:22,410 And to understand that question, we're 1112 00:47:22,410 --> 00:47:25,365 going to do more at this level, and less at the upper level. 1113 00:47:28,627 --> 00:47:30,960 To answer this question, we need to start with the mind. 1114 00:47:30,960 --> 00:47:32,710 We need to-- if we're going to understand, 1115 00:47:32,710 --> 00:47:35,220 how does this thing produce a mind, we need to first figure 1116 00:47:35,220 --> 00:47:36,300 out, what is a mind? 1117 00:47:36,300 --> 00:47:37,467 What do we know about minds? 1118 00:47:41,057 --> 00:47:43,140 We need to start with the various mental functions 1119 00:47:43,140 --> 00:47:46,510 that minds carry out-- things like perception, vision, 1120 00:47:46,510 --> 00:47:50,790 hearing, aspects of cognition, like understanding language, 1121 00:47:50,790 --> 00:47:56,220 thinking about people, thinking about things, et cetera. 1122 00:47:56,220 --> 00:47:59,670 For each mental function, what we're going to do in here 1123 00:47:59,670 --> 00:48:03,000 is start by trying to understand how it works in minds 1124 00:48:03,000 --> 00:48:05,940 as well as we can, or what it is that we're trying to understand 1125 00:48:05,940 --> 00:48:07,320 that minds can do. 1126 00:48:07,320 --> 00:48:09,262 What is computed and how? 1127 00:48:09,262 --> 00:48:11,220 And then we're going to look at its brain basis 1128 00:48:11,220 --> 00:48:13,080 and try to figure out what we can figure out 1129 00:48:13,080 --> 00:48:16,530 about how that mental function is implemented in a brain. 1130 00:48:16,530 --> 00:48:18,840 The first question we'll ask for all of these domains 1131 00:48:18,840 --> 00:48:22,710 is, is there specialized machinery to do that thing? 1132 00:48:22,710 --> 00:48:24,420 And then we'll ask, what information 1133 00:48:24,420 --> 00:48:27,100 is represented in the relevant parts of the brain, 1134 00:48:27,100 --> 00:48:33,180 and when is that information represented, and how? 1135 00:48:33,180 --> 00:48:35,490 How are we going to answer those questions? 1136 00:48:35,490 --> 00:48:39,540 Well, there's lots and lots of methods in our field. 1137 00:48:39,540 --> 00:48:42,120 The first set of methods-- if we want to understand minds, 1138 00:48:42,120 --> 00:48:44,760 the first set of methods are the basic stuff 1139 00:48:44,760 --> 00:48:46,920 of cognitive science, psychophysics. 1140 00:48:46,920 --> 00:48:49,590 That means showing people visual stimuli, 1141 00:48:49,590 --> 00:48:51,270 or playing them sounds, and asking them 1142 00:48:51,270 --> 00:48:52,410 what they see or hear. 1143 00:48:52,410 --> 00:48:54,300 Nice and low tech, but lots has been 1144 00:48:54,300 --> 00:48:55,560 learned from those methods. 1145 00:48:55,560 --> 00:48:57,635 You collect reaction time and accuracy, 1146 00:48:57,635 --> 00:48:59,010 and it's amazing how much you can 1147 00:48:59,010 --> 00:49:01,020 learn from these methods that have been around 1148 00:49:01,020 --> 00:49:03,660 for a hundred years or more. 1149 00:49:03,660 --> 00:49:06,630 Perceptual illusions are similarly very informative 1150 00:49:06,630 --> 00:49:08,280 about how minds work. 1151 00:49:08,280 --> 00:49:12,237 Now, let me say an important thing that arises here. 1152 00:49:12,237 --> 00:49:14,320 Last year was the first time I taught this course, 1153 00:49:14,320 --> 00:49:15,720 and I would say it went so-so. 1154 00:49:15,720 --> 00:49:18,450 I'm aiming for it to be much better this year. 1155 00:49:18,450 --> 00:49:19,950 And one of the ways I'm trying to do 1156 00:49:19,950 --> 00:49:22,620 that is to be responsive to the student 1157 00:49:22,620 --> 00:49:24,420 evals I got last year, which were not 1158 00:49:24,420 --> 00:49:26,130 fabulous across the board. 1159 00:49:26,130 --> 00:49:27,420 Hurt my feelings badly. 1160 00:49:27,420 --> 00:49:30,240 But once I got over myself, I decided to just listen to them 1161 00:49:30,240 --> 00:49:31,570 and try to fix it. 1162 00:49:31,570 --> 00:49:33,930 And one way to fix it is to be honest with you today 1163 00:49:33,930 --> 00:49:36,750 about what this course is going to cover. 1164 00:49:36,750 --> 00:49:41,520 In my evals, student 50458, bless them, 1165 00:49:41,520 --> 00:49:43,020 offered this comment. 1166 00:49:43,020 --> 00:49:45,013 "This class was not sold in the correct way. 1167 00:49:45,013 --> 00:49:46,680 It should not be called the Human Brain, 1168 00:49:46,680 --> 00:49:48,750 because it was basically just a cognitive science, not 1169 00:49:48,750 --> 00:49:49,380 a brain class. 1170 00:49:49,380 --> 00:49:52,655 I expect it to learn very different material." 1171 00:49:52,655 --> 00:49:54,030 I don't know who this student is. 1172 00:49:54,030 --> 00:49:55,900 I wish I could apologize to them. 1173 00:49:55,900 --> 00:50:00,720 But I will say to you, sorry, student 50458-- 1174 00:50:00,720 --> 00:50:02,670 sorry I didn't make that clear. 1175 00:50:02,670 --> 00:50:04,890 The fundamental the reason the brain is cool 1176 00:50:04,890 --> 00:50:07,180 is that gives rise to the mind. 1177 00:50:07,180 --> 00:50:10,290 And that means that studying the biological properties 1178 00:50:10,290 --> 00:50:12,810 of the brain without considering the mental functions it 1179 00:50:12,810 --> 00:50:15,450 implements it would be kind of like trying 1180 00:50:15,450 --> 00:50:17,280 to study the physical properties of a book 1181 00:50:17,280 --> 00:50:20,813 without considering the meaning of its text. 1182 00:50:20,813 --> 00:50:22,230 We're going to spend a lot of time 1183 00:50:22,230 --> 00:50:23,890 doing cognitive science in here. 1184 00:50:23,890 --> 00:50:26,825 And if you had a different impression, sorry about that. 1185 00:50:26,825 --> 00:50:28,200 But that's what we're doing here. 1186 00:50:30,617 --> 00:50:31,950 How are we going to answer this? 1187 00:50:31,950 --> 00:50:34,740 Lots of cognitive science. 1188 00:50:34,740 --> 00:50:37,038 How are we going to look at the brain basis? 1189 00:50:37,038 --> 00:50:39,330 Well, we're going to look at neuropsychology patients-- 1190 00:50:39,330 --> 00:50:41,470 people like Bob who have damage to the brain 1191 00:50:41,470 --> 00:50:44,490 and what functions get preserved and lost. 1192 00:50:44,490 --> 00:50:47,010 We'll look at a lot of studies with functional MRI. 1193 00:50:47,010 --> 00:50:49,110 Neurophysiology, where you can record 1194 00:50:49,110 --> 00:50:51,930 from individual neurons in animal brains, 1195 00:50:51,930 --> 00:50:55,170 and in rare cases, even in human brains-- 1196 00:50:55,170 --> 00:50:56,940 under clinical situations where they 1197 00:50:56,940 --> 00:51:00,540 need to have electrodes in their brain anyway for neurosurgery. 1198 00:51:00,540 --> 00:51:05,640 We will look at EEG recorded from electrodes on the scalp 1199 00:51:05,640 --> 00:51:08,280 and MEG recording from magnetic fields 1200 00:51:08,280 --> 00:51:11,340 from squids placed next to the scalp. 1201 00:51:11,340 --> 00:51:13,170 We'll look at connectivity measures 1202 00:51:13,170 --> 00:51:16,890 with a method called diffusion tractography, et cetera. 1203 00:51:16,890 --> 00:51:19,050 Lots of methods. 1204 00:51:19,050 --> 00:51:21,780 Which mental functions will we cover? 1205 00:51:21,780 --> 00:51:23,700 Well, to tell you about that, I need 1206 00:51:23,700 --> 00:51:25,890 to tell you about the huge progress that 1207 00:51:25,890 --> 00:51:28,830 has happened in our field in the last 20 years. 1208 00:51:28,830 --> 00:51:32,130 All of this is quite recent. 1209 00:51:32,130 --> 00:51:34,440 Let's back up to 1990. 1210 00:51:34,440 --> 00:51:37,350 Here is approximately what we knew about the organization 1211 00:51:37,350 --> 00:51:39,810 of the human brain in 1990. 1212 00:51:39,810 --> 00:51:41,850 The black ovals are the bits that 1213 00:51:41,850 --> 00:51:44,700 are primary sensory and motor regions that have been known 1214 00:51:44,700 --> 00:51:47,700 for a long time, even by 1990. 1215 00:51:47,700 --> 00:51:49,170 And the colored bits are the bits 1216 00:51:49,170 --> 00:51:51,900 where we had some idea that face recognition might 1217 00:51:51,900 --> 00:51:54,180 go on somewhere in the back end of the bottom 1218 00:51:54,180 --> 00:51:56,280 of the right hemisphere because of people 1219 00:51:56,280 --> 00:51:59,040 who had damage back there and lost their face recognition 1220 00:51:59,040 --> 00:52:00,000 ability-- 1221 00:52:00,000 --> 00:52:02,400 sometimes, preserving their ability 1222 00:52:02,400 --> 00:52:06,300 to visually recognize words and scenes and objects, 1223 00:52:06,300 --> 00:52:10,050 only losing their ability to recognize faces. 1224 00:52:10,050 --> 00:52:11,520 The language regions we had known 1225 00:52:11,520 --> 00:52:14,460 about for nearly 200 years, from Broca and Wernicke 1226 00:52:14,460 --> 00:52:16,950 and others, who had studied patients with damage 1227 00:52:16,950 --> 00:52:19,140 in those regions and noted that they had problems 1228 00:52:19,140 --> 00:52:20,850 with language function. 1229 00:52:20,850 --> 00:52:22,950 And similarly, many people had reported 1230 00:52:22,950 --> 00:52:25,590 that if you have damage up here in the parietal lobes, 1231 00:52:25,590 --> 00:52:27,900 you sometimes lose your ability to direct 1232 00:52:27,900 --> 00:52:30,870 your attention to different places in the visual scene. 1233 00:52:30,870 --> 00:52:35,640 That was approximately what was known in 1990. 1234 00:52:35,640 --> 00:52:38,190 And here's what we know now. 1235 00:52:38,190 --> 00:52:42,990 We now know, thanks largely to functional MRI, 1236 00:52:42,990 --> 00:52:47,310 that for dozens of regions in the brain, in every one of you, 1237 00:52:47,310 --> 00:52:51,480 we have a pretty good idea of the function of that region. 1238 00:52:51,480 --> 00:52:53,490 This is major progress. 1239 00:52:53,490 --> 00:52:57,150 This is a kind of rough sketch of the organization 1240 00:52:57,150 --> 00:52:59,310 of the human mind and brain that we have now, 1241 00:52:59,310 --> 00:53:01,320 that we didn't have 20 years ago. 1242 00:53:01,320 --> 00:53:04,830 And that's awesome. 1243 00:53:04,830 --> 00:53:08,010 That has made possible a lot of progress, 1244 00:53:08,010 --> 00:53:09,960 building with other methods. 1245 00:53:09,960 --> 00:53:13,020 What we'll study in this course is, 1246 00:53:13,020 --> 00:53:14,880 we'll focus on those mental functions 1247 00:53:14,880 --> 00:53:18,030 where the brain bases are best understood. 1248 00:53:18,030 --> 00:53:21,000 And that will include things like the visual perception 1249 00:53:21,000 --> 00:53:25,030 of color, shape, and motion, visual recognition of faces, 1250 00:53:25,030 --> 00:53:27,360 places, bodies, and words-- 1251 00:53:27,360 --> 00:53:28,050 and scenes. 1252 00:53:28,050 --> 00:53:29,770 Didn't make it on the slide. 1253 00:53:29,770 --> 00:53:30,510 Oh, yes, it did. 1254 00:53:30,510 --> 00:53:33,750 Perceiving scenes and navigating. 1255 00:53:33,750 --> 00:53:36,043 Understanding numbers. 1256 00:53:36,043 --> 00:53:37,710 Yes, there's a whole lot about the brain 1257 00:53:37,710 --> 00:53:39,450 basis of understanding numbers. 1258 00:53:39,450 --> 00:53:42,870 Perceiving speech and perceiving music. 1259 00:53:42,870 --> 00:53:44,910 Understanding language. 1260 00:53:44,910 --> 00:53:48,818 Understanding other people and their minds. 1261 00:53:48,818 --> 00:53:50,610 Those are the kinds of topics where there's 1262 00:53:50,610 --> 00:53:53,730 been a lot of progress recently in understanding the brain 1263 00:53:53,730 --> 00:53:55,830 basis of those mental functions. 1264 00:53:55,830 --> 00:53:57,690 Those are the ones we'll focus on. 1265 00:53:57,690 --> 00:54:00,060 And that means there's going to be a lot on perception, 1266 00:54:00,060 --> 00:54:03,960 high-level vision and high-level audition, 1267 00:54:03,960 --> 00:54:07,300 because that's one where a lot of progress has been made, 1268 00:54:07,300 --> 00:54:08,822 and it's also a lot of the cortex. 1269 00:54:08,822 --> 00:54:11,280 As I mentioned a moment ago, the whole back part your brain 1270 00:54:11,280 --> 00:54:13,540 does vision, construed broadly. 1271 00:54:13,540 --> 00:54:15,540 Some people might say, well, why is she spending 1272 00:54:15,540 --> 00:54:16,380 all of this time in vision? 1273 00:54:16,380 --> 00:54:18,297 Well, it's a big part of what your brain does. 1274 00:54:18,297 --> 00:54:19,470 We are very visual animals. 1275 00:54:19,470 --> 00:54:21,095 So we'll spend a lot of time on vision. 1276 00:54:23,670 --> 00:54:28,650 For each of these functions, we will ask, to what extent 1277 00:54:28,650 --> 00:54:30,270 is this mental function implemented 1278 00:54:30,270 --> 00:54:33,570 in its own specialized brain machinery? 1279 00:54:33,570 --> 00:54:35,460 Are there multiple different brain regions 1280 00:54:35,460 --> 00:54:38,250 that carry out that function? 1281 00:54:38,250 --> 00:54:39,450 What does each one do? 1282 00:54:39,450 --> 00:54:40,980 Is there a division of labor between 1283 00:54:40,980 --> 00:54:42,750 those different regions? 1284 00:54:42,750 --> 00:54:47,580 How does that system arise in development? 1285 00:54:47,580 --> 00:54:49,770 Does it have homologues in other species? 1286 00:54:49,770 --> 00:54:54,330 Are these things uniquely human, or which of them are? 1287 00:54:54,330 --> 00:54:56,460 And also, along the way, other side 1288 00:54:56,460 --> 00:54:59,220 cool questions that will come up. 1289 00:54:59,220 --> 00:55:01,920 What, if anything, is special about the human brain? 1290 00:55:01,920 --> 00:55:06,030 How come we are taking over-- and largely destroying-- 1291 00:55:06,030 --> 00:55:10,200 the planet, and other species are not? 1292 00:55:10,200 --> 00:55:11,700 Besides destroying the planet, we're 1293 00:55:11,700 --> 00:55:14,310 doing some other cool things, like inventing science, 1294 00:55:14,310 --> 00:55:17,683 and engineering, and medicine, and architecture, and poetry, 1295 00:55:17,683 --> 00:55:19,350 and literature, and all of these other-- 1296 00:55:19,350 --> 00:55:21,300 and music-- all of these other awesome things 1297 00:55:21,300 --> 00:55:23,200 that other species aren't doing. 1298 00:55:23,200 --> 00:55:27,060 How come our brains are doing that and other species aren't? 1299 00:55:27,060 --> 00:55:28,950 Where does knowledge come from? 1300 00:55:28,950 --> 00:55:31,030 You guys know all of this stuff. 1301 00:55:31,030 --> 00:55:34,860 How much of that stuff was wired in at birth and how much of it 1302 00:55:34,860 --> 00:55:36,330 did you get from experience? 1303 00:55:39,170 --> 00:55:41,930 How much can our minds and brains change over time? 1304 00:55:41,930 --> 00:55:44,240 Can we go study a new thing and get a whole new brain 1305 00:55:44,240 --> 00:55:45,440 region for that thing? 1306 00:55:48,650 --> 00:55:52,040 Can we change the basic structure just by training, 1307 00:55:52,040 --> 00:55:53,960 or after brain damage? 1308 00:55:53,960 --> 00:55:56,900 Can we think without language? 1309 00:55:56,900 --> 00:55:59,570 How many of you have wondered about that question? 1310 00:55:59,570 --> 00:56:02,210 Yeah, really basic question. 1311 00:56:02,210 --> 00:56:03,650 Anya is answering it. 1312 00:56:03,650 --> 00:56:04,590 Anya and some others. 1313 00:56:04,590 --> 00:56:06,590 But Anya is doing a lot to answer that question. 1314 00:56:06,590 --> 00:56:08,990 There are actually empirical answers 1315 00:56:08,990 --> 00:56:10,970 to these long-standing, deep questions 1316 00:56:10,970 --> 00:56:12,500 that everyone wonders about. 1317 00:56:12,500 --> 00:56:13,310 That's pretty cool. 1318 00:56:16,160 --> 00:56:18,710 Somebody back there asked a while ago about awareness. 1319 00:56:18,710 --> 00:56:21,500 Can we think, perceive, understand without awareness? 1320 00:56:21,500 --> 00:56:24,145 How much can go on in the basement of the brain 1321 00:56:24,145 --> 00:56:26,270 when we don't even know what's going on down there? 1322 00:56:26,270 --> 00:56:29,600 We'll consider all of these other cool questions. 1323 00:56:29,600 --> 00:56:33,320 There's a bunch of things we won't cover in this course 1324 00:56:33,320 --> 00:56:35,760 for various reasons, that could have been in here and just 1325 00:56:35,760 --> 00:56:36,260 aren't. 1326 00:56:36,260 --> 00:56:37,700 There's only so much time. 1327 00:56:37,700 --> 00:56:38,410 Motor control. 1328 00:56:38,410 --> 00:56:40,910 It's really important to know how you do things like pick up 1329 00:56:40,910 --> 00:56:43,220 objects and plan actions. 1330 00:56:43,220 --> 00:56:46,190 And we're just not covering that. 1331 00:56:46,190 --> 00:56:49,100 Something had to go. 1332 00:56:49,100 --> 00:56:50,430 Subcortical function. 1333 00:56:50,430 --> 00:56:52,560 This is a very corticocentric course. 1334 00:56:52,560 --> 00:56:54,680 Most of the course will deal with the cortex. 1335 00:56:54,680 --> 00:56:57,350 That's where most of conscious thinking and reasoning 1336 00:56:57,350 --> 00:56:58,437 and cognition happens. 1337 00:56:58,437 --> 00:57:01,020 There's a lot of good stuff down in the basement of the brain, 1338 00:57:01,020 --> 00:57:03,530 and it's going to get pretty short shrift. 1339 00:57:03,530 --> 00:57:04,940 Not for any good reason-- 1340 00:57:04,940 --> 00:57:06,410 just what it is. 1341 00:57:06,410 --> 00:57:07,250 Decision-making. 1342 00:57:07,250 --> 00:57:10,190 Important field, not getting much coverage in here. 1343 00:57:10,190 --> 00:57:12,950 Importantly, circuit-level mechanisms-- 1344 00:57:12,950 --> 00:57:15,200 explanations of cognition. 1345 00:57:15,200 --> 00:57:17,688 If you think that we're going to understand not only what 1346 00:57:17,688 --> 00:57:19,730 it means to understand the meaning of a sentence, 1347 00:57:19,730 --> 00:57:21,355 but that I'm going to give you a wiring 1348 00:57:21,355 --> 00:57:24,980 diagram of the neurons that implement that function, 1349 00:57:24,980 --> 00:57:26,480 sorry to be the bearer of bad news, 1350 00:57:26,480 --> 00:57:28,910 but nobody has a freaking clue how 1351 00:57:28,910 --> 00:57:30,980 you could get a bunch of neurons to understand 1352 00:57:30,980 --> 00:57:32,900 the meaning of a sentence. 1353 00:57:32,900 --> 00:57:33,890 That's exciting. 1354 00:57:33,890 --> 00:57:36,230 That means there's a field for you guys to waltz into. 1355 00:57:36,230 --> 00:57:37,855 And probably, in your lifetimes, people 1356 00:57:37,855 --> 00:57:40,100 will start to crack these things. 1357 00:57:40,100 --> 00:57:42,740 But just to know what we're headed into, 1358 00:57:42,740 --> 00:57:45,920 rarely, for almost no high-level mental functions, 1359 00:57:45,920 --> 00:57:48,650 do we have anything like a wiring 1360 00:57:48,650 --> 00:57:52,460 diagram-level understanding of any perceptual or cognitive 1361 00:57:52,460 --> 00:57:54,463 function. 1362 00:57:54,463 --> 00:57:56,130 That's not in the cards for this course, 1363 00:57:56,130 --> 00:57:59,655 because it doesn't exist in the field. 1364 00:57:59,655 --> 00:58:01,280 For that kind of thing, there are cases 1365 00:58:01,280 --> 00:58:02,447 where you can make progress. 1366 00:58:02,447 --> 00:58:05,690 You can understand, say, fear conditioning in a mouse. 1367 00:58:05,690 --> 00:58:08,420 Those circuits are being like cracked wide 1368 00:58:08,420 --> 00:58:11,000 open by people in this building, people all around the world, 1369 00:58:11,000 --> 00:58:13,070 with spectacular precision. 1370 00:58:13,070 --> 00:58:16,340 They know the specific classes of neurons, their connectivity. 1371 00:58:16,340 --> 00:58:18,110 They know every damn thing about them. 1372 00:58:18,110 --> 00:58:20,810 But it's like, how does a mouse learn that this thing is-- 1373 00:58:20,810 --> 00:58:22,340 to be afraid of this thing? 1374 00:58:22,340 --> 00:58:24,200 OK, that's important. 1375 00:58:24,200 --> 00:58:28,670 But for more complex aspects of cognition in humans, 1376 00:58:28,670 --> 00:58:32,700 we can't usually have that kind of circuit-level understanding. 1377 00:58:32,700 --> 00:58:34,700 Lots of other things that will get short shrift. 1378 00:58:34,700 --> 00:58:36,175 Memory, not for any good reason. 1379 00:58:36,175 --> 00:58:37,550 I mean, there's a lot of coverage 1380 00:58:37,550 --> 00:58:40,940 of memory in 900 and 901, and it's just somehow 1381 00:58:40,940 --> 00:58:42,560 off a blind spot for understanding-- 1382 00:58:42,560 --> 00:58:44,685 for knowing how to talk interestingly about memory. 1383 00:58:44,685 --> 00:58:47,018 So I'm not going to give you a boring lecture on memory. 1384 00:58:47,018 --> 00:58:49,370 Instead, I'm not going to give you any lecture on memory 1385 00:58:49,370 --> 00:58:52,430 until I learn how to talk about it interestingly. 1386 00:58:52,430 --> 00:58:54,410 Reinforcement learning and reward systems. 1387 00:58:54,410 --> 00:58:55,820 I'm going to try to pull some of that in, 1388 00:58:55,820 --> 00:58:57,653 but it's not going to be a major focus, even 1389 00:58:57,653 --> 00:58:59,870 though it's a really important part of cognition. 1390 00:58:59,870 --> 00:59:00,440 Attention. 1391 00:59:00,440 --> 00:59:01,732 There might be some at the end. 1392 00:59:04,850 --> 00:59:06,935 How many of you have taken 900? 1393 00:59:09,480 --> 00:59:13,050 Looks like a little over a half. 1394 00:59:13,050 --> 00:59:17,190 How many have taken 901? 1395 00:59:17,190 --> 00:59:18,360 Yeah, a little over half. 1396 00:59:18,360 --> 00:59:21,210 OK, good. 1397 00:59:21,210 --> 00:59:23,580 If you have, great. 1398 00:59:23,580 --> 00:59:24,220 Good for you. 1399 00:59:24,220 --> 00:59:26,640 This course is designed as a tier two course for people 1400 00:59:26,640 --> 00:59:28,740 who have taken 900 or 901. 1401 00:59:28,740 --> 00:59:31,180 If you haven't, you're probably OK, 1402 00:59:31,180 --> 00:59:33,840 but you might need to do a little extra work. 1403 00:59:33,840 --> 00:59:38,520 I've already posted online, and in the syllabus, information 1404 00:59:38,520 --> 00:59:40,747 about, actually, a lecture I gave a year ago 1405 00:59:40,747 --> 00:59:43,080 on some of the background stuff that is no longer taught 1406 00:59:43,080 --> 00:59:43,680 in this course. 1407 00:59:43,680 --> 00:59:45,120 People hated it when I taught them 1408 00:59:45,120 --> 00:59:46,810 stuff they'd already encountered before, 1409 00:59:46,810 --> 00:59:49,800 so I'm trying to minimize that. 1410 00:59:49,800 --> 00:59:51,300 That's a backup for those of you who 1411 00:59:51,300 --> 00:59:52,467 haven't taken these courses. 1412 00:59:52,467 --> 00:59:54,810 If you're worried about this, chat with me afterwards. 1413 00:59:54,810 --> 00:59:56,280 I think it will be OK, just count 1414 00:59:56,280 --> 00:59:58,275 on doing a little bit of extra work-- not much. 1415 01:00:01,320 --> 01:00:04,587 For those of us who have taken it, 1416 01:00:04,587 --> 01:00:06,420 there's going to be a little bit of overlap. 1417 01:00:06,420 --> 01:00:08,910 It's simply impossible to have zero overlap. 1418 01:00:08,910 --> 01:00:12,960 I mean, what does John Gabrieli in 900 and Mark Bear in 901 do? 1419 01:00:12,960 --> 01:00:14,700 They survey the whole broad field, 1420 01:00:14,700 --> 01:00:17,370 and they pick the coolest stuff out of every little bit, 1421 01:00:17,370 --> 01:00:19,590 and they teach it to you, exactly as they should. 1422 01:00:19,590 --> 01:00:21,798 But that means that when I come along and try to say, 1423 01:00:21,798 --> 01:00:24,630 I'm going to do a more intensive coverage of the coolest things, 1424 01:00:24,630 --> 01:00:26,490 there's going to be a teeny bit of overlap. 1425 01:00:26,490 --> 01:00:28,883 But I'll try to not make it too much-- 1426 01:00:28,883 --> 01:00:31,050 just because the coolest stuff is the coolest stuff. 1427 01:00:37,470 --> 01:00:39,580 Also, the spin and the goals of this course 1428 01:00:39,580 --> 01:00:43,230 are quite different from both 900 and 901. 1429 01:00:43,230 --> 01:00:46,327 You will have to memorize a few things, but not much. 1430 01:00:46,327 --> 01:00:47,910 My real goal in this course is to have 1431 01:00:47,910 --> 01:00:50,010 you understand things, not memorize 1432 01:00:50,010 --> 01:00:52,680 a sea of disjointed facts. 1433 01:00:52,680 --> 01:00:54,298 A little more on the goals. 1434 01:00:54,298 --> 01:00:56,340 Really, what I want you to get out of this course 1435 01:00:56,340 --> 01:00:58,710 is to appreciate the big questions in the field 1436 01:00:58,710 --> 01:01:03,528 and what is at stake theoretically in each. 1437 01:01:03,528 --> 01:01:05,820 I want you to understand the methods in human cognitive 1438 01:01:05,820 --> 01:01:07,650 neuroscience, what each one can tell you, 1439 01:01:07,650 --> 01:01:10,470 what it can't, how different combinations of methods 1440 01:01:10,470 --> 01:01:13,783 Can work synergistically and complementarily to answer 1441 01:01:13,783 --> 01:01:15,075 different facets of a question. 1442 01:01:17,760 --> 01:01:20,105 I do want you to gain some actual knowledge about some 1443 01:01:20,105 --> 01:01:21,480 of the domains of cognition where 1444 01:01:21,480 --> 01:01:23,563 we've learned a bunch, both at the cognitive level 1445 01:01:23,563 --> 01:01:25,290 and the brain level-- 1446 01:01:25,290 --> 01:01:27,480 things like face recognition, navigation, 1447 01:01:27,480 --> 01:01:31,980 language understanding, music, stuff like that. 1448 01:01:31,980 --> 01:01:34,860 And crucially, I want you guys to be 1449 01:01:34,860 --> 01:01:37,978 able to read current papers in the field. 1450 01:01:37,978 --> 01:01:40,020 If you look in the syllabus, the first few papers 1451 01:01:40,020 --> 01:01:41,490 are, like, 20 years old, but it's 1452 01:01:41,490 --> 01:01:44,307 going to accelerate quickly and you'll be reading papers-- 1453 01:01:44,307 --> 01:01:46,140 I'm trying to choose mostly papers published 1454 01:01:46,140 --> 01:01:48,243 in the last year or two. 1455 01:01:48,243 --> 01:01:50,160 I'm trying to take you straight to the cutting 1456 01:01:50,160 --> 01:01:51,850 edge of the field. 1457 01:01:51,850 --> 01:01:52,350 Yeah? 1458 01:01:52,350 --> 01:01:54,626 AUDIENCE: Are the papers going to be straight out 1459 01:01:54,626 --> 01:01:56,000 of research labs, or are they going to be, like, 1460 01:01:56,000 --> 01:01:57,150 the annual review [INAUDIBLE]? 1461 01:01:57,150 --> 01:01:59,060 NANCY KANWISHER: No, straight out of research labs. 1462 01:01:59,060 --> 01:02:00,560 You're going to read the real deal, 1463 01:02:00,560 --> 01:02:03,707 not someone else's blurry, they just read the abstracts 1464 01:02:03,707 --> 01:02:05,540 and put in some stuff in the review article. 1465 01:02:05,540 --> 01:02:07,580 No, you're going to read the actual paper. 1466 01:02:07,580 --> 01:02:08,690 That's the whole deal. 1467 01:02:11,990 --> 01:02:14,000 Those are the goals. 1468 01:02:14,000 --> 01:02:15,288 Good. 1469 01:02:15,288 --> 01:02:15,830 A few things. 1470 01:02:15,830 --> 01:02:17,330 Why no textbook? 1471 01:02:17,330 --> 01:02:19,370 This field is moving too fast for a textbook. 1472 01:02:19,370 --> 01:02:21,530 Plus, I have strong opinions, and I don't 1473 01:02:21,530 --> 01:02:24,590 like any of the textbooks. 1474 01:02:24,590 --> 01:02:26,390 Any textbook is out of date. 1475 01:02:26,390 --> 01:02:29,060 We're going to be reading hot stuff that's hot off the press, 1476 01:02:29,060 --> 01:02:31,500 and so that's not in the textbooks yet. 1477 01:02:31,500 --> 01:02:33,535 And so we're skipping that, and you're 1478 01:02:33,535 --> 01:02:35,660 going to go straight to original research articles. 1479 01:02:35,660 --> 01:02:38,240 There will be occasional review articles where relevant, 1480 01:02:38,240 --> 01:02:40,250 but mostly, part of the agenda of this course 1481 01:02:40,250 --> 01:02:43,190 is to teach you to be not afraid of and able to read 1482 01:02:43,190 --> 01:02:45,740 current articles in the field. 1483 01:02:45,740 --> 01:02:46,580 All right. 1484 01:02:46,580 --> 01:02:48,230 You've all been waiting for this. 1485 01:02:48,230 --> 01:02:49,310 Details on the grading. 1486 01:02:49,310 --> 01:02:50,360 Pretty standard. 1487 01:02:50,360 --> 01:02:52,250 Midterm, 25% of the class-- 1488 01:02:52,250 --> 01:02:53,180 of the grade. 1489 01:02:53,180 --> 01:02:54,860 Final, 25%. 1490 01:02:54,860 --> 01:02:56,360 It will be cumulative, but weighted 1491 01:02:56,360 --> 01:02:58,440 toward the second half. 1492 01:02:58,440 --> 01:03:00,440 There's going to be a lot of reading and writing 1493 01:03:00,440 --> 01:03:07,580 assignments, approximately two papers to read per week. 1494 01:03:07,580 --> 01:03:10,400 And for, usually, one of those papers per week, 1495 01:03:10,400 --> 01:03:14,420 you will have a very short written assignment in which, 1496 01:03:14,420 --> 01:03:17,990 usually, I ask a few simple questions and maybe one 1497 01:03:17,990 --> 01:03:21,140 paragraph-level think question. 1498 01:03:21,140 --> 01:03:24,230 The essence of these tasks is not 1499 01:03:24,230 --> 01:03:25,970 the written assignment itself. 1500 01:03:25,970 --> 01:03:28,610 The essence of the task is to understand the paper. 1501 01:03:28,610 --> 01:03:30,810 If you understood the paper as you read it, 1502 01:03:30,810 --> 01:03:33,143 then you should be able to answer those questions pretty 1503 01:03:33,143 --> 01:03:34,700 straightforwardly. 1504 01:03:34,700 --> 01:03:36,920 And let me just say that understanding 1505 01:03:36,920 --> 01:03:38,660 a scientific paper is not trivial. 1506 01:03:38,660 --> 01:03:41,810 When I write a scientific paper, right in my area, 1507 01:03:41,810 --> 01:03:45,080 where I have all of the background, it takes me hours-- 1508 01:03:45,080 --> 01:03:46,310 hours. 1509 01:03:46,310 --> 01:03:47,360 It may be five pages. 1510 01:03:47,360 --> 01:03:49,310 It still takes me hours. 1511 01:03:49,310 --> 01:03:50,660 It's just how it is. 1512 01:03:50,660 --> 01:03:54,260 So when I assign a paper and you say, oh, it's only three pages, 1513 01:03:54,260 --> 01:03:55,880 I could do that in 20 minutes. 1514 01:03:55,880 --> 01:03:57,590 Oh, no, you can't. 1515 01:03:57,590 --> 01:03:59,150 No, you can't. 1516 01:03:59,150 --> 01:04:01,460 And that's part of what I want you to learn how to do, 1517 01:04:01,460 --> 01:04:03,320 is how to really read and understand 1518 01:04:03,320 --> 01:04:04,430 the scientific paper. 1519 01:04:04,430 --> 01:04:07,790 Allocate the time it takes to really get it. 1520 01:04:07,790 --> 01:04:12,470 That's a big part of the agenda in this task. 1521 01:04:12,470 --> 01:04:13,850 All of the stuff-- 1522 01:04:13,850 --> 01:04:16,310 the assignments and the submission of the assignments-- 1523 01:04:16,310 --> 01:04:17,960 will all happen on Stellar. 1524 01:04:17,960 --> 01:04:20,180 Your first written response to a paper 1525 01:04:20,180 --> 01:04:24,082 is due February 12 at 6:00 PM on Stellar. 1526 01:04:24,082 --> 01:04:26,540 But there are other readings that are assigned before that. 1527 01:04:26,540 --> 01:04:27,830 A note about the schedule. 1528 01:04:27,830 --> 01:04:30,590 I struggled a lot trying to both have the assignments happen 1529 01:04:30,590 --> 01:04:33,048 when you had already learned enough in lectures to know how 1530 01:04:33,048 --> 01:04:36,090 to do it, but have it close enough to the topic at hand 1531 01:04:36,090 --> 01:04:38,240 so it didn't seem, like, no longer relevant. 1532 01:04:38,240 --> 01:04:39,960 It's hard to do both of those things. 1533 01:04:39,960 --> 01:04:42,350 So the compromise is, all of the assignments 1534 01:04:42,350 --> 01:04:44,780 are due at 6:00 PM the night before the class 1535 01:04:44,780 --> 01:04:46,430 in which they're assigned. 1536 01:04:46,430 --> 01:04:51,230 If you see that it's assigned on the 13th-- 1537 01:04:51,230 --> 01:04:54,170 if it's listed on the lecture for the 13th, check carefully. 1538 01:04:54,170 --> 01:04:56,353 It's probably due the night of the-- 1539 01:04:56,353 --> 01:04:57,770 I'm getting this wrong-- the 12th. 1540 01:04:57,770 --> 01:04:58,760 The night before. 1541 01:04:58,760 --> 01:05:00,897 And that's so that we and the TAs can look at it, 1542 01:05:00,897 --> 01:05:03,230 figure out what you understood, what you didn't, and how 1543 01:05:03,230 --> 01:05:05,210 to incorporate and explain whatever you 1544 01:05:05,210 --> 01:05:08,780 didn't get in the next lecture. 1545 01:05:08,780 --> 01:05:10,100 All right. 1546 01:05:10,100 --> 01:05:10,760 Quizzes. 1547 01:05:10,760 --> 01:05:12,040 I haven't done this before. 1548 01:05:12,040 --> 01:05:13,288 New thing I'm going to try. 1549 01:05:13,288 --> 01:05:15,080 There are going to be about eight of these. 1550 01:05:15,080 --> 01:05:16,520 They're going to be very brief. 1551 01:05:16,520 --> 01:05:19,820 They're going to happen at the end of class, in class. 1552 01:05:19,820 --> 01:05:22,250 And you will do them on your computer or your iPhone 1553 01:05:22,250 --> 01:05:23,330 using Google Forms. 1554 01:05:23,330 --> 01:05:25,400 If anybody doesn't have a computer or an iPhone 1555 01:05:25,400 --> 01:05:28,158 they can bring to class the days of those quizzes, 1556 01:05:28,158 --> 01:05:30,575 let us know after class and we'll come up with a solution. 1557 01:05:33,300 --> 01:05:35,210 And the idea of these is not to fish 1558 01:05:35,210 --> 01:05:38,180 out an obscure fact that was in one of the reading assignments 1559 01:05:38,180 --> 01:05:40,040 and ding you on it. 1560 01:05:40,040 --> 01:05:41,660 I'm not interested in that. 1561 01:05:41,660 --> 01:05:44,220 The goal of this is just to keep you up to date, 1562 01:05:44,220 --> 01:05:47,340 keep you doing the readings, keep you up with the material. 1563 01:05:47,340 --> 01:05:49,238 And if you basically are understanding what 1564 01:05:49,238 --> 01:05:51,530 you're reading and understanding the lecture material-- 1565 01:05:51,530 --> 01:05:53,983 maybe you glance at it briefly before-- 1566 01:05:53,983 --> 01:05:55,400 you should do fine on the quizzes. 1567 01:05:55,400 --> 01:05:57,222 They're just kind of reality checks 1568 01:05:57,222 --> 01:05:59,180 for us to know what people are getting and not. 1569 01:06:02,210 --> 01:06:06,200 First quiz is February 20, blah, blah. 1570 01:06:06,200 --> 01:06:08,240 There is one longer written assignment 1571 01:06:08,240 --> 01:06:10,040 that is not due with the usual schedule, 1572 01:06:10,040 --> 01:06:12,110 with all of the other things, to do near the end. 1573 01:06:12,110 --> 01:06:13,527 And in that one, you will actually 1574 01:06:13,527 --> 01:06:15,920 design an experiment in a particular area. 1575 01:06:15,920 --> 01:06:16,690 And that will be-- 1576 01:06:16,690 --> 01:06:19,190 I don't know yet-- three to five pages, something like that. 1577 01:06:19,190 --> 01:06:21,020 We'll give you more details on exactly how you 1578 01:06:21,020 --> 01:06:21,950 want to organize this. 1579 01:06:21,950 --> 01:06:23,570 And it will be very specific-- like, 1580 01:06:23,570 --> 01:06:25,790 state your exact hypothesis, state 1581 01:06:25,790 --> 01:06:27,887 your exact experimental design, et cetera. 1582 01:06:27,887 --> 01:06:30,095 And you'll get practice with those things in advance. 1583 01:06:32,600 --> 01:06:35,010 Those are the grading and requirements. 1584 01:06:35,010 --> 01:06:37,650 And this is the-- you have this all in the syllabus in front 1585 01:06:37,650 --> 01:06:38,150 of you. 1586 01:06:38,150 --> 01:06:39,680 This is the lineup of topics. 1587 01:06:39,680 --> 01:06:44,100 But very briefly, let me try to give you the arc of the class. 1588 01:06:44,100 --> 01:06:45,965 So this is the introduction. 1589 01:06:45,965 --> 01:06:47,840 Next time, we're going to do just a teeny bit 1590 01:06:47,840 --> 01:06:48,920 of neuroanatomy. 1591 01:06:48,920 --> 01:06:52,370 There will be a teeny bit of overlap with 900 and 901 there. 1592 01:06:52,370 --> 01:06:55,020 I'm going to whip through it in very superficial form. 1593 01:06:55,020 --> 01:06:58,180 I'm doing that largely because on the following class, 1594 01:06:58,180 --> 01:07:00,270 we have an amazing privilege, which 1595 01:07:00,270 --> 01:07:03,660 is that one of the greatest neuroscientists alive today, 1596 01:07:03,660 --> 01:07:07,020 Ann Graybiel's, will be doing an actual brain dissection, 1597 01:07:07,020 --> 01:07:09,180 right here in this class, right in front of you. 1598 01:07:09,180 --> 01:07:10,800 It's going to be awesome. 1599 01:07:10,800 --> 01:07:11,850 I can't wait. 1600 01:07:11,850 --> 01:07:13,170 It's an incredible privilege. 1601 01:07:13,170 --> 01:07:15,378 It will be a real human brain, and you guys will be-- 1602 01:07:15,378 --> 01:07:17,160 Ann will be here with all her apparatus, 1603 01:07:17,160 --> 01:07:18,900 and you guys will be clustered around. 1604 01:07:18,900 --> 01:07:20,402 And if it's this many, god help us, 1605 01:07:20,402 --> 01:07:22,110 but we'll figure out how to make it work. 1606 01:07:22,110 --> 01:07:24,210 I may-- let me just say, if there are listeners in here, 1607 01:07:24,210 --> 01:07:26,280 I may have to tell listeners they can't come, 1608 01:07:26,280 --> 01:07:28,830 because very sensitive about not having too many people. 1609 01:07:28,830 --> 01:07:29,670 Stay tuned on that. 1610 01:07:29,670 --> 01:07:30,837 I haven't quite decided yet. 1611 01:07:30,837 --> 01:07:33,150 It depends how many people are taking the class. 1612 01:07:33,150 --> 01:07:35,080 But it's going to be amazing. 1613 01:07:35,080 --> 01:07:37,290 And I want to remind you of just some basics 1614 01:07:37,290 --> 01:07:40,680 so you're not asking her, like, what is the hippocampus? 1615 01:07:40,680 --> 01:07:44,292 I should all know that, but we'll just do bare basics. 1616 01:07:44,292 --> 01:07:45,750 And then we'll have the dissection. 1617 01:07:45,750 --> 01:07:46,690 That will be great. 1618 01:07:46,690 --> 01:07:48,540 And also, another thing to say is, 1619 01:07:48,540 --> 01:07:50,985 I mentioned that the subcortical regions are going to get 1620 01:07:50,985 --> 01:07:52,110 short shrift in this class. 1621 01:07:52,110 --> 01:07:53,610 That's true. 1622 01:07:53,610 --> 01:07:56,040 But a lot of what you see in the dissection 1623 01:07:56,040 --> 01:07:58,320 is the subcortical stuff. 1624 01:07:58,320 --> 01:08:00,930 Cortex is great but, it kind of all looks the same. 1625 01:08:00,930 --> 01:08:03,000 You kind of can't say, oh, that's this region. 1626 01:08:03,000 --> 01:08:03,630 That's the other region. 1627 01:08:03,630 --> 01:08:05,630 Well, you can, but it doesn't look any different 1628 01:08:05,630 --> 01:08:08,430 from any other region. 1629 01:08:08,430 --> 01:08:11,820 That's where the subcortical stuff will happen. 1630 01:08:11,820 --> 01:08:13,770 Then I'm going to do a couple of lectures 1631 01:08:13,770 --> 01:08:19,020 that focus on high-level vision, perceiving motion, and color, 1632 01:08:19,020 --> 01:08:21,630 and shape, and faces, and scenes, and bodies, 1633 01:08:21,630 --> 01:08:22,740 and stuff like that. 1634 01:08:22,740 --> 01:08:26,189 And we will use those both to teach you that content, 1635 01:08:26,189 --> 01:08:32,760 and also to teach you vast array of methods in this field. 1636 01:08:32,760 --> 01:08:36,660 We will then have a lecture on the kind 1637 01:08:36,660 --> 01:08:38,580 of debates about the organization 1638 01:08:38,580 --> 01:08:40,560 of visual cortex in humans. 1639 01:08:40,560 --> 01:08:42,060 I have a particular view. 1640 01:08:42,060 --> 01:08:45,750 I'm very fond of views that some patches of cortex 1641 01:08:45,750 --> 01:08:47,910 are very, very functionally specific. 1642 01:08:47,910 --> 01:08:49,350 Not everyone believes that. 1643 01:08:49,350 --> 01:08:51,000 So I have assigned readings of people 1644 01:08:51,000 --> 01:08:53,167 who have different views, and we will consider that. 1645 01:08:53,167 --> 01:08:56,130 I will try to expose you to the alternate views 1646 01:08:56,130 --> 01:08:58,529 and tell you why I'm teaching-- why I still believe mine, 1647 01:08:58,529 --> 01:09:02,670 but why other smart people believe different things. 1648 01:09:02,670 --> 01:09:05,640 We will then move up the system from perception, 1649 01:09:05,640 --> 01:09:08,670 and we will spend two meetings talking about scene perception 1650 01:09:08,670 --> 01:09:09,779 and navigation. 1651 01:09:09,779 --> 01:09:12,330 You got a hint about what an interesting area this is 1652 01:09:12,330 --> 01:09:13,529 from the story of Bob. 1653 01:09:13,529 --> 01:09:15,450 We'll consider more what we've learned 1654 01:09:15,450 --> 01:09:18,180 from studies of patients with brain damage, 1655 01:09:18,180 --> 01:09:21,270 from functional MRI, from physiology in animals, 1656 01:09:21,270 --> 01:09:24,180 from cognitive science, from the whole glorious menagerie 1657 01:09:24,180 --> 01:09:26,640 of methods to understand navigation. 1658 01:09:26,640 --> 01:09:30,029 It's a really fascinating area. 1659 01:09:30,029 --> 01:09:32,920 In the two lectures after that, we'll consider development. 1660 01:09:32,920 --> 01:09:34,380 How do you wire up a brain? 1661 01:09:34,380 --> 01:09:35,970 How much is present at birth? 1662 01:09:35,970 --> 01:09:37,930 What is specified in the genes? 1663 01:09:37,930 --> 01:09:39,569 What is learned? 1664 01:09:39,569 --> 01:09:42,960 And a lot of that will focus on the navigation system 1665 01:09:42,960 --> 01:09:44,850 and the face system, simply because that's 1666 01:09:44,850 --> 01:09:46,325 where there's a lot known. 1667 01:09:46,325 --> 01:09:47,700 We'll consider some other things, 1668 01:09:47,700 --> 01:09:49,710 but those are two areas where there's 1669 01:09:49,710 --> 01:09:52,890 super exciting work from just the last three or four years. 1670 01:09:52,890 --> 01:09:55,620 That's what we'll focus on there. 1671 01:09:55,620 --> 01:10:00,030 I'm then going to do a lecture on brains in blind people. 1672 01:10:00,030 --> 01:10:01,090 How are they different? 1673 01:10:01,090 --> 01:10:02,080 How are they the same? 1674 01:10:02,080 --> 01:10:04,470 What does that tell us? 1675 01:10:04,470 --> 01:10:07,060 And then you have the midterm. 1676 01:10:07,060 --> 01:10:09,240 Then we're going to move on and consider number. 1677 01:10:09,240 --> 01:10:12,210 How do you like instantly know that that's three fingers 1678 01:10:12,210 --> 01:10:14,040 and that's two without having to do 1679 01:10:14,040 --> 01:10:16,170 anything all that complicated? 1680 01:10:16,170 --> 01:10:18,390 And if I had 25 fingers and held them up, 1681 01:10:18,390 --> 01:10:21,420 you would immediately get a sense that it was about 25. 1682 01:10:21,420 --> 01:10:24,180 You might not know if it was 22 or 28, 1683 01:10:24,180 --> 01:10:26,677 but you would know it was about 25. 1684 01:10:26,677 --> 01:10:28,260 And there are particular brain regions 1685 01:10:28,260 --> 01:10:29,343 that compute that for you. 1686 01:10:29,343 --> 01:10:30,810 And we will consider all of that. 1687 01:10:30,810 --> 01:10:33,600 And there's a very rich array of information 1688 01:10:33,600 --> 01:10:35,820 from studies of infant cognition, 1689 01:10:35,820 --> 01:10:39,630 from animal behavior, from brain imaging, from brain damage, 1690 01:10:39,630 --> 01:10:43,380 from single-unit physiology, and from computation, all of which 1691 01:10:43,380 --> 01:10:45,120 inform our understanding of number. 1692 01:10:45,120 --> 01:10:46,290 Those are my favorite lectures, where 1693 01:10:46,290 --> 01:10:47,748 we can take one domain of cognition 1694 01:10:47,748 --> 01:10:49,800 and inform it with all of the methods. 1695 01:10:49,800 --> 01:10:52,800 And numbers are a really great example. 1696 01:10:52,800 --> 01:10:54,630 Then we'll talk a little bit about-- 1697 01:10:54,630 --> 01:10:56,495 one of my TAS said, call it neuroeconomics. 1698 01:10:56,495 --> 01:10:57,370 That will sound good. 1699 01:10:57,370 --> 01:10:58,620 But actually, what I'm going to try to do 1700 01:10:58,620 --> 01:10:59,820 is sort of neuroeconomics. 1701 01:10:59,820 --> 01:11:04,950 But it will be about pleasure, and pain, and reward, 1702 01:11:04,950 --> 01:11:08,280 and how we think about those things. 1703 01:11:08,280 --> 01:11:11,250 And then that's down to April 3. 1704 01:11:11,250 --> 01:11:13,650 Just as a side note, all of these things 1705 01:11:13,650 --> 01:11:16,260 are things that are pretty similar between humans 1706 01:11:16,260 --> 01:11:17,550 and-- at least primates. 1707 01:11:17,550 --> 01:11:20,850 And some of them are shared with rodents. 1708 01:11:20,850 --> 01:11:22,530 And most of the things after that 1709 01:11:22,530 --> 01:11:25,440 are things that are really uniquely human. 1710 01:11:25,440 --> 01:11:28,240 We'll be really moving away, with less available animal 1711 01:11:28,240 --> 01:11:30,240 literature to inform the stuff we're looking at, 1712 01:11:30,240 --> 01:11:33,190 because animals can't do these things. 1713 01:11:33,190 --> 01:11:36,032 And so necessarily far from the details of individual neurons 1714 01:11:36,032 --> 01:11:37,740 and circuits, but there's still lots cool 1715 01:11:37,740 --> 01:11:41,040 that can be said about how you understand speech, 1716 01:11:41,040 --> 01:11:44,010 how you appreciate music. 1717 01:11:44,010 --> 01:11:46,140 There will be a guest lecture, just for fun, 1718 01:11:46,140 --> 01:11:49,470 on brain-machine interface by Michael Cohen, who's 1719 01:11:49,470 --> 01:11:50,970 working in my lab now, and who has 1720 01:11:50,970 --> 01:11:53,200 a great lecture on this topic. 1721 01:11:53,200 --> 01:11:55,840 Then we'll spend a couple of lectures on language-- 1722 01:11:55,840 --> 01:11:58,150 how you understand and produce language, 1723 01:11:58,150 --> 01:12:00,160 and what the relevant brain regions are, 1724 01:12:00,160 --> 01:12:02,080 what we know about it from cognition, 1725 01:12:02,080 --> 01:12:04,060 and lots of other methods-- 1726 01:12:04,060 --> 01:12:08,350 and what the relationship is between language and thought. 1727 01:12:08,350 --> 01:12:11,740 Then we'll think about how we think about other people. 1728 01:12:11,740 --> 01:12:13,150 This is called theory of mind-- 1729 01:12:13,150 --> 01:12:15,460 how I can look out at this lecture 1730 01:12:15,460 --> 01:12:18,270 and try to evaluate from your facial expression. 1731 01:12:18,270 --> 01:12:21,910 Are they bored, sleepy, overworked, fascinated, 1732 01:12:21,910 --> 01:12:22,870 excited? 1733 01:12:22,870 --> 01:12:25,030 All of this kind of stuff that all of us 1734 01:12:25,030 --> 01:12:27,100 do moment-to-moment in any conversation, 1735 01:12:27,100 --> 01:12:30,620 and that, yes, lecturers are doing all of the time, 1736 01:12:30,620 --> 01:12:32,620 even if I know that you guys have too much work, 1737 01:12:32,620 --> 01:12:34,078 and that's why you're sleepy, and I 1738 01:12:34,078 --> 01:12:36,598 shouldn't take it personally. 1739 01:12:36,598 --> 01:12:37,390 I'm still noticing. 1740 01:12:37,390 --> 01:12:42,190 Anyway, then we'll go on and consider brain networks. 1741 01:12:42,190 --> 01:12:43,570 Of course, brain function doesn't 1742 01:12:43,570 --> 01:12:46,420 happen in just a single region, even if we spend a lot of time 1743 01:12:46,420 --> 01:12:48,190 studying individual regions. 1744 01:12:48,190 --> 01:12:50,110 There's considerable work trying to figure out 1745 01:12:50,110 --> 01:12:51,850 which sets of regions work together, 1746 01:12:51,850 --> 01:12:54,280 and how could we discover that, and what are those broader 1747 01:12:54,280 --> 01:12:57,040 networks of brain regions? 1748 01:12:57,040 --> 01:12:59,320 And then on May 6, you will have turned 1749 01:12:59,320 --> 01:13:01,630 in your longer written assignment 1750 01:13:01,630 --> 01:13:02,990 designing your own experiment. 1751 01:13:02,990 --> 01:13:05,890 And then on May 6, we will work together in groups 1752 01:13:05,890 --> 01:13:08,500 to refine those experiments and really hash out the details 1753 01:13:08,500 --> 01:13:12,370 so you actually know how to design an experiment. 1754 01:13:12,370 --> 01:13:16,030 And then we will have this guest lecture from my postdoc, 1755 01:13:16,030 --> 01:13:18,730 Katherina Dobbs, on deep nets and what they can tell us 1756 01:13:18,730 --> 01:13:21,190 about cognition and brains. 1757 01:13:21,190 --> 01:13:23,613 And then we'll talk about attention and awareness. 1758 01:13:23,613 --> 01:13:25,030 And then I'm not totally sure what 1759 01:13:25,030 --> 01:13:26,405 we're going do in the last class, 1760 01:13:26,405 --> 01:13:28,900 but what I'm voting for is that the amazing TAs each 1761 01:13:28,900 --> 01:13:31,550 give a short talk on the cool stuff they're doing. 1762 01:13:31,550 --> 01:13:33,520 But that's under discussion. 1763 01:13:33,520 --> 01:13:36,580 OK, that's the arc of the class. 1764 01:13:36,580 --> 01:13:37,240 Questions? 1765 01:13:40,360 --> 01:13:41,000 All clear? 1766 01:13:44,000 --> 01:13:44,570 Great. 1767 01:13:44,570 --> 01:13:47,030 Well, if I have five more minutes, 1768 01:13:47,030 --> 01:13:48,650 maybe I'll do one other little thing. 1769 01:13:48,650 --> 01:13:50,707 Let me try this. 1770 01:13:50,707 --> 01:13:53,040 You asked-- I'm going to try to learn everybody's names, 1771 01:13:53,040 --> 01:13:54,440 but I'm not doing that yet, because some of you 1772 01:13:54,440 --> 01:13:56,300 might not show up and I will have wasted a whole piece 1773 01:13:56,300 --> 01:13:57,350 of my brain encoding it. 1774 01:13:57,350 --> 01:13:59,910 I'm just kidding. 1775 01:13:59,910 --> 01:14:01,940 But anyway, I remember that you asked, are you 1776 01:14:01,940 --> 01:14:03,210 going to read current papers? 1777 01:14:03,210 --> 01:14:04,460 Yes, it is-- and you're right. 1778 01:14:04,460 --> 01:14:05,750 It's daunting. 1779 01:14:05,750 --> 01:14:08,510 But let me just say a little bit about how to read papers. 1780 01:14:08,510 --> 01:14:12,620 This is not a stats course, and we haven't prerequisited stats. 1781 01:14:12,620 --> 01:14:17,660 Neither is it a course on the physics of MRI. 1782 01:14:17,660 --> 01:14:19,490 There will be parts of every MRI paper 1783 01:14:19,490 --> 01:14:21,020 that have a lot of gobbledygook. 1784 01:14:21,020 --> 01:14:24,980 We scanned with this scanning procedure. 1785 01:14:24,980 --> 01:14:27,890 We used this kind of scanner and this kind of blah, blah, blah. 1786 01:14:27,890 --> 01:14:28,910 Lots of gobbledygook. 1787 01:14:28,910 --> 01:14:31,100 You guys don't need to worry about that. 1788 01:14:31,100 --> 01:14:34,040 About the stats, it's kind of a judgment call. 1789 01:14:34,040 --> 01:14:37,490 Everyone in here should have an idea of what a P level means, 1790 01:14:37,490 --> 01:14:40,940 and I hope, a sense of what a T-test is and an ANOVA. 1791 01:14:40,940 --> 01:14:43,250 If you don't, I should probably tell you that offline, 1792 01:14:43,250 --> 01:14:44,840 because that's pretty basic. 1793 01:14:44,840 --> 01:14:46,370 And what a correlation is. 1794 01:14:46,370 --> 01:14:50,795 Beyond that, just use your intuitions about those things. 1795 01:14:50,795 --> 01:14:52,670 And this is not of course about understanding 1796 01:14:52,670 --> 01:14:54,530 the details of the stats in each experiment. 1797 01:14:54,530 --> 01:14:56,863 There just isn't room to cover all that in the substance 1798 01:14:56,863 --> 01:14:58,490 of the studies, as well. 1799 01:14:58,490 --> 01:14:59,760 When you read a paper-- 1800 01:14:59,760 --> 01:15:03,410 for example, here's a paper-- 1801 01:15:03,410 --> 01:15:04,497 a very old paper. 1802 01:15:04,497 --> 01:15:06,080 You come across this, and it was like, 1803 01:15:06,080 --> 01:15:07,580 OK, here are all these words. 1804 01:15:07,580 --> 01:15:09,040 And it goes on for 20 pages. 1805 01:15:09,040 --> 01:15:10,850 And how do you even dig in? 1806 01:15:10,850 --> 01:15:13,880 Well, the way to dig in is to start by saying, 1807 01:15:13,880 --> 01:15:16,880 what question is being asked in this paper? 1808 01:15:16,880 --> 01:15:18,830 If the paper is well written, you'll 1809 01:15:18,830 --> 01:15:21,830 be able to find that in the abstract. 1810 01:15:21,830 --> 01:15:24,440 Blah, blah, blah, to study the effect of face inversion 1811 01:15:24,440 --> 01:15:26,465 on the human fusiform face area. 1812 01:15:26,465 --> 01:15:27,840 We'll talk about that more later. 1813 01:15:27,840 --> 01:15:29,540 But if you fish through the abstract, 1814 01:15:29,540 --> 01:15:31,400 you should be able to find what question is being asked. 1815 01:15:31,400 --> 01:15:33,317 And it's the first thing you should figure out 1816 01:15:33,317 --> 01:15:35,270 about a paper. 1817 01:15:35,270 --> 01:15:37,997 You don't necessarily read a paper start to-- 1818 01:15:37,997 --> 01:15:39,080 beginning through the end. 1819 01:15:39,080 --> 01:15:41,413 I think it's better to start with this list of questions 1820 01:15:41,413 --> 01:15:45,260 in your head and look for the answers to those questions. 1821 01:15:45,260 --> 01:15:46,430 Second question. 1822 01:15:46,430 --> 01:15:48,710 What did they find? 1823 01:15:48,710 --> 01:15:50,210 If the abstract is well written, you 1824 01:15:50,210 --> 01:15:52,430 can find that in the abstract as well. 1825 01:15:52,430 --> 01:15:54,830 Signal intensity from the fusiform face area 1826 01:15:54,830 --> 01:15:58,687 was reduced when grayscale faces were presented upside down. 1827 01:15:58,687 --> 01:16:00,020 Kind of boring, but there it is. 1828 01:16:00,020 --> 01:16:01,395 That's the finding of this paper. 1829 01:16:03,680 --> 01:16:05,300 What is the interpretation? 1830 01:16:05,300 --> 01:16:07,050 In other words, who cares? 1831 01:16:07,050 --> 01:16:09,095 Why-- who cares about this? 1832 01:16:11,918 --> 01:16:13,460 If you look in here, in the abstract, 1833 01:16:13,460 --> 01:16:16,550 FFA responds to faces per se rather 1834 01:16:16,550 --> 01:16:20,277 than to the low-level features present in faces. 1835 01:16:20,277 --> 01:16:21,860 We'll talk more about what that means. 1836 01:16:21,860 --> 01:16:23,480 You guys have an assignment about that-- probably, 1837 01:16:23,480 --> 01:16:25,480 several assignments about that kind of question. 1838 01:16:28,327 --> 01:16:30,410 Next question you want to ask yourself is, what is 1839 01:16:30,410 --> 01:16:33,110 the design of this experiment? 1840 01:16:33,110 --> 01:16:35,840 Often, for this, you have to go beyond the abstract. 1841 01:16:35,840 --> 01:16:38,810 And I should say, for even these earlier questions, 1842 01:16:38,810 --> 01:16:40,760 sometimes you won't find them in the abstract. 1843 01:16:40,760 --> 01:16:43,280 That just means the abstract is not well written. 1844 01:16:43,280 --> 01:16:45,260 But that exists. 1845 01:16:45,260 --> 01:16:46,370 To get the design-- 1846 01:16:46,370 --> 01:16:48,200 like, what exactly did they do? 1847 01:16:48,200 --> 01:16:51,138 Usually, you have to fit in what exactly was done, 1848 01:16:51,138 --> 01:16:52,430 and how were the data analyzed? 1849 01:16:52,430 --> 01:16:53,690 You need to fish farther. 1850 01:16:53,690 --> 01:16:57,770 You need to fish around other parts of the paper. 1851 01:16:57,770 --> 01:17:00,260 And of course, all of those questions-- 1852 01:17:00,260 --> 01:17:02,270 I just said, what question-- 1853 01:17:02,270 --> 01:17:03,350 I circled this part. 1854 01:17:03,350 --> 01:17:05,900 But there are many levels to one question. 1855 01:17:05,900 --> 01:17:10,590 You can get more on, why is that inverted question important? 1856 01:17:10,590 --> 01:17:13,980 You look through, usually, in the introduction to the paper. 1857 01:17:13,980 --> 01:17:16,370 Does the FFA respond to faces per se, 1858 01:17:16,370 --> 01:17:18,410 or to a confounding visual feature which 1859 01:17:18,410 --> 01:17:20,480 tends to be present in faces? 1860 01:17:20,480 --> 01:17:23,130 Second, is it true that inverted faces cannot engage 1861 01:17:23,130 --> 01:17:24,320 face-specific mechanisms? 1862 01:17:24,320 --> 01:17:25,065 Blah, blah, blah. 1863 01:17:25,065 --> 01:17:26,690 That gives you a little more background 1864 01:17:26,690 --> 01:17:28,070 on what the question is. 1865 01:17:28,070 --> 01:17:30,388 There are different levels of depth. 1866 01:17:30,388 --> 01:17:32,180 These are all things you want to be looking 1867 01:17:32,180 --> 01:17:33,710 for when you read a paper. 1868 01:17:33,710 --> 01:17:34,910 What exactly was done? 1869 01:17:34,910 --> 01:17:36,950 We measured MRI responses in the FFA 1870 01:17:36,950 --> 01:17:38,570 to upright and inverted faces. 1871 01:17:38,570 --> 01:17:39,830 I don't expect you to understand all of this. 1872 01:17:39,830 --> 01:17:41,390 These are just giving you, schematically, 1873 01:17:41,390 --> 01:17:43,223 how you proceed when you're reading a paper. 1874 01:17:45,620 --> 01:17:48,470 More on the interpretation, or who cares? 1875 01:17:48,470 --> 01:17:50,960 This result would show that face-specific mechanisms are 1876 01:17:50,960 --> 01:17:53,300 engaged only or predominantly by upright faces, 1877 01:17:53,300 --> 01:17:54,070 blah, blah, blah. 1878 01:17:54,070 --> 01:17:55,668 You can fish through for those things. 1879 01:17:55,668 --> 01:17:57,710 The point is to have those questions in your head 1880 01:17:57,710 --> 01:18:00,290 when you read a paper. 1881 01:18:00,290 --> 01:18:03,530 It's much more easy and engaging to read something 1882 01:18:03,530 --> 01:18:05,720 if you have an agenda when you read it. 1883 01:18:05,720 --> 01:18:07,795 Your agenda, in reading scientific papers, 1884 01:18:07,795 --> 01:18:09,545 is to answer those questions for yourself. 1885 01:18:12,860 --> 01:18:13,370 More stuff. 1886 01:18:13,370 --> 01:18:14,713 What was the design and logic? 1887 01:18:14,713 --> 01:18:16,130 Often, that's deep in the methods. 1888 01:18:16,130 --> 01:18:17,540 You have to fish around and find it. 1889 01:18:17,540 --> 01:18:19,582 There will be some set of conditions and designs. 1890 01:18:19,582 --> 01:18:22,340 We'll talk more about all this kind of stuff. 1891 01:18:22,340 --> 01:18:25,775 What exactly was done, blah, blah, more details. 1892 01:18:29,450 --> 01:18:31,580 And this is an example of the kind of gobbledygook 1893 01:18:31,580 --> 01:18:34,490 that you can ignore. 1894 01:18:34,490 --> 01:18:37,160 Subjects were scanned on a 1 and 1/2 T scanner. 1895 01:18:37,160 --> 01:18:38,280 And there are all these-- 1896 01:18:38,280 --> 01:18:40,940 here's an example of said gobbledygook. 1897 01:18:40,940 --> 01:18:43,078 You can ignore this, in this class. 1898 01:18:43,078 --> 01:18:45,370 Every method will have different kinds of gobbledygook. 1899 01:18:45,370 --> 01:18:46,750 This is MRI gobbledygook. 1900 01:18:46,750 --> 01:18:49,720 You can ignore it. 1901 01:18:49,720 --> 01:18:53,800 It matters a lot, but not here. 1902 01:18:53,800 --> 01:18:54,370 What else? 1903 01:18:54,370 --> 01:18:56,110 How are the data analyzed? 1904 01:18:56,110 --> 01:18:58,000 If you look in the-- 1905 01:18:58,000 --> 01:19:01,107 sometimes, there's a data analysis section, or a results 1906 01:19:01,107 --> 01:19:03,190 section, or a methods section, that will tell you. 1907 01:19:03,190 --> 01:19:07,090 You can find that, figure it out. 1908 01:19:07,090 --> 01:19:08,200 What was the finding? 1909 01:19:08,200 --> 01:19:09,490 Here's more on the finding. 1910 01:19:09,490 --> 01:19:12,640 Again, you just fish through for these things. 1911 01:19:12,640 --> 01:19:15,910 The point is just, when you're reading a paper, 1912 01:19:15,910 --> 01:19:17,530 it's not necessarily-- 1913 01:19:17,530 --> 01:19:20,230 what I do is, I read the title, I read the abstract. 1914 01:19:20,230 --> 01:19:22,480 And then I start answering those questions for myself. 1915 01:19:22,480 --> 01:19:24,730 And sometimes, at that point, I'm skipping to figures. 1916 01:19:24,730 --> 01:19:26,110 I'm skipping to methods. 1917 01:19:26,110 --> 01:19:28,990 Any of that is fine. 1918 01:19:28,990 --> 01:19:31,480 Don't feel like you need to understand each word, 1919 01:19:31,480 --> 01:19:33,378 especially deep in the methods. 1920 01:19:33,378 --> 01:19:33,920 I don't know. 1921 01:19:33,920 --> 01:19:35,498 Was that helpful at all? 1922 01:19:35,498 --> 01:19:37,540 We'll try it, and you guys will give me feedback, 1923 01:19:37,540 --> 01:19:39,650 and if it works, great. 1924 01:19:39,650 --> 01:19:43,330 And if not, we'll do more on how to read papers. 1925 01:19:43,330 --> 01:19:45,580 All right, it's 12:25. 1926 01:19:45,580 --> 01:19:47,940 See you on Monday.