1 00:00:00,000 --> 00:00:02,435 [SQUEAKING] 2 00:00:02,435 --> 00:00:04,383 [RUSTLING] 3 00:00:04,383 --> 00:00:05,844 [CLICKING] 4 00:00:15,955 --> 00:00:17,080 JAMES PAINE: Hi, everybody. 5 00:00:17,080 --> 00:00:18,038 My name is James Paine. 6 00:00:18,038 --> 00:00:20,160 I'm with the System Dynamics Group here at MIT. 7 00:00:20,160 --> 00:00:21,780 I'll give you a little bit more background on myself 8 00:00:21,780 --> 00:00:22,530 in a few minutes. 9 00:00:22,530 --> 00:00:24,450 But just to be sure everyone's where they're supposed to be, 10 00:00:24,450 --> 00:00:25,470 this is System Dynamics-- 11 00:00:25,470 --> 00:00:27,553 Systems Thinking and Modeling for a Complex World. 12 00:00:27,553 --> 00:00:31,887 This is a three-hour IAP overview session. 13 00:00:31,887 --> 00:00:34,470 Just making sure no one stands up and walks out at this point. 14 00:00:34,470 --> 00:00:35,610 OK, good, yes. 15 00:00:35,610 --> 00:00:36,690 I'm doing pretty good. 16 00:00:36,690 --> 00:00:38,705 A little overview of what we're doing today. 17 00:00:38,705 --> 00:00:40,080 So right now, we're in the middle 18 00:00:40,080 --> 00:00:41,330 of the welcome, which is nice. 19 00:00:41,330 --> 00:00:42,747 We're going to do a quick overview 20 00:00:42,747 --> 00:00:44,100 of What is Systems Thinking? 21 00:00:44,100 --> 00:00:45,683 And I'm going to try to emphasize this 22 00:00:45,683 --> 00:00:47,970 throughout the day, that there is 23 00:00:47,970 --> 00:00:51,450 a subtle difference between system dynamics, the method 24 00:00:51,450 --> 00:00:55,380 set, and system thinking, the concept and field. 25 00:00:55,380 --> 00:00:58,330 And they overlap, and they flow in and out of each other, 26 00:00:58,330 --> 00:01:01,890 but they are subtly different. 27 00:01:01,890 --> 00:01:06,150 Additionally, because this is only a three-hour overview, 28 00:01:06,150 --> 00:01:08,730 we won't have the time to dive into a lot 29 00:01:08,730 --> 00:01:12,468 of the specific tools, and methods, and nitty gritty. 30 00:01:12,468 --> 00:01:14,010 Happy to discuss that in more detail, 31 00:01:14,010 --> 00:01:16,110 or maybe even I'm thinking about planning a future session 32 00:01:16,110 --> 00:01:16,610 with that. 33 00:01:16,610 --> 00:01:18,527 So one of the things we thought about-- what's 34 00:01:18,527 --> 00:01:21,090 the way that we can get people engrossed in this material 35 00:01:21,090 --> 00:01:23,202 as quickly and efficiently as possible? 36 00:01:23,202 --> 00:01:25,410 And our thought there is, we're going to do something 37 00:01:25,410 --> 00:01:27,270 that we like to do quite a bit which is a management flight 38 00:01:27,270 --> 00:01:27,960 simulator. 39 00:01:27,960 --> 00:01:29,585 So that's going to take up a good chunk 40 00:01:29,585 --> 00:01:31,180 of the middle of the afternoon. 41 00:01:31,180 --> 00:01:33,460 I hope everyone has been receiving my emails. 42 00:01:33,460 --> 00:01:35,460 I'm sorry for all of them. 43 00:01:35,460 --> 00:01:37,470 But we need essentially one laptop 44 00:01:37,470 --> 00:01:41,277 per about three to five people in the entire group. 45 00:01:41,277 --> 00:01:42,610 So I think we're good with that. 46 00:01:42,610 --> 00:01:44,068 So that will give us an opportunity 47 00:01:44,068 --> 00:01:49,380 to get hands-on with the system dynamics-fueled simulator, 48 00:01:49,380 --> 00:01:52,200 and then afterwards, we're going to sit down and talk about how 49 00:01:52,200 --> 00:01:54,783 the lessons from that simulation tied into some of our earlier 50 00:01:54,783 --> 00:01:55,360 concepts. 51 00:01:55,360 --> 00:01:56,973 And finally, end with an overview 52 00:01:56,973 --> 00:01:59,640 of some additional resources for those of you who are interested 53 00:01:59,640 --> 00:02:01,473 not only in system dynamics but specifically 54 00:02:01,473 --> 00:02:04,720 System Dynamics here at MIT. 55 00:02:04,720 --> 00:02:05,873 So who am I? 56 00:02:05,873 --> 00:02:07,290 I'll say this, I was not realizing 57 00:02:07,290 --> 00:02:10,050 that the screen was so big, because my face is 58 00:02:10,050 --> 00:02:11,560 very large up there. 59 00:02:11,560 --> 00:02:12,060 Yeah. 60 00:02:12,060 --> 00:02:13,350 My name is James Paine. 61 00:02:13,350 --> 00:02:17,640 I'm a second-year PhD student in the System Dynamics Group. 62 00:02:17,640 --> 00:02:22,200 My path here to MIT was torturous, to say the least. 63 00:02:22,200 --> 00:02:25,170 I worked for approximately 10 years in various industry 64 00:02:25,170 --> 00:02:28,530 settings as a chemical engineer at GE-Hitachi nuclear energy, 65 00:02:28,530 --> 00:02:32,208 as an industrial engineer and Lean Six Sigma specialist 66 00:02:32,208 --> 00:02:33,750 at a company called Inmar, which does 67 00:02:33,750 --> 00:02:36,810 reverse logistics, and then, finally, 68 00:02:36,810 --> 00:02:39,270 as a product manager at HanesBrands on the Maidenform 69 00:02:39,270 --> 00:02:40,770 and Playtex 18 Hour brands. 70 00:02:40,770 --> 00:02:42,902 So all over the place. 71 00:02:42,902 --> 00:02:44,610 One of the reasons, actually, why I found 72 00:02:44,610 --> 00:02:46,193 the System Dynamics Group specifically 73 00:02:46,193 --> 00:02:49,085 was when I decided to go back into academia as a career. 74 00:02:49,085 --> 00:02:50,460 The System Dynamics Group was one 75 00:02:50,460 --> 00:02:53,025 of the few places when, during my interviews 76 00:02:53,025 --> 00:02:54,400 with different folks, when I said 77 00:02:54,400 --> 00:02:56,733 I've worked on everything from nuclear reactors to bras, 78 00:02:56,733 --> 00:02:58,963 their response was, OK, that makes sense. 79 00:02:58,963 --> 00:03:00,630 And then, my response to that was, well, 80 00:03:00,630 --> 00:03:01,990 that doesn't really make sense to me, 81 00:03:01,990 --> 00:03:03,870 I'm curious about why that makes sense to you. 82 00:03:03,870 --> 00:03:05,610 And then, we kind of dove into it from there. 83 00:03:05,610 --> 00:03:07,902 And we'll get more into that sole idea of systems level 84 00:03:07,902 --> 00:03:08,760 thinking. 85 00:03:08,760 --> 00:03:11,250 My specific research interests are focused primarily 86 00:03:11,250 --> 00:03:13,810 on this concept of behavioral operations management. 87 00:03:13,810 --> 00:03:16,440 So some of you in here might be familiar with operations 88 00:03:16,440 --> 00:03:20,190 management as a concept behavioral OM that specifically 89 00:03:20,190 --> 00:03:22,200 looks at the decision making processes that 90 00:03:22,200 --> 00:03:24,150 come about as a result of human beings 91 00:03:24,150 --> 00:03:26,430 existing within a supply chain and either trying 92 00:03:26,430 --> 00:03:28,410 to work around or with that. 93 00:03:28,410 --> 00:03:30,720 My constant line and refrain for all this 94 00:03:30,720 --> 00:03:32,853 is, within behavioral OM, the presence of people 95 00:03:32,853 --> 00:03:34,770 is not a bug, it's a feature, and what can you 96 00:03:34,770 --> 00:03:39,113 do to kind of plan not only around but with that. 97 00:03:39,113 --> 00:03:41,280 So a little bit more about the System Dynamics Group 98 00:03:41,280 --> 00:03:42,982 specifically at MIT here. 99 00:03:42,982 --> 00:03:44,190 Here are our faculty members. 100 00:03:44,190 --> 00:03:47,340 We have John Sterman, Nelson Repenning, Hazhir Rahmandad, 101 00:03:47,340 --> 00:03:48,780 and David Keith. 102 00:03:48,780 --> 00:03:53,100 John Sterman is a big presence in the system dynamics 103 00:03:53,100 --> 00:03:54,360 field at this point. 104 00:03:54,360 --> 00:03:57,150 He is the author of-- and I'll talk about this later on-- this 105 00:03:57,150 --> 00:04:02,250 lovely textbook right here, which is a big, old sucker-- 106 00:04:02,250 --> 00:04:03,780 Business Dynamics. 107 00:04:03,780 --> 00:04:06,527 This has become the de facto textbook of the field 108 00:04:06,527 --> 00:04:07,110 at this point. 109 00:04:07,110 --> 00:04:09,193 There's a number of different resources out there. 110 00:04:09,193 --> 00:04:11,700 I'll talk about, again, this at the very end of the class, 111 00:04:11,700 --> 00:04:15,143 but given limited time and limited resources, 112 00:04:15,143 --> 00:04:17,310 my suggestion would to be grab one of the copies out 113 00:04:17,310 --> 00:04:19,518 of the Dewey Library over here and read the first two 114 00:04:19,518 --> 00:04:20,160 chapters. 115 00:04:20,160 --> 00:04:22,500 That greatly overlaps what we're going 116 00:04:22,500 --> 00:04:24,150 to talk about today, and in my opinion, 117 00:04:24,150 --> 00:04:26,920 covers a lot of the fundamental concepts of system dynamics 118 00:04:26,920 --> 00:04:28,150 and systems thinking. 119 00:04:28,150 --> 00:04:30,480 So very brief history of system dynamics. 120 00:04:30,480 --> 00:04:34,782 System dynamics is, at its core, an MIT, sort of, founded field. 121 00:04:34,782 --> 00:04:36,990 It came about out of the mind of this man right here, 122 00:04:36,990 --> 00:04:41,163 Jay Forrester, and its origins are in control theory. 123 00:04:41,163 --> 00:04:42,330 So this is-- I want to say-- 124 00:04:42,330 --> 00:04:45,000 I want to preface this by saying this is my view of system 125 00:04:45,000 --> 00:04:47,458 dynamics, and this is coming from the standpoint of someone 126 00:04:47,458 --> 00:04:50,400 whose background is all sort of engineering and operations 127 00:04:50,400 --> 00:04:51,130 management. 128 00:04:51,130 --> 00:04:53,220 I've used system dynamics as control theory 129 00:04:53,220 --> 00:04:55,890 applied to social systems. 130 00:04:55,890 --> 00:04:58,020 And again, you might have some slight disagreement 131 00:04:58,020 --> 00:04:59,478 from different people in the group, 132 00:04:59,478 --> 00:05:02,550 but that's at least my one-sentence summary of all 133 00:05:02,550 --> 00:05:03,870 that system dynamics is. 134 00:05:03,870 --> 00:05:07,140 So the real, seminal work of all this 135 00:05:07,140 --> 00:05:10,170 was in 1958 when he published Industrial Dynamics, which 136 00:05:10,170 --> 00:05:13,168 was sort of the first formalization of these concepts 137 00:05:13,168 --> 00:05:14,960 that we're going to be talking about today. 138 00:05:14,960 --> 00:05:16,880 Jay Forrester himself-- if anyone here 139 00:05:16,880 --> 00:05:17,855 has a background in electrical engineering, 140 00:05:17,855 --> 00:05:19,063 you might recognize his name. 141 00:05:19,063 --> 00:05:20,930 He's the original inventor of RAM. 142 00:05:20,930 --> 00:05:24,200 So his background is primarily around computers. 143 00:05:24,200 --> 00:05:27,140 He was involved in the WHIRLWIND I, 144 00:05:27,140 --> 00:05:30,690 which was MIT's first multipurpose, digital computer. 145 00:05:30,690 --> 00:05:32,817 So he did a lot of work in digital computing, 146 00:05:32,817 --> 00:05:34,400 a lot of work in the electronic space. 147 00:05:34,400 --> 00:05:36,233 Actually, if you go up to the fourth floor-- 148 00:05:36,233 --> 00:05:40,340 room 450-- it's the Jay Forrester Conference Room. 149 00:05:40,340 --> 00:05:43,010 In there, there are pictures of him on his ranch 150 00:05:43,010 --> 00:05:44,072 that he grew up on. 151 00:05:44,072 --> 00:05:44,780 There's pictures. 152 00:05:44,780 --> 00:05:47,240 There's chunks of memorabilia from his history of system 153 00:05:47,240 --> 00:05:47,870 dynamics. 154 00:05:47,870 --> 00:05:52,390 There's also an original piece of hand-woven RAM shoved up 155 00:05:52,390 --> 00:05:52,890 there. 156 00:05:52,890 --> 00:05:55,250 So if you have any interest in early computing history, 157 00:05:55,250 --> 00:05:58,640 that's kind of a fun bookshelf to wander through and touch 158 00:05:58,640 --> 00:05:59,630 and feel. 159 00:05:59,630 --> 00:06:03,320 So, really, how this started was, he came to Sloan 160 00:06:03,320 --> 00:06:05,840 and had this idea of what can I do 161 00:06:05,840 --> 00:06:08,060 to take my background in engineering 162 00:06:08,060 --> 00:06:10,040 and apply it to business systems. 163 00:06:10,040 --> 00:06:11,780 And this was born out of his view 164 00:06:11,780 --> 00:06:15,350 that, often, when faced with a problem in this space, 165 00:06:15,350 --> 00:06:18,080 you could get so far with treating it as a purely 166 00:06:18,080 --> 00:06:19,130 engineering problem. 167 00:06:19,130 --> 00:06:22,850 And ultimately, you would end up running into social issues, 168 00:06:22,850 --> 00:06:24,440 you run into bureaucratic issues, 169 00:06:24,440 --> 00:06:27,110 you run into structural issues inside the organization that 170 00:06:27,110 --> 00:06:29,690 are ultimately trying to implement the solution that 171 00:06:29,690 --> 00:06:32,520 have to become part of the solution in and of itself. 172 00:06:32,520 --> 00:06:34,912 So he asked himself, how can I broaden and apply 173 00:06:34,912 --> 00:06:37,370 some of this theory that I've already learned and developed 174 00:06:37,370 --> 00:06:39,620 over the course of my life to this wider 175 00:06:39,620 --> 00:06:41,180 concept of social systems? 176 00:06:41,180 --> 00:06:44,430 His real test of that came in two forms. 177 00:06:44,430 --> 00:06:46,425 One was an early project with General Electric 178 00:06:46,425 --> 00:06:48,050 at an appliance plant, and they noticed 179 00:06:48,050 --> 00:06:50,570 that they had this three-year cycle, specifically 180 00:06:50,570 --> 00:06:52,940 with regards to their employment history. 181 00:06:52,940 --> 00:06:54,200 It was sort of boom and bust. 182 00:06:54,200 --> 00:06:55,850 Have a whole bunch of people coming on board-- onboarding, 183 00:06:55,850 --> 00:06:57,650 onboarding, onboarding-- and then mass 184 00:06:57,650 --> 00:06:59,550 quits every three years. 185 00:06:59,550 --> 00:07:01,710 And the question was, where is this coming from, 186 00:07:01,710 --> 00:07:02,418 what's happening? 187 00:07:02,418 --> 00:07:04,760 So he sat down and hand coded one of the first-- 188 00:07:04,760 --> 00:07:05,990 what we consider to be-- 189 00:07:05,990 --> 00:07:08,970 system dynamics models within this field and presented it. 190 00:07:08,970 --> 00:07:11,090 And that led to his work in Industrial Dynamics-- 191 00:07:11,090 --> 00:07:12,798 A Major Breakthrough for Decision Makers. 192 00:07:12,798 --> 00:07:15,470 Where this got a lot of attention and spread 193 00:07:15,470 --> 00:07:18,680 beyond our group here at MIT was when 194 00:07:18,680 --> 00:07:20,570 he was invited to the Club of Rome 195 00:07:20,570 --> 00:07:22,190 and started applying some of this work 196 00:07:22,190 --> 00:07:25,520 to larger social systems in his WORLD2 simulation. 197 00:07:25,520 --> 00:07:29,030 This led to a book later on-- 198 00:07:29,030 --> 00:07:30,500 I'm sure you guys have heard-- 199 00:07:30,500 --> 00:07:33,380 called The Limits To Growth, which 200 00:07:33,380 --> 00:07:36,467 is, in and of itself, a fundamental book within system 201 00:07:36,467 --> 00:07:38,300 dynamics but also a little bit controversial 202 00:07:38,300 --> 00:07:41,750 in the form of that it was semi-misconstrued as making 203 00:07:41,750 --> 00:07:43,250 concrete predictions for the future. 204 00:07:43,250 --> 00:07:44,750 Instead, it was trying to make some comments 205 00:07:44,750 --> 00:07:46,970 about the overarching social structures we live in 206 00:07:46,970 --> 00:07:52,290 and the limits of resource renewal and regeneration. 207 00:07:52,290 --> 00:07:55,520 So that's kind of my quick background of where we are 208 00:07:55,520 --> 00:07:56,690 and how we are. 209 00:07:56,690 --> 00:07:58,820 What's interesting is, that what that's led to is, 210 00:07:58,820 --> 00:08:02,520 a group here that has a wide, wide background. 211 00:08:02,520 --> 00:08:05,478 So these are the current PhD students in our group, with me 212 00:08:05,478 --> 00:08:06,770 down there in the middle again. 213 00:08:06,770 --> 00:08:08,353 We have two other members in the room. 214 00:08:08,353 --> 00:08:11,030 We have Jose back there in the corner. 215 00:08:11,030 --> 00:08:12,615 We have TY sitting over here. 216 00:08:12,615 --> 00:08:13,740 Just going to call him out. 217 00:08:13,740 --> 00:08:14,240 Yeah. 218 00:08:14,240 --> 00:08:15,782 But if you look up here, the reason I 219 00:08:15,782 --> 00:08:18,110 have this up here partially so you can hunt us all down 220 00:08:18,110 --> 00:08:19,470 later on if you have questions. 221 00:08:19,470 --> 00:08:22,160 And then, also, for this little line below, 222 00:08:22,160 --> 00:08:24,720 where we all came from and how we all got here. 223 00:08:24,720 --> 00:08:27,230 And you'll notice that it's kind of all over the place. 224 00:08:27,230 --> 00:08:29,355 We have some engineering, we have some aeronautics, 225 00:08:29,355 --> 00:08:30,285 we have electrical. 226 00:08:30,285 --> 00:08:31,160 We also have biology. 227 00:08:31,160 --> 00:08:32,539 We have geophysics. 228 00:08:32,539 --> 00:08:34,200 We have applied mathematics. 229 00:08:34,200 --> 00:08:37,909 So folks come to system dynamics through a variety 230 00:08:37,909 --> 00:08:41,917 of different means and methods, but ultimately, what-- 231 00:08:41,917 --> 00:08:44,000 at least my opinion on this is-- it's from someone 232 00:08:44,000 --> 00:08:46,925 taking a systematic approach and noticing 233 00:08:46,925 --> 00:08:48,800 that a systematic approach by its very nature 234 00:08:48,800 --> 00:08:50,660 must exist within a system and how can we 235 00:08:50,660 --> 00:08:53,730 apply that more widely. 236 00:08:53,730 --> 00:08:56,390 So just to give you an idea of where this can kind of lead, 237 00:08:56,390 --> 00:08:57,860 I'm going give you two examples of some research 238 00:08:57,860 --> 00:08:58,730 that I'm currently working on. 239 00:08:58,730 --> 00:08:59,930 I'll get back to this later on. 240 00:08:59,930 --> 00:09:02,450 This one is one that I would consider to be more traditional 241 00:09:02,450 --> 00:09:04,910 system dynamics, in the sense that-- notice this lovely, 242 00:09:04,910 --> 00:09:07,370 little diagram, which we'll talk more about later on-- 243 00:09:07,370 --> 00:09:09,710 stock and flow diagram compartmental model. 244 00:09:09,710 --> 00:09:11,898 This is applying concepts of systems thinking 245 00:09:11,898 --> 00:09:13,940 that we'll talk about a little bit more later on, 246 00:09:13,940 --> 00:09:16,780 this idea of nonprofit organizations and decision 247 00:09:16,780 --> 00:09:18,530 making within that space, and specifically 248 00:09:18,530 --> 00:09:23,492 how nonprofits within the performance 249 00:09:23,492 --> 00:09:24,950 landscape in which they operate are 250 00:09:24,950 --> 00:09:27,860 subject to unique and different pressures 251 00:09:27,860 --> 00:09:29,795 that for-profits are not subject to. 252 00:09:29,795 --> 00:09:32,420 And this is-- none of you should be surprised by that sentence. 253 00:09:32,420 --> 00:09:35,480 That's not a crazy thing, but by its very nature, 254 00:09:35,480 --> 00:09:36,620 you create this-- 255 00:09:36,620 --> 00:09:38,157 and by outcome of this paper-- 256 00:09:38,157 --> 00:09:40,490 this performance landscape that says that not only is it 257 00:09:40,490 --> 00:09:43,460 more difficult to be a nonprofit in certain settings, 258 00:09:43,460 --> 00:09:46,310 it is structurally impossible for you 259 00:09:46,310 --> 00:09:48,390 to achieve certain types of goals. 260 00:09:48,390 --> 00:09:50,720 So that's something that's in the whole line 261 00:09:50,720 --> 00:09:53,540 with this is like why is it so difficult to be 262 00:09:53,540 --> 00:09:55,580 a nonprofit manager under certain circumstances. 263 00:09:55,580 --> 00:09:56,997 That's kind of where it came from. 264 00:09:56,997 --> 00:09:59,270 Totally different chain of research 265 00:09:59,270 --> 00:10:02,510 here is this idea of multi-echelon supply chains 266 00:10:02,510 --> 00:10:04,190 and inventory oscillation. 267 00:10:04,190 --> 00:10:09,290 This is built on a classic model within this space. 268 00:10:09,290 --> 00:10:11,290 Has anyone here ever played the Beer Game? 269 00:10:11,290 --> 00:10:11,790 Beer Game? 270 00:10:11,790 --> 00:10:12,770 We got a couple of hands. 271 00:10:12,770 --> 00:10:14,687 So this is ultimately looking at the Beer Game 272 00:10:14,687 --> 00:10:16,340 and me being frustrated and saying, 273 00:10:16,340 --> 00:10:19,433 we've been talking about the Beer Game for 60 years, 274 00:10:19,433 --> 00:10:21,100 like, why are we still talking about it? 275 00:10:21,100 --> 00:10:22,350 What's going on in this space? 276 00:10:22,350 --> 00:10:26,178 So this is taking, again, the idea of people being people, 277 00:10:26,178 --> 00:10:27,970 letting their order behavior be what it is, 278 00:10:27,970 --> 00:10:30,820 and talk about what can we do for an algorithmic approach 279 00:10:30,820 --> 00:10:34,630 to control mechanisms that exist within a supply chain 280 00:10:34,630 --> 00:10:35,330 with delays. 281 00:10:35,330 --> 00:10:37,597 So in this case, I'm not really doing a lot 282 00:10:37,597 --> 00:10:40,180 with what one would consider to be traditional system dynamics 283 00:10:40,180 --> 00:10:42,010 tools, but I'm spending a lot of time in TensorFlow, 284 00:10:42,010 --> 00:10:43,850 I'm spending a lot of time in Python, 285 00:10:43,850 --> 00:10:46,570 I'm spending a lot of time doing optimization 286 00:10:46,570 --> 00:10:47,950 within these spaces. 287 00:10:47,950 --> 00:10:51,820 And that still is within this larger concept of system 288 00:10:51,820 --> 00:10:54,832 dynamics and systems thinking. 289 00:10:54,832 --> 00:10:56,290 So let's dive in a little bit more. 290 00:10:56,290 --> 00:10:58,660 I've been saying system dynamics and systems thinking a lot. 291 00:10:58,660 --> 00:11:00,800 And you guys are in a three-hour overview session, 292 00:11:00,800 --> 00:11:02,560 so I'm assuming that most of you guys 293 00:11:02,560 --> 00:11:05,290 don't necessarily have a strong background on this. 294 00:11:05,290 --> 00:11:06,640 Actually, that's a good question to ask right now. 295 00:11:06,640 --> 00:11:09,010 Has anyone here taken any of the system dynamics classes 296 00:11:09,010 --> 00:11:12,100 here at MIT already? 297 00:11:12,100 --> 00:11:12,700 OK, good. 298 00:11:12,700 --> 00:11:15,430 Because otherwise, this will be really repetitive for you, 299 00:11:15,430 --> 00:11:17,825 so that's great, OK. 300 00:11:17,825 --> 00:11:20,200 So who here has ever seen a diagram that looks like this? 301 00:11:20,200 --> 00:11:21,200 Does this look familiar? 302 00:11:21,200 --> 00:11:24,280 You have a problem, you identify the problem, you gather data, 303 00:11:24,280 --> 00:11:26,530 you evaluate alternatives, you select a solution, 304 00:11:26,530 --> 00:11:27,640 and you implement it. 305 00:11:27,640 --> 00:11:29,680 Anyone here ever worked in project management? 306 00:11:29,680 --> 00:11:31,360 Anyone here ever worked in that sort of environment? 307 00:11:31,360 --> 00:11:31,990 Yeah. 308 00:11:31,990 --> 00:11:35,890 So my background is Lean Six Sigma-- 309 00:11:35,890 --> 00:11:40,060 once upon a time-- and this is a highly-simplified version 310 00:11:40,060 --> 00:11:41,143 of the idea of like DMAIC. 311 00:11:41,143 --> 00:11:43,560 Though, for those of you who have a background in Lean Six 312 00:11:43,560 --> 00:11:45,580 Sigma and know DMAIC, don't jump ahead of me 313 00:11:45,580 --> 00:11:48,470 because I'm going to get to something important here. 314 00:11:48,470 --> 00:11:52,300 So within this concept of identifying a problem, 315 00:11:52,300 --> 00:11:54,347 finding a solution, and implementing it, 316 00:11:54,347 --> 00:11:55,930 ultimately what we're talking about is 317 00:11:55,930 --> 00:11:58,930 that we have some sort of goal, we want to fix a problem. 318 00:11:58,930 --> 00:12:01,030 We're going to make some sort of decision. 319 00:12:01,030 --> 00:12:02,530 By making that decision, we're going 320 00:12:02,530 --> 00:12:04,060 to change the state of our system. 321 00:12:04,060 --> 00:12:05,600 Great, we did it. 322 00:12:05,600 --> 00:12:06,970 Congratulations. 323 00:12:06,970 --> 00:12:09,460 I'm assuming-- partially given away by the title of this 324 00:12:09,460 --> 00:12:10,840 section of the slides-- 325 00:12:10,840 --> 00:12:13,390 you can kind of tell that this is not complete. 326 00:12:13,390 --> 00:12:16,630 Also, just by the nature of how I've arranged these variables 327 00:12:16,630 --> 00:12:19,003 on this board, I'm kind of giving away where 328 00:12:19,003 --> 00:12:20,170 I'm going with some of this. 329 00:12:20,170 --> 00:12:22,060 Because at the end of the day, this 330 00:12:22,060 --> 00:12:24,350 is not true in and of itself. 331 00:12:24,350 --> 00:12:26,110 This is all embedded in a larger system 332 00:12:26,110 --> 00:12:28,060 that matters in the short term. 333 00:12:28,060 --> 00:12:31,060 So when you change the state of your system, 334 00:12:31,060 --> 00:12:32,590 your decisions have to change. 335 00:12:32,590 --> 00:12:36,580 Suddenly, when you've changed how you might implement 336 00:12:36,580 --> 00:12:40,360 or control a specific structural process within, I don't know, 337 00:12:40,360 --> 00:12:42,400 your supply chain, the way that you 338 00:12:42,400 --> 00:12:45,100 go about managing that supply chain on a daily basis 339 00:12:45,100 --> 00:12:46,480 is now different. 340 00:12:46,480 --> 00:12:48,170 Additionally, because of that change, 341 00:12:48,170 --> 00:12:50,170 maybe now your goals are a little bit different. 342 00:12:50,170 --> 00:12:51,670 Before, you were trying to-- again, 343 00:12:51,670 --> 00:12:54,640 I'm going to talk in the language of supply chain-- 344 00:12:54,640 --> 00:12:56,920 you were shooting for that semi-arbitrary number 345 00:12:56,920 --> 00:13:00,485 of 3.4 defects per million opportunities-- which 346 00:13:00,485 --> 00:13:02,860 I have a whole other soapbox about why that number is not 347 00:13:02,860 --> 00:13:05,218 the appropriate number to choose. 348 00:13:05,218 --> 00:13:07,760 And now you've done it, so now you want to do something else, 349 00:13:07,760 --> 00:13:09,787 so now your goals have changed. 350 00:13:09,787 --> 00:13:11,620 But now, all of a sudden, by you doing this, 351 00:13:11,620 --> 00:13:15,160 you suddenly realize that you're affecting not only your one 352 00:13:15,160 --> 00:13:17,470 process but other processes, other things 353 00:13:17,470 --> 00:13:18,500 within your system. 354 00:13:18,500 --> 00:13:21,430 And that can lead to this thing we call side effects. 355 00:13:21,430 --> 00:13:23,328 But however, maybe by me changing 356 00:13:23,328 --> 00:13:25,120 the state of my system, suddenly a supplier 357 00:13:25,120 --> 00:13:27,040 has to change how they're acting for me. 358 00:13:27,040 --> 00:13:28,990 Suddenly, maybe an order stream has to change. 359 00:13:28,990 --> 00:13:32,320 Somebody outside of myself and my team 360 00:13:32,320 --> 00:13:34,240 has to change their decisions based 361 00:13:34,240 --> 00:13:35,920 upon the state of the system. 362 00:13:35,920 --> 00:13:37,900 And additionally, that state of the system 363 00:13:37,900 --> 00:13:39,038 feeds in right there. 364 00:13:39,038 --> 00:13:41,080 So you can see where I'm kind of going with this. 365 00:13:41,080 --> 00:13:43,570 And additionally, when everything is said and done 366 00:13:43,570 --> 00:13:45,460 and you've built this lovely process up, 367 00:13:45,460 --> 00:13:46,810 with everyone at their own side effects 368 00:13:46,810 --> 00:13:48,643 and building off everything else like that-- 369 00:13:48,643 --> 00:13:51,460 here's the fun bit-- none of this happens instantaneously. 370 00:13:51,460 --> 00:13:56,560 There are delay after delay after delay after delay. 371 00:13:56,560 --> 00:14:00,250 So you, making a choice today that affects your system 372 00:14:00,250 --> 00:14:02,920 and achieves that goal within that open loop thinking model, 373 00:14:02,920 --> 00:14:05,320 ultimately can come right back around and affect 374 00:14:05,320 --> 00:14:06,460 that original input again. 375 00:14:06,460 --> 00:14:09,970 This is the fundamental concept of systems level thinking. 376 00:14:09,970 --> 00:14:13,000 No decision you make exists in isolation. 377 00:14:13,000 --> 00:14:16,780 No process that you affect exists in isolation. 378 00:14:16,780 --> 00:14:19,010 The only difference is the boundaries of your system. 379 00:14:19,010 --> 00:14:21,593 In fact, I go so far as to say there are no such thing as side 380 00:14:21,593 --> 00:14:23,860 effects, there are simply effects that you have not 381 00:14:23,860 --> 00:14:25,640 thought about yet. 382 00:14:25,640 --> 00:14:27,080 So what is a system? 383 00:14:27,080 --> 00:14:30,010 A system is any set of interdependent parts 384 00:14:30,010 --> 00:14:31,417 with a common purpose. 385 00:14:31,417 --> 00:14:32,500 Now, that common purpose-- 386 00:14:32,500 --> 00:14:35,028 I'll be honest-- is a bit of a caveat, 387 00:14:35,028 --> 00:14:36,820 because as you can probably tell from this, 388 00:14:36,820 --> 00:14:38,530 if you take the systems level concept 389 00:14:38,530 --> 00:14:40,210 and apply it ad nauseum, suddenly you 390 00:14:40,210 --> 00:14:42,700 have a system of every atom of the universe, 391 00:14:42,700 --> 00:14:45,280 and that ultimately is not feasible. 392 00:14:45,280 --> 00:14:48,310 You do eventually have to draw a boundary around your system. 393 00:14:48,310 --> 00:14:50,200 So for our purpose, very generally, 394 00:14:50,200 --> 00:14:52,210 we're going to say that define a purpose, 395 00:14:52,210 --> 00:14:53,980 and then your system is all those parts 396 00:14:53,980 --> 00:14:56,920 that exist within that purpose. 397 00:14:56,920 --> 00:15:00,400 And-- I put my animations out of order there apparently-- social 398 00:15:00,400 --> 00:15:02,560 and economic systems are highly complex. 399 00:15:02,560 --> 00:15:04,930 They are dynamic, tightly-coupled-- and by that, 400 00:15:04,930 --> 00:15:06,952 what I mean is, when one thing happens, 401 00:15:06,952 --> 00:15:09,160 you can trace it to another thing occurring somewhere 402 00:15:09,160 --> 00:15:09,670 else-- 403 00:15:09,670 --> 00:15:10,965 governed by feedback. 404 00:15:10,965 --> 00:15:12,340 Ultimately, the decision you make 405 00:15:12,340 --> 00:15:14,590 affects something that then affects that same decision 406 00:15:14,590 --> 00:15:15,880 process in the future. 407 00:15:15,880 --> 00:15:18,760 Non-linear, limited information, and there's ambiguity. 408 00:15:18,760 --> 00:15:20,710 And typically, they're more complex than 409 00:15:20,710 --> 00:15:22,782 human-made, physical systems. 410 00:15:22,782 --> 00:15:25,240 And so, ultimately, when you take a lovely, little diagram, 411 00:15:25,240 --> 00:15:27,400 like this of one thing-- kind of water flowing down, 412 00:15:27,400 --> 00:15:29,200 water flowing down, water flowing down-- the part 413 00:15:29,200 --> 00:15:31,300 that people often forget is that feedback loop 414 00:15:31,300 --> 00:15:33,830 that goes right back up to the top again. 415 00:15:33,830 --> 00:15:37,600 So when we talk about systems thinking and system dynamics, 416 00:15:37,600 --> 00:15:40,270 it's really a framework to help close the loops. 417 00:15:40,270 --> 00:15:42,700 I talked about, before, this idea of open loop thinking. 418 00:15:42,700 --> 00:15:45,075 That's our method of taking that process and saying, 419 00:15:45,075 --> 00:15:47,200 what are the processes that I make downstream, what 420 00:15:47,200 --> 00:15:48,760 are the choices I make downstream, 421 00:15:48,760 --> 00:15:50,290 and how do they come back and affect 422 00:15:50,290 --> 00:15:54,130 that first starting point at the beginning of the day. 423 00:15:54,130 --> 00:15:56,440 Ultimately, what we try to do within system dynamics, 424 00:15:56,440 --> 00:15:59,548 we have a general framework for figuring this sort of process 425 00:15:59,548 --> 00:16:00,340 out as best we can. 426 00:16:00,340 --> 00:16:02,560 A big part of it is eliciting mental models. 427 00:16:02,560 --> 00:16:04,990 And mental models in that sense is saying-- walking up 428 00:16:04,990 --> 00:16:07,690 to somebody and saying, I know you're acting rationally, 429 00:16:07,690 --> 00:16:10,300 I know the choices that you make are the ones that you think 430 00:16:10,300 --> 00:16:13,070 are best for the information set that you have right now, 431 00:16:13,070 --> 00:16:13,960 I want to know why. 432 00:16:13,960 --> 00:16:16,660 I want to build a model of your mental process 433 00:16:16,660 --> 00:16:19,755 that when I run it and I look at it, it makes complete sense, 434 00:16:19,755 --> 00:16:21,130 and then take that model and then 435 00:16:21,130 --> 00:16:22,930 pop it in the context of a larger system 436 00:16:22,930 --> 00:16:25,388 and see how those decisions start coming back and affecting 437 00:16:25,388 --> 00:16:25,930 the outcome. 438 00:16:25,930 --> 00:16:27,370 And then-- that's exactly right-- 439 00:16:27,370 --> 00:16:29,860 taking that mental model and expanding it outwards 440 00:16:29,860 --> 00:16:31,660 and saying, OK, now you've made choices, 441 00:16:31,660 --> 00:16:34,630 how can we bring feedback into account? 442 00:16:34,630 --> 00:16:35,315 Simulation. 443 00:16:35,315 --> 00:16:36,940 So this is one thing-- for those of you 444 00:16:36,940 --> 00:16:39,580 who might have a small inkling of the System Dynamics Group 445 00:16:39,580 --> 00:16:41,835 here at MIT, a huge chunk of what we do is simulation. 446 00:16:41,835 --> 00:16:43,960 Because at the end of the day, it's a heck of a lot 447 00:16:43,960 --> 00:16:48,520 cheaper, quicker to simulate a social process, and changes, 448 00:16:48,520 --> 00:16:53,230 and tweaks, or impacts, or effects 449 00:16:53,230 --> 00:16:55,360 than actually going out there and doing it directly 450 00:16:55,360 --> 00:16:56,512 on social systems. 451 00:16:56,512 --> 00:16:57,970 It turns out that takes a long time 452 00:16:57,970 --> 00:17:00,137 and often makes people grumpy if you do it directly. 453 00:17:00,137 --> 00:17:02,420 So simulation is really, really helpful. 454 00:17:02,420 --> 00:17:04,869 So simulation helps also improving mental models 455 00:17:04,869 --> 00:17:08,200 by presenting folks with people, also as a teaching tool, 456 00:17:08,200 --> 00:17:11,890 showing folks self-contained models and simulations 457 00:17:11,890 --> 00:17:15,339 of a process that both shows how their mental model fits 458 00:17:15,339 --> 00:17:17,319 and makes sense and is perfectly rational, 459 00:17:17,319 --> 00:17:19,359 but then also, in a larger context, 460 00:17:19,359 --> 00:17:23,982 can affect and have feedback. 461 00:17:23,982 --> 00:17:24,940 So this is the big one. 462 00:17:24,940 --> 00:17:27,910 The simulation's purpose is not to be right. 463 00:17:27,910 --> 00:17:31,090 I put this up here partially for my own edification 464 00:17:31,090 --> 00:17:33,080 and my own learning. 465 00:17:33,080 --> 00:17:35,530 I spend way too much time building these models, 466 00:17:35,530 --> 00:17:37,883 both within the core system dynamics tools 467 00:17:37,883 --> 00:17:39,550 that we'll talk about a little bit later 468 00:17:39,550 --> 00:17:43,750 and also within my own side environments of saying, OK, I 469 00:17:43,750 --> 00:17:47,200 want to get this simulation to match some reference 470 00:17:47,200 --> 00:17:49,780 mode of data as closely as possible. 471 00:17:49,780 --> 00:17:52,570 And the question there is, though, being right 472 00:17:52,570 --> 00:17:55,040 is not the same as being useful. 473 00:17:55,040 --> 00:17:56,560 So you want to make sure that you're 474 00:17:56,560 --> 00:17:59,380 identifying the high-leverage policy choices 475 00:17:59,380 --> 00:18:04,247 within your model, the portions of that multi-loop mode 476 00:18:04,247 --> 00:18:06,580 of thinking we talked about earlier that actually affect 477 00:18:06,580 --> 00:18:09,280 the outcome that you care about, not necessarily 478 00:18:09,280 --> 00:18:12,940 have a simulation that directly matches the physical universe 479 00:18:12,940 --> 00:18:15,310 around us. 480 00:18:15,310 --> 00:18:18,250 These are next-- what I consider to be some of the three core 481 00:18:18,250 --> 00:18:20,460 points of systems thinking and systems dynamics, 482 00:18:20,460 --> 00:18:21,460 and this is the big one. 483 00:18:21,460 --> 00:18:22,840 Structure generates behavior. 484 00:18:22,840 --> 00:18:24,820 If you walk into a system dynamics conference, 485 00:18:24,820 --> 00:18:26,740 if you walk into any room with enough people 486 00:18:26,740 --> 00:18:29,380 who do system dynamics for more than five minutes, 487 00:18:29,380 --> 00:18:31,330 eventually someone's going to say this. 488 00:18:31,330 --> 00:18:32,622 Structure generates behavior. 489 00:18:32,622 --> 00:18:34,330 And I'm going to spend a little more time 490 00:18:34,330 --> 00:18:35,290 on this in a few minutes. 491 00:18:35,290 --> 00:18:36,832 The whole idea of structure generates 492 00:18:36,832 --> 00:18:41,470 behavior is that the actions that people take 493 00:18:41,470 --> 00:18:45,130 are possible, feasible, and rational because 494 00:18:45,130 --> 00:18:47,230 of the universe in which they exist, 495 00:18:47,230 --> 00:18:49,060 and it's important to realize that. 496 00:18:49,060 --> 00:18:51,430 And that actually then goes into that mental models 497 00:18:51,430 --> 00:18:52,540 matter a lot. 498 00:18:52,540 --> 00:18:54,830 So physical structure-- when we talk about structure, 499 00:18:54,830 --> 00:18:56,600 it's not just the physical structure of your system, 500 00:18:56,600 --> 00:18:58,808 it's the structure that exists within people's heads. 501 00:18:58,808 --> 00:19:01,270 It's the choices that they make when they take an input 502 00:19:01,270 --> 00:19:02,470 and then do an output. 503 00:19:02,470 --> 00:19:05,920 Again, my background is all process control and process 504 00:19:05,920 --> 00:19:06,680 engineering. 505 00:19:06,680 --> 00:19:09,880 So one way you can think of this as little tiny PID controllers 506 00:19:09,880 --> 00:19:12,440 floating around in the universe taking inputs and outputs. 507 00:19:12,440 --> 00:19:15,010 However, figuring out what those inputs and outputs are 508 00:19:15,010 --> 00:19:16,780 and how people convert that information 509 00:19:16,780 --> 00:19:20,770 is difficult and should not be casually assumed away. 510 00:19:20,770 --> 00:19:23,230 And then, finally, the fundamental attribution error. 511 00:19:23,230 --> 00:19:25,660 This is something that I added back in here, 512 00:19:25,660 --> 00:19:28,463 because this idea that our first instinct is 513 00:19:28,463 --> 00:19:30,130 to blame the people and not the system-- 514 00:19:30,130 --> 00:19:31,963 and this goes back to this idea of structure 515 00:19:31,963 --> 00:19:32,930 generates behavior. 516 00:19:32,930 --> 00:19:35,770 So like, for example, if you're on the highway and suddenly 517 00:19:35,770 --> 00:19:40,090 someone cuts you off, what's your initial reaction? 518 00:19:40,090 --> 00:19:41,050 Screw that guy! 519 00:19:41,050 --> 00:19:42,190 What the heck! 520 00:19:42,190 --> 00:19:43,900 Honk, honk, honk. 521 00:19:43,900 --> 00:19:45,910 Now suddenly, you're on the highway and-- 522 00:19:45,910 --> 00:19:48,130 I don't know-- you're late to get somewhere-- for me, 523 00:19:48,130 --> 00:19:50,350 I got a call that my kid is throwing up at preschool, 524 00:19:50,350 --> 00:19:52,142 I got to get there in the next 10 minutes-- 525 00:19:52,142 --> 00:19:52,760 what do I do? 526 00:19:52,760 --> 00:19:56,560 I cut somebody off saying, sorry, it's OK, and I zoom off. 527 00:19:56,560 --> 00:19:59,230 So in that situation-- that first situation, when I'm honk, 528 00:19:59,230 --> 00:20:01,810 honk, honk, that guy-- that guy who just cut me off-- 529 00:20:01,810 --> 00:20:02,830 what's wrong with them? 530 00:20:02,830 --> 00:20:05,510 Something is wrong with that person. 531 00:20:05,510 --> 00:20:06,010 No. 532 00:20:06,010 --> 00:20:07,630 There's an underlying reason, there's 533 00:20:07,630 --> 00:20:10,177 an underlying rationality behind what they're doing. 534 00:20:10,177 --> 00:20:12,010 I will argue that there might be rationality 535 00:20:12,010 --> 00:20:13,270 or that person is just really bad at driving, 536 00:20:13,270 --> 00:20:14,478 but hopefully, that's not it. 537 00:20:14,478 --> 00:20:16,660 But the idea that the fundamental attribution error 538 00:20:16,660 --> 00:20:19,090 is that our first instinct is to blame the people and not 539 00:20:19,090 --> 00:20:19,780 the system. 540 00:20:19,780 --> 00:20:21,640 Back to structure generates behavior. 541 00:20:21,640 --> 00:20:24,760 It is the structure of the system that matters 542 00:20:24,760 --> 00:20:26,110 less so than the behavior. 543 00:20:26,110 --> 00:20:28,480 And even within our own group here at MIT, 544 00:20:28,480 --> 00:20:30,220 we have to remind ourselves of this idea 545 00:20:30,220 --> 00:20:31,450 of the fundamental attribution error. 546 00:20:31,450 --> 00:20:33,867 And we try to break away from it, we're not perfect at it. 547 00:20:33,867 --> 00:20:36,340 This is written on the whiteboard up in our room. 548 00:20:36,340 --> 00:20:38,590 This is one of the first things I saw when I walked up 549 00:20:38,590 --> 00:20:40,360 to that group, and at the time, I just 550 00:20:40,360 --> 00:20:41,530 thought it was a nice saying and I 551 00:20:41,530 --> 00:20:43,322 didn't realize this was actually addressing 552 00:20:43,322 --> 00:20:45,752 one of the fundamental decision-making fallacies 553 00:20:45,752 --> 00:20:47,710 that we have to overcome as system dynamisists. 554 00:20:47,710 --> 00:20:49,360 So this is the basic assumption. 555 00:20:49,360 --> 00:20:51,220 "We believe that everyone in this community 556 00:20:51,220 --> 00:20:54,250 is intelligent and capable, cares about doing their best, 557 00:20:54,250 --> 00:20:56,915 and acts with integrity, and wants to learn." 558 00:20:56,915 --> 00:20:58,540 So the assumption is that, every person 559 00:20:58,540 --> 00:21:00,640 you meet, every person you sit down with 560 00:21:00,640 --> 00:21:02,743 has this operating in the background 561 00:21:02,743 --> 00:21:04,660 and you have to assume that they are rational, 562 00:21:04,660 --> 00:21:07,690 caring human beings who are doing what they think is best. 563 00:21:07,690 --> 00:21:10,210 And the moment that you accept that, then, suddenly, it's 564 00:21:10,210 --> 00:21:11,900 not the person you have to change, 565 00:21:11,900 --> 00:21:14,400 it's the system in which they're operating that needs to get 566 00:21:14,400 --> 00:21:15,560 shifted. 567 00:21:15,560 --> 00:21:18,070 So this some more, kind of, burning 568 00:21:18,070 --> 00:21:19,990 into this whole structure generates behavior. 569 00:21:19,990 --> 00:21:24,528 You have patterns of behavior-- so events are the most visible. 570 00:21:24,528 --> 00:21:26,570 Underneath events, you have patterns of behavior. 571 00:21:26,570 --> 00:21:28,260 Under that, you have structure. 572 00:21:28,260 --> 00:21:31,520 So when you look out into the universe and you see something, 573 00:21:31,520 --> 00:21:32,695 those are the events. 574 00:21:32,695 --> 00:21:34,820 You see those things happening over and over again, 575 00:21:34,820 --> 00:21:36,237 those become patterns of behavior. 576 00:21:36,237 --> 00:21:38,770 And then, the system under which it's generating it 577 00:21:38,770 --> 00:21:39,840 is the actual structure. 578 00:21:39,840 --> 00:21:41,592 So here's an example. 579 00:21:41,592 --> 00:21:42,800 This is all about oil prices. 580 00:21:42,800 --> 00:21:44,990 "Drunk trader caused a spike in oil prices." 581 00:21:44,990 --> 00:21:46,910 "Oil prices keep falling-- this is why." 582 00:21:46,910 --> 00:21:50,300 "OPEC rumors continue to pull oil prices higher." 583 00:21:50,300 --> 00:21:52,400 OPEC turns for high oil prices. 584 00:21:52,400 --> 00:21:54,800 "Another sign of economic worry-- tumbling oil prices." 585 00:21:54,800 --> 00:21:57,573 Oil prices after tanker attack in the Gulf of Oman. 586 00:21:57,573 --> 00:21:59,240 So in this case, each one of these ones, 587 00:21:59,240 --> 00:22:01,580 you notice they're talking about oil prices jumping up, 588 00:22:01,580 --> 00:22:03,710 oil prices going down, oil prices jumping up, 589 00:22:03,710 --> 00:22:05,143 oil prices going down. 590 00:22:05,143 --> 00:22:06,560 These are the events that you see. 591 00:22:06,560 --> 00:22:08,150 These are the points of data. 592 00:22:08,150 --> 00:22:10,872 The pattern is a little bit more interesting. 593 00:22:10,872 --> 00:22:12,330 You take each one of those effects, 594 00:22:12,330 --> 00:22:14,540 you can't just draw a line out ad infinitum, 595 00:22:14,540 --> 00:22:15,810 you want to look at something like this. 596 00:22:15,810 --> 00:22:17,685 So in this case, you can see the oil prices-- 597 00:22:17,685 --> 00:22:21,260 and I would see this as sort of a stereotypical boom and bust 598 00:22:21,260 --> 00:22:22,070 pattern. 599 00:22:22,070 --> 00:22:24,203 It goes up, it goes down for quite some time. 600 00:22:24,203 --> 00:22:25,370 Goes up, and then goes down. 601 00:22:25,370 --> 00:22:26,960 Goes up, and then goes down. 602 00:22:26,960 --> 00:22:28,760 I would actually even boil in here 603 00:22:28,760 --> 00:22:31,115 and say that it looks like the amount of noise 604 00:22:31,115 --> 00:22:32,990 around this up-and-down pattern is increasing 605 00:22:32,990 --> 00:22:34,365 over time, which in and of itself 606 00:22:34,365 --> 00:22:36,410 is a little bit interesting. 607 00:22:36,410 --> 00:22:38,628 But ultimately, it's that structure 608 00:22:38,628 --> 00:22:39,920 in the background that matters. 609 00:22:39,920 --> 00:22:42,800 There's the physical structure, there's the information 610 00:22:42,800 --> 00:22:44,720 availability, and then there's also 611 00:22:44,720 --> 00:22:49,740 the actual mental model of the actors involved in the process. 612 00:22:49,740 --> 00:22:50,763 So we talk about this. 613 00:22:50,763 --> 00:22:51,680 This is a whole thing. 614 00:22:51,680 --> 00:22:54,500 Whenever you talk about mental models within this space, 615 00:22:54,500 --> 00:22:56,000 one of the things that's easy to say 616 00:22:56,000 --> 00:22:58,250 is, OK, well, you tell people about this. 617 00:22:58,250 --> 00:22:59,720 You tell them to consider feedback, 618 00:22:59,720 --> 00:23:02,210 you tell them to incorporate this information around them, 619 00:23:02,210 --> 00:23:04,310 that that'll fix it, that'll fix it. 620 00:23:04,310 --> 00:23:04,940 No. 621 00:23:04,940 --> 00:23:07,580 It is incredibly difficult to learn and change 622 00:23:07,580 --> 00:23:09,230 in a dynamically-complex environment, 623 00:23:09,230 --> 00:23:11,700 and these are just some of the information-- 624 00:23:11,700 --> 00:23:13,858 some of the reasons why. 625 00:23:13,858 --> 00:23:15,650 I'm not going to walk through all of these. 626 00:23:15,650 --> 00:23:16,850 I think everyone in this room, probably, 627 00:23:16,850 --> 00:23:18,060 is on the same page with me on this one. 628 00:23:18,060 --> 00:23:19,643 One thing I'll point out, right there, 629 00:23:19,643 --> 00:23:22,340 is limited information and time delays. 630 00:23:22,340 --> 00:23:23,570 Gigantic reason. 631 00:23:23,570 --> 00:23:25,818 People can only make the decisions 632 00:23:25,818 --> 00:23:27,860 given the information they have available to them 633 00:23:27,860 --> 00:23:29,943 in the structure in which they're able to operate. 634 00:23:29,943 --> 00:23:32,450 To expect someone to operate otherwise 635 00:23:32,450 --> 00:23:34,950 is to expect them to be omniscient in some way, shape, 636 00:23:34,950 --> 00:23:38,000 or form, and ultimately, that is not feasible within a larger 637 00:23:38,000 --> 00:23:38,893 system. 638 00:23:38,893 --> 00:23:41,060 So system dynamics-- in my mind, the System Dynamics 639 00:23:41,060 --> 00:23:43,760 Group here at MIT is really applied systems thinking. 640 00:23:43,760 --> 00:23:44,580 It is not a method. 641 00:23:44,580 --> 00:23:45,950 It's not a model necessarily. 642 00:23:45,950 --> 00:23:48,625 It is taking the concepts of systems thinking and applying 643 00:23:48,625 --> 00:23:50,750 them in ways where we can get research outputs that 644 00:23:50,750 --> 00:23:59,845 ultimately improve some sort of social system for the world. 645 00:23:59,845 --> 00:24:01,970 So now having said all that about systems thinking, 646 00:24:01,970 --> 00:24:05,540 I'm now going to dive into some specific, actual tools 647 00:24:05,540 --> 00:24:08,030 that we can go ahead and use here within this space. 648 00:24:08,030 --> 00:24:10,030 Let me make sure how we're doing here with time. 649 00:24:10,030 --> 00:24:10,820 We're doing great. 650 00:24:10,820 --> 00:24:11,360 All right. 651 00:24:11,360 --> 00:24:14,900 So systems thinking, ultimately, and system modeling 652 00:24:14,900 --> 00:24:16,040 is iterative approach. 653 00:24:16,040 --> 00:24:18,410 There is no right answer at the very beginning, 654 00:24:18,410 --> 00:24:20,630 there is only the answer that you 655 00:24:20,630 --> 00:24:23,180 think is good enough to solve or address 656 00:24:23,180 --> 00:24:24,470 the problem you care about. 657 00:24:24,470 --> 00:24:25,670 It's a spiral approach. 658 00:24:25,670 --> 00:24:27,350 Just like everything else we'll talk about with open loop 659 00:24:27,350 --> 00:24:29,475 thinking, the process of building a system dynamics 660 00:24:29,475 --> 00:24:31,650 model is an attempt to close the loop. 661 00:24:31,650 --> 00:24:33,727 And it's not just compartmental models. 662 00:24:33,727 --> 00:24:35,810 So when people talk about system dynamics models-- 663 00:24:35,810 --> 00:24:37,880 if you hear that phrase-- what they're likely talking about 664 00:24:37,880 --> 00:24:39,172 is this lovely thing over here. 665 00:24:39,172 --> 00:24:41,060 This is from one of my papers, and I'm 666 00:24:41,060 --> 00:24:42,530 working on it right now. 667 00:24:42,530 --> 00:24:44,572 We'll talk about this in more detail in a second. 668 00:24:44,572 --> 00:24:47,360 It's called a stock and flow diagram or compartmental model. 669 00:24:47,360 --> 00:24:51,650 This is often confounded with system dynamics, the field. 670 00:24:51,650 --> 00:24:54,950 It's partially because this is the go-to modeling 671 00:24:54,950 --> 00:24:56,810 choice for many people in system dynamics, 672 00:24:56,810 --> 00:24:59,630 and this is also the first modeling choice 673 00:24:59,630 --> 00:25:02,210 back in the '50s, '60s, and '70s when approaching 674 00:25:02,210 --> 00:25:04,823 these sort of systems. 675 00:25:04,823 --> 00:25:06,990 And then, of course, that says all models are wrong, 676 00:25:06,990 --> 00:25:09,240 but some models are useful at the end of the day. 677 00:25:09,240 --> 00:25:12,200 So when you're taking a spiral approach 678 00:25:12,200 --> 00:25:14,390 to systems modeling, what you want to do 679 00:25:14,390 --> 00:25:16,730 is you pick something within this setup. 680 00:25:16,730 --> 00:25:18,530 You think, OK, what's a reference mode. 681 00:25:18,530 --> 00:25:22,130 So by reference mode, I mean what is a mode of behavior 682 00:25:22,130 --> 00:25:24,740 that you can observe and pin down. 683 00:25:24,740 --> 00:25:28,970 This person made this decision when exposed to these inputs 684 00:25:28,970 --> 00:25:29,900 and had this output. 685 00:25:29,900 --> 00:25:31,610 Great, we have a reference mode. 686 00:25:31,610 --> 00:25:34,850 Can I build a model that reflects that reference mode? 687 00:25:34,850 --> 00:25:35,540 Awesome. 688 00:25:35,540 --> 00:25:36,830 Now that I've done that, let's go ahead 689 00:25:36,830 --> 00:25:38,038 and add a new reference mode. 690 00:25:38,038 --> 00:25:39,830 Can I make this reference mode exist 691 00:25:39,830 --> 00:25:41,000 within my existing diagram? 692 00:25:41,000 --> 00:25:41,310 Nope. 693 00:25:41,310 --> 00:25:42,690 Let's go ahead and add more structure. 694 00:25:42,690 --> 00:25:44,510 And as you add that more structure, maybe it comes back 695 00:25:44,510 --> 00:25:46,052 and it feeds back into itself and you 696 00:25:46,052 --> 00:25:48,723 have to go back and kind of rework from the very beginning. 697 00:25:48,723 --> 00:25:50,390 This is where this concept of simulation 698 00:25:50,390 --> 00:25:52,280 becomes really, really helpful, because when 699 00:25:52,280 --> 00:25:53,660 you interview somebody-- 700 00:25:53,660 --> 00:25:56,330 as I've done with some of the work in the, especially, 701 00:25:56,330 --> 00:25:57,260 non-profit space-- 702 00:25:57,260 --> 00:25:59,870 you might get six different stories describing 703 00:25:59,870 --> 00:26:03,440 three different outcomes across two different possible modes 704 00:26:03,440 --> 00:26:04,005 of behavior. 705 00:26:04,005 --> 00:26:05,630 And the question there is, how the heck 706 00:26:05,630 --> 00:26:08,330 do I incorporate this all into a dynamic model 707 00:26:08,330 --> 00:26:09,440 at the very beginning? 708 00:26:09,440 --> 00:26:10,850 My argument there is, you don't. 709 00:26:10,850 --> 00:26:11,788 Pick one. 710 00:26:11,788 --> 00:26:14,330 Pick one to model it, then when you're done, pick another one 711 00:26:14,330 --> 00:26:15,960 and model that one and then see, OK, 712 00:26:15,960 --> 00:26:18,170 how do they overlap, how do they fit into each other, 713 00:26:18,170 --> 00:26:19,988 how can you make this one, unified model? 714 00:26:19,988 --> 00:26:21,530 And the big thing is, as you're going 715 00:26:21,530 --> 00:26:24,110 through this whole process, doing your sensitivity analysis 716 00:26:24,110 --> 00:26:25,985 and you're testing, you're going back through 717 00:26:25,985 --> 00:26:28,975 and updating your own assumptions and mental models 718 00:26:28,975 --> 00:26:31,100 about what you think the problem was to begin with. 719 00:26:31,100 --> 00:26:34,302 This idea of spiral approach. 720 00:26:34,302 --> 00:26:36,760 So, OK, now we're actually going to jump into some actual-- 721 00:26:36,760 --> 00:26:38,510 I said before that system dynamics are not 722 00:26:38,510 --> 00:26:39,503 compartmental models. 723 00:26:39,503 --> 00:26:41,920 I'm now going to teach you guys about compartmental models 724 00:26:41,920 --> 00:26:43,180 because we use them a lot in system dynamics. 725 00:26:43,180 --> 00:26:43,755 Heh. 726 00:26:43,755 --> 00:26:44,380 So this is fun. 727 00:26:44,380 --> 00:26:45,850 So causal links. 728 00:26:45,850 --> 00:26:48,610 A causal link is a fundamental tool. 729 00:26:48,610 --> 00:26:50,530 So causal link is simply saying, if one thing 730 00:26:50,530 --> 00:26:52,240 changes, another thing changes. 731 00:26:52,240 --> 00:26:53,292 This is super simple. 732 00:26:53,292 --> 00:26:54,500 So let's start with this one. 733 00:26:54,500 --> 00:26:55,630 You have production, you have inventory, 734 00:26:55,630 --> 00:26:56,590 and you have shipments. 735 00:26:56,590 --> 00:27:00,100 If I increase my production of a good, 736 00:27:00,100 --> 00:27:03,220 what should happen to my inventory? 737 00:27:03,220 --> 00:27:04,750 This is not a trick question. 738 00:27:04,750 --> 00:27:05,710 AUDIENCE: Goes up. 739 00:27:05,710 --> 00:27:06,190 JAMES PAINE: It's going to go up. 740 00:27:06,190 --> 00:27:06,700 Woo! 741 00:27:06,700 --> 00:27:08,920 If I increase my shipments of a good, 742 00:27:08,920 --> 00:27:10,690 what's going to happen to my inventory? 743 00:27:10,690 --> 00:27:11,848 It's going to go down. 744 00:27:11,848 --> 00:27:12,640 You guys are-- woo! 745 00:27:12,640 --> 00:27:13,432 You guys are great. 746 00:27:13,432 --> 00:27:14,470 So there you go. 747 00:27:14,470 --> 00:27:17,637 So these are causal loops, or causal links, similarly. 748 00:27:17,637 --> 00:27:18,970 Let's say I have orders booked-- 749 00:27:18,970 --> 00:27:20,730 and now, you could argue that there's 750 00:27:20,730 --> 00:27:22,480 some missing components in here, but we're 751 00:27:22,480 --> 00:27:23,980 keeping this really high-ended. 752 00:27:23,980 --> 00:27:26,950 If I have more salesforce in general, 753 00:27:26,950 --> 00:27:29,200 what do you think would happen to the number of orders 754 00:27:29,200 --> 00:27:31,180 I have booked? 755 00:27:31,180 --> 00:27:31,960 Increase, yep. 756 00:27:31,960 --> 00:27:34,760 If I increase my price, how many orders 757 00:27:34,760 --> 00:27:36,010 do you think I'm going to get? 758 00:27:36,010 --> 00:27:37,240 They're going to go down, exactly, yeah. 759 00:27:37,240 --> 00:27:38,380 Pretty straightforward. 760 00:27:38,380 --> 00:27:40,840 One more and then I'll leave you guys alone with this one. 761 00:27:40,840 --> 00:27:41,870 Population. 762 00:27:41,870 --> 00:27:44,135 More people are born, it's going to go up. 763 00:27:44,135 --> 00:27:45,760 More people die, it's going to go down. 764 00:27:45,760 --> 00:27:46,260 Great. 765 00:27:46,260 --> 00:27:48,320 Causal links, straightforward. 766 00:27:48,320 --> 00:27:51,705 Where it gets a little tricky is when you have an ambiguity. 767 00:27:51,705 --> 00:27:53,330 If all of a sudden you say to yourself, 768 00:27:53,330 --> 00:27:55,497 well, you know, salesforce, I don't know about that. 769 00:27:55,497 --> 00:27:57,772 Maybe if I hire too many people with my salesforce, 770 00:27:57,772 --> 00:27:59,980 then suddenly they start conflicting with each other. 771 00:27:59,980 --> 00:28:01,647 Maybe they start going after each other. 772 00:28:01,647 --> 00:28:04,357 Maybe they start overlapping, and maybe customers 773 00:28:04,357 --> 00:28:06,190 get annoyed because they get too many calls, 774 00:28:06,190 --> 00:28:07,595 so maybe it will go down. 775 00:28:07,595 --> 00:28:09,970 The moment you have this idea that that causal sign could 776 00:28:09,970 --> 00:28:12,120 be a plus or minus, that implies that there's 777 00:28:12,120 --> 00:28:13,120 something in the middle. 778 00:28:13,120 --> 00:28:15,610 That means that that loop does not exist in and of itself, 779 00:28:15,610 --> 00:28:19,688 there is structure that you have missing that you need to add. 780 00:28:19,688 --> 00:28:21,730 Now, something also, too, kind of the same idea-- 781 00:28:21,730 --> 00:28:22,917 causal thinking. 782 00:28:22,917 --> 00:28:25,000 And some of you guys might have seen this already. 783 00:28:25,000 --> 00:28:27,010 Causal thinking is truly causal thinking. 784 00:28:27,010 --> 00:28:30,620 It needs to be causal, not purely corollary. 785 00:28:30,620 --> 00:28:32,560 So ice cream-- so this is generally 786 00:28:32,560 --> 00:28:36,122 true-- as ice cream sales goes up, so does the murder rate. 787 00:28:36,122 --> 00:28:38,080 So next time they open up a new Baskin-Robbins, 788 00:28:38,080 --> 00:28:39,288 we should maybe change towns. 789 00:28:39,288 --> 00:28:40,810 I guess that's what that means. 790 00:28:40,810 --> 00:28:42,790 Now, as you guys can probably guess, what that really means 791 00:28:42,790 --> 00:28:44,207 is that there's something missing. 792 00:28:44,207 --> 00:28:47,367 This is not truly causal, instead 793 00:28:47,367 --> 00:28:48,700 we have the average temperature. 794 00:28:48,700 --> 00:28:50,173 As the average temperature goes up, 795 00:28:50,173 --> 00:28:52,090 people start interacting with each other more, 796 00:28:52,090 --> 00:28:53,350 they start going outside more. 797 00:28:53,350 --> 00:28:55,810 Just the amount of incidents of people kind of interacting 798 00:28:55,810 --> 00:28:57,560 goes up, murder rate happens to go up. 799 00:28:57,560 --> 00:28:58,880 Additionally, people want more ice cream. 800 00:28:58,880 --> 00:28:59,320 So there. 801 00:28:59,320 --> 00:29:01,112 In that case, we have something in between. 802 00:29:01,112 --> 00:29:03,640 So something to keep in mind when you build a causal loop 803 00:29:03,640 --> 00:29:05,042 diagram is that word causal, when 804 00:29:05,042 --> 00:29:06,250 you're putting this together. 805 00:29:06,250 --> 00:29:07,600 And then, if you have time, I would 806 00:29:07,600 --> 00:29:09,850 encourage you guys to go just like, I'm going to try-- 807 00:29:09,850 --> 00:29:13,648 I'm going to run these slides by the faculty 808 00:29:13,648 --> 00:29:15,190 and make sure there's nothing in here 809 00:29:15,190 --> 00:29:17,350 that I'm not supposed to share, but assuming I can, 810 00:29:17,350 --> 00:29:18,820 I'm going to send this out to everybody as well so you'll 811 00:29:18,820 --> 00:29:20,110 have access to these slides. 812 00:29:20,110 --> 00:29:23,050 One of my favorite websites, Spurious Correlations. 813 00:29:23,050 --> 00:29:25,400 They have all sorts of fun little things like that. 814 00:29:25,400 --> 00:29:28,190 So, now, bringing it all together into a loop. 815 00:29:28,190 --> 00:29:31,750 So in this case, we have a basic loop of employee skill 816 00:29:31,750 --> 00:29:33,910 versus customer satisfaction, complaints, 817 00:29:33,910 --> 00:29:36,010 manager's time spent resolving customer issues, 818 00:29:36,010 --> 00:29:37,760 and manager time spent coaching employees. 819 00:29:37,760 --> 00:29:39,677 I'm going to walk through this really quickly. 820 00:29:39,677 --> 00:29:41,470 Again, we only have a few hours to talk. 821 00:29:41,470 --> 00:29:43,240 So normally, this would be a little-- if you guys end up 822 00:29:43,240 --> 00:29:44,990 taking one of the system dynamics classes, 823 00:29:44,990 --> 00:29:46,600 we step through this very carefully. 824 00:29:46,600 --> 00:29:47,710 Yeah. 825 00:29:47,710 --> 00:29:50,400 AUDIENCE: What do you mean by causal? 826 00:29:50,400 --> 00:29:52,150 JAMES PAINE: So causal, in and of itself-- 827 00:29:52,150 --> 00:29:53,358 you raise an excellent point. 828 00:29:53,358 --> 00:29:55,540 Causal, in and of itself, is the degree to which 829 00:29:55,540 --> 00:29:57,230 you have confidence on it. 830 00:29:57,230 --> 00:29:59,560 So at the end of the day, causality 831 00:29:59,560 --> 00:30:04,270 is the realm of mathematicians and physicists, in my opinion. 832 00:30:04,270 --> 00:30:07,220 True, true unresolved causality. 833 00:30:07,220 --> 00:30:11,080 So causality in the sense that, if one thing occurs, 834 00:30:11,080 --> 00:30:13,028 the other thing if, and only-- 835 00:30:13,028 --> 00:30:15,070 one thing occurs if, and only if, the other thing 836 00:30:15,070 --> 00:30:18,100 occurs, that they truly are connected in time and space. 837 00:30:18,100 --> 00:30:20,260 When one happens, the other one follows. 838 00:30:20,260 --> 00:30:23,362 Corollary is that that may still hold true, 839 00:30:23,362 --> 00:30:24,820 but the reason they're happening is 840 00:30:24,820 --> 00:30:28,060 because of an in-between mechanism that is driving both. 841 00:30:28,060 --> 00:30:31,480 So where this gets kind of tricky is, 842 00:30:31,480 --> 00:30:34,540 the lines of causality become blurred 843 00:30:34,540 --> 00:30:37,150 when you start talking about macro-social systems. 844 00:30:37,150 --> 00:30:41,600 And ultimately, this is the same idea that all models are wrong, 845 00:30:41,600 --> 00:30:42,950 but some models are useful. 846 00:30:42,950 --> 00:30:44,492 There's a certain point of time where 847 00:30:44,492 --> 00:30:46,190 you have to pump the brakes and say, 848 00:30:46,190 --> 00:30:48,978 all right, this is causal enough for my purposes 849 00:30:48,978 --> 00:30:50,020 and I'm going to move on. 850 00:30:50,020 --> 00:30:52,062 But the same thing-- when you said the moment you 851 00:30:52,062 --> 00:30:55,510 have a time where that loop polarity 852 00:30:55,510 --> 00:30:58,750 can shift or, suddenly, something increasing causes 853 00:30:58,750 --> 00:30:59,952 another thing to increase. 854 00:30:59,952 --> 00:31:01,660 Now, suddenly, that same thing increasing 855 00:31:01,660 --> 00:31:03,430 is causing everything to decrease. 856 00:31:03,430 --> 00:31:05,600 Those are no longer causally-related in space 857 00:31:05,600 --> 00:31:06,100 and time. 858 00:31:06,100 --> 00:31:07,160 There's something in between. 859 00:31:07,160 --> 00:31:08,535 There's some mechanism in between 860 00:31:08,535 --> 00:31:10,870 that's causing that thing to go up in this direction 861 00:31:10,870 --> 00:31:12,880 causes this other thing to go up and/or down 862 00:31:12,880 --> 00:31:15,250 depending upon some in-between circumstance. 863 00:31:15,250 --> 00:31:16,960 The moment that degree of specificity 864 00:31:16,960 --> 00:31:19,480 is necessary, then suddenly that causal link needs to get 865 00:31:19,480 --> 00:31:22,727 split out into more detailed structure. 866 00:31:22,727 --> 00:31:24,310 Maybe I'm diving into too much detail. 867 00:31:24,310 --> 00:31:25,430 Is that helpful? 868 00:31:25,430 --> 00:31:25,930 Kind of? 869 00:31:25,930 --> 00:31:26,980 AUDIENCE: [INAUDIBLE] you mentioned 870 00:31:26,980 --> 00:31:29,140 physics, but like there's a perspective of physics 871 00:31:29,140 --> 00:31:32,050 in which you actually look at the physics equations, 872 00:31:32,050 --> 00:31:33,840 there is no real such thing as causality, 873 00:31:33,840 --> 00:31:35,590 there are just states of the universe that 874 00:31:35,590 --> 00:31:38,560 are mathematically consistent and states of the universe that 875 00:31:38,560 --> 00:31:43,270 are not, so causality-- so it's like, causality really 876 00:31:43,270 --> 00:31:46,750 depends on the assumptions of your system. 877 00:31:46,750 --> 00:31:48,160 Like, you're assuming that things 878 00:31:48,160 --> 00:31:51,050 could have been different, but in reality, they couldn't have. 879 00:31:51,050 --> 00:31:53,300 So I guess I was-- 880 00:31:53,300 --> 00:31:55,105 the operational definition of [INAUDIBLE].. 881 00:31:55,105 --> 00:31:56,140 JAMES PAINE: You're completely right. 882 00:31:56,140 --> 00:31:58,150 Is your background either physics or mathematics? 883 00:31:58,150 --> 00:31:58,720 AUDIENCE: Yeah, physics. 884 00:31:58,720 --> 00:31:59,630 JAMES PAINE: Aha, there we go. 885 00:31:59,630 --> 00:31:59,890 So yeah. 886 00:31:59,890 --> 00:32:01,432 No, no, you raise an excellent point, 887 00:32:01,432 --> 00:32:02,890 because I had the same conversation 888 00:32:02,890 --> 00:32:05,170 with a couple of other folks in the physics group. 889 00:32:05,170 --> 00:32:08,830 And within our-- what it boils down to-- within our social 890 00:32:08,830 --> 00:32:13,150 system modeling, our definition of causality is loose at best-- 891 00:32:13,150 --> 00:32:14,770 but ultimately, it boils down to, 892 00:32:14,770 --> 00:32:17,030 if one thing moves in one direction, 893 00:32:17,030 --> 00:32:18,820 another thing moves in the same direction 894 00:32:18,820 --> 00:32:21,592 consistently every time. 895 00:32:21,592 --> 00:32:23,800 The moment that you can't say the phrase consistently 896 00:32:23,800 --> 00:32:27,567 every time, then that concept of causality breaks down. 897 00:32:27,567 --> 00:32:28,150 Just so you... 898 00:32:28,150 --> 00:32:28,840 Yeah. 899 00:32:28,840 --> 00:32:31,048 AUDIENCE: So when you're dealing with social systems, 900 00:32:31,048 --> 00:32:33,280 does it become a little bit less [INAUDIBLE] 901 00:32:33,280 --> 00:32:36,010 and more statistical? 902 00:32:36,010 --> 00:32:36,813 JAMES PAINE: Yeah. 903 00:32:36,813 --> 00:32:38,230 To be very honest with you, I have 904 00:32:38,230 --> 00:32:40,438 to be careful because I think some folks in the group 905 00:32:40,438 --> 00:32:41,740 might disagree with me on this. 906 00:32:41,740 --> 00:32:43,990 But again, my background coming from control theory 907 00:32:43,990 --> 00:32:48,550 and operations management is that the realm of causality 908 00:32:48,550 --> 00:32:53,290 and the realm of correlation are degrees of convenience 909 00:32:53,290 --> 00:32:55,003 at the end of the day. 910 00:32:55,003 --> 00:32:56,920 This is a gigantic asterisk in the top corner. 911 00:32:56,920 --> 00:32:58,378 This is the opinion of James Paine. 912 00:32:58,378 --> 00:33:02,860 Please do not necessarily tell other faculty that that's me. 913 00:33:02,860 --> 00:33:04,568 But yes, it is a division of convenience. 914 00:33:04,568 --> 00:33:06,277 And ultimately, what it boils down to me, 915 00:33:06,277 --> 00:33:08,380 is that, if you shove something in one direction, 916 00:33:08,380 --> 00:33:10,000 does another thing constantly move 917 00:33:10,000 --> 00:33:12,490 the same way every single time? 918 00:33:12,490 --> 00:33:15,212 The moment that it doesn't, then they're no longer 919 00:33:15,212 --> 00:33:16,420 coupled within these systems. 920 00:33:16,420 --> 00:33:19,930 That's the very, sort of, hand-wavy, simplistic way 921 00:33:19,930 --> 00:33:20,680 of approaching it. 922 00:33:20,680 --> 00:33:22,033 AUDIENCE: All else being equal. 923 00:33:22,033 --> 00:33:23,450 JAMES PAINE: All else being equal. 924 00:33:23,450 --> 00:33:24,283 That's a good point. 925 00:33:24,283 --> 00:33:25,370 All else being equal. 926 00:33:25,370 --> 00:33:25,870 So here. 927 00:33:25,870 --> 00:33:26,890 I'm going to go through these loops real quick. 928 00:33:26,890 --> 00:33:27,220 This is good. 929 00:33:27,220 --> 00:33:29,210 So this, I think, is pretty straightforward. 930 00:33:29,210 --> 00:33:31,627 This is the whole idea that, as your employee skill-- this 931 00:33:31,627 --> 00:33:33,712 is, again, very general-- employee skill goes up, 932 00:33:33,712 --> 00:33:35,170 your customer satisfaction goes up. 933 00:33:35,170 --> 00:33:38,140 Customer satisfaction goes up, complaints go down. 934 00:33:38,140 --> 00:33:40,180 Times that your manager now spends 935 00:33:40,180 --> 00:33:42,350 resolving customer issues goes down, 936 00:33:42,350 --> 00:33:44,517 so now the manager has more time to coach employees. 937 00:33:44,517 --> 00:33:45,100 So guess what? 938 00:33:45,100 --> 00:33:46,520 Employee skill goes up. 939 00:33:46,520 --> 00:33:48,265 This is an example of a causal loop. 940 00:33:48,265 --> 00:33:50,173 You start at one variable, yet work your way 941 00:33:50,173 --> 00:33:52,090 all the way around to the same variable again. 942 00:33:52,090 --> 00:33:53,770 Now, I'm speeding up here just a little bit 943 00:33:53,770 --> 00:33:55,270 for the sake of time, but let's just 944 00:33:55,270 --> 00:33:58,270 go ahead and say that we somehow exogenously increased 945 00:33:58,270 --> 00:33:59,770 the amount of time that managers can 946 00:33:59,770 --> 00:34:01,710 spend coaching their employees. 947 00:34:01,710 --> 00:34:04,570 I don't know, we somehow give them extra time in their day. 948 00:34:04,570 --> 00:34:05,750 What does that mean? 949 00:34:05,750 --> 00:34:07,490 Well, this would imply that our employee 950 00:34:07,490 --> 00:34:08,998 skill you'd expect to go up. 951 00:34:08,998 --> 00:34:10,790 And if our employee skill expects to go up, 952 00:34:10,790 --> 00:34:12,760 then we expect our customer satisfaction to go up. 953 00:34:12,760 --> 00:34:15,070 We'd expect our complaints to correspondingly go down, 954 00:34:15,070 --> 00:34:16,870 based upon these causal diagrams. 955 00:34:16,870 --> 00:34:20,320 If our complaints go down, then the manager time 956 00:34:20,320 --> 00:34:23,625 spent resolving issues goes back up, which suddenly means 957 00:34:23,625 --> 00:34:25,750 that the amount of time they have available to them 958 00:34:25,750 --> 00:34:26,542 goes back up again. 959 00:34:26,542 --> 00:34:28,208 We've gone all the way back up the loop, 960 00:34:28,208 --> 00:34:30,130 we've increased one variable by a little bit, 961 00:34:30,130 --> 00:34:32,560 and we've gotten a positive gain around that loop. 962 00:34:32,560 --> 00:34:34,870 If any of you have any experience with control theory, 963 00:34:34,870 --> 00:34:35,870 this will look familiar. 964 00:34:35,870 --> 00:34:38,630 This is a positive gain or reinforcing loop. 965 00:34:38,630 --> 00:34:41,293 This is a loop that, essentially, runs away 966 00:34:41,293 --> 00:34:42,460 in one direction or another. 967 00:34:42,460 --> 00:34:44,877 When one variable starts going a little bit up or a little 968 00:34:44,877 --> 00:34:47,860 bit down, by its action, it causes a cascade 969 00:34:47,860 --> 00:34:50,320 of events that causes it to go more up or more down 970 00:34:50,320 --> 00:34:52,000 within that space. 971 00:34:52,000 --> 00:34:53,920 Correspondingly, we can have something-- 972 00:34:53,920 --> 00:34:54,880 I think you guys can tell where this 973 00:34:54,880 --> 00:34:56,530 is going-- called the balancing loop, to get ahead 974 00:34:56,530 --> 00:34:57,440 of myself a little bit. 975 00:34:57,440 --> 00:34:59,440 So in this case, the general attractiveness of your market 976 00:34:59,440 --> 00:35:01,510 goes up, the number of competitors within that space-- 977 00:35:01,510 --> 00:35:02,890 you could imagine-- would go up. 978 00:35:02,890 --> 00:35:05,827 Your product price-- this is due to competitive interactions-- 979 00:35:05,827 --> 00:35:06,410 might go down. 980 00:35:06,410 --> 00:35:10,577 This is an example of an aggregated causal diagram. 981 00:35:10,577 --> 00:35:12,910 I think you can make a really strong argument that those 982 00:35:12,910 --> 00:35:15,530 are not necessarily perfectly causally-related, 983 00:35:15,530 --> 00:35:18,760 but for our conversation, it's good enough. 984 00:35:18,760 --> 00:35:20,110 Product price goes down. 985 00:35:20,110 --> 00:35:21,760 Your profits, all else being equal, 986 00:35:21,760 --> 00:35:25,140 you'd imagine would go down. 987 00:35:25,140 --> 00:35:26,180 That's-- OK. 988 00:35:26,180 --> 00:35:27,930 I think I did that one a little backwards. 989 00:35:27,930 --> 00:35:28,350 We'll find out. 990 00:35:28,350 --> 00:35:29,320 Anyway, balancing loop. 991 00:35:29,320 --> 00:35:31,028 So the whole idea with the balancing loop 992 00:35:31,028 --> 00:35:31,860 is that a small-- 993 00:35:31,860 --> 00:35:36,330 these are loops where they tend to balance themselves out. 994 00:35:36,330 --> 00:35:38,250 You have some sort of external stimuli 995 00:35:38,250 --> 00:35:40,410 that pokes a variable up or pokes a variable down, 996 00:35:40,410 --> 00:35:43,050 it will act in a way to return back to its initial position. 997 00:35:43,050 --> 00:35:45,300 A good example of a balancing loop that comes up a lot 998 00:35:45,300 --> 00:35:46,890 is called the goal-seeking loop. 999 00:35:46,890 --> 00:35:49,350 In this case, you have some sort of desired performance out 1000 00:35:49,350 --> 00:35:52,230 in the universe and your system will 1001 00:35:52,230 --> 00:35:54,480 act to close a gap relative to that. 1002 00:35:54,480 --> 00:35:56,670 I'm showing you guys a structure kind of relatively 1003 00:35:56,670 --> 00:35:59,740 quickly to give you some chunks of system dynamics 1004 00:35:59,740 --> 00:36:02,740 compartmental models that you might see frequently. 1005 00:36:02,740 --> 00:36:05,525 So the right way to tell if you have a balancing or reinforcing 1006 00:36:05,525 --> 00:36:07,650 loop is you trace the behavior, like we did before. 1007 00:36:07,650 --> 00:36:09,692 You pick a variable, you kick it in a direction-- 1008 00:36:09,692 --> 00:36:10,598 plus or minus-- 1009 00:36:10,598 --> 00:36:13,140 and you see what happens when you loop it all the way around. 1010 00:36:13,140 --> 00:36:16,620 There's a quick way to do it, which is count the negative. 1011 00:36:16,620 --> 00:36:18,810 If there is an even number, then it 1012 00:36:18,810 --> 00:36:20,520 is a reinforcing loop, zero or even. 1013 00:36:20,520 --> 00:36:23,190 If it is an odd number, then it's a balancing loop. 1014 00:36:23,190 --> 00:36:25,440 The part where that gets a little wonky is that if you 1015 00:36:25,440 --> 00:36:27,300 mislabeled it in some way, shape, or form, then suddenly 1016 00:36:27,300 --> 00:36:28,425 that no longer makes sense. 1017 00:36:28,425 --> 00:36:31,140 So that's my little asterisk in the top corner there. 1018 00:36:31,140 --> 00:36:32,320 Stocks and flows. 1019 00:36:32,320 --> 00:36:33,840 So stocks and flows, ultimately-- 1020 00:36:33,840 --> 00:36:36,780 I'm going to skip ahead a little bit here-- are anything that-- 1021 00:36:36,780 --> 00:36:40,000 a stock is anything that has memory within this space. 1022 00:36:40,000 --> 00:36:42,640 So if you go back to the origins of system dynamics 1023 00:36:42,640 --> 00:36:45,150 and its applications in control theory, 1024 00:36:45,150 --> 00:36:47,430 a stock is anything whose value depends 1025 00:36:47,430 --> 00:36:50,400 upon the instantaneous or the accumulation of change 1026 00:36:50,400 --> 00:36:51,580 of two other variables. 1027 00:36:51,580 --> 00:36:54,060 So how we tend to show this in our space 1028 00:36:54,060 --> 00:36:56,130 is as a bucket with some pipes. 1029 00:36:56,130 --> 00:36:57,780 So you have an inflow coming in, you 1030 00:36:57,780 --> 00:36:59,400 have some sort of accumulation in the middle, 1031 00:36:59,400 --> 00:37:01,567 and you have some outflow coming out of the back end 1032 00:37:01,567 --> 00:37:02,080 right there. 1033 00:37:02,080 --> 00:37:04,630 This right here, now, allows your system to have memory. 1034 00:37:04,630 --> 00:37:06,922 This is what your system has when you talk about number 1035 00:37:06,922 --> 00:37:10,200 of employees, you talk about number 1036 00:37:10,200 --> 00:37:12,150 of units in your inventory, something else 1037 00:37:12,150 --> 00:37:14,070 that has sort of stickiness in time, 1038 00:37:14,070 --> 00:37:18,060 something that instantaneously doesn't disappear or change 1039 00:37:18,060 --> 00:37:22,120 within one unit of delta time but rather exists and persists. 1040 00:37:22,120 --> 00:37:24,210 So we use something called the hydraulic metaphor. 1041 00:37:24,210 --> 00:37:25,770 This picture-- the quality of it is really low, 1042 00:37:25,770 --> 00:37:26,730 and I should remake it. 1043 00:37:26,730 --> 00:37:27,810 It's the idea of a bathtub. 1044 00:37:27,810 --> 00:37:30,435 So you have your pipe coming in, you have your pipe coming out. 1045 00:37:30,435 --> 00:37:35,610 And ultimately, you can't get rid of something in the middle 1046 00:37:35,610 --> 00:37:37,655 unless you take it out of the back end 1047 00:37:37,655 --> 00:37:39,780 or turn down the amount of inflow on the front end. 1048 00:37:39,780 --> 00:37:41,700 So a good example of this is greenhouse gases 1049 00:37:41,700 --> 00:37:42,900 in the atmosphere. 1050 00:37:42,900 --> 00:37:43,920 What do you do about it? 1051 00:37:43,920 --> 00:37:46,170 There's really only two options at the end of the day. 1052 00:37:46,170 --> 00:37:48,900 You have to either increase net removal 1053 00:37:48,900 --> 00:37:50,910 or decrease net emissions. 1054 00:37:50,910 --> 00:37:52,120 Those are your two choices. 1055 00:37:52,120 --> 00:37:53,703 So that's the one thing we talk about. 1056 00:37:53,703 --> 00:37:56,838 A stock is something that accumulates over time. 1057 00:37:56,838 --> 00:37:58,380 So this is a good example right here. 1058 00:37:58,380 --> 00:38:00,600 This is just some stocks versus flows. 1059 00:38:00,600 --> 00:38:03,240 Your balance sheet versus your cash flow statement. 1060 00:38:03,240 --> 00:38:07,620 Wealth versus income and expenditures per unit time. 1061 00:38:07,620 --> 00:38:09,330 CO2 emissions, vehicle production-- those 1062 00:38:09,330 --> 00:38:10,020 are all flows. 1063 00:38:10,020 --> 00:38:11,603 Part of this-- you can tell-- is based 1064 00:38:11,603 --> 00:38:13,343 upon how you define the variables 1065 00:38:13,343 --> 00:38:14,760 at the end of the day, so you have 1066 00:38:14,760 --> 00:38:15,978 to be a little bit careful. 1067 00:38:15,978 --> 00:38:17,020 But I will give you this. 1068 00:38:17,020 --> 00:38:18,978 Here's one that I kind of consider a little bit 1069 00:38:18,978 --> 00:38:20,332 tricky-- interest rate. 1070 00:38:20,332 --> 00:38:22,040 What would you consider to be an interest 1071 00:38:22,040 --> 00:38:24,635 rate, a stock or a flow? 1072 00:38:24,635 --> 00:38:25,713 AUDIENCE: [INAUDIBLE] 1073 00:38:25,713 --> 00:38:27,880 JAMES PAINE: I called it tricky, so you can probably 1074 00:38:27,880 --> 00:38:29,060 think, like, oh, take what you think it is 1075 00:38:29,060 --> 00:38:30,130 and then just flip the answer. 1076 00:38:30,130 --> 00:38:31,088 That's what I would do. 1077 00:38:31,088 --> 00:38:33,160 So it has the word rate in it, which immediately 1078 00:38:33,160 --> 00:38:35,590 makes me think flow, but it's actually a rate on money-- 1079 00:38:35,590 --> 00:38:36,523 it is a price. 1080 00:38:36,523 --> 00:38:38,440 It is something that says that, if you give me 1081 00:38:38,440 --> 00:38:39,940 x number of dollars, I will give you 1082 00:38:39,940 --> 00:38:44,410 back x number of dollars over some period of time elapsing. 1083 00:38:44,410 --> 00:38:46,690 In that case, it is a price, it is a sticky quantity, 1084 00:38:46,690 --> 00:38:50,140 it is a stock, so the word rate there is misleading. 1085 00:38:50,140 --> 00:38:52,630 I'm going very quick guys because I want to give you 1086 00:38:52,630 --> 00:38:54,110 guys time for the simulation. 1087 00:38:54,110 --> 00:38:57,130 So that right there is really quick system dynamics 1088 00:38:57,130 --> 00:38:59,837 in a nutshell, in terms of both systems thinking and concepts 1089 00:38:59,837 --> 00:39:01,420 and also some of the fundamental tools 1090 00:39:01,420 --> 00:39:04,360 that we use within stock and flow diagrams. 1091 00:39:04,360 --> 00:39:06,970 What I wanted to do next was transition into actually a very 1092 00:39:06,970 --> 00:39:08,740 large group project that will give us some time 1093 00:39:08,740 --> 00:39:10,510 to really experience some of this firsthand, 1094 00:39:10,510 --> 00:39:12,020 and then ultimately, tie it back in. 1095 00:39:12,020 --> 00:39:14,228 But before I do that, I want to give us a few minutes 1096 00:39:14,228 --> 00:39:15,170 to talk about this. 1097 00:39:15,170 --> 00:39:17,500 Anyone have any questions about either System Dynamics Group 1098 00:39:17,500 --> 00:39:19,083 at MIT-- and we'll have some more time 1099 00:39:19,083 --> 00:39:21,400 at the end of this-- or system dynamics and systems 1100 00:39:21,400 --> 00:39:24,250 thinking as methodology at this point? 1101 00:39:26,920 --> 00:39:27,910 Yeah. 1102 00:39:27,910 --> 00:39:30,100 AUDIENCE: When do you think-- or when 1103 00:39:30,100 --> 00:39:32,560 is a problem too complicated for you to say 1104 00:39:32,560 --> 00:39:34,510 system dynamics is useful? 1105 00:39:34,510 --> 00:39:36,040 What are the bounds on usefulness 1106 00:39:36,040 --> 00:39:37,983 of applying this to a problem? 1107 00:39:37,983 --> 00:39:39,400 JAMES PAINE: Well, the weird thing 1108 00:39:39,400 --> 00:39:41,050 is, I won't necessarily say-- complicated 1109 00:39:41,050 --> 00:39:41,967 is not the right word. 1110 00:39:41,967 --> 00:39:43,690 I would say that system dynamics is 1111 00:39:43,690 --> 00:39:49,750 a poor predictor of the future when it comes to precision. 1112 00:39:49,750 --> 00:39:52,540 So like, there's an example of system dynamics, I think, 1113 00:39:52,540 --> 00:39:55,150 is good at making models that predict modes of behavior 1114 00:39:55,150 --> 00:39:57,770 but not necessarily point predictions in the future. 1115 00:39:57,770 --> 00:40:02,155 So like, a good example would be an oscillatory mode. 1116 00:40:04,720 --> 00:40:08,050 If this is your idea of-- this is like your mode of behavior 1117 00:40:08,050 --> 00:40:09,070 of something over time-- 1118 00:40:09,070 --> 00:40:12,670 I don't know-- this is like some price of something over time, 1119 00:40:12,670 --> 00:40:15,850 and you make a model that ultimately shows 1120 00:40:15,850 --> 00:40:18,100 that, given some period, some set of inputs, 1121 00:40:18,100 --> 00:40:21,040 I can do something like affect the amplitude and periodicity 1122 00:40:21,040 --> 00:40:24,220 of this scale but ultimately does show something that kind 1123 00:40:24,220 --> 00:40:27,760 of changes with time, but for some reason 1124 00:40:27,760 --> 00:40:31,030 my model is just a little bit off. 1125 00:40:31,030 --> 00:40:33,040 The question is, which is useful-- 1126 00:40:33,040 --> 00:40:35,920 knowing that I have this sort of up-and-down amplitude 1127 00:40:35,920 --> 00:40:38,020 and here are the things or features 1128 00:40:38,020 --> 00:40:42,400 that either predict a change in that amplitude and frequency, 1129 00:40:42,400 --> 00:40:44,530 or is it more useful to know that the price is 1130 00:40:44,530 --> 00:40:47,560 going to be this versus this at this time in the future? 1131 00:40:47,560 --> 00:40:50,332 And the issue with any sort of systems model 1132 00:40:50,332 --> 00:40:52,540 is, the moment you move far enough away from whatever 1133 00:40:52,540 --> 00:40:54,437 your current sort of stab in space 1134 00:40:54,437 --> 00:40:57,490 is, you can be just as wrong as you can possibly 1135 00:40:57,490 --> 00:40:59,960 be when it comes to point predictions in the future. 1136 00:40:59,960 --> 00:41:01,690 So my argument is that system dynamics 1137 00:41:01,690 --> 00:41:05,980 is good at coming up with models where you can start looking 1138 00:41:05,980 --> 00:41:08,860 for these modes of behavior and identifying 1139 00:41:08,860 --> 00:41:12,340 the chunks of structure that have a large policy 1140 00:41:12,340 --> 00:41:15,445 impact on affecting or adjusting these modes of behavior, 1141 00:41:15,445 --> 00:41:17,320 not necessarily figuring out am I going to be 1142 00:41:17,320 --> 00:41:19,270 here or here at time this. 1143 00:41:19,270 --> 00:41:21,070 So that's-- yeah. 1144 00:41:23,600 --> 00:41:26,350 I got to use the whiteboard, so that's fun. 1145 00:41:26,350 --> 00:41:27,220 OK, cool. 1146 00:41:27,220 --> 00:41:30,255 So what we're going to do-- this is another bit of bread 1147 00:41:30,255 --> 00:41:32,130 and butter of the MIT System Dynamics Group-- 1148 00:41:32,130 --> 00:41:33,380 a management flight simulator. 1149 00:41:33,380 --> 00:41:35,927 This is the whole idea of working on mental models. 1150 00:41:35,927 --> 00:41:38,260 Ultimately, getting someone's hands dirty with something 1151 00:41:38,260 --> 00:41:39,760 like this and getting an opportunity 1152 00:41:39,760 --> 00:41:42,070 to experience it firsthand is one of the best ways 1153 00:41:42,070 --> 00:41:44,840 to teach concepts from system dynamics and systems thinking. 1154 00:41:44,840 --> 00:41:46,030 So Fishbanks. 1155 00:41:46,030 --> 00:41:50,530 Fishbanks was originally developed quite a number 1156 00:41:50,530 --> 00:41:52,120 of years ago as a board game-- 1157 00:41:52,120 --> 00:41:53,620 and I think you guys will figure out 1158 00:41:53,620 --> 00:41:56,340 very quickly what it's trying to elicit out of you all. 1159 00:41:56,340 --> 00:41:59,530 This is a smart group of folks. 1160 00:41:59,530 --> 00:42:01,030 But it has now been turned into kind 1161 00:42:01,030 --> 00:42:02,660 of this interactive simulation. 1162 00:42:02,660 --> 00:42:06,522 So for the next hour and some change as we go through this-- 1163 00:42:06,522 --> 00:42:08,480 I know everyone here has different backgrounds, 1164 00:42:08,480 --> 00:42:10,450 different businesses, different things you've been exposed to, 1165 00:42:10,450 --> 00:42:12,033 but this is going to be your business. 1166 00:42:12,033 --> 00:42:14,480 Your business is going to be deep sea and coastal fisher 1167 00:42:14,480 --> 00:42:14,980 people. 1168 00:42:14,980 --> 00:42:18,192 Your job is to go out there and catch fish as best you can 1169 00:42:18,192 --> 00:42:19,150 and make as much money. 1170 00:42:19,150 --> 00:42:22,030 You are all endowed with your own small fleet 1171 00:42:22,030 --> 00:42:25,000 of boats and your own commercial fishing operation. 1172 00:42:25,000 --> 00:42:26,650 Congratulations, welcome aboard. 1173 00:42:32,870 --> 00:42:35,240 So that's all I'm going to say about Fishbanks. 1174 00:42:35,240 --> 00:42:38,000 Again, there's a lot more stuff to say about this, 1175 00:42:38,000 --> 00:42:40,880 but I'm going to go ahead and wrap this up, 1176 00:42:40,880 --> 00:42:43,340 because, again, we only have three hours in total. 1177 00:42:43,340 --> 00:42:46,100 Before I do that, any final questions about the simulation 1178 00:42:46,100 --> 00:42:46,670 itself? 1179 00:42:46,670 --> 00:42:48,740 Any accusations that I rigged the whole thing 1180 00:42:48,740 --> 00:42:49,790 and made it impossible? 1181 00:42:49,790 --> 00:42:51,860 Or anything fun like that? 1182 00:42:51,860 --> 00:42:53,240 Yeah. 1183 00:42:53,240 --> 00:42:56,130 AUDIENCE: One of the major, like, one of the teams 1184 00:42:56,130 --> 00:42:58,710 also found out that their change happened too fast, 1185 00:42:58,710 --> 00:43:01,190 and I think that's because we didn't understand 1186 00:43:01,190 --> 00:43:03,830 the system or the equations behind it, the model. 1187 00:43:03,830 --> 00:43:08,558 Was that information left out intentionally? 1188 00:43:08,558 --> 00:43:09,600 JAMES PAINE: To a degree. 1189 00:43:09,600 --> 00:43:12,612 So one interesting way to run this simulation-- and it's been 1190 00:43:12,612 --> 00:43:14,570 done, but it hasn't been done for a long time-- 1191 00:43:14,570 --> 00:43:18,290 is to essentially run this whole thing again with you. 1192 00:43:18,290 --> 00:43:20,450 Like, essentially let every person in this room 1193 00:43:20,450 --> 00:43:23,270 play the game again, but understandably, 1194 00:43:23,270 --> 00:43:24,770 with slightly different parameters, 1195 00:43:24,770 --> 00:43:28,012 like so you can't just set it straight to what you 1196 00:43:28,012 --> 00:43:29,220 want to do right off the bat. 1197 00:43:29,220 --> 00:43:32,330 So the argument here is that information is imperfect, 1198 00:43:32,330 --> 00:43:34,980 but it is still present and some information 1199 00:43:34,980 --> 00:43:36,230 is better than no information. 1200 00:43:36,230 --> 00:43:37,520 So the constant question is, what 1201 00:43:37,520 --> 00:43:39,937 happens when you guys know this information ahead of time? 1202 00:43:39,937 --> 00:43:41,750 I'd be curious about that. 1203 00:43:41,750 --> 00:43:44,300 For those who did the beer game, one of the things 1204 00:43:44,300 --> 00:43:45,350 that came out of that that's interesting, 1205 00:43:45,350 --> 00:43:47,000 the beer game is this multi-echelon supply chain. 1206 00:43:47,000 --> 00:43:49,580 The ultimate thing about it is that information matters, 1207 00:43:49,580 --> 00:43:52,313 information is helpful. 1208 00:43:52,313 --> 00:43:54,980 When you play the beer game with people who have played the beer 1209 00:43:54,980 --> 00:43:57,428 game, you still get the exact same outcome no matter 1210 00:43:57,428 --> 00:43:58,470 how many times you do it. 1211 00:43:58,470 --> 00:43:59,887 And one of the classic examples is 1212 00:43:59,887 --> 00:44:02,638 we had a beer game, like, championship here at MIT 1213 00:44:02,638 --> 00:44:05,180 some years ago where not only was it people who played before 1214 00:44:05,180 --> 00:44:08,790 but it was professors who teach the beer game in the same room. 1215 00:44:08,790 --> 00:44:11,370 And we also gave information. 1216 00:44:11,370 --> 00:44:13,575 So typically in the beer game, there's some-- 1217 00:44:13,575 --> 00:44:14,450 AUDIENCE: [INAUDIBLE] 1218 00:44:14,450 --> 00:44:15,533 JAMES PAINE: I know, yeah. 1219 00:44:15,533 --> 00:44:19,970 [INAUDIBLE] But even the folks who teach the beer game didn't 1220 00:44:19,970 --> 00:44:22,160 escape the beer game. 1221 00:44:22,160 --> 00:44:24,680 So part of this is that as long as-- 1222 00:44:24,680 --> 00:44:27,320 and this is the whole idea of structure yields behavior. 1223 00:44:27,320 --> 00:44:29,030 The structure is still there. 1224 00:44:29,030 --> 00:44:31,550 The underlying structure still exists. 1225 00:44:31,550 --> 00:44:34,730 It is real hard to break your behavior patterns 1226 00:44:34,730 --> 00:44:37,380 when you don't change that underlying structure. 1227 00:44:37,380 --> 00:44:40,160 So if you want more, we're all up on the fourth floor, 1228 00:44:40,160 --> 00:44:41,810 for one thing. 1229 00:44:41,810 --> 00:44:44,180 So at least, I don't lock my door, so come on in. 1230 00:44:44,180 --> 00:44:45,680 I'll talk your ear off. 1231 00:44:45,680 --> 00:44:48,890 For classes here at MIT, a lot of what we just talked about 1232 00:44:48,890 --> 00:44:53,150 is covered really in much more detail in 15.871, Introduction 1233 00:44:53,150 --> 00:44:54,110 to System Dynamics. 1234 00:44:54,110 --> 00:44:56,360 I would say 15.871's primary purpose 1235 00:44:56,360 --> 00:44:59,690 is to get these ideas of systems thinking in your head. 1236 00:44:59,690 --> 00:45:02,390 You'll be spending a lot of time in that class 1237 00:45:02,390 --> 00:45:05,450 up on a chalkboard, sort of talking through problems, 1238 00:45:05,450 --> 00:45:08,210 saying, OK, well, let's talk about what happens when you 1239 00:45:08,210 --> 00:45:09,627 increase consumer satisfaction. 1240 00:45:09,627 --> 00:45:10,460 What does that mean? 1241 00:45:10,460 --> 00:45:13,163 And start drawing these lovely causal loop diagrams 1242 00:45:13,163 --> 00:45:15,080 and think to ourselves, OK, well, what happens 1243 00:45:15,080 --> 00:45:16,160 if we snip this loop here? 1244 00:45:16,160 --> 00:45:17,550 And what would it take to do that? 1245 00:45:17,550 --> 00:45:19,342 So it's much more about how to get yourself 1246 00:45:19,342 --> 00:45:21,620 thinking from linear direction into more of a looping 1247 00:45:21,620 --> 00:45:22,550 direction. 1248 00:45:22,550 --> 00:45:27,590 15.872 dives a lot more into now taking those thought processes 1249 00:45:27,590 --> 00:45:30,585 and applying them to sort of much more realistic scenarios 1250 00:45:30,585 --> 00:45:32,960 and situations and getting more comfortable with modeling 1251 00:45:32,960 --> 00:45:33,680 tools. 1252 00:45:33,680 --> 00:45:37,610 Both involve you using software sets, 1253 00:45:37,610 --> 00:45:40,550 which I'll talk about a little bit later, that are commonly 1254 00:45:40,550 --> 00:45:41,960 used within system dynamics. 1255 00:45:41,960 --> 00:45:45,620 But 15.872 kind of takes the training wheels off. 1256 00:45:45,620 --> 00:45:47,330 And there is no-- there is no right 1257 00:45:47,330 --> 00:45:49,580 there is no wrong answer as long as you can figure out 1258 00:45:49,580 --> 00:45:53,810 a good self-contained simulation that gets the conclusion you 1259 00:45:53,810 --> 00:45:55,260 think is interesting. 1260 00:45:55,260 --> 00:45:57,830 This one is relatively new, 15.873. 1261 00:45:57,830 --> 00:46:01,610 This is somewhere that sits in between 15.871 and 872. 1262 00:46:01,610 --> 00:46:05,450 This is one who's, I like to think, has more of an emphasis 1263 00:46:05,450 --> 00:46:07,110 on business and policy. 1264 00:46:07,110 --> 00:46:08,360 I have this example down here. 1265 00:46:08,360 --> 00:46:10,430 This is the Vasa, which I'm now giving away 1266 00:46:10,430 --> 00:46:13,340 a slide from, I think, 15.872. 1267 00:46:13,340 --> 00:46:17,030 The Vasa was an incredible ship that 1268 00:46:17,030 --> 00:46:22,190 was built by some of the best ship designers of its day, 1269 00:46:22,190 --> 00:46:24,950 was launched, and then promptly tipped completely upside down 1270 00:46:24,950 --> 00:46:26,540 and sunk to the bottom of the ocean. 1271 00:46:26,540 --> 00:46:28,498 And the reason was because, at the last minute, 1272 00:46:28,498 --> 00:46:30,710 there were several late design adds. 1273 00:46:30,710 --> 00:46:34,700 Someone decided they wanted to build the captain's 1274 00:46:34,700 --> 00:46:35,840 lodge a little bit larger. 1275 00:46:35,840 --> 00:46:37,170 They wanted to add a few more sails. 1276 00:46:37,170 --> 00:46:38,450 They wanted to add a ton more cannons. 1277 00:46:38,450 --> 00:46:39,560 They wanted to make it more and more impressive. 1278 00:46:39,560 --> 00:46:42,440 It became hydro dynamically unstable, promptly tipped over. 1279 00:46:42,440 --> 00:46:44,030 So the idea being here that-- 1280 00:46:44,030 --> 00:46:45,800 I emphasize that one for this idea 1281 00:46:45,800 --> 00:46:47,897 that you have unintended consequences. 1282 00:46:47,897 --> 00:46:48,980 There are no side effects. 1283 00:46:48,980 --> 00:46:51,000 There are simply effects that you didn't plan for. 1284 00:46:51,000 --> 00:46:52,958 And that's kind of emphasized a little bit more 1285 00:46:52,958 --> 00:46:57,020 in 15.872, 15.873. 1286 00:46:57,020 --> 00:46:59,300 In terms of books, I brought a few with me. 1287 00:46:59,300 --> 00:47:01,550 Again, I mentioned this at the beginning of the class. 1288 00:47:01,550 --> 00:47:04,700 This has become sort of the bible of system 1289 00:47:04,700 --> 00:47:06,163 dynamics, business dynamics. 1290 00:47:06,163 --> 00:47:07,580 The first couple of chapters cover 1291 00:47:07,580 --> 00:47:08,830 a lot of what we talked about. 1292 00:47:08,830 --> 00:47:10,987 Later on, it gets really specific. 1293 00:47:10,987 --> 00:47:13,070 There's an opening in here that says, essentially, 1294 00:47:13,070 --> 00:47:15,150 if you have a background in math, that's great. 1295 00:47:15,150 --> 00:47:16,130 If you don't, that's great. 1296 00:47:16,130 --> 00:47:18,713 But later on, it starts talking about the specific differences 1297 00:47:18,713 --> 00:47:21,380 between choosing modes of feedback and behavior, 1298 00:47:21,380 --> 00:47:23,510 how to model specific things, really 1299 00:47:23,510 --> 00:47:29,030 how to get your hands dirty for primarily compartmental models. 1300 00:47:29,030 --> 00:47:31,070 If you want something a little more general, 1301 00:47:31,070 --> 00:47:32,778 there is the Fifth Discipline Field Book. 1302 00:47:32,778 --> 00:47:34,737 I wouldn't necessarily recommend the whole book 1303 00:47:34,737 --> 00:47:35,810 unless that's your thing. 1304 00:47:35,810 --> 00:47:38,360 But part two is called systems thinking. 1305 00:47:38,360 --> 00:47:40,820 It does a great job of summarizing the work 1306 00:47:40,820 --> 00:47:42,000 that we talked about here. 1307 00:47:42,000 --> 00:47:43,875 Both of these books are in the Dewey Library. 1308 00:47:43,875 --> 00:47:44,960 I checked earlier today. 1309 00:47:44,960 --> 00:47:47,780 I think this is one of the two copies of this book. 1310 00:47:47,780 --> 00:47:49,940 Sell or return it right after this class. 1311 00:47:49,940 --> 00:47:52,520 But there's lots of copies of John's book on business 1312 00:47:52,520 --> 00:47:53,300 dynamics. 1313 00:47:53,300 --> 00:47:54,842 If you want to know a little bit more 1314 00:47:54,842 --> 00:47:57,800 about sort of the history of this, Limits to Growth 1315 00:47:57,800 --> 00:48:01,370 is a book that I recommend because it has a strong history 1316 00:48:01,370 --> 00:48:02,315 in system dynamics. 1317 00:48:02,315 --> 00:48:04,190 It also has a bit of a controversial history, 1318 00:48:04,190 --> 00:48:05,757 if you dive into it. 1319 00:48:05,757 --> 00:48:07,340 This was one of those first books that 1320 00:48:07,340 --> 00:48:10,670 applied the ideas of overshoot and collapse to the earth, 1321 00:48:10,670 --> 00:48:14,540 and saying that we as a species are headed towards this, 1322 00:48:14,540 --> 00:48:16,220 and we need to manage our resources 1323 00:48:16,220 --> 00:48:19,910 and we need to be a little bit more careful with our choices. 1324 00:48:19,910 --> 00:48:23,720 The simulations in there had some specific dates 1325 00:48:23,720 --> 00:48:26,450 that ultimately were held up as sort 1326 00:48:26,450 --> 00:48:28,700 of the how why point prediction is 1327 00:48:28,700 --> 00:48:30,210 difficult in system dynamics. 1328 00:48:30,210 --> 00:48:32,810 So if you read it, this is not a prediction for the future. 1329 00:48:32,810 --> 00:48:34,460 It's discussing the mode of behavior. 1330 00:48:34,460 --> 00:48:36,140 It's talking about the structure that we currently 1331 00:48:36,140 --> 00:48:38,240 have that can lead to overshoot and collapse 1332 00:48:38,240 --> 00:48:40,040 within shared resources. 1333 00:48:40,040 --> 00:48:42,690 Articles, if you get a chance, it's my personal favorites. 1334 00:48:42,690 --> 00:48:45,380 System Dynamics at Sixty: The Path Forward, I think, 1335 00:48:45,380 --> 00:48:47,760 does a great job of kind of saying where 1336 00:48:47,760 --> 00:48:49,160 is system dynamics right now. 1337 00:48:49,160 --> 00:48:50,570 A lot of material that you get out there 1338 00:48:50,570 --> 00:48:51,945 talks about where system dynamics 1339 00:48:51,945 --> 00:48:55,280 was 20, 30, even 40 years ago. 1340 00:48:55,280 --> 00:48:58,295 I think this one does a great job of saying, that's great. 1341 00:48:58,295 --> 00:48:59,420 What are we doing from now? 1342 00:48:59,420 --> 00:49:01,490 This is essentially the article that says, 1343 00:49:01,490 --> 00:49:03,410 system dynamics is not compartmental models. 1344 00:49:03,410 --> 00:49:05,018 System dynamics is a mode of thinking. 1345 00:49:05,018 --> 00:49:06,560 Choose the software that you need to. 1346 00:49:06,560 --> 00:49:09,840 Choose the methods of communication you need to. 1347 00:49:09,840 --> 00:49:13,890 I am, again, someone who does a lot of operations management. 1348 00:49:13,890 --> 00:49:15,530 So most of these choices up there 1349 00:49:15,530 --> 00:49:21,278 about compatibility traps and sort of operational failure, 1350 00:49:21,278 --> 00:49:22,820 that's kind of where I'm coming from. 1351 00:49:22,820 --> 00:49:24,170 But that being said, this one right here, 1352 00:49:24,170 --> 00:49:26,390 the third one on the list, Making the Numbers, 1353 00:49:26,390 --> 00:49:29,120 applies a classic operations management concept 1354 00:49:29,120 --> 00:49:33,080 of a capability trap to valuation of stocks. 1355 00:49:33,080 --> 00:49:36,110 And I think that's a great little connection right there. 1356 00:49:36,110 --> 00:49:37,783 If you happen to be on MIT'S network, 1357 00:49:37,783 --> 00:49:39,200 these are all free and accessible. 1358 00:49:39,200 --> 00:49:41,450 The worst case scenario, I'll print out a copy for you 1359 00:49:41,450 --> 00:49:44,270 because I think these are good articles. 1360 00:49:44,270 --> 00:49:45,080 Other websites. 1361 00:49:45,080 --> 00:49:46,460 Some personal favorites here. 1362 00:49:46,460 --> 00:49:48,710 Creative Learning Exchange, this is 1363 00:49:48,710 --> 00:49:51,740 one-- it's primarily K through 12 education-focused. 1364 00:49:51,740 --> 00:49:54,020 But there's a lot of fun little tools on there. 1365 00:49:54,020 --> 00:49:56,750 It uses a lot of animations, make things very approachable, 1366 00:49:56,750 --> 00:49:58,250 especially if you're used to looking 1367 00:49:58,250 --> 00:50:00,890 at sort of cold, black and white diagrams. 1368 00:50:00,890 --> 00:50:03,350 Clicking around on there just kind of makes me happy, 1369 00:50:03,350 --> 00:50:06,002 like it has lots of pink fluid flowing back and forth, 1370 00:50:06,002 --> 00:50:08,210 and you like-- you attach the little things together. 1371 00:50:08,210 --> 00:50:11,870 They have some fun iOS and Android applications. 1372 00:50:11,870 --> 00:50:14,180 Tom Fiddaman's MetaSD website is fantastic. 1373 00:50:14,180 --> 00:50:16,820 He has-- that man will just model anything 1374 00:50:16,820 --> 00:50:18,150 that he thinks is interesting. 1375 00:50:18,150 --> 00:50:20,768 So if you have a blank piece of software in front of you 1376 00:50:20,768 --> 00:50:23,060 and you don't know what to click and you want something 1377 00:50:23,060 --> 00:50:24,950 as an example, he has one about sort 1378 00:50:24,950 --> 00:50:27,650 of the self-reinforcing nature of UFO sightings. 1379 00:50:27,650 --> 00:50:29,300 So he essentially thinks to himself, 1380 00:50:29,300 --> 00:50:31,373 this is an interesting behavior I have observed. 1381 00:50:31,373 --> 00:50:33,290 I wonder what structure could have yielded it? 1382 00:50:33,290 --> 00:50:35,330 And then he makes it and puts it on his website. 1383 00:50:35,330 --> 00:50:37,730 So that's just a fun one to click around on. 1384 00:50:37,730 --> 00:50:39,650 This one, the self-study website, 1385 00:50:39,650 --> 00:50:41,540 that is dense, to be honest. 1386 00:50:41,540 --> 00:50:42,860 But it's a good spot to start. 1387 00:50:42,860 --> 00:50:45,230 This systematically walks you through the fundamentals 1388 00:50:45,230 --> 00:50:48,920 of system dynamics from open concept to detailed modeling 1389 00:50:48,920 --> 00:50:49,820 systematically. 1390 00:50:49,820 --> 00:50:51,980 And is free and open for anyone to use. 1391 00:50:51,980 --> 00:50:54,410 I run, right now, the Journal Club. 1392 00:50:54,410 --> 00:50:56,450 So that's my plug right there. 1393 00:50:56,450 --> 00:50:58,220 Stop on by. 1394 00:50:58,220 --> 00:51:01,162 Every other Friday, once the semester starts, 1395 00:51:01,162 --> 00:51:02,870 or every Friday once the semester starts, 1396 00:51:02,870 --> 00:51:05,960 we have a seminar series that's systematic-focused up 1397 00:51:05,960 --> 00:51:07,490 on the fourth floor. 1398 00:51:07,490 --> 00:51:09,050 If there's not a seminar scheduled, 1399 00:51:09,050 --> 00:51:11,780 then I just kind of run the discussion and talk 1400 00:51:11,780 --> 00:51:14,730 about something I feel like talking about. 1401 00:51:14,730 --> 00:51:16,730 And then of course, the System Dynamics Society. 1402 00:51:16,730 --> 00:51:20,030 That's kind of our group society. 1403 00:51:20,030 --> 00:51:22,490 I would be remiss if I did not at least mention them 1404 00:51:22,490 --> 00:51:23,720 in passing. 1405 00:51:23,720 --> 00:51:26,648 It's a lot of information, but I think especially 1406 00:51:26,648 --> 00:51:28,190 those first two are not ones that you 1407 00:51:28,190 --> 00:51:29,540 see a lot are recommended. 1408 00:51:29,540 --> 00:51:30,230 Software. 1409 00:51:30,230 --> 00:51:32,480 We didn't talk at all about software. 1410 00:51:32,480 --> 00:51:34,688 If this had been a longer course, that probably would 1411 00:51:34,688 --> 00:51:36,980 be the next thing I'd do is I'd dive into some software 1412 00:51:36,980 --> 00:51:37,790 with you guys. 1413 00:51:37,790 --> 00:51:39,450 Unfortunately, we don't have the time. 1414 00:51:39,450 --> 00:51:40,880 So I want to point you in these-- 1415 00:51:40,880 --> 00:51:44,630 towards these guys right here, Vensim and Stella Architect. 1416 00:51:44,630 --> 00:51:47,510 Vensim especially has a free academic license. 1417 00:51:47,510 --> 00:51:49,610 If you also have an MIT email address, 1418 00:51:49,610 --> 00:51:52,640 you can get more or less the full commercial version 1419 00:51:52,640 --> 00:51:54,317 for free, which is nice. 1420 00:51:54,317 --> 00:51:56,150 The difference between the personal learning 1421 00:51:56,150 --> 00:51:57,567 edition and the commercial edition 1422 00:51:57,567 --> 00:51:59,630 is minimal until you hit it. 1423 00:51:59,630 --> 00:52:01,130 And then once you hit it, you really 1424 00:52:01,130 --> 00:52:02,713 want that commercial version for being 1425 00:52:02,713 --> 00:52:03,990 able to just do some stuff. 1426 00:52:03,990 --> 00:52:06,290 So if you have an MIT email address, 1427 00:52:06,290 --> 00:52:08,030 I suggest spending the three minutes 1428 00:52:08,030 --> 00:52:11,960 to get a Vensim professional license. 1429 00:52:11,960 --> 00:52:15,860 Stella Architect is less used in our group, but definitely used 1430 00:52:15,860 --> 00:52:17,280 in general. 1431 00:52:17,280 --> 00:52:19,117 They have a great storytelling mode, 1432 00:52:19,117 --> 00:52:21,200 this idea of building up your models progressively 1433 00:52:21,200 --> 00:52:23,060 without scaring people away, kind of. 1434 00:52:23,060 --> 00:52:25,820 And also have a lot of nice visuals. 1435 00:52:25,820 --> 00:52:28,340 I'm still getting used to that software myself, 1436 00:52:28,340 --> 00:52:29,960 but I like what I see. 1437 00:52:29,960 --> 00:52:32,950 Also, off to the side there, NetLogo. 1438 00:52:32,950 --> 00:52:34,870 Not used a lot in our group necessarily, 1439 00:52:34,870 --> 00:52:35,750 but I use it a lot. 1440 00:52:35,750 --> 00:52:37,760 It's a good old agent-based model. 1441 00:52:37,760 --> 00:52:40,940 So sometimes it's useful to think, OK, 1442 00:52:40,940 --> 00:52:43,000 I know what a person would do or an agent 1443 00:52:43,000 --> 00:52:44,350 or an entity in the setup. 1444 00:52:44,350 --> 00:52:45,947 I can wrap my head around that. 1445 00:52:45,947 --> 00:52:47,530 I don't know what the system is, but I 1446 00:52:47,530 --> 00:52:50,230 can wrap my head around one person wandering through space. 1447 00:52:50,230 --> 00:52:52,117 This lets you, essentially, model that person 1448 00:52:52,117 --> 00:52:53,950 and then put a whole bunch of them in a room 1449 00:52:53,950 --> 00:52:54,825 and see what happens. 1450 00:52:54,825 --> 00:52:57,370 If anyone has done any sort of agent-based modeling 1451 00:52:57,370 --> 00:53:01,120 with any sort of scripting language, MATLAB, R, anything 1452 00:53:01,120 --> 00:53:03,160 else like that, this is kind of the same concept 1453 00:53:03,160 --> 00:53:04,535 with a nice little pretty wrapper 1454 00:53:04,535 --> 00:53:05,710 around the outside of it. 1455 00:53:05,710 --> 00:53:07,540 Speaking of which, in terms of software, 1456 00:53:07,540 --> 00:53:08,890 we use all sorts of software. 1457 00:53:08,890 --> 00:53:10,630 It's not just Vensim and Stella. 1458 00:53:10,630 --> 00:53:12,790 These are just what I have upstairs on my machine 1459 00:53:12,790 --> 00:53:13,110 right now. 1460 00:53:13,110 --> 00:53:14,818 Actually, they're on my machine right now 1461 00:53:14,818 --> 00:53:17,650 and what I use on more or less a daily basis. 1462 00:53:17,650 --> 00:53:19,330 We're not just compartmentalize models. 1463 00:53:19,330 --> 00:53:20,450 We're a mode of thinking. 1464 00:53:20,450 --> 00:53:22,610 So if you have a tool that gets the job done, 1465 00:53:22,610 --> 00:53:23,590 than good for you. 1466 00:53:23,590 --> 00:53:27,460 It's possible to do a good causal diagram using Excel 1467 00:53:27,460 --> 00:53:30,737 if you're really motivated. 1468 00:53:30,737 --> 00:53:31,820 Oh yeah, and a whiteboard. 1469 00:53:31,820 --> 00:53:33,050 Whiteboard definitely gets the job done. 1470 00:53:33,050 --> 00:53:34,970 That's actually-- he raises a great point. 1471 00:53:34,970 --> 00:53:37,560 The best use is just standing in front of our whiteboard 1472 00:53:37,560 --> 00:53:39,320 and starting writing some things down. 1473 00:53:39,320 --> 00:53:41,662 People often ask the question, what do you do? 1474 00:53:41,662 --> 00:53:42,620 Like, how do you start? 1475 00:53:42,620 --> 00:53:44,900 Like, what's the first thing you do? 1476 00:53:44,900 --> 00:53:47,088 I'd say walk up to a whiteboard, think 1477 00:53:47,088 --> 00:53:48,380 about the thing you care about. 1478 00:53:48,380 --> 00:53:50,930 I don't know, it's CO2 in the atmosphere, 1479 00:53:50,930 --> 00:53:56,150 car deaths, number of people quitting from a non-profit. 1480 00:53:56,150 --> 00:53:57,080 Write that on a board. 1481 00:53:57,080 --> 00:53:59,060 Just write it smack dab in the middle of the board. 1482 00:53:59,060 --> 00:54:01,190 Then think to yourself, what would cause that thing 1483 00:54:01,190 --> 00:54:02,390 to go up or down? 1484 00:54:02,390 --> 00:54:03,510 Write it off to the side. 1485 00:54:03,510 --> 00:54:05,510 Draw an arrow, put a plus or a minus next to it. 1486 00:54:05,510 --> 00:54:07,640 And just keep going and see what happens. 1487 00:54:07,640 --> 00:54:10,190 And let those loops come back around. 1488 00:54:10,190 --> 00:54:12,440 And that's a good spot to start. 1489 00:54:12,440 --> 00:54:14,930 So that's-- we have a few minutes. 1490 00:54:14,930 --> 00:54:16,820 I'm going to click this one thing right here 1491 00:54:16,820 --> 00:54:20,000 for another example of system dynamics in action. 1492 00:54:20,000 --> 00:54:25,130 If anyone is familiar with the policy interactive, 1493 00:54:25,130 --> 00:54:27,800 this is an idea too of what's the most complicated model sort 1494 00:54:27,800 --> 00:54:29,120 of idea and what can you do. 1495 00:54:29,120 --> 00:54:30,860 This is En-ROADS. 1496 00:54:30,860 --> 00:54:34,490 This is a policy simulator for the climate. 1497 00:54:34,490 --> 00:54:37,047 So this right here has a full-blown climate simulation 1498 00:54:37,047 --> 00:54:37,880 running on the back. 1499 00:54:37,880 --> 00:54:38,930 But its job at the end of the day 1500 00:54:38,930 --> 00:54:40,222 is again, not point prediction. 1501 00:54:40,222 --> 00:54:41,300 Its job is policy. 1502 00:54:41,300 --> 00:54:43,123 So again, this is one kind of a fun thing 1503 00:54:43,123 --> 00:54:45,540 to click around and kind of see system dynamics in action. 1504 00:54:45,540 --> 00:54:47,780 This is simply saying, what are we doing right here? 1505 00:54:47,780 --> 00:54:49,863 What's our predicted temperature change over time? 1506 00:54:49,863 --> 00:54:53,280 What are the choices that we can make as a society, as a group, 1507 00:54:53,280 --> 00:54:56,050 as a series of countries to change this outcome? 1508 00:54:56,050 --> 00:54:58,910 This is being used right now in conversations with members 1509 00:54:58,910 --> 00:55:02,405 of our own government in order to help elicit mental models 1510 00:55:02,405 --> 00:55:04,280 and get people on the same page about climate 1511 00:55:04,280 --> 00:55:05,490 action in the future. 1512 00:55:05,490 --> 00:55:06,948 And so this is something else where 1513 00:55:06,948 --> 00:55:09,110 there is a big compartmental model running 1514 00:55:09,110 --> 00:55:10,430 in the background. 1515 00:55:10,430 --> 00:55:12,980 But this is an example of using system dynamics 1516 00:55:12,980 --> 00:55:16,580 to hopefully elicit real policy change in the near term. 1517 00:55:16,580 --> 00:55:18,830 So you can start clicking around, seeing what happens. 1518 00:55:18,830 --> 00:55:20,300 I'll be honest, I'm really bad at this one. 1519 00:55:20,300 --> 00:55:22,220 I click around and I realize I'm really bad at figuring out 1520 00:55:22,220 --> 00:55:24,920 how to fix the climate, which is why I do operations 1521 00:55:24,920 --> 00:55:28,910 and supply chain management, I suppose, at the end of the day. 1522 00:55:28,910 --> 00:55:31,530 So yeah, I'd encourage you guys to click around on that. 1523 00:55:31,530 --> 00:55:34,868 So that, essentially, is where we are. 1524 00:55:34,868 --> 00:55:35,910 Thank you guys very much. 1525 00:55:35,910 --> 00:55:37,370 I loved sharing system dynamics with you, 1526 00:55:37,370 --> 00:55:39,000 and I hope to see some of you guys in the future. 1527 00:55:39,000 --> 00:55:39,500 Thank you. 1528 00:55:39,500 --> 00:55:42,760 [APPLAUSE]