1 00:00:00,120 --> 00:00:02,460 The following content is provided under a Creative 2 00:00:02,460 --> 00:00:03,880 Commons license. 3 00:00:03,880 --> 00:00:06,090 Your support will help MIT OpenCourseWare 4 00:00:06,090 --> 00:00:10,180 continue to offer high quality educational resources for free. 5 00:00:10,180 --> 00:00:12,720 To make a donation or to view additional materials 6 00:00:12,720 --> 00:00:16,680 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:16,680 --> 00:00:17,880 at ocw.mit.edu. 8 00:00:24,670 --> 00:00:29,120 PROFESSOR: OK, so let's situate ourselves where we are. 9 00:00:29,120 --> 00:00:31,550 So we're starting to really make a lot of progress 10 00:00:31,550 --> 00:00:33,860 here, moving in the V model. 11 00:00:33,860 --> 00:00:38,240 Today's topic is session five: tradespace exploration, 12 00:00:38,240 --> 00:00:40,220 concept selection, and PDR-- 13 00:00:40,220 --> 00:00:41,440 preliminary design review. 14 00:00:44,170 --> 00:00:46,690 So there's a lot to talk about, and I'm 15 00:00:46,690 --> 00:00:48,660 going to go relatively quickly through this. 16 00:00:48,660 --> 00:00:51,880 So, first I want to talk about decision analysis. 17 00:00:51,880 --> 00:00:55,960 Fundamentally, when you come up for PDR, 18 00:00:55,960 --> 00:00:59,470 you have to make a big decision which is, what concept are you 19 00:00:59,470 --> 00:01:00,220 going for? 20 00:01:00,220 --> 00:01:02,344 What is your system architecture? 21 00:01:02,344 --> 00:01:03,260 That's a big decision. 22 00:01:03,260 --> 00:01:04,720 You don't have all the details yet. 23 00:01:04,720 --> 00:01:07,120 The design is not fully done, but you've 24 00:01:07,120 --> 00:01:08,980 chosen your key architecture. 25 00:01:08,980 --> 00:01:10,720 That's a big decision, and there's 26 00:01:10,720 --> 00:01:13,510 a whole branch of research and science called 27 00:01:13,510 --> 00:01:14,790 decision analysis. 28 00:01:14,790 --> 00:01:17,360 So I want to tell you about that. 29 00:01:17,360 --> 00:01:20,650 Talk about some of the issues in concept selection, and then 30 00:01:20,650 --> 00:01:22,690 give you some tools for doing that 31 00:01:22,690 --> 00:01:24,550 in a kind of organized way. 32 00:01:24,550 --> 00:01:25,720 There's a couple of-- 33 00:01:25,720 --> 00:01:29,830 what I would say, simple methods Pugh matrix and multi-attribute 34 00:01:29,830 --> 00:01:31,060 utility-- 35 00:01:31,060 --> 00:01:33,520 that are relatively, I think, straightforward. 36 00:01:33,520 --> 00:01:37,810 And then there's a bit more advanced concept called 37 00:01:37,810 --> 00:01:39,940 non-dominance-- 38 00:01:39,940 --> 00:01:44,050 Pareto frontiers, getting into multi-objective optimization. 39 00:01:44,050 --> 00:01:47,020 And so, at a minimum, when you choose your concept, 40 00:01:47,020 --> 00:01:49,630 you want to choose a non-dominated concept. 41 00:01:49,630 --> 00:01:51,530 And I'll explain what that means. 42 00:01:51,530 --> 00:01:54,750 And then we'll close with discussing what is a PDR? 43 00:01:54,750 --> 00:01:57,160 What do you do with the PDR? 44 00:01:57,160 --> 00:01:59,560 What's the purpose of it? 45 00:01:59,560 --> 00:02:04,320 So, here is another way to explain essentially the flow 46 00:02:04,320 --> 00:02:08,880 of things as we come up to PDR. 47 00:02:08,880 --> 00:02:11,400 So the way you can think of it as, you 48 00:02:11,400 --> 00:02:12,960 start with a need, right? 49 00:02:12,960 --> 00:02:15,110 And your requirements. 50 00:02:15,110 --> 00:02:17,590 That's-- we've talked about that before. 51 00:02:17,590 --> 00:02:19,270 And then you have this creative phase, 52 00:02:19,270 --> 00:02:20,950 which we talked about last time, right? 53 00:02:20,950 --> 00:02:24,210 You come up with a lot of ideas, a lot of concepts, 54 00:02:24,210 --> 00:02:28,500 and the funnel that you see here is going up. 55 00:02:28,500 --> 00:02:30,730 So each of these shapes here-- 56 00:02:30,730 --> 00:02:32,850 the circle, the triangle-- 57 00:02:32,850 --> 00:02:35,640 they're meant to represent fundamentally different 58 00:02:35,640 --> 00:02:37,410 concepts, different architecture, 59 00:02:37,410 --> 00:02:39,780 different way of satisfying the requirements. 60 00:02:39,780 --> 00:02:42,750 And so there's a lot of different things you can do, 61 00:02:42,750 --> 00:02:44,340 and the funnel opens up. 62 00:02:44,340 --> 00:02:48,570 This is sort of the size of options or alternatives 63 00:02:48,570 --> 00:02:49,890 you're considering. 64 00:02:49,890 --> 00:02:53,730 And then you have to filter this down. 65 00:02:53,730 --> 00:02:56,000 In the end, you can only build one system, right? 66 00:02:56,000 --> 00:02:58,320 You have to choose one architecture. 67 00:02:58,320 --> 00:03:01,480 And this filtering down happens typically in two steps. 68 00:03:01,480 --> 00:03:04,770 One is what we call concept screening, which 69 00:03:04,770 --> 00:03:07,380 is more qualitative. 70 00:03:07,380 --> 00:03:10,260 And then the actual concept selection 71 00:03:10,260 --> 00:03:13,330 is more quantitative, based on models, based on data, 72 00:03:13,330 --> 00:03:15,990 based on metrics. 73 00:03:15,990 --> 00:03:17,350 And then you arrive at PDR. 74 00:03:17,350 --> 00:03:20,250 So in this sort of simplified view, 75 00:03:20,250 --> 00:03:25,250 we've chosen the square as our concept. 76 00:03:25,250 --> 00:03:27,630 And we've made that decision by looking 77 00:03:27,630 --> 00:03:29,620 at these different metrics. 78 00:03:29,620 --> 00:03:33,580 So you can see that on this graphic here, 79 00:03:33,580 --> 00:03:35,510 we have metric one, metric two-- 80 00:03:35,510 --> 00:03:37,830 you can see the triangle is dominated, right? 81 00:03:37,830 --> 00:03:40,020 The triangle doesn't do as well if we 82 00:03:40,020 --> 00:03:42,470 want to maximize metric one and two, 83 00:03:42,470 --> 00:03:44,790 so it wouldn't make sense to choose the triangle. 84 00:03:44,790 --> 00:03:50,070 But the circle and the square-- 85 00:03:50,070 --> 00:03:52,890 circle does better on metric two, but less well 86 00:03:52,890 --> 00:03:53,820 on metric one. 87 00:03:53,820 --> 00:03:56,860 The square does better on metric one, not quite as well-- 88 00:03:56,860 --> 00:03:58,870 and they crossover at some point. 89 00:03:58,870 --> 00:04:01,740 So, whatever-- how you weigh this in decision making, 90 00:04:01,740 --> 00:04:03,120 the decision is made. 91 00:04:03,120 --> 00:04:04,710 We're going to go for the square. 92 00:04:04,710 --> 00:04:06,180 That's our concept. 93 00:04:06,180 --> 00:04:10,620 And then you move to the actual system design, post-PDR, 94 00:04:10,620 --> 00:04:12,750 which is all the details. 95 00:04:12,750 --> 00:04:16,350 So we're expanding the box now, the square, and designing 96 00:04:16,350 --> 00:04:17,790 all the details within-- 97 00:04:17,790 --> 00:04:21,300 the structure, the avionics, the software, the controls. 98 00:04:21,300 --> 00:04:25,080 And that's design modeling and optimization. 99 00:04:25,080 --> 00:04:28,260 And so here I'm showing you metric three, which is maybe 100 00:04:28,260 --> 00:04:32,250 another metric, and then design variable x1, which 101 00:04:32,250 --> 00:04:35,190 is some pretty detailed decision that you 102 00:04:35,190 --> 00:04:37,230 make within that concept, and you're 103 00:04:37,230 --> 00:04:40,470 looking for the optimal, in this case, minimizing metric three. 104 00:04:40,470 --> 00:04:41,970 It's this point here. 105 00:04:41,970 --> 00:04:43,470 And then that gives you your-- and I 106 00:04:43,470 --> 00:04:46,560 put this deliberately in quotes-- optimal design. 107 00:04:46,560 --> 00:04:49,680 That's what you show at the CDR, and the critical design review, 108 00:04:49,680 --> 00:04:51,490 and that's what you go and build. 109 00:04:51,490 --> 00:04:56,650 So that's roughly the flow of decision making. 110 00:04:56,650 --> 00:04:58,020 Any question about that? 111 00:04:58,020 --> 00:04:59,980 That general flow? 112 00:04:59,980 --> 00:05:02,280 So what we'll talk about today is essentially 113 00:05:02,280 --> 00:05:05,550 this portion here, concept screening 114 00:05:05,550 --> 00:05:08,420 and concept selection. 115 00:05:08,420 --> 00:05:13,110 So, since you make a decision, what is decision analysis? 116 00:05:13,110 --> 00:05:14,900 So decision analysis, in general, 117 00:05:14,900 --> 00:05:18,350 is methods and tools for ranking and choosing 118 00:05:18,350 --> 00:05:22,280 among competing alternatives or courses of action. 119 00:05:22,280 --> 00:05:24,080 And I just want to make-- 120 00:05:24,080 --> 00:05:26,270 the word alternative. 121 00:05:26,270 --> 00:05:29,360 A lot of people use the word option and alternative 122 00:05:29,360 --> 00:05:31,670 sort of as synonyms. 123 00:05:31,670 --> 00:05:32,431 They're not. 124 00:05:32,431 --> 00:05:32,930 OK? 125 00:05:32,930 --> 00:05:34,010 They're not. 126 00:05:34,010 --> 00:05:36,360 Options and alternatives are not the same thing. 127 00:05:36,360 --> 00:05:43,250 An alternative means if I choose A, I can't choose B and C. 128 00:05:43,250 --> 00:05:46,640 But the fact that you've chosen A, or the square, 129 00:05:46,640 --> 00:05:50,870 or whatever your concept is, means that you cannot choose 130 00:05:50,870 --> 00:05:52,310 the other ones. 131 00:05:52,310 --> 00:05:56,885 An option means, well given that I've chosen A, 132 00:05:56,885 --> 00:06:00,350 I could also add something on top of A, right? 133 00:06:00,350 --> 00:06:02,720 It's kind of like if you were buying a new car, 134 00:06:02,720 --> 00:06:05,310 are you going to get the winter package or not? 135 00:06:05,310 --> 00:06:09,580 That's an option, but you've chosen a particular vehicle. 136 00:06:09,580 --> 00:06:11,390 So, that's important. 137 00:06:11,390 --> 00:06:16,070 So alternatives are mutually exclusive choices or courses 138 00:06:16,070 --> 00:06:16,710 of action. 139 00:06:16,710 --> 00:06:18,230 So you need the alternatives, those 140 00:06:18,230 --> 00:06:21,670 are the components-- four components of a decision. 141 00:06:21,670 --> 00:06:26,570 You need criteria, which means how are you going to evaluate 142 00:06:26,570 --> 00:06:28,610 and compare the alternatives? 143 00:06:28,610 --> 00:06:30,350 Then you need value judgments. 144 00:06:30,350 --> 00:06:33,980 You now apply the criteria to the alternatives 145 00:06:33,980 --> 00:06:36,140 to compare them, and then finally you 146 00:06:36,140 --> 00:06:37,910 need a decision maker, which is either 147 00:06:37,910 --> 00:06:41,120 an individual or a group with preferences 148 00:06:41,120 --> 00:06:42,980 to then make the choice. 149 00:06:42,980 --> 00:06:46,280 Those are the four ingredients of decision analysis. 150 00:06:46,280 --> 00:06:49,760 And if we show this somewhat graphically-- so here's 151 00:06:49,760 --> 00:06:51,170 our alternatives. 152 00:06:51,170 --> 00:06:55,310 We can do A, B, or C. The criteria in this case 153 00:06:55,310 --> 00:06:57,760 are cost, performance, and risk. 154 00:06:57,760 --> 00:07:02,720 Then we basically evaluate, in this case, option 155 00:07:02,720 --> 00:07:06,470 or alternative A, I should say, for cost, for performance, 156 00:07:06,470 --> 00:07:07,370 for risk. 157 00:07:07,370 --> 00:07:10,880 And this gives us some score or number, 158 00:07:10,880 --> 00:07:12,560 and then we need to figure out a way 159 00:07:12,560 --> 00:07:16,700 to combine this to rank the alternatives. 160 00:07:16,700 --> 00:07:18,650 And when it comes to ranking, there's 161 00:07:18,650 --> 00:07:21,680 a distinction between what's known as an ordinal scale 162 00:07:21,680 --> 00:07:23,540 and a cardinal scale. 163 00:07:23,540 --> 00:07:27,410 So in an ordinal ranking all you really care about and do 164 00:07:27,410 --> 00:07:30,390 is say, what's the best choice, the second best choice, 165 00:07:30,390 --> 00:07:31,350 and the third choice. 166 00:07:31,350 --> 00:07:33,980 You ordered the alternative. 167 00:07:33,980 --> 00:07:37,580 But you don't know in an ordinal ranking or ordinal scale, 168 00:07:37,580 --> 00:07:40,910 whether one and two are really close, 169 00:07:40,910 --> 00:07:45,530 and then there's a big gap between two and three. 170 00:07:45,530 --> 00:07:47,660 You don't know that, and you don't really care. 171 00:07:47,660 --> 00:07:50,060 All you want is the rank order. 172 00:07:50,060 --> 00:07:52,780 When you put things on a cardinal scale-- 173 00:07:52,780 --> 00:07:54,830 so this is a continuous line-- 174 00:07:54,830 --> 00:07:58,220 you actually get, in a sense, both the order 175 00:07:58,220 --> 00:08:00,920 but you also see how close the alternatives are. 176 00:08:00,920 --> 00:08:05,350 So that's a cardinal scale as opposed to an ordinal scale. 177 00:08:05,350 --> 00:08:06,500 OK? 178 00:08:06,500 --> 00:08:08,000 So-- yes? 179 00:08:08,000 --> 00:08:09,129 Please. 180 00:08:09,129 --> 00:08:11,420 AUDIENCE: How do you get a discrete value for something 181 00:08:11,420 --> 00:08:13,862 like risk or something like that? 182 00:08:13,862 --> 00:08:15,320 And especially in an ordinal scale, 183 00:08:15,320 --> 00:08:17,778 if you have two things that are really close to each other, 184 00:08:17,778 --> 00:08:20,774 but the risk is kind of a blurry region, how do you-- 185 00:08:20,774 --> 00:08:22,940 PROFESSOR: So then they would be equivalent in terms 186 00:08:22,940 --> 00:08:24,500 of that criterion. 187 00:08:24,500 --> 00:08:28,020 And you can actually have ties, right? 188 00:08:28,020 --> 00:08:30,540 You can have alternatives that are tied. 189 00:08:30,540 --> 00:08:32,240 And I'll talk about-- this is one 190 00:08:32,240 --> 00:08:33,830 of the issues in concept selection 191 00:08:33,830 --> 00:08:35,929 is, how do you do tie breaking? 192 00:08:35,929 --> 00:08:37,280 Good question. 193 00:08:37,280 --> 00:08:38,900 Any questions at EPFL? 194 00:08:38,900 --> 00:08:40,890 I know this is very abstract. 195 00:08:40,890 --> 00:08:44,210 But fundamentally, that's what we do. 196 00:08:44,210 --> 00:08:48,370 Any questions over there? 197 00:08:48,370 --> 00:08:50,335 Is it clear? 198 00:08:50,335 --> 00:08:51,627 AUDIENCE: Yes. 199 00:08:51,627 --> 00:08:52,210 PROFESSOR: OK. 200 00:08:52,210 --> 00:08:53,290 Good. 201 00:08:53,290 --> 00:08:55,670 So let's keep moving. 202 00:08:55,670 --> 00:08:56,680 So what are the issues? 203 00:08:56,680 --> 00:08:59,557 I think that's-- so multiple criteria, 204 00:08:59,557 --> 00:09:00,640 how do you deal with them? 205 00:09:00,640 --> 00:09:03,850 We usually have multiple criteria and how do we-- 206 00:09:03,850 --> 00:09:06,970 and eventually, when you make a decision, when 207 00:09:06,970 --> 00:09:08,800 you pick a concept, you're somehow 208 00:09:08,800 --> 00:09:13,630 combining these criteria into a single measure, 209 00:09:13,630 --> 00:09:15,820 whether you're doing that explicitly or not. 210 00:09:15,820 --> 00:09:18,260 So how do you deal with these multiple criteria? 211 00:09:18,260 --> 00:09:19,880 We'll talk about that. 212 00:09:19,880 --> 00:09:21,850 Here's the point that you brought up. 213 00:09:21,850 --> 00:09:23,110 What if there's ties? 214 00:09:23,110 --> 00:09:24,970 How do you break them? 215 00:09:24,970 --> 00:09:28,540 Group decision making versus individual decision making-- 216 00:09:28,540 --> 00:09:30,910 who gets to really make the decision? 217 00:09:30,910 --> 00:09:31,780 Right? 218 00:09:31,780 --> 00:09:34,570 And this relates to the stakeholder analysis. 219 00:09:34,570 --> 00:09:38,230 Who has the power to make the decision in the end? 220 00:09:38,230 --> 00:09:42,820 And then uncertainty-- did we choose the right criteria? 221 00:09:42,820 --> 00:09:46,520 Did we evaluate each option properly? 222 00:09:46,520 --> 00:09:49,540 And then the big one is are the the best 223 00:09:49,540 --> 00:09:51,550 alternatives represented? 224 00:09:51,550 --> 00:09:54,220 So you can choose among A, B, and C, 225 00:09:54,220 --> 00:09:57,610 but maybe there's this much, much better architecture 226 00:09:57,610 --> 00:10:01,870 concept D out there, but it's not in the decision set. 227 00:10:01,870 --> 00:10:04,300 Because it didn't-- nobody thought of it, 228 00:10:04,300 --> 00:10:07,690 or some other thing happened, and it didn't come up 229 00:10:07,690 --> 00:10:09,140 in concept generation. 230 00:10:09,140 --> 00:10:09,640 Right? 231 00:10:09,640 --> 00:10:11,742 So there's a lot of uncertainties 232 00:10:11,742 --> 00:10:12,700 when you're doing this. 233 00:10:12,700 --> 00:10:15,440 So you have to be aware of it. 234 00:10:15,440 --> 00:10:19,900 So let me talk about two-- 235 00:10:19,900 --> 00:10:23,140 again, quote, end quote-- "simple methods" 236 00:10:23,140 --> 00:10:24,640 for doing concept selection. 237 00:10:24,640 --> 00:10:27,400 The first one is called Pugh matrix, 238 00:10:27,400 --> 00:10:30,440 and then I'll talk about utility analysis. 239 00:10:30,440 --> 00:10:34,520 So the Pugh matrix, essentially, is a very discrete method. 240 00:10:34,520 --> 00:10:39,670 It uses essentially plus, zero, and minus to score alternatives 241 00:10:39,670 --> 00:10:41,440 relative to a datum. 242 00:10:41,440 --> 00:10:45,950 A datum is a reference decision, a reference architecture. 243 00:10:45,950 --> 00:10:48,600 And this is named after Stuart Pugh, 244 00:10:48,600 --> 00:10:55,060 who is a British professor of design, essentially. 245 00:10:55,060 --> 00:10:58,360 And it's a very simple method. 246 00:10:58,360 --> 00:11:00,820 And it's used extensively. 247 00:11:00,820 --> 00:11:04,820 It does have some pitfalls, as well, but it's very well known. 248 00:11:04,820 --> 00:11:07,470 So Pugh matrix 249 00:11:07,470 --> 00:11:11,830 And then the second one is called utility analysis. 250 00:11:11,830 --> 00:11:15,670 So it essentially maps all the criteria 251 00:11:15,670 --> 00:11:20,440 to a dimensionless utility, this is between zero and one. 252 00:11:20,440 --> 00:11:23,240 So zero is a completely useless system. 253 00:11:23,240 --> 00:11:25,090 It has zero utility. 254 00:11:25,090 --> 00:11:28,360 And one is a system that has perfect utility. 255 00:11:28,360 --> 00:11:30,010 It can't really be improved. 256 00:11:30,010 --> 00:11:32,560 You could try to improve it but your satisfaction 257 00:11:32,560 --> 00:11:34,960 would not increase, OK? 258 00:11:34,960 --> 00:11:39,160 And this has very deep mathematical theory, 259 00:11:39,160 --> 00:11:44,360 particularly the Von Neumann and Morgenstern 260 00:11:44,360 --> 00:11:51,030 utility that was basically developed 261 00:11:51,030 --> 00:11:53,000 in the mid-twentieth century. 262 00:11:53,000 --> 00:11:57,420 Anybody seen the movie A Beautiful Mind? 263 00:11:57,420 --> 00:11:57,920 Yeah? 264 00:11:57,920 --> 00:12:01,460 What is is A Beautiful Mind about? 265 00:12:07,412 --> 00:12:09,520 AUDIENCE: It's about John-- 266 00:12:09,520 --> 00:12:13,510 mathematician John Nash who's a mathematician who won-- 267 00:12:16,130 --> 00:12:18,400 I forget exactly what prize he had won. 268 00:12:22,684 --> 00:12:24,100 PROFESSOR: He won the Nobel Prize. 269 00:12:24,100 --> 00:12:25,475 AUDIENCE: Was it the Nobel Prize? 270 00:12:25,475 --> 00:12:26,300 OK. 271 00:12:26,300 --> 00:12:27,550 AUDIENCE: Founded game theory. 272 00:12:27,550 --> 00:12:28,383 PROFESSOR: For what? 273 00:12:28,383 --> 00:12:29,590 For game theory, right. 274 00:12:29,590 --> 00:12:33,040 And so one of Nash's contributions 275 00:12:33,040 --> 00:12:35,120 was the idea of a Nash equilibrium. 276 00:12:35,120 --> 00:12:38,110 So we have multiple decision makers in gaming, 277 00:12:38,110 --> 00:12:40,460 and the question is-- 278 00:12:40,460 --> 00:12:42,520 you have your current strategy, but is there 279 00:12:42,520 --> 00:12:44,410 a better move you could make? 280 00:12:44,410 --> 00:12:47,290 And if you could make a better move that 281 00:12:47,290 --> 00:12:52,280 improves your satisfaction, or your utility, 282 00:12:52,280 --> 00:12:56,590 but by doing this, you decrease somebody else's utility, 283 00:12:56,590 --> 00:12:58,480 you may not go there. 284 00:12:58,480 --> 00:13:00,450 So if you reach a Nash equilibrium, 285 00:13:00,450 --> 00:13:02,980 that means that everybody believes that this 286 00:13:02,980 --> 00:13:05,140 is the best they can achieve. 287 00:13:05,140 --> 00:13:06,740 And how do you measure that? 288 00:13:06,740 --> 00:13:12,130 Well Nash actually developed his Nash equilibrium and game 289 00:13:12,130 --> 00:13:18,880 theory building upon theories of Von Neumann and Morgenstern. 290 00:13:18,880 --> 00:13:21,560 So there's a very deep theory behind it. 291 00:13:21,560 --> 00:13:23,450 I'm not going to go very deep into it, 292 00:13:23,450 --> 00:13:29,010 but so you know that there's a lot behind utility analysis. 293 00:13:29,010 --> 00:13:33,970 OK, so let me just talk you through the basic steps 294 00:13:33,970 --> 00:13:35,920 for doing a Pugh matrix analysis. 295 00:13:35,920 --> 00:13:38,500 And I'll show you a very simple example. 296 00:13:38,500 --> 00:13:41,290 First step, you choose or develop the criteria 297 00:13:41,290 --> 00:13:43,430 for comparison. 298 00:13:43,430 --> 00:13:45,530 So, what's important? 299 00:13:45,530 --> 00:13:47,470 How are we going to decide this? 300 00:13:47,470 --> 00:13:48,610 Selection. 301 00:13:48,610 --> 00:13:50,140 And obviously this should be based 302 00:13:50,140 --> 00:13:53,990 on a set of system requirements and goals. 303 00:13:53,990 --> 00:13:57,710 Now, there are two flavors of requirements. 304 00:13:57,710 --> 00:14:01,390 Remember, what-- let's ask here at EPFL. 305 00:14:01,390 --> 00:14:04,330 You remember the two flavors of requirements? 306 00:14:04,330 --> 00:14:07,921 What were the two flavors of requirements? 307 00:14:07,921 --> 00:14:19,712 AUDIENCE: [INAUDIBLE] 308 00:14:19,712 --> 00:14:21,920 PROFESSOR: I mean, there's actually six types, right? 309 00:14:21,920 --> 00:14:24,100 There's functional requirements. 310 00:14:24,100 --> 00:14:26,850 There is constraints. 311 00:14:26,850 --> 00:14:31,360 There's interface requirements, and so forth. 312 00:14:31,360 --> 00:14:34,180 What I'm trying to get at is there are requirements 313 00:14:34,180 --> 00:14:36,010 that use shall statements and then 314 00:14:36,010 --> 00:14:39,310 there's requirements that use should statements, right? 315 00:14:39,310 --> 00:14:41,410 So the shall statements are what? 316 00:14:41,410 --> 00:14:43,195 What do the shell statements imply? 317 00:14:47,304 --> 00:14:50,912 AUDIENCE: So basically, for the shall, it's compulsory, 318 00:14:50,912 --> 00:14:55,180 and for should, it's a goal of the system. 319 00:14:55,180 --> 00:14:56,200 PROFESSOR: That's right. 320 00:14:56,200 --> 00:14:56,980 Exactly. 321 00:14:56,980 --> 00:15:00,280 So, in some sense, the shall state-- 322 00:15:00,280 --> 00:15:03,520 the criteria associated with shall statements 323 00:15:03,520 --> 00:15:07,960 don't necessarily make the best criteria for the Pugh matrix, 324 00:15:07,960 --> 00:15:12,130 because by definition, the concepts that you're evaluating 325 00:15:12,130 --> 00:15:14,770 should all satisfy the shall statements. 326 00:15:14,770 --> 00:15:19,030 Because if they don't, they're essentially infeasible, right? 327 00:15:19,030 --> 00:15:23,710 And you shouldn't select among them in the first place. 328 00:15:23,710 --> 00:15:27,430 So architectures, or concepts, or alternatives 329 00:15:27,430 --> 00:15:35,340 that violate shall requirements do not satisfy the must 330 00:15:35,340 --> 00:15:37,120 have requirements. 331 00:15:37,120 --> 00:15:40,570 And you shouldn't-- so you shouldn't use criteria that are 332 00:15:40,570 --> 00:15:42,610 associated with hard constraints. 333 00:15:42,610 --> 00:15:44,770 That's what I'm trying to say. 334 00:15:44,770 --> 00:15:46,840 Now the requirements that are associated with 335 00:15:46,840 --> 00:15:51,400 should statements, those are good criteria for this Pugh 336 00:15:51,400 --> 00:15:54,920 matrix, because the degree to which you satisfy this, 337 00:15:54,920 --> 00:15:57,020 the goals are somewhat variable. 338 00:15:57,020 --> 00:15:59,540 Therefore, you can then compare amongst alternatives. 339 00:15:59,540 --> 00:16:00,400 OK? 340 00:16:00,400 --> 00:16:01,330 So that's number one. 341 00:16:01,330 --> 00:16:05,110 Number two-- thank you, that was good-- 342 00:16:05,110 --> 00:16:07,160 select the alternatives to be compared. 343 00:16:07,160 --> 00:16:11,080 So that those are coming out of concept generation and one 344 00:16:11,080 --> 00:16:16,360 little bit tricky aspect here is when you're putting together 345 00:16:16,360 --> 00:16:19,780 the alternatives, the concepts to be selected from, 346 00:16:19,780 --> 00:16:21,790 they should be represented at a similar level 347 00:16:21,790 --> 00:16:24,070 of detail of abstraction. 348 00:16:24,070 --> 00:16:28,840 It is not good to have one concept that's really detailed, 349 00:16:28,840 --> 00:16:30,640 and you know a lot about it, and often it's 350 00:16:30,640 --> 00:16:33,640 kind of an existing system or something you've done before, 351 00:16:33,640 --> 00:16:38,440 and then there's this other concept that is very fuzzy. 352 00:16:38,440 --> 00:16:41,560 Why shouldn't you compare something very detailed 353 00:16:41,560 --> 00:16:45,820 with something very, very poorly defined? 354 00:16:45,820 --> 00:16:49,090 Why do you think that's not a good practice? 355 00:16:49,090 --> 00:16:50,780 What do you guys think here? 356 00:16:50,780 --> 00:16:52,050 Yes. 357 00:16:52,050 --> 00:16:53,440 AUDIENCE: Is it like-- 358 00:16:53,440 --> 00:16:55,050 it's not fair, because-- 359 00:16:55,050 --> 00:16:56,050 PROFESSOR: That's right. 360 00:16:56,050 --> 00:16:59,320 It's a fairness issue, and because the concepts 361 00:16:59,320 --> 00:17:02,550 that are poorly defined often, you're 362 00:17:02,550 --> 00:17:04,060 too optimistic about them. 363 00:17:04,060 --> 00:17:05,560 That's sort of the typical-- 364 00:17:05,560 --> 00:17:09,099 this looks really good, very promising, 365 00:17:09,099 --> 00:17:11,500 but it looks very promising because it's 366 00:17:11,500 --> 00:17:13,319 kind of ill-defined. 367 00:17:13,319 --> 00:17:15,319 So that's important. 368 00:17:15,319 --> 00:17:17,349 Try to represent all the concepts 369 00:17:17,349 --> 00:17:19,510 at about the same level of detail 370 00:17:19,510 --> 00:17:21,010 when you make the comparison. 371 00:17:21,010 --> 00:17:22,960 That's number two. 372 00:17:22,960 --> 00:17:26,440 Number three-- you actually go through and generate 373 00:17:26,440 --> 00:17:27,880 the scores. 374 00:17:27,880 --> 00:17:30,280 And the key thing here-- this is maybe 375 00:17:30,280 --> 00:17:32,560 the most important thing about the Pugh matrix 376 00:17:32,560 --> 00:17:36,250 is you always use a datum. 377 00:17:36,250 --> 00:17:40,330 The datum is a reference-- one of the alternatives 378 00:17:40,330 --> 00:17:43,220 that you're going to compare is your reference. 379 00:17:43,220 --> 00:17:47,560 So what you do is you always compare every other alternative 380 00:17:47,560 --> 00:17:51,550 in the set against the datum, and you say for each criterion 381 00:17:51,550 --> 00:17:53,290 is it better? 382 00:17:53,290 --> 00:17:55,660 Is it about the same, or is it worse 383 00:17:55,660 --> 00:17:59,250 than the reference, the datum? 384 00:17:59,250 --> 00:18:01,620 So you don't compare all the alternatives 385 00:18:01,620 --> 00:18:03,180 pairwise against each other. 386 00:18:03,180 --> 00:18:06,590 You always compare against the datum. 387 00:18:06,590 --> 00:18:08,630 Then you compute the total scores 388 00:18:08,630 --> 00:18:10,300 and I'll show you this in the example, 389 00:18:10,300 --> 00:18:12,110 and then there's variations like-- 390 00:18:12,110 --> 00:18:14,300 rather than better, equal, or worse, 391 00:18:14,300 --> 00:18:17,066 you could go to a five-point. 392 00:18:17,066 --> 00:18:20,030 Much better, a little better, or about the same. 393 00:18:20,030 --> 00:18:21,980 So there are some variations on the scoring, 394 00:18:21,980 --> 00:18:27,804 but the classic method just uses better, the same, or worse. 395 00:18:27,804 --> 00:18:28,470 So here's some-- 396 00:18:28,470 --> 00:18:30,260 AUDIENCE: Excuse me, just a question. 397 00:18:30,260 --> 00:18:32,157 PROFESSOR: Yeah, go ahead. 398 00:18:32,157 --> 00:18:33,740 AUDIENCE: Just about the number three, 399 00:18:33,740 --> 00:18:36,270 about generic scores and user datum-- 400 00:18:36,270 --> 00:18:39,496 by doing so, don't you exclude a solution that could have, 401 00:18:39,496 --> 00:18:44,450 under the generic picture, be better then others? 402 00:18:44,450 --> 00:18:47,219 PROFESSOR: What do you mean by general picture? 403 00:18:47,219 --> 00:18:49,593 AUDIENCE: You should take all the solution that you have, 404 00:18:49,593 --> 00:18:53,280 and at the end, one performs slightly better than the other, 405 00:18:53,280 --> 00:18:56,820 but you couldn't spot it by using the datum? 406 00:18:56,820 --> 00:18:57,540 PROFESSOR: Yeah. 407 00:18:57,540 --> 00:19:00,100 So it's a very good question. 408 00:19:00,100 --> 00:19:02,550 And so this is a question of resolution right? 409 00:19:02,550 --> 00:19:05,150 How good is your resolution? 410 00:19:05,150 --> 00:19:07,380 The main purpose of the Pugh matrix 411 00:19:07,380 --> 00:19:11,160 is not for making the final choice right away. 412 00:19:11,160 --> 00:19:12,930 The main purpose of the Pugh matrix 413 00:19:12,930 --> 00:19:17,750 is actually to eliminate concepts that are-- 414 00:19:17,750 --> 00:19:23,130 you can tell already that they're not competitive. 415 00:19:23,130 --> 00:19:26,640 And you don't typically do Pugh matrix one time 416 00:19:26,640 --> 00:19:28,530 and then say that's the best one. 417 00:19:28,530 --> 00:19:31,106 You use it, essentially, to eliminate 418 00:19:31,106 --> 00:19:32,996 purely inferior solutions. 419 00:19:32,996 --> 00:19:33,870 Does that make sense? 420 00:19:37,550 --> 00:19:42,500 AUDIENCE: I'm really sorry but the video cut just 421 00:19:42,500 --> 00:19:44,640 during your explanation. 422 00:19:44,640 --> 00:19:48,060 PROFESSOR: OK, so let me repeat this. 423 00:19:48,060 --> 00:19:51,930 So the purpose of the Pugh matrix 424 00:19:51,930 --> 00:19:54,750 is not to select the final concept 425 00:19:54,750 --> 00:19:57,120 as a result of doing the Pugh matrix, 426 00:19:57,120 --> 00:20:00,510 but to identify inferior concepts 427 00:20:00,510 --> 00:20:02,100 to eliminate them from the set. 428 00:20:06,270 --> 00:20:07,790 OK? 429 00:20:07,790 --> 00:20:08,626 AUDIENCE: OK. 430 00:20:08,626 --> 00:20:09,500 PROFESSOR: All right. 431 00:20:09,500 --> 00:20:13,460 So here's a kind of generic example 432 00:20:13,460 --> 00:20:15,040 to explain how this works. 433 00:20:15,040 --> 00:20:20,090 So here's our Pugh matrix, our evaluation matrix. 434 00:20:20,090 --> 00:20:22,820 On the rows, we have our criteria. 435 00:20:22,820 --> 00:20:30,680 It could be cost, performance, resilience, effectiveness, 436 00:20:30,680 --> 00:20:33,890 and so those are called A, B, C, D, E, and F. 437 00:20:33,890 --> 00:20:36,920 So we have six criteria. 438 00:20:36,920 --> 00:20:38,030 And then the column-- 439 00:20:38,030 --> 00:20:40,260 each column represents a concept, 440 00:20:40,260 --> 00:20:42,380 a design, an architecture. 441 00:20:42,380 --> 00:20:44,090 And so you know we're designing-- 442 00:20:44,090 --> 00:20:46,640 I guess we're designing beams here, or levers, 443 00:20:46,640 --> 00:20:48,210 or something like that. 444 00:20:48,210 --> 00:20:49,790 So this is a solid cylinder. 445 00:20:49,790 --> 00:20:52,220 This is a cylinder that's hollow. 446 00:20:52,220 --> 00:20:54,500 This is a square, it's cross-sectioned. 447 00:20:54,500 --> 00:20:56,646 This is a triangular cross-section. 448 00:20:56,646 --> 00:20:57,770 You get the picture, right? 449 00:20:57,770 --> 00:21:01,010 There they're qualitatively different. 450 00:21:01,010 --> 00:21:05,360 And then you note which one is our is our reference here? 451 00:21:05,360 --> 00:21:08,370 Which concept? 452 00:21:08,370 --> 00:21:09,510 Number seven, right? 453 00:21:09,510 --> 00:21:16,140 If you look at seven this sort of shallow triangle. 454 00:21:16,140 --> 00:21:17,520 Seven, it says datum. 455 00:21:17,520 --> 00:21:19,040 You see that? 456 00:21:19,040 --> 00:21:21,140 Which means that this is our baseline, 457 00:21:21,140 --> 00:21:23,180 our reference, our datum. 458 00:21:23,180 --> 00:21:27,930 And then what you do is, as you fill this in-- 459 00:21:27,930 --> 00:21:29,360 and you can do it two ways. 460 00:21:29,360 --> 00:21:35,210 You can do-- pick, say, concept one, the solid cylinder, 461 00:21:35,210 --> 00:21:39,230 and then compare it against seven for all criteria. 462 00:21:39,230 --> 00:21:45,650 So A, so concept one is better than concept seven in criterion 463 00:21:45,650 --> 00:21:47,150 A. So it gets a plus. 464 00:21:47,150 --> 00:21:49,820 It's also better in B. It gets a plus. 465 00:21:49,820 --> 00:21:55,190 It's worse in C and D, so this is a minus compared to seven, 466 00:21:55,190 --> 00:21:57,040 and so forth. 467 00:21:57,040 --> 00:21:58,850 So the two ways of filling it in is 468 00:21:58,850 --> 00:22:01,820 you do concept by concept, column-wise, 469 00:22:01,820 --> 00:22:05,330 or you can do it row- wise. 470 00:22:05,330 --> 00:22:09,570 And there's some arguments in favor of one or the other. 471 00:22:09,570 --> 00:22:12,650 But essentially, you fill in this matrix 472 00:22:12,650 --> 00:22:17,390 by always comparing against number seven. 473 00:22:17,390 --> 00:22:19,117 Alexis, what's your question? 474 00:22:19,117 --> 00:22:20,450 AUDIENCE: Yeah, just a question. 475 00:22:20,450 --> 00:22:22,273 Shouldn't we weight the criteria? 476 00:22:22,273 --> 00:22:26,890 I mean, for example, if the criteria A is more important, 477 00:22:26,890 --> 00:22:28,440 shall we put more weight on it? 478 00:22:28,440 --> 00:22:29,610 PROFESSOR: No. 479 00:22:29,610 --> 00:22:30,450 No. 480 00:22:30,450 --> 00:22:33,600 At this point, in the Pugh matrix, 481 00:22:33,600 --> 00:22:35,670 the criteria are unweighted. 482 00:22:35,670 --> 00:22:37,410 This is very important. 483 00:22:37,410 --> 00:22:40,900 You don't say B is twice as important as C. 484 00:22:40,900 --> 00:22:43,830 They're unweighted when you first do this. 485 00:22:43,830 --> 00:22:46,320 Eventually, when you do the final selection, 486 00:22:46,320 --> 00:22:48,120 you will probably weigh them. 487 00:22:48,120 --> 00:22:50,710 But when you first do this, there's no weighting. 488 00:22:50,710 --> 00:22:52,320 They're equal. 489 00:22:52,320 --> 00:22:52,930 OK? 490 00:22:52,930 --> 00:22:53,430 Yes? 491 00:22:53,430 --> 00:22:54,395 Go ahead. 492 00:22:54,395 --> 00:22:56,520 AUDIENCE: Are there any rules in picking your datum 493 00:22:56,520 --> 00:22:58,230 or do you just pick arbitrarily? 494 00:22:58,230 --> 00:23:01,740 PROFESSOR: So I'll get into that. 495 00:23:01,740 --> 00:23:07,040 It should be a design that you think is competitive, 496 00:23:07,040 --> 00:23:10,140 that [INAUDIBLE] maybe you know it, 497 00:23:10,140 --> 00:23:12,110 maybe it's an existing sort of system, 498 00:23:12,110 --> 00:23:16,614 there's data on it, a known quantity. 499 00:23:16,614 --> 00:23:18,030 But if you don't have if you don't 500 00:23:18,030 --> 00:23:20,020 have any information, if they're all equal, 501 00:23:20,020 --> 00:23:22,320 then it's a random choice. 502 00:23:22,320 --> 00:23:22,980 OK? 503 00:23:22,980 --> 00:23:26,670 So, then once you've filled in this matrix-- 504 00:23:26,670 --> 00:23:29,190 so S here is for same, right? 505 00:23:29,190 --> 00:23:30,510 About the same. 506 00:23:30,510 --> 00:23:33,465 Then you essentially sum all the pluses, 507 00:23:33,465 --> 00:23:36,600 you sum all the minuses, and then 508 00:23:36,600 --> 00:23:39,820 you have the net result, unweighted. 509 00:23:39,820 --> 00:23:40,320 Right? 510 00:23:40,320 --> 00:23:43,650 So, here you can essentially compare 511 00:23:43,650 --> 00:23:47,820 that, you can see that concept one is better in three 512 00:23:47,820 --> 00:23:51,900 and worse in three than the datum. 513 00:23:51,900 --> 00:23:57,810 You can see that concept two is inferior in three 514 00:23:57,810 --> 00:24:01,180 criteria, better in two, and the same in one. 515 00:24:01,180 --> 00:24:03,180 But we're not actually summing. 516 00:24:03,180 --> 00:24:04,980 We're not actually summing at this point, 517 00:24:04,980 --> 00:24:07,260 because we're not weighting these, OK? 518 00:24:07,260 --> 00:24:11,310 But it gives you a basis of understanding. 519 00:24:11,310 --> 00:24:16,050 OK so let's do a quick partner exercise. 520 00:24:16,050 --> 00:24:18,240 What do you see as the main advantages 521 00:24:18,240 --> 00:24:22,960 and potential disadvantages or pitfalls of this method? 522 00:24:22,960 --> 00:24:27,030 So turn to your partner, discuss this for a few minutes, 523 00:24:27,030 --> 00:24:29,316 and then we'll see what you came up with. 524 00:24:29,316 --> 00:24:31,220 [MICROPHONE NOISE] 525 00:24:31,220 --> 00:24:54,900 AUDIENCE: [INAUDIBLE] 526 00:24:54,900 --> 00:24:55,790 PROFESSOR: All right. 527 00:24:55,790 --> 00:24:56,289 Good. 528 00:24:56,289 --> 00:24:59,380 So let's see advantages. 529 00:24:59,380 --> 00:25:01,020 Let's get a couple here, and then we'll 530 00:25:01,020 --> 00:25:03,570 sort of go back and forth between here and EPFL. 531 00:25:03,570 --> 00:25:05,700 So what are-- what's good about this method? 532 00:25:09,820 --> 00:25:11,099 Yes, go ahead. 533 00:25:11,099 --> 00:25:12,640 AUDIENCE: So, Nate and I were talking 534 00:25:12,640 --> 00:25:16,600 that it is a relatively simple decision making process. 535 00:25:16,600 --> 00:25:19,380 So instead of just going with some sort of gut feeling, 536 00:25:19,380 --> 00:25:22,830 or not looking at everything, you can get stuff down on paper 537 00:25:22,830 --> 00:25:25,017 and actually see these criteria and how 538 00:25:25,017 --> 00:25:26,100 they relate to each other. 539 00:25:26,100 --> 00:25:26,830 PROFESSOR: Yeah. 540 00:25:26,830 --> 00:25:28,050 So simplicity. 541 00:25:28,050 --> 00:25:28,945 What else? 542 00:25:28,945 --> 00:25:30,430 Go ahead, Sam. 543 00:25:30,430 --> 00:25:34,320 AUDIENCE: It allows you to look at a large array of concepts 544 00:25:34,320 --> 00:25:36,050 very quickly. 545 00:25:36,050 --> 00:25:39,580 PROFESSOR: OK large samples. 546 00:25:44,310 --> 00:25:45,918 Sorry. 547 00:25:45,918 --> 00:25:46,530 Quick, right? 548 00:25:46,530 --> 00:25:47,620 You said quick. 549 00:25:47,620 --> 00:25:48,120 . 550 00:25:48,120 --> 00:25:49,140 OK. 551 00:25:49,140 --> 00:25:52,090 What advantages, EPFL, what did you guys come up with? 552 00:25:52,090 --> 00:25:54,375 So simple, it allows us to look at a lot of concepts. 553 00:25:54,375 --> 00:25:55,810 It's quick. 554 00:25:55,810 --> 00:25:56,310 What else? 555 00:25:59,585 --> 00:26:03,310 AUDIENCE: [INAUDIBLE] we got [INAUDIBLE] the same things. 556 00:26:03,310 --> 00:26:08,795 This method is simple, and it's pretty fast and pretty 557 00:26:08,795 --> 00:26:09,670 straightforward, too. 558 00:26:09,670 --> 00:26:16,281 So this is a good way to get rid of some of the concepts 559 00:26:16,281 --> 00:26:16,780 pretty fast. 560 00:26:16,780 --> 00:26:17,410 PROFESSOR: OK. 561 00:26:17,410 --> 00:26:22,150 Anything else that we didn't [INAUDIBLE]?? 562 00:26:22,150 --> 00:26:24,460 There's one more that I would like to put on this list. 563 00:26:24,460 --> 00:26:26,524 Veronica, go ahead. 564 00:26:26,524 --> 00:26:27,870 AUDIENCE: [INAUDIBLE] 565 00:26:27,870 --> 00:26:29,510 PROFESSOR: Qualitative. 566 00:26:29,510 --> 00:26:39,140 AUDIENCE: [INAUDIBLE] I kind of like [INAUDIBLE] improvements, 567 00:26:39,140 --> 00:26:44,060 rather than this is better because the requirement is-- 568 00:26:44,060 --> 00:26:46,030 AUDIENCE: Please put on the microphone. 569 00:26:46,030 --> 00:26:49,960 PROFESSOR: I think-- can you just quickly repeat that? 570 00:26:49,960 --> 00:26:51,860 AUDIENCE: Just that it's qualitative. 571 00:26:51,860 --> 00:26:54,380 PROFESSOR: And that you see that as an advantage, actually. 572 00:26:54,380 --> 00:26:55,870 AUDIENCE: I think you can see it both ways. 573 00:26:55,870 --> 00:26:56,700 PROFESSOR: Yeah. 574 00:26:56,700 --> 00:27:00,870 So I think what you implied with is it stimulates 575 00:27:00,870 --> 00:27:02,802 the discussion, right? 576 00:27:02,802 --> 00:27:03,907 Stimulates debate. 577 00:27:03,907 --> 00:27:08,176 AUDIENCE: There may be another criterion on the positive side. 578 00:27:08,176 --> 00:27:09,713 PROFESSOR: Go ahead. 579 00:27:09,713 --> 00:27:14,100 AUDIENCE: And it would be that it's extremely easy to explain 580 00:27:14,100 --> 00:27:18,003 to [INAUDIBLE],, to [INAUDIBLE],, to whoever your customer is, 581 00:27:18,003 --> 00:27:19,895 because-- and they can relate to it. 582 00:27:19,895 --> 00:27:24,090 If the criteria are mostly physical or scientific, I mean, 583 00:27:24,090 --> 00:27:27,480 larger, stronger, lighter, cheaper, 584 00:27:27,480 --> 00:27:30,750 it's probably very intuitive, yes, as I say. 585 00:27:30,750 --> 00:27:32,330 PROFESSOR: OK, great. 586 00:27:32,330 --> 00:27:34,180 Now, downsides. 587 00:27:34,180 --> 00:27:37,510 What are the downsides of the method? 588 00:27:37,510 --> 00:27:39,390 Go ahead. 589 00:27:39,390 --> 00:27:41,060 Make sure you-- 590 00:27:41,060 --> 00:27:43,690 AUDIENCE: I think it definitely depends 591 00:27:43,690 --> 00:27:46,210 on which you pick as a datum. 592 00:27:46,210 --> 00:27:48,869 Like, if you pick the clearly best one as the datum, 593 00:27:48,869 --> 00:27:50,910 then it's not going to give you much information. 594 00:27:50,910 --> 00:27:53,955 Or if you pick the worst one as the datum, then all of them 595 00:27:53,955 --> 00:27:54,580 will be better. 596 00:27:54,580 --> 00:27:57,670 And you kind of don't really gain much information 597 00:27:57,670 --> 00:27:58,430 from that. 598 00:27:58,430 --> 00:28:01,305 So you have to like iterate with different datums. 599 00:28:01,305 --> 00:28:01,930 PROFESSOR: Yes. 600 00:28:01,930 --> 00:28:03,610 There's actually research on this. 601 00:28:03,610 --> 00:28:06,820 There's some research-- this matrix, this Pugh matrix 602 00:28:06,820 --> 00:28:09,580 method, by the way, has been studied 603 00:28:09,580 --> 00:28:11,300 scientifically quite a bit. 604 00:28:11,300 --> 00:28:12,910 There's quite a literature on it. 605 00:28:12,910 --> 00:28:16,370 And the point that you brought up was studied. 606 00:28:16,370 --> 00:28:18,550 They would basically give the same criteria 607 00:28:18,550 --> 00:28:21,460 and the same alternatives to different groups of people, 608 00:28:21,460 --> 00:28:23,880 but then give them a different datum. 609 00:28:23,880 --> 00:28:27,130 And it's a little tricky-- is the result different 610 00:28:27,130 --> 00:28:29,870 because it's a different group dynamic? 611 00:28:29,870 --> 00:28:33,690 Or is it-- but this has been done statistically, 612 00:28:33,690 --> 00:28:37,300 and the choice of datum has an impact on the outcome. 613 00:28:37,300 --> 00:28:38,320 Very good. 614 00:28:38,320 --> 00:28:40,450 That's actually a pretty subtle point. 615 00:28:40,450 --> 00:28:42,010 That was excellent. 616 00:28:42,010 --> 00:28:44,830 What are some other-- yes, please go ahead. 617 00:28:44,830 --> 00:28:47,140 AUDIENCE: While it's very easy to implement 618 00:28:47,140 --> 00:28:50,510 and anyone can do it, it is very subjective. 619 00:28:50,510 --> 00:28:53,200 So if you have somebody that doesn't 620 00:28:53,200 --> 00:28:57,820 have a whole lot of experience in a certain area, 621 00:28:57,820 --> 00:29:00,290 they may get a completely different answer 622 00:29:00,290 --> 00:29:02,592 than somebody who is an expert looking 623 00:29:02,592 --> 00:29:03,800 at the same type of criteria. 624 00:29:03,800 --> 00:29:08,070 PROFESSOR: Subjectivity-- which basically, repeatability-- 625 00:29:08,070 --> 00:29:09,340 may be low, right? 626 00:29:09,340 --> 00:29:11,890 You want to have robustness of the methods such 627 00:29:11,890 --> 00:29:14,590 that if you repeated this multiple times 628 00:29:14,590 --> 00:29:17,020 or you gave it to different groups of people, 629 00:29:17,020 --> 00:29:20,230 you want to have some confidence that the results will come up 630 00:29:20,230 --> 00:29:21,250 similarly. 631 00:29:21,250 --> 00:29:22,510 Very good. 632 00:29:22,510 --> 00:29:25,900 Let's see at EPFL, any downsides? 633 00:29:25,900 --> 00:29:29,680 So datum dependent, subjective-- 634 00:29:29,680 --> 00:29:31,366 what else? 635 00:29:31,366 --> 00:29:32,740 AUDIENCE: So what we are thinking 636 00:29:32,740 --> 00:29:38,910 is that the criteria depend, from group to group. 637 00:29:38,910 --> 00:29:41,490 It doesn't seem to be clearly defined what you need. 638 00:29:41,490 --> 00:29:44,260 Although, there are some that are straightforward 639 00:29:44,260 --> 00:29:47,319 and our other point-- 640 00:29:47,319 --> 00:29:48,444 AUDIENCE: Datum dependence? 641 00:29:48,444 --> 00:29:49,110 That's what we-- 642 00:29:49,110 --> 00:29:52,320 AUDIENCE: Yeah, datum dependence. 643 00:29:52,320 --> 00:29:58,750 If it was a very easy solution, or if a very bad solution 644 00:29:58,750 --> 00:30:02,020 as a reference, then all the others look good. 645 00:30:02,020 --> 00:30:04,440 And finally, if you have a-- 646 00:30:04,440 --> 00:30:07,850 if you judge on the overall concept, 647 00:30:07,850 --> 00:30:10,640 and maybe you discard one or two solutions, 648 00:30:10,640 --> 00:30:14,620 but inside those concepts you have ideas 649 00:30:14,620 --> 00:30:22,120 that you could keep or use, then they might be lost. 650 00:30:22,120 --> 00:30:24,600 PROFESSOR: OK, so loss of sub-concepts. 651 00:30:24,600 --> 00:30:26,760 OK. 652 00:30:26,760 --> 00:30:27,360 Good. 653 00:30:27,360 --> 00:30:28,690 Anything else here? 654 00:30:28,690 --> 00:30:30,207 Mike. 655 00:30:30,207 --> 00:30:32,290 AUDIENCE: We were talking about how it's very easy 656 00:30:32,290 --> 00:30:35,110 to tune your metrics to be-- 657 00:30:35,110 --> 00:30:36,022 like to be biased. 658 00:30:36,022 --> 00:30:36,730 PROFESSOR: I see. 659 00:30:36,730 --> 00:30:38,980 Such gaming, right? 660 00:30:38,980 --> 00:30:41,020 Gaming-- gaming the-- 661 00:30:41,020 --> 00:30:45,640 basically producing a matrix to give you the answer you want, 662 00:30:45,640 --> 00:30:47,510 that you already had preordained, sort of. 663 00:30:47,510 --> 00:30:48,010 Right? 664 00:30:48,010 --> 00:30:50,450 That essentially what you're saying. 665 00:30:50,450 --> 00:30:50,950 . 666 00:30:50,950 --> 00:30:52,230 OK. 667 00:30:52,230 --> 00:30:52,790 Very good. 668 00:30:52,790 --> 00:30:54,250 So these are all-- 669 00:30:54,250 --> 00:30:54,750 go ahead. 670 00:30:57,306 --> 00:30:59,610 AUDIENCE: Just sort of one last disadvantage 671 00:30:59,610 --> 00:31:02,880 of the method that I think is a bit interesting, 672 00:31:02,880 --> 00:31:06,510 because I think the way of grading everything with 673 00:31:06,510 --> 00:31:09,330 plus or minus can be [INAUDIBLE],, because you can 674 00:31:09,330 --> 00:31:13,892 imagine a system, for instance, that is very, very good in one 675 00:31:13,892 --> 00:31:17,550 criteria, and then pretty bad in all the others, 676 00:31:17,550 --> 00:31:19,930 and it could be a good concept that 677 00:31:19,930 --> 00:31:22,410 is disregarded nonetheless. 678 00:31:22,410 --> 00:31:25,050 PROFESSOR: What's your name? 679 00:31:25,050 --> 00:31:25,800 AUDIENCE: Bastian. 680 00:31:25,800 --> 00:31:28,536 PROFESSOR: Pasqual? 681 00:31:28,536 --> 00:31:29,490 AUDIENCE: Sorry? 682 00:31:29,490 --> 00:31:30,281 PROFESSOR: Pasqual. 683 00:31:30,281 --> 00:31:31,365 Is your name Pasqual? 684 00:31:31,365 --> 00:31:32,115 AUDIENCE: Bastian. 685 00:31:32,115 --> 00:31:33,265 PROFESSOR: [INAUDIBLE]? 686 00:31:33,265 --> 00:31:34,690 AUDIENCE: Bastian. 687 00:31:34,690 --> 00:31:36,860 PROFESSOR: OK. 688 00:31:36,860 --> 00:31:38,810 Keep that point until-- we're going 689 00:31:38,810 --> 00:31:42,320 to talk about exactly that point when it comes to non-dominance, 690 00:31:42,320 --> 00:31:43,430 OK? 691 00:31:43,430 --> 00:31:47,060 So please reserve-- this is great. 692 00:31:47,060 --> 00:31:50,420 Just, we'll come back to that in a few minutes, OK? 693 00:31:50,420 --> 00:31:52,790 All right, so this is great. 694 00:31:52,790 --> 00:31:54,710 I think you really got it. 695 00:31:54,710 --> 00:31:58,310 This is exact-- this is a very good list. 696 00:31:58,310 --> 00:32:01,820 So, you may say well, OK this is good for beams 697 00:32:01,820 --> 00:32:03,780 and you know very simple things. 698 00:32:03,780 --> 00:32:05,600 So, I want to show you an example 699 00:32:05,600 --> 00:32:09,080 of an application of the Pugh matrix to what I would argue 700 00:32:09,080 --> 00:32:13,670 is a fairly complex architectural decision. 701 00:32:13,670 --> 00:32:16,820 And what I'll talk to you about is a thesis 702 00:32:16,820 --> 00:32:21,900 that was done here at MIT in 2003, so a while ago, 703 00:32:21,900 --> 00:32:24,950 12 years ago, by Brian Smith. 704 00:32:24,950 --> 00:32:30,020 He was an STM student, and was one of the easiest students 705 00:32:30,020 --> 00:32:31,460 ever to advise. 706 00:32:31,460 --> 00:32:36,230 Very knowledgeable in nuclear propulsion and nuclear power. 707 00:32:36,230 --> 00:32:39,710 He's now the branch chief for nuclear power propulsion 708 00:32:39,710 --> 00:32:43,670 at NASA Glenn research center in Ohio. 709 00:32:43,670 --> 00:32:46,790 And so, at the time, NASA had a mission 710 00:32:46,790 --> 00:32:50,420 that was kind of high priority called JIMO-- 711 00:32:50,420 --> 00:32:52,790 Jupiter icy moons mission. 712 00:32:52,790 --> 00:32:55,730 And when you go out to Jupiter, and you 713 00:32:55,730 --> 00:32:57,950 want to not just do a fly by but you actually 714 00:32:57,950 --> 00:32:59,990 want to go to different orbits, you 715 00:32:59,990 --> 00:33:03,240 want to do high power imaging, you need a lot of power. 716 00:33:03,240 --> 00:33:05,050 You're far from the sun. 717 00:33:05,050 --> 00:33:08,870 RTG's only give you 100 watts or so, per RTG. 718 00:33:08,870 --> 00:33:12,777 So nuclear power is pretty much the way to go. 719 00:33:12,777 --> 00:33:14,360 But there's a lot of different choices 720 00:33:14,360 --> 00:33:16,640 of different nuclear reactor architectures 721 00:33:16,640 --> 00:33:18,770 that would potentially-- 722 00:33:18,770 --> 00:33:21,320 and then couple that with the propulsion system. 723 00:33:21,320 --> 00:33:26,830 So that was the challenge was sifting through the large space 724 00:33:26,830 --> 00:33:29,680 of possibility for nuclear electric power, 725 00:33:29,680 --> 00:33:32,620 nuclear exploration class electric power and propulsion 726 00:33:32,620 --> 00:33:33,700 systems. 727 00:33:33,700 --> 00:33:38,350 And you know, here's the way the study was set up 728 00:33:38,350 --> 00:33:40,900 is the expansion phase, the filtering, and then 729 00:33:40,900 --> 00:33:43,620 the screening phase. 730 00:33:43,620 --> 00:33:47,310 So here's the architectural space the way it was defined. 731 00:33:47,310 --> 00:33:51,480 So nuclear electric power and propulsion is what we're after. 732 00:33:51,480 --> 00:33:55,070 And there's different ways to generate these alternatives. 733 00:33:55,070 --> 00:33:57,690 So the most important is the design vector, 734 00:33:57,690 --> 00:34:00,200 the type of reactor, the operating temperature, 735 00:34:00,200 --> 00:34:02,780 the power conversion scheme, the heat exchange, and then 736 00:34:02,780 --> 00:34:04,340 the fuel type. 737 00:34:04,340 --> 00:34:06,790 And then there were requirements. 738 00:34:06,790 --> 00:34:10,040 So this is the power range, the delivery timeline, 739 00:34:10,040 --> 00:34:13,070 the fact that you should be able to do it in a single launch, 740 00:34:13,070 --> 00:34:15,630 and then the operational lifetime of the system. 741 00:34:15,630 --> 00:34:17,580 So those are the shall requirements. 742 00:34:17,580 --> 00:34:20,989 Every architecture had to meet these requirements otherwise 743 00:34:20,989 --> 00:34:22,550 it would not be considered. 744 00:34:22,550 --> 00:34:25,174 And then there were some things that were assumed as constants. 745 00:34:25,174 --> 00:34:27,110 They're shown here. 746 00:34:27,110 --> 00:34:28,940 There was a policy vector in terms 747 00:34:28,940 --> 00:34:31,909 of funding profiles, international partnerships, 748 00:34:31,909 --> 00:34:36,260 at what altitude you could insert the system into orbit, 749 00:34:36,260 --> 00:34:38,030 and then influence on future missions. 750 00:34:38,030 --> 00:34:41,730 And the objective vector is the criteria for the decision. 751 00:34:41,730 --> 00:34:45,080 Technology-- TRL is Technology Readiness Level. 752 00:34:45,080 --> 00:34:46,800 How much infrastructure do you need? 753 00:34:46,800 --> 00:34:47,659 Complexity. 754 00:34:47,659 --> 00:34:49,900 Strategic value to the nation. 755 00:34:49,900 --> 00:34:53,100 This is a little fuzzy, but basically what this means is, 756 00:34:53,100 --> 00:34:55,250 could you use it for other applications 757 00:34:55,250 --> 00:34:57,580 than exploring Jupiter? 758 00:34:57,580 --> 00:34:58,220 The schedule. 759 00:34:58,220 --> 00:34:59,431 The launch packaging. 760 00:34:59,431 --> 00:34:59,930 Power. 761 00:34:59,930 --> 00:35:04,710 Specific mass, which is watts per kilogram. 762 00:35:04,710 --> 00:35:06,870 The lifetime. 763 00:35:06,870 --> 00:35:09,680 So it's interesting you have lifetime here and here. 764 00:35:09,680 --> 00:35:13,040 So there's a minimum lifetime, right, and if you satisfy this, 765 00:35:13,040 --> 00:35:15,380 the extra life you get can actually 766 00:35:15,380 --> 00:35:17,750 be used as a decision criteria. 767 00:35:17,750 --> 00:35:19,480 How does it interact with the payload? 768 00:35:19,480 --> 00:35:21,410 Do you need a lot of shielding? 769 00:35:21,410 --> 00:35:22,730 And then adaptability? 770 00:35:22,730 --> 00:35:24,510 Can you adapt it for different missions? 771 00:35:24,510 --> 00:35:25,010 Yeah? 772 00:35:25,010 --> 00:35:30,051 AUDIENCE: In the previous slide, it listed both possible-- 773 00:35:30,051 --> 00:35:33,370 on the previous slide, we had both possible and feasible 774 00:35:33,370 --> 00:35:37,410 on there as different kind of criteria. 775 00:35:37,410 --> 00:35:39,630 What do you have for a difference for those? 776 00:35:39,630 --> 00:35:42,990 PROFESSOR: So essentially, possible is, this 777 00:35:42,990 --> 00:35:45,480 is sort of the combinatorial space. 778 00:35:45,480 --> 00:35:49,940 And then, applying the hard constraints 779 00:35:49,940 --> 00:35:52,640 based on the requirements vector gets you 780 00:35:52,640 --> 00:35:55,300 from possible to feasible. 781 00:35:55,300 --> 00:35:57,920 They satisfy your must have requirements, 782 00:35:57,920 --> 00:35:59,960 your shall requirements, and then you 783 00:35:59,960 --> 00:36:03,700 use, in screening, you use the objective vector-- 784 00:36:03,700 --> 00:36:10,120 these criteria-- you use to go from here to here. 785 00:36:10,120 --> 00:36:12,870 So let me show you what this looks like, 786 00:36:12,870 --> 00:36:15,210 and I'll post this thesis, if you're interested. 787 00:36:15,210 --> 00:36:20,110 It's really-- the whole thesis is about doing this process. 788 00:36:20,110 --> 00:36:21,600 So here we have-- 789 00:36:21,600 --> 00:36:25,470 each column represents a different reactor architecture. 790 00:36:25,470 --> 00:36:28,450 So you can see in the legend what these things mean. 791 00:36:28,450 --> 00:36:35,610 So for the reactor type, we have a liquid metal, gas cooled, 792 00:36:35,610 --> 00:36:38,670 or heat pipe type reactor. 793 00:36:38,670 --> 00:36:42,360 These are three types of nuclear fission reactors. 794 00:36:42,360 --> 00:36:45,940 For fuel, we have two types of fuel, if I remember correctly. 795 00:36:45,940 --> 00:36:48,810 We have UO2 or UN. 796 00:36:48,810 --> 00:36:52,080 For the temperature at which the reactor would be run, 797 00:36:52,080 --> 00:36:54,390 we have medium or high. 798 00:36:54,390 --> 00:36:57,870 For the conversion-- this is the thermodynamics. 799 00:36:57,870 --> 00:37:00,450 How do you get the heat out of the reactor? 800 00:37:00,450 --> 00:37:03,960 We have a Brayton or a Rankine cycle, 801 00:37:03,960 --> 00:37:08,070 or a thermoelectric reactor, which is basically 802 00:37:08,070 --> 00:37:11,190 directly converting the heat to electricity 803 00:37:11,190 --> 00:37:13,710 without a working fluid. 804 00:37:13,710 --> 00:37:20,820 And then, D is the heat exchange architecture, which 805 00:37:20,820 --> 00:37:23,190 is either direct or indirect. 806 00:37:23,190 --> 00:37:26,100 And so you can see that each of these columns 807 00:37:26,100 --> 00:37:29,490 represents a different kind of architecture, 808 00:37:29,490 --> 00:37:32,940 and they're colored here by-- 809 00:37:32,940 --> 00:37:42,330 the dark is a direct heat pipe with liquid metal reactor, 810 00:37:42,330 --> 00:37:44,040 and so forth. 811 00:37:44,040 --> 00:37:47,640 So that's essentially-- what you're seeing here 812 00:37:47,640 --> 00:37:51,540 is the filtered set that is deemed to be feasible. 813 00:37:51,540 --> 00:37:53,430 Potentially feasible. 814 00:37:53,430 --> 00:37:57,870 And then the actual screening is done based on this. 815 00:37:57,870 --> 00:38:00,790 So this is the actual Pugh matrix here. 816 00:38:00,790 --> 00:38:04,420 So we have the concept combinations, 817 00:38:04,420 --> 00:38:06,040 which are the columns. 818 00:38:06,040 --> 00:38:09,150 The datum here is the one that's shaded in gray. 819 00:38:09,150 --> 00:38:13,350 So it's a liquid metal reactor with a thermal-- 820 00:38:13,350 --> 00:38:16,770 it's a liquid metal thermoelectric reactor 821 00:38:16,770 --> 00:38:21,570 with indirect heat exchange, UN fuel type and a high operating 822 00:38:21,570 --> 00:38:22,950 temperature. 823 00:38:22,950 --> 00:38:25,230 And why was this chosen? 824 00:38:25,230 --> 00:38:30,000 Well if you look here, it says SP100 reference. 825 00:38:30,000 --> 00:38:33,450 So the US actually has launched-- at least, 826 00:38:33,450 --> 00:38:35,760 officially, as far as we know-- 827 00:38:35,760 --> 00:38:39,270 one nuclear reactor into space, and that's the SP100. 828 00:38:39,270 --> 00:38:41,070 You can look it up. 829 00:38:41,070 --> 00:38:43,180 There's a history behind it. 830 00:38:43,180 --> 00:38:46,950 And that's the architecture of the SP100. 831 00:38:46,950 --> 00:38:48,450 So it was launched. 832 00:38:48,450 --> 00:38:49,680 We had data on it. 833 00:38:49,680 --> 00:38:51,510 It's a known quantity. 834 00:38:51,510 --> 00:38:53,580 The Russians, it turns out-- 835 00:38:53,580 --> 00:38:56,730 Russia has launched a lot of nuclear reactors 836 00:38:56,730 --> 00:38:58,550 over the years. 837 00:38:58,550 --> 00:39:01,500 But this is sort of the US baseline, 838 00:39:01,500 --> 00:39:03,640 if you want to call it that. 839 00:39:03,640 --> 00:39:05,880 And then all of these other combinations 840 00:39:05,880 --> 00:39:10,680 are compared against the SP100, the reference reactor. 841 00:39:10,680 --> 00:39:17,020 And so, you can see here that the comparison is drawn again. 842 00:39:17,020 --> 00:39:20,200 So zero means it's about-- 843 00:39:20,200 --> 00:39:23,440 so for TRL, it means it's about equally mature. 844 00:39:23,440 --> 00:39:26,140 A plus means it's actually more mature. 845 00:39:26,140 --> 00:39:30,720 A negative means it's less mature than the SP100. 846 00:39:30,720 --> 00:39:34,000 And at the end of this, you can actually 847 00:39:34,000 --> 00:39:37,620 look at the total pluses and zeros and minuses, 848 00:39:37,620 --> 00:39:40,260 and then in this case, what Brian did, 849 00:39:40,260 --> 00:39:42,600 he did calculate a net score. 850 00:39:42,600 --> 00:39:44,130 That's an unweighted score. 851 00:39:44,130 --> 00:39:48,510 Basically, if you give one point-- so let's 852 00:39:48,510 --> 00:39:50,020 look at the first one. 853 00:39:50,020 --> 00:39:53,260 So it has four pluses, right? 854 00:39:53,260 --> 00:39:58,240 It has four pluses, and it has five equal to the SP100, 855 00:39:58,240 --> 00:40:00,350 to the datum, and has two minus. 856 00:40:00,350 --> 00:40:03,340 So it's worse than two, better in four criteria, 857 00:40:03,340 --> 00:40:04,840 and the same in five. 858 00:40:04,840 --> 00:40:07,630 So the net score would be plus two, 859 00:40:07,630 --> 00:40:11,770 because it's better in two criteria, net. 860 00:40:11,770 --> 00:40:13,690 But it's unweighted. 861 00:40:13,690 --> 00:40:17,980 And so in order to figure out what-- 862 00:40:17,980 --> 00:40:21,880 based on this net score, so a zero net score 863 00:40:21,880 --> 00:40:24,600 would say it's about equal to the datum, 864 00:40:24,600 --> 00:40:27,440 but it maybe in other, different criteria. 865 00:40:27,440 --> 00:40:30,620 But essentially it's about the same. 866 00:40:30,620 --> 00:40:35,080 If it's a plus two, then it is potentially better 867 00:40:35,080 --> 00:40:36,310 than the SP100. 868 00:40:36,310 --> 00:40:38,230 But you don't know that 100%, because you 869 00:40:38,230 --> 00:40:39,640 haven't weighted the criteria. 870 00:40:39,640 --> 00:40:42,010 It's really used for screening. 871 00:40:42,010 --> 00:40:47,950 So the ones that are circled are better, essentially, 872 00:40:47,950 --> 00:40:49,930 than the datum. 873 00:40:49,930 --> 00:40:51,710 And the ones that are-- 874 00:40:51,710 --> 00:40:53,410 this is-- I'm sorry, the rank. 875 00:40:53,410 --> 00:40:54,970 This is the rank among the set. 876 00:40:54,970 --> 00:40:59,290 So if you have a one here, you're in the rank one. 877 00:40:59,290 --> 00:41:03,490 And there's two architectures, the first two are rank one. 878 00:41:03,490 --> 00:41:05,050 And then we have this architecture, 879 00:41:05,050 --> 00:41:06,370 which is ranked two. 880 00:41:06,370 --> 00:41:10,300 The datum itself, plus two other architectures, are rank three. 881 00:41:10,300 --> 00:41:11,800 And then you see that there's some-- 882 00:41:11,800 --> 00:41:14,260 like, for example, this architecture here 883 00:41:14,260 --> 00:41:15,580 is rank eight, right? 884 00:41:15,580 --> 00:41:17,350 So it-- why is that? 885 00:41:17,350 --> 00:41:20,350 Because it's not better in any criteria than the datum. 886 00:41:20,350 --> 00:41:21,910 It's the same in six. 887 00:41:21,910 --> 00:41:23,840 And it's worse in five, right? 888 00:41:23,840 --> 00:41:26,980 So that gives you an indication that this particular concept 889 00:41:26,980 --> 00:41:32,570 probably is not very attractive, and you would eliminate that. 890 00:41:32,570 --> 00:41:33,640 OK? 891 00:41:33,640 --> 00:41:40,090 So that's essentially an application of the Pugh matrix 892 00:41:40,090 --> 00:41:45,170 to a pretty real-world, complex problem. 893 00:41:45,170 --> 00:41:49,150 So what the Pugh matrix is for is essentially structured-- 894 00:41:49,150 --> 00:41:52,060 structuring and representing the evaluation procedure. 895 00:41:52,060 --> 00:41:54,130 It serves as a common visual. 896 00:41:54,130 --> 00:41:56,530 It provides some amount of discipline, 897 00:41:56,530 --> 00:41:59,470 and helps break down self-sealing behavior, 898 00:41:59,470 --> 00:42:00,520 meaning-- 899 00:42:00,520 --> 00:42:03,100 what that means is people defending their concepts 900 00:42:03,100 --> 00:42:05,500 without sort of seeing the bigger picture. 901 00:42:05,500 --> 00:42:07,130 Encourages teamwork. 902 00:42:07,130 --> 00:42:09,370 It helps you eliminate weaker ideas, 903 00:42:09,370 --> 00:42:11,950 retain a set of stronger concepts. 904 00:42:11,950 --> 00:42:14,000 And then this is interesting-- divergence. 905 00:42:14,000 --> 00:42:17,260 It helps to identify opportunities for combination. 906 00:42:17,260 --> 00:42:20,300 There's also a way to use this Pugh matrix 907 00:42:20,300 --> 00:42:22,301 methods multiple times. 908 00:42:22,301 --> 00:42:23,800 And the way you would do this is you 909 00:42:23,800 --> 00:42:27,880 would say, OK so these are the stronger concepts here, right? 910 00:42:27,880 --> 00:42:30,670 The ones that do equally well or maybe even 911 00:42:30,670 --> 00:42:32,550 a little better than the datum. 912 00:42:32,550 --> 00:42:35,200 What are the strengths and weaknesses of each concept 913 00:42:35,200 --> 00:42:37,210 and can we hybridize them? 914 00:42:37,210 --> 00:42:40,330 Can we create some hybrid concepts? 915 00:42:40,330 --> 00:42:43,930 And so you eliminate the weaker ones. 916 00:42:43,930 --> 00:42:47,770 The surviving concepts are kept, and then maybe hybridized 917 00:42:47,770 --> 00:42:48,700 with each other. 918 00:42:48,700 --> 00:42:52,000 So you expand and create more concepts again. 919 00:42:52,000 --> 00:42:53,560 Maybe a little bit more detailed. 920 00:42:53,560 --> 00:42:56,020 And then you apply the Pugh matrix again. 921 00:42:56,020 --> 00:43:00,220 So it's iterative application of the Pugh matrix, where 922 00:43:00,220 --> 00:43:04,570 you eliminate, you keep strong concepts, you hybridize them, 923 00:43:04,570 --> 00:43:06,790 and then you repeat it two or three times. 924 00:43:06,790 --> 00:43:08,800 And there's actually research on this, as well. 925 00:43:08,800 --> 00:43:11,140 Professor Dan Frye, who is here at MIT 926 00:43:11,140 --> 00:43:14,260 in mechanical engineering, has done research on that 927 00:43:14,260 --> 00:43:18,050 and shown that that can actually lead to very good outcomes. 928 00:43:18,050 --> 00:43:19,870 So this sort of gradual-- 929 00:43:19,870 --> 00:43:23,740 so then, you don't have just one expansion and contraction 930 00:43:23,740 --> 00:43:28,060 of your concept space, but it sort of goes multiple times. 931 00:43:28,060 --> 00:43:29,960 But eventually it does converge. 932 00:43:29,960 --> 00:43:33,700 So this is a more refined application of the method. 933 00:43:33,700 --> 00:43:38,650 What Pugh matrix is not good for is automatic decision-making, 934 00:43:38,650 --> 00:43:40,390 right? 935 00:43:40,390 --> 00:43:42,310 Completely controlling the process. 936 00:43:42,310 --> 00:43:47,810 So, you know, the idea of automating the Pugh matrix-- 937 00:43:47,810 --> 00:43:49,660 it should not be automatic. 938 00:43:49,660 --> 00:43:51,460 And it's not really done-- it's not 939 00:43:51,460 --> 00:43:55,100 really good for trade studies. 940 00:43:55,100 --> 00:43:58,460 Challenges-- [LAUGHS] so here's some quotes. 941 00:43:58,460 --> 00:44:00,350 "People who have a lot of experience 942 00:44:00,350 --> 00:44:02,630 exhibit an impatience. 943 00:44:02,630 --> 00:44:04,880 Get on with it. 944 00:44:04,880 --> 00:44:06,800 This procedure holds us back." 945 00:44:06,800 --> 00:44:07,430 Right? 946 00:44:07,430 --> 00:44:10,160 "Strong-willed individuals who have a lot of experience 947 00:44:10,160 --> 00:44:13,310 and whose initial concepts have not emerged in the final 948 00:44:13,310 --> 00:44:18,260 selection commence a defense based on emotion, experience, 949 00:44:18,260 --> 00:44:19,120 and bluster." 950 00:44:19,120 --> 00:44:19,620 Right? 951 00:44:19,620 --> 00:44:23,360 So there's this social dynamics that gets unfolded here. 952 00:44:23,360 --> 00:44:26,540 So, therefore, it is recommended to do Pugh matrix 953 00:44:26,540 --> 00:44:27,980 with a facilitator. 954 00:44:27,980 --> 00:44:31,820 Somebody who-- similar to brainstorming, right, 955 00:44:31,820 --> 00:44:33,720 where you have a facilitator. 956 00:44:33,720 --> 00:44:36,230 So you have a facilitator for Pugh matrix. 957 00:44:36,230 --> 00:44:38,300 So somebody who controls the flow and pace 958 00:44:38,300 --> 00:44:40,910 of the session, records the results, 959 00:44:40,910 --> 00:44:44,120 tries to maintain some discipline. 960 00:44:44,120 --> 00:44:45,890 Always compare against the datum. 961 00:44:45,890 --> 00:44:50,000 I've seen Pugh matrix sessions where you start with the datum, 962 00:44:50,000 --> 00:44:51,770 and then by the third or fourth concept, 963 00:44:51,770 --> 00:44:54,470 people are starting to do pairwise comparisons, 964 00:44:54,470 --> 00:44:56,630 and the datum is sort of lost. 965 00:44:56,630 --> 00:44:59,210 That should not happen, right? 966 00:44:59,210 --> 00:45:02,450 Preventing tangents, but encourage clarification. 967 00:45:02,450 --> 00:45:05,000 What do we really mean by criteria? 968 00:45:05,000 --> 00:45:08,180 Clarification of the concepts, and then opportunities 969 00:45:08,180 --> 00:45:11,760 for divergence in hybrids. 970 00:45:11,760 --> 00:45:14,319 So the critique of the method is-- 971 00:45:14,319 --> 00:45:16,610 I think a lot of these things you've already mentioned. 972 00:45:16,610 --> 00:45:19,220 The ranking depends on the choice of datum. 973 00:45:19,220 --> 00:45:20,240 The weighting. 974 00:45:20,240 --> 00:45:21,860 You know, there's no real weighting 975 00:45:21,860 --> 00:45:24,390 in the classic method. 976 00:45:24,390 --> 00:45:26,931 However, you can implement a weighted version of the Pugh 977 00:45:26,931 --> 00:45:29,180 method, but then you have to have the whole discussion 978 00:45:29,180 --> 00:45:33,350 about the weightings, a priori. 979 00:45:33,350 --> 00:45:36,500 Next, I'm going to talk about multi-attribute utility. 980 00:45:36,500 --> 00:45:39,760 And so Pugh matrix and multi-attribute utility 981 00:45:39,760 --> 00:45:42,010 could give you a different rank order of alternatives. 982 00:45:42,010 --> 00:45:45,110 So how do you reconcile that? 983 00:45:45,110 --> 00:45:48,080 The most important criteria may be intangible and missing 984 00:45:48,080 --> 00:45:49,650 from the list. 985 00:45:49,650 --> 00:45:52,670 And my personal opinion on this is that the Pugh matrix 986 00:45:52,670 --> 00:45:54,840 is useful and simple to use. 987 00:45:54,840 --> 00:45:58,700 It stimulates discussion about the criteria, the alternatives, 988 00:45:58,700 --> 00:46:02,480 but it shouldn't be your only means of concept selection. 989 00:46:02,480 --> 00:46:04,250 But it's usually one of the first things 990 00:46:04,250 --> 00:46:07,860 you should do because you learn a lot from doing it. 991 00:46:07,860 --> 00:46:08,720 OK? 992 00:46:08,720 --> 00:46:12,920 Any questions about matrix before I move on? 993 00:46:12,920 --> 00:46:14,690 Yes, Sam? 994 00:46:14,690 --> 00:46:16,130 AUDIENCE: Should it always be done 995 00:46:16,130 --> 00:46:20,450 with a group that's working together and discussing 996 00:46:20,450 --> 00:46:21,110 the concepts? 997 00:46:21,110 --> 00:46:24,410 Or could you also have people individually 998 00:46:24,410 --> 00:46:27,170 filling them out and then combining their results? 999 00:46:27,170 --> 00:46:30,650 PROFESSOR: So yeah, usually you do it as a group 1000 00:46:30,650 --> 00:46:33,410 because in the discussion with the group, 1001 00:46:33,410 --> 00:46:36,140 there's a lot of clarification that happens. 1002 00:46:36,140 --> 00:46:40,550 Now if you do it separately, and you firewall people 1003 00:46:40,550 --> 00:46:43,550 from each other and have people do this individually and then 1004 00:46:43,550 --> 00:46:48,140 try to combine the results later, that's actually called-- 1005 00:46:48,140 --> 00:46:52,606 that's closer to a method called the Delphi method. 1006 00:46:52,606 --> 00:46:54,230 Which I'm not going to talk about here. 1007 00:46:54,230 --> 00:46:56,490 But the Delphi method means you're 1008 00:46:56,490 --> 00:47:01,070 asking these kinds of questions to experts. 1009 00:47:01,070 --> 00:47:04,520 They give you the answers, and then you anonymize the answers 1010 00:47:04,520 --> 00:47:08,210 and reflect the combined answers back to the group. 1011 00:47:08,210 --> 00:47:11,720 And then you eliminate the social dynamics, 1012 00:47:11,720 --> 00:47:15,590 and you can see whether was my assessment an outlier, 1013 00:47:15,590 --> 00:47:17,690 am I close to the center of the group? 1014 00:47:17,690 --> 00:47:20,550 So the Delphi method has a lot of benefits, 1015 00:47:20,550 --> 00:47:22,130 but it's not usually-- 1016 00:47:22,130 --> 00:47:24,680 you don't have the social dynamic 1017 00:47:24,680 --> 00:47:28,370 in the same way, which is sort of essential to make this work. 1018 00:47:28,370 --> 00:47:29,900 Good question though. 1019 00:47:29,900 --> 00:47:33,510 Any questions at EPFL about Pugh matrix before I move on? 1020 00:47:36,688 --> 00:47:37,823 AUDIENCE: No, it's clear. 1021 00:47:37,823 --> 00:47:38,470 PROFESSOR: OK. 1022 00:47:38,470 --> 00:47:39,970 Go ahead. 1023 00:47:39,970 --> 00:47:44,000 AUDIENCE: It is useful to almost have an independent group do 1024 00:47:44,000 --> 00:47:46,880 this, as opposed to everyone generates their ideas 1025 00:47:46,880 --> 00:47:52,050 and then everyone's trying to fill out the matrix so 1026 00:47:52,050 --> 00:47:53,510 that their idea wins. 1027 00:47:53,510 --> 00:47:56,780 PROFESSOR: Usually not, because if you give it 1028 00:47:56,780 --> 00:47:59,120 to an independent group, they'd know 1029 00:47:59,120 --> 00:48:00,410 much less about the problem. 1030 00:48:00,410 --> 00:48:04,190 They probably weren't involved in generating the alternatives, 1031 00:48:04,190 --> 00:48:07,100 they weren't involved in selecting the criteria. 1032 00:48:07,100 --> 00:48:09,380 They're probably not as knowledgeable. 1033 00:48:09,380 --> 00:48:13,850 And so there is an advantage of doing it within the team. 1034 00:48:13,850 --> 00:48:16,370 But then, you know, before you actually go and pull 1035 00:48:16,370 --> 00:48:18,440 the trigger on the final-- that's the PDR. 1036 00:48:18,440 --> 00:48:21,620 You get vetted by independent people. 1037 00:48:21,620 --> 00:48:24,300 That's what happens at the PDR. 1038 00:48:24,300 --> 00:48:25,325 OK. 1039 00:48:25,325 --> 00:48:29,750 AUDIENCE: One point here is that you can invite external people 1040 00:48:29,750 --> 00:48:31,130 to join your discussion. 1041 00:48:31,130 --> 00:48:32,980 So you have your design team. 1042 00:48:32,980 --> 00:48:35,508 And it's very good to have the experts to come, 1043 00:48:35,508 --> 00:48:38,300 sit in the meeting, and then give their opinion, 1044 00:48:38,300 --> 00:48:40,264 and with the Pugh matrix, as it's 1045 00:48:40,264 --> 00:48:41,680 pretty straightforward and simple, 1046 00:48:41,680 --> 00:48:43,820 you probably shouldn't then bias the result, 1047 00:48:43,820 --> 00:48:45,487 but you could probably enrich it. 1048 00:48:45,487 --> 00:48:46,820 PROFESSOR: That's a great point. 1049 00:48:46,820 --> 00:48:48,560 I agree with that, [INAUDIBLE]. 1050 00:48:48,560 --> 00:48:49,160 Absolutely. 1051 00:48:49,160 --> 00:48:50,360 Yeah. 1052 00:48:50,360 --> 00:48:50,960 OK. 1053 00:48:50,960 --> 00:48:54,860 So let's switch gears to utility theory 1054 00:48:54,860 --> 00:48:57,080 or multi-attribute utility theory. 1055 00:48:57,080 --> 00:48:58,710 So what's that all about? 1056 00:48:58,710 --> 00:49:02,870 So like I said, utility is a very deeply rooted concept 1057 00:49:02,870 --> 00:49:04,610 in economics. 1058 00:49:04,610 --> 00:49:07,190 And so it's defined as, "Utility is 1059 00:49:07,190 --> 00:49:11,060 a measure of relative happiness, or satisfaction, 1060 00:49:11,060 --> 00:49:15,140 or gratification gained by consuming different bundles 1061 00:49:15,140 --> 00:49:16,470 of goods and services." 1062 00:49:16,470 --> 00:49:18,470 This is sort of economic-- very-- 1063 00:49:18,470 --> 00:49:21,560 this is how economists talk. 1064 00:49:21,560 --> 00:49:25,040 So it's-- the idea is you want to choose a concept 1065 00:49:25,040 --> 00:49:28,220 or alternative that will maximize your happiness, 1066 00:49:28,220 --> 00:49:31,530 your satisfaction, your gratification. 1067 00:49:31,530 --> 00:49:36,620 And whenever you buy something, when you go to Anna's Taqueria, 1068 00:49:36,620 --> 00:49:39,500 or wherever you guys have lunch, or at EPFL you 1069 00:49:39,500 --> 00:49:43,430 go to the Rowlock center, and you pick from the menu, 1070 00:49:43,430 --> 00:49:46,160 you're actually doing just that, right? 1071 00:49:46,160 --> 00:49:49,400 You have-- how much money do I have in my pocket? 1072 00:49:49,400 --> 00:49:52,370 How hungry am I? 1073 00:49:52,370 --> 00:49:54,020 Am I a vegetarian or not? 1074 00:49:54,020 --> 00:49:57,440 That would filter out some menu items right away, right? 1075 00:49:57,440 --> 00:49:59,720 And you pick, every day. 1076 00:49:59,720 --> 00:50:01,340 You pick your lunch. 1077 00:50:01,340 --> 00:50:03,249 Well, you're doing exactly this. 1078 00:50:03,249 --> 00:50:05,040 You don't really know that you're doing it, 1079 00:50:05,040 --> 00:50:09,020 but in your mind, as you're picking from the menu, 1080 00:50:09,020 --> 00:50:10,880 and then making your choice, you're 1081 00:50:10,880 --> 00:50:14,540 maximizing your utility at that moment. 1082 00:50:14,540 --> 00:50:16,410 So we do this every day. 1083 00:50:16,410 --> 00:50:20,000 Here, in this class, we talk about designing 1084 00:50:20,000 --> 00:50:21,920 complex systems. 1085 00:50:21,920 --> 00:50:25,160 So we need to do it a little bit more deliberately, 1086 00:50:25,160 --> 00:50:28,110 but this is not just some abstract thing. 1087 00:50:28,110 --> 00:50:30,750 We do this every day. 1088 00:50:30,750 --> 00:50:34,700 And so the idea is, essentially, we have this Consumption Set 1089 00:50:34,700 --> 00:50:37,040 X, which are our alternatives-- mutually 1090 00:50:37,040 --> 00:50:39,080 exclusive alternatives-- and then we 1091 00:50:39,080 --> 00:50:41,870 map that to the real scale. 1092 00:50:41,870 --> 00:50:44,120 We rank each member of the Consumption Set. 1093 00:50:44,120 --> 00:50:48,110 So we map the alternatives on this utility scale. 1094 00:50:48,110 --> 00:50:52,130 In this case, the cardinal scale between zero and one. 1095 00:50:52,130 --> 00:50:54,690 So, the way this is done, in order 1096 00:50:54,690 --> 00:51:00,060 to do the mapping from the criteria to utility, 1097 00:51:00,060 --> 00:51:02,190 you need these mapping functions. 1098 00:51:02,190 --> 00:51:04,350 And we call these utility functions. 1099 00:51:04,350 --> 00:51:06,180 They're called utility functions. 1100 00:51:06,180 --> 00:51:09,300 So basically we have-- 1101 00:51:09,300 --> 00:51:12,390 I'm going to use J here for-- 1102 00:51:12,390 --> 00:51:17,040 J means, essentially, your attribute, your objective. 1103 00:51:17,040 --> 00:51:19,290 And then U is your utility. 1104 00:51:19,290 --> 00:51:23,040 And so there are different shapes of utility functions. 1105 00:51:23,040 --> 00:51:25,750 And different scholars call them differently. 1106 00:51:25,750 --> 00:51:27,180 So this is Cook. 1107 00:51:27,180 --> 00:51:30,080 He says, this is smaller is better, or larger is better. 1108 00:51:30,080 --> 00:51:32,940 And you see they're essentially monotonically increasing 1109 00:51:32,940 --> 00:51:34,770 or decreasing curves. 1110 00:51:34,770 --> 00:51:39,450 Messac, another scholar in multi objective design, 1111 00:51:39,450 --> 00:51:42,570 calls this class 1S, class 2S. 1112 00:51:42,570 --> 00:51:47,070 So what would be an example of a smaller 1113 00:51:47,070 --> 00:51:49,024 is better utility function? 1114 00:51:52,260 --> 00:51:54,300 What would be an example of that? 1115 00:51:54,300 --> 00:51:54,800 Go ahead. 1116 00:51:58,620 --> 00:52:01,260 AUDIENCE: An example would be the weight of a launch vehicle. 1117 00:52:01,260 --> 00:52:03,026 It's decreasing. 1118 00:52:03,026 --> 00:52:04,150 PROFESSOR: The launch mass? 1119 00:52:04,150 --> 00:52:04,560 AUDIENCE: Yes. 1120 00:52:04,560 --> 00:52:04,920 PROFESSOR: OK. 1121 00:52:04,920 --> 00:52:05,510 Launch mass. 1122 00:52:05,510 --> 00:52:06,720 OK. 1123 00:52:06,720 --> 00:52:07,980 What about EPFL? 1124 00:52:07,980 --> 00:52:09,100 Smaller is better. 1125 00:52:13,820 --> 00:52:14,780 AUDIENCE: Cost. 1126 00:52:14,780 --> 00:52:16,120 PROFESSOR: Cost, right. 1127 00:52:16,120 --> 00:52:19,250 So, larger is better? 1128 00:52:27,160 --> 00:52:27,910 AUDIENCE: Revenue. 1129 00:52:27,910 --> 00:52:29,530 PROFESSOR: Revenue. 1130 00:52:29,530 --> 00:52:36,270 Maybe range, endurance, reliability, right? 1131 00:52:36,270 --> 00:52:40,020 Then there's-- the next one is strictly concave or strictly 1132 00:52:40,020 --> 00:52:42,660 convex, which is nominal is better. 1133 00:52:42,660 --> 00:52:46,640 So the idea there is there's a sweet spot, right? 1134 00:52:46,640 --> 00:52:49,350 You want the performance or the attribute 1135 00:52:49,350 --> 00:52:53,160 to be a pretty specific value, and if you 1136 00:52:53,160 --> 00:52:58,200 deviate from that on the up or down side then 1137 00:52:58,200 --> 00:53:00,180 the utility decreases. 1138 00:53:00,180 --> 00:53:03,660 So what would be an example of that? 1139 00:53:03,660 --> 00:53:05,310 AUDIENCE: Size of a meal. 1140 00:53:05,310 --> 00:53:06,990 PROFESSOR: Size of a meal, OK? 1141 00:53:06,990 --> 00:53:10,410 Well, some restaurants may beg to differ, right? 1142 00:53:10,410 --> 00:53:11,178 All you can eat. 1143 00:53:11,178 --> 00:53:11,677 [LAUGHS] 1144 00:53:11,677 --> 00:53:12,591 AUDIENCE: [INAUDIBLE] what their objective is. 1145 00:53:12,591 --> 00:53:13,410 PROFESSOR: Yes. 1146 00:53:13,410 --> 00:53:14,310 All right. 1147 00:53:14,310 --> 00:53:14,910 Very good. 1148 00:53:14,910 --> 00:53:15,409 Yes, Sam? 1149 00:53:15,409 --> 00:53:17,229 AUDIENCE: [INAUDIBLE] has to interface 1150 00:53:17,229 --> 00:53:20,043 with something else [INAUDIBLE] you're not 1151 00:53:20,043 --> 00:53:22,860 [INAUDIBLE] it [INAUDIBLE]. 1152 00:53:22,860 --> 00:53:25,740 PROFESSOR: Yup, so if there is a very specific interface 1153 00:53:25,740 --> 00:53:26,360 condition. 1154 00:53:26,360 --> 00:53:27,090 Yeah. 1155 00:53:27,090 --> 00:53:29,240 OK, go ahead at EPFL. 1156 00:53:29,240 --> 00:53:32,640 Nominal is better. 1157 00:53:32,640 --> 00:53:34,790 AUDIENCE: Like, for example, ambient temperature. 1158 00:53:34,790 --> 00:53:35,456 PROFESSOR: Yeah. 1159 00:53:35,456 --> 00:53:37,920 So temperature-- ambient temperature, right? 1160 00:53:37,920 --> 00:53:41,580 Humans are pretty-- we have a fairly narrow range where 1161 00:53:41,580 --> 00:53:43,110 we say, this is good. 1162 00:53:43,110 --> 00:53:45,150 Now we can put on a sweater or-- 1163 00:53:45,150 --> 00:53:47,500 but it's a fairly narrow range. 1164 00:53:47,500 --> 00:53:49,650 So, great example. 1165 00:53:49,650 --> 00:53:52,770 Then the next one is range is better, 1166 00:53:52,770 --> 00:53:55,090 which is concave or convex. 1167 00:53:55,090 --> 00:53:59,670 So this is the idea that as long as you're within this interval, 1168 00:53:59,670 --> 00:54:03,260 within the interval itself, you're sort of indifferent. 1169 00:54:03,260 --> 00:54:03,760 Right? 1170 00:54:03,760 --> 00:54:06,160 But then when you drop outside the interval, 1171 00:54:06,160 --> 00:54:09,280 then utility decreases or increases. 1172 00:54:09,280 --> 00:54:13,270 And then the last one is very exotic-- non-monotonic utility 1173 00:54:13,270 --> 00:54:15,520 functions that have multiple peaks. 1174 00:54:15,520 --> 00:54:19,340 They exist in theory, but in practice, you almost never 1175 00:54:19,340 --> 00:54:19,840 see them. 1176 00:54:19,840 --> 00:54:23,290 I can't give you a good example of multi-modal one. 1177 00:54:23,290 --> 00:54:26,754 But in theory they do exist. 1178 00:54:26,754 --> 00:54:29,214 AUDIENCE: Well, there is one, and it's clearly 1179 00:54:29,214 --> 00:54:30,805 the landing sites on a planet. 1180 00:54:30,805 --> 00:54:32,200 PROFESSOR: OK. 1181 00:54:32,200 --> 00:54:35,260 AUDIENCE: Because you have the one injection trajectory, 1182 00:54:35,260 --> 00:54:38,710 and then you have the primary sites of the primary elipse, 1183 00:54:38,710 --> 00:54:40,480 then the secondary. 1184 00:54:40,480 --> 00:54:45,010 And it's non-monotonic, mostly [INAUDIBLE].. 1185 00:54:45,010 --> 00:54:46,720 It doesn't go up, it goes down. 1186 00:54:46,720 --> 00:54:47,790 So you have ideal. 1187 00:54:47,790 --> 00:54:49,900 Then you are out of it. 1188 00:54:49,900 --> 00:54:51,530 Then you would land on rocks. 1189 00:54:51,530 --> 00:54:54,160 And then you have the next plateau, and maybe a third one, 1190 00:54:54,160 --> 00:54:56,104 on one pass by. 1191 00:54:56,104 --> 00:54:57,520 PROFESSOR: That's an interesting-- 1192 00:54:57,520 --> 00:55:00,450 that's an interesting comment. 1193 00:55:00,450 --> 00:55:03,550 I'm trying to figure out what's the attribute-- what's 1194 00:55:03,550 --> 00:55:05,890 the engineering attribute that goes with that? 1195 00:55:09,526 --> 00:55:12,550 AUDIENCE: Well, it's the tolerance on the injection. 1196 00:55:12,550 --> 00:55:14,550 I know that, at JPL-- 1197 00:55:14,550 --> 00:55:17,830 by the way, thanks for organizing the [INAUDIBLE]---- 1198 00:55:17,830 --> 00:55:19,440 they made this presentation, also, 1199 00:55:19,440 --> 00:55:21,850 of the landing of the rovers. 1200 00:55:21,850 --> 00:55:23,680 And they have the primary ellipse 1201 00:55:23,680 --> 00:55:25,180 where they try to land-- 1202 00:55:25,180 --> 00:55:29,860 well, try-- it depends on the entry. 1203 00:55:29,860 --> 00:55:32,800 And then they have secondary-- if they miss the entry, 1204 00:55:32,800 --> 00:55:36,042 then they have very bad option. 1205 00:55:36,042 --> 00:55:37,500 And then, maybe a little bit later, 1206 00:55:37,500 --> 00:55:39,370 they maybe can land on the next plateau, 1207 00:55:39,370 --> 00:55:41,700 or in the next crater, which is useful. 1208 00:55:41,700 --> 00:55:46,602 So they have this non-monotonic, but degrading, function. 1209 00:55:46,602 --> 00:55:47,560 The first time is best. 1210 00:55:47,560 --> 00:55:48,808 The second would still work. 1211 00:55:48,808 --> 00:55:51,480 But in between, there's a part that doesn't go. 1212 00:55:51,480 --> 00:55:53,760 PROFESSOR: OK. 1213 00:55:53,760 --> 00:55:55,040 Got to think about that one. 1214 00:55:55,040 --> 00:55:56,590 But that's an interesting-- that's 1215 00:55:56,590 --> 00:55:58,950 a pretty interesting example. 1216 00:55:58,950 --> 00:56:03,790 So, the main point here is that in order to calculate utility, 1217 00:56:03,790 --> 00:56:08,440 you need a translation function between the engineering 1218 00:56:08,440 --> 00:56:13,750 or financial attributes of each alternative and utility. 1219 00:56:13,750 --> 00:56:16,750 If it's not-- if it was directly that, 1220 00:56:16,750 --> 00:56:18,130 you'd just have a linear, right? 1221 00:56:18,130 --> 00:56:20,470 It would be a 45 degree line, right? 1222 00:56:20,470 --> 00:56:22,750 Or negative 45. 1223 00:56:22,750 --> 00:56:27,040 But typically the mapping from your attribute values 1224 00:56:27,040 --> 00:56:29,320 to the utility is non-linear. 1225 00:56:29,320 --> 00:56:30,460 And that's what this says. 1226 00:56:33,520 --> 00:56:37,240 So then the challenge is well, how do you 1227 00:56:37,240 --> 00:56:38,770 get these utility functions? 1228 00:56:38,770 --> 00:56:40,400 Where do they come from? 1229 00:56:40,400 --> 00:56:42,890 And the answer is you've got to do interviews, 1230 00:56:42,890 --> 00:56:44,200 you have to survey people-- 1231 00:56:44,200 --> 00:56:47,240 the decision-makers, the stakeholders we talked about. 1232 00:56:47,240 --> 00:56:49,420 They're the ones who have these utility 1233 00:56:49,420 --> 00:56:52,520 functions in their minds, even if they're not-- 1234 00:56:52,520 --> 00:56:55,060 So you've got to make those explicit. 1235 00:56:55,060 --> 00:56:56,600 So here's an example. 1236 00:56:56,600 --> 00:56:58,990 You may have three different customers for your system. 1237 00:56:58,990 --> 00:57:01,960 User attribute, some performance attribute. 1238 00:57:01,960 --> 00:57:06,610 Customer one is shown here. 1239 00:57:06,610 --> 00:57:08,890 They need a minimum amount of performance, 1240 00:57:08,890 --> 00:57:11,590 but once you reach this threshold, after that there's 1241 00:57:11,590 --> 00:57:13,320 no more utility. 1242 00:57:13,320 --> 00:57:16,270 But then customer two and customer three are different. 1243 00:57:16,270 --> 00:57:19,750 They want-- their utility increases gradually, 1244 00:57:19,750 --> 00:57:23,590 and in fact, customer three doesn't see any much utility 1245 00:57:23,590 --> 00:57:25,940 until you hit this much higher level. 1246 00:57:25,940 --> 00:57:28,930 So one of the challenges, then, in designing these utility 1247 00:57:28,930 --> 00:57:33,160 functions when you have multiple customers is to solicit this-- 1248 00:57:33,160 --> 00:57:37,180 and there's interview and survey techniques for doing that-- 1249 00:57:37,180 --> 00:57:39,380 and then combining those. 1250 00:57:39,380 --> 00:57:41,500 And then the other challenge in combining 1251 00:57:41,500 --> 00:57:44,590 them is that, remember at the system level, 1252 00:57:44,590 --> 00:57:47,980 utility is still between zero and one, right? 1253 00:57:47,980 --> 00:57:51,550 So if you had, for example, two attributes, 1254 00:57:51,550 --> 00:57:54,220 and they both give you perfect utility, 1255 00:57:54,220 --> 00:57:59,400 and you add them together you'd get a utility of what? 1256 00:57:59,400 --> 00:58:01,850 You have two attributes. 1257 00:58:01,850 --> 00:58:02,610 Two. 1258 00:58:02,610 --> 00:58:06,690 But that can't be because the total system utility can never 1259 00:58:06,690 --> 00:58:08,040 be better than one. 1260 00:58:08,040 --> 00:58:09,330 So you have to normalize. 1261 00:58:09,330 --> 00:58:11,100 As you combine the utilities, you 1262 00:58:11,100 --> 00:58:15,450 have to normalize them, and actually weight them as well. 1263 00:58:15,450 --> 00:58:19,800 So here's the equation when you have two utilities. 1264 00:58:19,800 --> 00:58:24,030 You basically have the utility of the system just from J1-- 1265 00:58:24,030 --> 00:58:28,470 your first attribute-- times the utility of J2. 1266 00:58:28,470 --> 00:58:31,710 This is the mixed term, right, the combined term. 1267 00:58:31,710 --> 00:58:37,800 Plus the utility of just J1 plus the utility of just J2. 1268 00:58:37,800 --> 00:58:41,910 Then you have this K factor here-- this capital K factor-- 1269 00:58:41,910 --> 00:58:45,090 which re-normalizes everything to zero and one. 1270 00:58:45,090 --> 00:58:46,700 And if you have more than two, then 1271 00:58:46,700 --> 00:58:49,570 it becomes a matrix calculation. 1272 00:58:49,570 --> 00:58:52,230 So it's a little tricky for how to do this properly, 1273 00:58:52,230 --> 00:58:54,270 but it's well-known how to do this. 1274 00:58:54,270 --> 00:58:58,020 So the steps in multi-attribute utility analysis 1275 00:58:58,020 --> 00:59:01,170 are to identify your objectives and attributes, 1276 00:59:01,170 --> 00:59:03,060 you develop an interview questionnaire, 1277 00:59:03,060 --> 00:59:05,040 you administer that questionnaire. 1278 00:59:05,040 --> 00:59:08,040 You then develop your aggregate utility functions, 1279 00:59:08,040 --> 00:59:10,440 you determine the utility of your alternatives, 1280 00:59:10,440 --> 00:59:13,170 and then you analyze the results. 1281 00:59:13,170 --> 00:59:16,210 And there is one word of caution I want to give you, 1282 00:59:16,210 --> 00:59:18,120 which is that utility is essentially 1283 00:59:18,120 --> 00:59:21,000 the surrogate of value, but value often we 1284 00:59:21,000 --> 00:59:24,120 express in dollars, in monetary terms. 1285 00:59:24,120 --> 00:59:27,010 But utility is unit-less. 1286 00:59:27,010 --> 00:59:31,980 So let me just comment about utility maximization. 1287 00:59:31,980 --> 00:59:34,025 It's very common and generally well accepted. 1288 00:59:36,910 --> 00:59:40,440 It's a non-linear combination of your criteria-- your decision 1289 00:59:40,440 --> 00:59:41,580 criteria. 1290 00:59:41,580 --> 00:59:44,370 The downside of it is the physical meaning 1291 00:59:44,370 --> 00:59:48,990 is often lost, because the engineering or customer focused 1292 00:59:48,990 --> 00:59:50,970 attributes all get combined. 1293 00:59:50,970 --> 00:59:54,960 So you say this system has a utility 0.83, 1294 00:59:54,960 --> 00:59:58,210 and this one has a utility 0.75. 1295 00:59:58,210 --> 01:00:00,950 So one is better than the other, but immediately you 1296 01:00:00,950 --> 01:00:02,250 want to say well why is that? 1297 01:00:02,250 --> 01:00:03,270 What does that mean? 1298 01:00:03,270 --> 01:00:06,360 So then you have to backtrack and reverse engineer 1299 01:00:06,360 --> 01:00:10,110 how those utilities were calculated. 1300 01:00:10,110 --> 01:00:12,510 You need to obtain a mathematical representation 1301 01:00:12,510 --> 01:00:14,850 for all utility functions, and-- 1302 01:00:14,850 --> 01:00:18,780 this is a probably not just US-M centric comment, 1303 01:00:18,780 --> 01:00:21,120 this is probably everywhere in the world more or less-- 1304 01:00:21,120 --> 01:00:24,780 but the utility function can vary drastically depending 1305 01:00:24,780 --> 01:00:26,290 on the decision maker. 1306 01:00:26,290 --> 01:00:29,340 This is a big issue in government programs. 1307 01:00:29,340 --> 01:00:32,770 So you guys, it's uniform day, I guess, today. 1308 01:00:32,770 --> 01:00:35,550 So how long is a typical tour of duty 1309 01:00:35,550 --> 01:00:39,000 of a program manager in the Pentagon for a big program? 1310 01:00:39,000 --> 01:00:41,850 What would you say? 1311 01:00:41,850 --> 01:00:44,774 AUDIENCE: It's usually three to four years. 1312 01:00:44,774 --> 01:00:47,190 PROFESSOR: So three to four years, like it's written here. 1313 01:00:47,190 --> 01:00:52,680 And what does that that mean in practice for these utility 1314 01:00:52,680 --> 01:00:53,960 functions and for programs? 1315 01:01:01,090 --> 01:01:04,660 Sorry I'm putting you on the spot here. 1316 01:01:04,660 --> 01:01:08,320 AUDIENCE: Well, you don't usually 1317 01:01:08,320 --> 01:01:10,780 have a lot of overlap between the PM. 1318 01:01:10,780 --> 01:01:12,880 So, as one goes out, another one's 1319 01:01:12,880 --> 01:01:17,050 coming in, so you're losing a lot of experience, 1320 01:01:17,050 --> 01:01:20,790 gaining somebody who doesn't have much experience. 1321 01:01:20,790 --> 01:01:22,322 So there's a learning curve here. 1322 01:01:22,322 --> 01:01:23,030 PROFESSOR: Right. 1323 01:01:27,759 --> 01:01:28,300 That's right. 1324 01:01:28,300 --> 01:01:30,980 What I'm getting at is-- so there's definitely that 1325 01:01:30,980 --> 01:01:32,070 effect-- 1326 01:01:32,070 --> 01:01:34,580 but the priorities may be different. 1327 01:01:34,580 --> 01:01:38,620 You know, whereas the prior program manager really valued 1328 01:01:38,620 --> 01:01:44,410 a lot performance or quick response, 1329 01:01:44,410 --> 01:01:47,590 and the next program manager-- because maybe the context has 1330 01:01:47,590 --> 01:01:48,530 changed-- 1331 01:01:48,530 --> 01:01:51,150 is very, very cost conscious. 1332 01:01:51,150 --> 01:01:53,500 Really wants a system that is very, very affordable, 1333 01:01:53,500 --> 01:01:57,160 and is willing to sacrifice performance for that. 1334 01:01:57,160 --> 01:02:01,870 So the way you can think of this is the shape of the utility 1335 01:02:01,870 --> 01:02:05,820 curves for those decision makers has now shifted, 1336 01:02:05,820 --> 01:02:08,940 and because of that the architecture 1337 01:02:08,940 --> 01:02:11,620 or the choice of concept that you would go for 1338 01:02:11,620 --> 01:02:13,385 may be different. 1339 01:02:13,385 --> 01:02:15,510 AUDIENCE: This happens to NASA all the time, right? 1340 01:02:15,510 --> 01:02:16,218 PROFESSOR: Right. 1341 01:02:16,218 --> 01:02:18,430 And so this is not just a DOD issue. 1342 01:02:18,430 --> 01:02:19,690 And in commercial world. 1343 01:02:19,690 --> 01:02:24,280 You know, a new CEO comes in, new CTO comes in, 1344 01:02:24,280 --> 01:02:27,580 and these utility curves actually shift. 1345 01:02:27,580 --> 01:02:29,440 So you've got to be to be aware of this. 1346 01:02:29,440 --> 01:02:32,470 So one of the big topics there is-- 1347 01:02:32,470 --> 01:02:35,230 particularly now with NASA is-- 1348 01:02:35,230 --> 01:02:38,920 how do you choose an architecture, a concept, 1349 01:02:38,920 --> 01:02:44,260 that is robust to changing utilities and decision makers. 1350 01:02:44,260 --> 01:02:46,600 So maybe you don't go for the super duper 1351 01:02:46,600 --> 01:02:49,540 best, most exciting, most capable 1352 01:02:49,540 --> 01:02:51,850 concept, but you go for the one that's 1353 01:02:51,850 --> 01:02:56,020 least likely to be overturned or disrupted 1354 01:02:56,020 --> 01:02:58,820 with the next administration. 1355 01:02:58,820 --> 01:03:01,940 And this is a discussion-- active discussion-- 1356 01:03:01,940 --> 01:03:03,650 right now at NASA headquarters. 1357 01:03:03,650 --> 01:03:05,450 You know, what are the investments 1358 01:03:05,450 --> 01:03:08,360 you can make that are going to provide utility, 1359 01:03:08,360 --> 01:03:12,110 even if the utility function of the future decision makers 1360 01:03:12,110 --> 01:03:13,440 changes? 1361 01:03:13,440 --> 01:03:16,110 It's a big, big topic. 1362 01:03:16,110 --> 01:03:18,260 The other thing is, of course, this 1363 01:03:18,260 --> 01:03:20,820 requires your formulation of preferences. 1364 01:03:20,820 --> 01:03:23,720 Those K factors, those weightings, 1365 01:03:23,720 --> 01:03:26,000 a priori-- before you've actually 1366 01:03:26,000 --> 01:03:28,410 scored the alternatives. 1367 01:03:28,410 --> 01:03:28,910 All right. 1368 01:03:28,910 --> 01:03:31,010 So the example I want to talk you through 1369 01:03:31,010 --> 01:03:34,700 is essentially a space tug. 1370 01:03:34,700 --> 01:03:37,520 So we're going to do trade space exploration. 1371 01:03:37,520 --> 01:03:40,760 There's a trade space, a design space, of alternatives-- 1372 01:03:40,760 --> 01:03:42,620 conceptual alternatives. 1373 01:03:42,620 --> 01:03:44,600 And we're going to try to understand 1374 01:03:44,600 --> 01:03:47,220 that using utility theory. 1375 01:03:47,220 --> 01:03:50,780 And let me explain to you what we mean by space tug. 1376 01:03:50,780 --> 01:03:54,230 So a space tug is essentially a satellite that has the ability 1377 01:03:54,230 --> 01:03:58,400 to change its orbital elements-- there are six orbital elements, 1378 01:03:58,400 --> 01:04:01,640 semi-major axis, eccentricity, inclination, 1379 01:04:01,640 --> 01:04:05,830 right ascension of the ascending node, and so forth-- 1380 01:04:05,830 --> 01:04:08,720 of a target satellite by a predefined amount 1381 01:04:08,720 --> 01:04:12,330 without degrading its functionality in the process. 1382 01:04:12,330 --> 01:04:14,330 So here's a picture of the earth. 1383 01:04:14,330 --> 01:04:15,800 There's two orbits. 1384 01:04:15,800 --> 01:04:19,190 The space tug is here in the black orbit, 1385 01:04:19,190 --> 01:04:22,530 and then our target satellite is in the orange orbit. 1386 01:04:22,530 --> 01:04:26,000 And so the typical process is the space tug 1387 01:04:26,000 --> 01:04:28,850 waits in its parking orbit-- the black orbit. 1388 01:04:28,850 --> 01:04:30,170 It gets tasked. 1389 01:04:30,170 --> 01:04:33,620 It transfers to the other orbit, searches for the target, 1390 01:04:33,620 --> 01:04:37,520 identifies it rendezvous and approach, docking and capture, 1391 01:04:37,520 --> 01:04:41,090 does an orbital transfer and then releases the target 1392 01:04:41,090 --> 01:04:44,510 satellite at the new orbit, verifies its status 1393 01:04:44,510 --> 01:04:46,670 and then either goes directly to the next target, 1394 01:04:46,670 --> 01:04:49,040 or returns to the parking orbit. 1395 01:04:49,040 --> 01:04:50,630 And as you can imagine, depending 1396 01:04:50,630 --> 01:04:52,610 on the plane changes that are required here 1397 01:04:52,610 --> 01:04:54,920 this can be quite expensive. 1398 01:04:54,920 --> 01:04:57,920 It's less expensive in geosynchronous orbit, 1399 01:04:57,920 --> 01:05:02,900 and there are some capabilities that we 1400 01:05:02,900 --> 01:05:05,420 think heard about in the public that US 1401 01:05:05,420 --> 01:05:08,000 and maybe other countries have as well to do this. 1402 01:05:11,240 --> 01:05:14,180 But there's also applications-- commercial applications for it. 1403 01:05:14,180 --> 01:05:16,310 So, for example, for space debris 1404 01:05:16,310 --> 01:05:18,240 removal and things like that. 1405 01:05:18,240 --> 01:05:21,060 So that's what we mean by space tug. 1406 01:05:21,060 --> 01:05:23,220 So what are the attributes? 1407 01:05:23,220 --> 01:05:28,020 What are the-- so this is based on a paper from 2003 1408 01:05:28,020 --> 01:05:31,800 called Understanding the Orbital Transfer Vehicle Trade Space. 1409 01:05:31,800 --> 01:05:37,530 And so here's three attributes that are combined into utility: 1410 01:05:37,530 --> 01:05:41,610 total delta V capability-- delta V is the change in velocity. 1411 01:05:41,610 --> 01:05:44,640 This essentially tells you where can it go? 1412 01:05:44,640 --> 01:05:47,100 You calculate that from essentially the rocket 1413 01:05:47,100 --> 01:05:48,240 equation. 1414 01:05:48,240 --> 01:05:50,430 The second one is response time. 1415 01:05:50,430 --> 01:05:53,850 How fast can it get there after it has been tasked? 1416 01:05:53,850 --> 01:05:55,850 And there there's a big distinction between-- 1417 01:05:55,850 --> 01:05:58,360 it's almost binary-- between electric propulsion, 1418 01:05:58,360 --> 01:06:02,190 which is very efficient but slow, and then chemical 1419 01:06:02,190 --> 01:06:03,360 propulsion. 1420 01:06:03,360 --> 01:06:07,350 The third is the mass of the observation or grappling 1421 01:06:07,350 --> 01:06:08,520 equipment. 1422 01:06:08,520 --> 01:06:12,010 And so this tells you what it can do when it gets there. 1423 01:06:12,010 --> 01:06:14,820 So the size of target satellites that can be actually 1424 01:06:14,820 --> 01:06:16,140 interacted with. 1425 01:06:16,140 --> 01:06:20,280 Those are the three attributes that define utility, 1426 01:06:20,280 --> 01:06:25,110 and we combine those into a utility between zero and one. 1427 01:06:25,110 --> 01:06:28,380 And then in this case, cost or a surrogate for cost 1428 01:06:28,380 --> 01:06:30,060 is kept separately. 1429 01:06:30,060 --> 01:06:32,250 This is the vehicle wet and dry mass. 1430 01:06:32,250 --> 01:06:33,840 These are the cost drivers, and this 1431 01:06:33,840 --> 01:06:37,950 is calculated from some simple scaling relationships. 1432 01:06:37,950 --> 01:06:39,720 And so what we're interested in is 1433 01:06:39,720 --> 01:06:42,390 looking at the trade space of utility 1434 01:06:42,390 --> 01:06:45,990 versus cost of the system. 1435 01:06:45,990 --> 01:06:50,720 So how is utility defined for a space tug? 1436 01:06:50,720 --> 01:06:58,060 So we have response time, which is bad in electric systems. 1437 01:06:58,060 --> 01:07:02,104 Total utility is a weighted sum, and then we estimate the cost 1438 01:07:02,104 --> 01:07:03,020 from wet and dry mass. 1439 01:07:03,020 --> 01:07:06,030 So the delta V utility is shown here. 1440 01:07:06,030 --> 01:07:07,890 This is in terms of meters per second. 1441 01:07:07,890 --> 01:07:11,670 So two, four, six, eight, 10 kilometers per second. 1442 01:07:11,670 --> 01:07:15,110 These are pretty large delta Vs. And what's interesting 1443 01:07:15,110 --> 01:07:20,210 is this is a larger is better, but it's a step curve. 1444 01:07:20,210 --> 01:07:22,700 It's not a smooth curve, it's a step curve. 1445 01:07:22,700 --> 01:07:23,810 Why is that? 1446 01:07:23,810 --> 01:07:27,200 Because as you reach certain delta Vs, 1447 01:07:27,200 --> 01:07:32,480 it enables operating just in GEO or a LEO-GEO transfer, 1448 01:07:32,480 --> 01:07:34,710 or a LEO-GEO return. 1449 01:07:34,710 --> 01:07:35,210 Right? 1450 01:07:35,210 --> 01:07:38,230 So you can use the space tug more than once, 1451 01:07:38,230 --> 01:07:39,560 the more delta V you have. 1452 01:07:39,560 --> 01:07:43,850 And it's a step curve because as you hit this value of delta V, 1453 01:07:43,850 --> 01:07:48,360 you can now operate in a different regime. 1454 01:07:48,360 --> 01:07:49,890 The capability. 1455 01:07:49,890 --> 01:07:53,460 The payload mass utility is essentially 1456 01:07:53,460 --> 01:07:57,690 discretized between low, medium, high, and extreme. 1457 01:07:57,690 --> 01:07:59,490 So low is for small satellites. 1458 01:07:59,490 --> 01:08:02,220 Medium satellites up to one metric ton, 1459 01:08:02,220 --> 01:08:04,500 up to three metric tons, and then more than five 1460 01:08:04,500 --> 01:08:07,500 metric tons, which are the big satellites 1461 01:08:07,500 --> 01:08:09,390 in geosynchronous orbit. 1462 01:08:09,390 --> 01:08:13,410 And then the weighting factors are we're going to capability 1463 01:08:13,410 --> 01:08:17,670 30%, delta V 60%, and the time responsiveness 1464 01:08:17,670 --> 01:08:20,130 of the system 10%. 1465 01:08:20,130 --> 01:08:23,100 So time responsiveness is not that important compared 1466 01:08:23,100 --> 01:08:24,939 to the other two. 1467 01:08:24,939 --> 01:08:26,609 So once we have this-- 1468 01:08:26,609 --> 01:08:28,680 I'm not going to go through all the details 1469 01:08:28,680 --> 01:08:30,689 of the calculations. 1470 01:08:30,689 --> 01:08:33,920 That's in that paper, which I'll post. 1471 01:08:33,920 --> 01:08:35,920 You get this. 1472 01:08:35,920 --> 01:08:37,870 So this is a cloud of points. 1473 01:08:37,870 --> 01:08:40,890 Each point here represents a particular alternative 1474 01:08:40,890 --> 01:08:45,090 or architecture, and we have the cost of the system in millions 1475 01:08:45,090 --> 01:08:45,649 of dollars. 1476 01:08:45,649 --> 01:08:46,980 So this is not cheap. 1477 01:08:46,980 --> 01:08:47,479 All right? 1478 01:08:47,479 --> 01:08:50,590 This is a billion dollars, two billion, three billion. 1479 01:08:50,590 --> 01:08:54,359 And then we have this utility, this dimensionless utility. 1480 01:08:54,359 --> 01:08:56,640 And, of course, what we're particularly interested in 1481 01:08:56,640 --> 01:09:01,170 is this lower right corner: high utility lower cost systems. 1482 01:09:01,170 --> 01:09:06,990 And what we did here is to identify some particularly 1483 01:09:06,990 --> 01:09:08,670 interesting architecture, as they're 1484 01:09:08,670 --> 01:09:10,470 shown here in this space. 1485 01:09:10,470 --> 01:09:13,290 So let me just explain two of them. 1486 01:09:13,290 --> 01:09:14,569 One of them, here-- 1487 01:09:14,569 --> 01:09:17,609 and we gave them names that are recognizable. 1488 01:09:17,609 --> 01:09:21,060 So this point here, it's below 0.4, 1489 01:09:21,060 --> 01:09:24,840 so it's a rather lower utility system, 1490 01:09:24,840 --> 01:09:28,260 but it's also relatively affordable. 1491 01:09:28,260 --> 01:09:31,920 We call it the bi-prop low-earth orbit tender. 1492 01:09:31,920 --> 01:09:34,859 So it's intended for use in low-earth orbit 1493 01:09:34,859 --> 01:09:38,609 and it has a dry mass of about 680 kilograms, 1494 01:09:38,609 --> 01:09:40,410 a wet mass of 1,400. 1495 01:09:40,410 --> 01:09:42,240 So the difference is propellant-- 1496 01:09:42,240 --> 01:09:44,939 about 800 kilograms of propellant-- 1497 01:09:44,939 --> 01:09:47,800 and has reasonable size and mass fraction. 1498 01:09:47,800 --> 01:09:50,370 And it's fairly responsive. 1499 01:09:50,370 --> 01:09:55,200 Another alternative is what we call the electric geocruiser. 1500 01:09:55,200 --> 01:09:56,860 Sounds cool, doesn't it? 1501 01:09:56,860 --> 01:09:59,670 This is basically a space tug that operates only 1502 01:09:59,670 --> 01:10:01,680 in geosynchronous orbit. 1503 01:10:01,680 --> 01:10:04,200 So you launch into geosynchronous orbit, 1504 01:10:04,200 --> 01:10:07,800 but once it's there it can do a lot. 1505 01:10:07,800 --> 01:10:10,410 But it's electric, so it's kind of slow. 1506 01:10:10,410 --> 01:10:12,670 Seven hundred eleven kilograms dry mass, 1507 01:10:12,670 --> 01:10:15,530 1,100 kilograms wet mass. 1508 01:10:15,530 --> 01:10:20,460 And it includes return of the tug to a safe orbit. 1509 01:10:20,460 --> 01:10:25,410 And this is sort of a versatile space tug in the GEO belt. 1510 01:10:25,410 --> 01:10:28,050 So that's the idea. 1511 01:10:28,050 --> 01:10:32,140 You calculate utility, shown on the x-axis, 1512 01:10:32,140 --> 01:10:33,750 you have the cost of the system. 1513 01:10:33,750 --> 01:10:35,100 You have this cloud of points. 1514 01:10:35,100 --> 01:10:36,930 And you start understanding what are 1515 01:10:36,930 --> 01:10:41,760 the interesting architectures in that trade space. 1516 01:10:41,760 --> 01:10:43,480 This is the same trade space. 1517 01:10:43,480 --> 01:10:45,510 But now what we've done here is we've 1518 01:10:45,510 --> 01:10:49,990 shown the choice of propulsion system is critical. 1519 01:10:49,990 --> 01:10:53,620 And what you can see here, it's really almost impossible 1520 01:10:53,620 --> 01:10:55,420 to get to a utility of one. 1521 01:10:55,420 --> 01:10:57,730 It's very, very hard. 1522 01:10:57,730 --> 01:11:02,550 And the reason for this is the rocket equation. 1523 01:11:02,550 --> 01:11:03,540 You see how these-- 1524 01:11:03,540 --> 01:11:08,280 so the blue are the bi-propellant architectures. 1525 01:11:08,280 --> 01:11:10,710 The purple are the cryogenic, so these are 1526 01:11:10,710 --> 01:11:13,260 [? lux ?] hydrogen propulsion. 1527 01:11:13,260 --> 01:11:15,540 The yellow ones are the electric propulsion system, 1528 01:11:15,540 --> 01:11:17,610 and then we also have nuclear propulsion, here, 1529 01:11:17,610 --> 01:11:21,450 which clearly is challenging and has policy implications. 1530 01:11:21,450 --> 01:11:23,550 So you can see the nuclear propulsion gets 1531 01:11:23,550 --> 01:11:26,100 you close to much higher utility, 1532 01:11:26,100 --> 01:11:28,170 but it's also much, much more expensive. 1533 01:11:28,170 --> 01:11:30,300 But in all cases, as you're trying 1534 01:11:30,300 --> 01:11:32,460 to get more and more utility, at some point 1535 01:11:32,460 --> 01:11:35,470 you hit the wall because of the rocket equation, 1536 01:11:35,470 --> 01:11:37,040 which is very non-linear. 1537 01:11:37,040 --> 01:11:38,730 You know, more fuel-- 1538 01:11:38,730 --> 01:11:40,410 in order to carry more fuel, you need 1539 01:11:40,410 --> 01:11:42,840 more dry mass, more fuel and dry mass 1540 01:11:42,840 --> 01:11:47,640 requires more fuel to push, and the system blows up on you. 1541 01:11:47,640 --> 01:11:49,750 And you can see this in these curves. 1542 01:11:49,750 --> 01:11:52,110 So you get some physical intuition 1543 01:11:52,110 --> 01:11:53,700 in terms of the shape of the space. 1544 01:11:53,700 --> 01:11:54,340 Yeah? 1545 01:11:54,340 --> 01:11:57,574 AUDIENCE: Is there a reason between this layering of cost? 1546 01:11:57,574 --> 01:11:58,240 PROFESSOR: Yeah. 1547 01:11:58,240 --> 01:12:02,790 They're essentially, if I remember correctly, 1548 01:12:02,790 --> 01:12:05,680 the capability of the system. 1549 01:12:05,680 --> 01:12:08,160 Remember there's this small, medium, large, 1550 01:12:08,160 --> 01:12:11,010 you know the size of the grappling equipment? 1551 01:12:11,010 --> 01:12:12,690 I think that's the tiering that you see. 1552 01:12:12,690 --> 01:12:16,294 AUDIENCE: That doesn't change the utility? 1553 01:12:16,294 --> 01:12:18,460 PROFESSOR: Well, it depends on the weighting, right? 1554 01:12:18,460 --> 01:12:21,310 So as you change the weighting among the attributes, 1555 01:12:21,310 --> 01:12:22,810 that space can get scrambled. 1556 01:12:22,810 --> 01:12:28,270 This space is valid for that particular weighting. 1557 01:12:28,270 --> 01:12:30,250 So any questions about-- 1558 01:12:30,250 --> 01:12:34,210 this is utility theory, multi-attribute utility theory, 1559 01:12:34,210 --> 01:12:38,280 applied to trade space exploration. 1560 01:12:38,280 --> 01:12:40,480 Any questions? 1561 01:12:40,480 --> 01:12:40,980 Go ahead. 1562 01:12:44,530 --> 01:12:45,775 EPFL, do you have a question? 1563 01:12:51,830 --> 01:12:53,050 OK. 1564 01:12:53,050 --> 01:12:57,160 So now what you see here is implicitly, 1565 01:12:57,160 --> 01:12:59,950 is there's nothing here in the lower right corner. 1566 01:12:59,950 --> 01:13:03,700 There is no system that has perfect utility 1567 01:13:03,700 --> 01:13:05,140 and is low cost. 1568 01:13:05,140 --> 01:13:06,000 Right? 1569 01:13:06,000 --> 01:13:07,000 There's an empty space. 1570 01:13:07,000 --> 01:13:09,880 We'd love to be here, but there's nothing there. 1571 01:13:09,880 --> 01:13:12,615 So the best we can do is get close to that. 1572 01:13:12,615 --> 01:13:14,740 And that's what I want to talk about next, which is 1573 01:13:14,740 --> 01:13:16,930 this concept of non-dominance. 1574 01:13:16,930 --> 01:13:18,580 So what do we mean by non-dominance? 1575 01:13:18,580 --> 01:13:21,760 What is a Pareto frontier and what is 1576 01:13:21,760 --> 01:13:23,890 multi-objective optimization? 1577 01:13:23,890 --> 01:13:27,220 The key point about this is that when 1578 01:13:27,220 --> 01:13:30,520 you do non-dominance filtering, when 1579 01:13:30,520 --> 01:13:32,800 you look for Pareto frontiers, you 1580 01:13:32,800 --> 01:13:36,880 do not need to express your preferences ahead of time. 1581 01:13:36,880 --> 01:13:37,610 Right? 1582 01:13:37,610 --> 01:13:41,600 So when you do you a weighted Pugh matrix or when 1583 01:13:41,600 --> 01:13:46,190 you do a utility theory, you had to define ahead of time, 1584 01:13:46,190 --> 01:13:49,640 before you did the scoring, what's more important-- 1585 01:13:49,640 --> 01:13:53,540 fuel efficiency is twice as important to me as cost. 1586 01:13:53,540 --> 01:13:54,040 Right? 1587 01:13:54,040 --> 01:13:55,760 The weighting, the preferences had 1588 01:13:55,760 --> 01:14:00,080 to be expressed a priori-- before you did the scoring. 1589 01:14:00,080 --> 01:14:03,890 And there's arguments that that's not the best thing 1590 01:14:03,890 --> 01:14:06,950 to do, but you can do it. 1591 01:14:06,950 --> 01:14:12,500 Here in non-dominance and multi-objective optimization, 1592 01:14:12,500 --> 01:14:15,200 eventually you have to express your preferences, 1593 01:14:15,200 --> 01:14:18,530 but you do it after you've scored the options. 1594 01:14:18,530 --> 01:14:21,350 That's a really important distinction. 1595 01:14:21,350 --> 01:14:25,000 So let me give you a little history here on this. 1596 01:14:25,000 --> 01:14:28,460 And what's really interesting is there's a Lausanne connection. 1597 01:14:28,460 --> 01:14:29,182 OK? 1598 01:14:29,182 --> 01:14:30,640 There's a connection with Lausanne. 1599 01:14:30,640 --> 01:14:34,780 So the word Pareto frontier is named 1600 01:14:34,780 --> 01:14:38,890 after a an economist, who actually 1601 01:14:38,890 --> 01:14:42,040 started as an engineer, Vilfredo Pareto. 1602 01:14:42,040 --> 01:14:47,230 He was born in Paris in 1848 and then graduated 1603 01:14:47,230 --> 01:14:50,770 from the University of Turin in 1870. 1604 01:14:50,770 --> 01:14:54,220 Very much in engineering, civil engineering, 1605 01:14:54,220 --> 01:14:56,950 his thesis title was The Fundamental Principles 1606 01:14:56,950 --> 01:14:59,200 of Equilibrium and Solid Bodies. 1607 01:14:59,200 --> 01:15:01,780 Something we take-- that's pretty basic now. 1608 01:15:01,780 --> 01:15:06,370 Forces and torques in equilibrium on a body. 1609 01:15:06,370 --> 01:15:11,080 Now he worked as a civil engineer in Florence, 1610 01:15:11,080 --> 01:15:14,290 and then he got interested in philosophy, politics, 1611 01:15:14,290 --> 01:15:17,590 and economics, and he started to think about this in terms 1612 01:15:17,590 --> 01:15:22,850 of how does this concept of equilibrium apply to economics? 1613 01:15:22,850 --> 01:15:25,840 And in 1893, he actually became a professor 1614 01:15:25,840 --> 01:15:27,490 at the University of Lausanne, which 1615 01:15:27,490 --> 01:15:31,300 is located right next to EPFL, and started 1616 01:15:31,300 --> 01:15:37,870 applying this to economics and societal theory. 1617 01:15:37,870 --> 01:15:43,210 And then he did-- some of his work was a bit controversial, 1618 01:15:43,210 --> 01:15:45,610 but the one that I want to talk about here 1619 01:15:45,610 --> 01:15:48,070 is this the idea of the Pareto optimum. 1620 01:15:48,070 --> 01:15:51,460 And I will-- let me just read this quote to you. 1621 01:15:51,460 --> 01:15:54,880 "The optimum allocation of the resources of a society 1622 01:15:54,880 --> 01:15:57,460 is not attained, so long as it is 1623 01:15:57,460 --> 01:16:00,850 possible to make at least one individual better off 1624 01:16:00,850 --> 01:16:05,320 in his or her own estimation, while keeping others 1625 01:16:05,320 --> 01:16:09,620 as well off as before in their own estimation." 1626 01:16:09,620 --> 01:16:13,810 So what it means is if you can make an investment 1627 01:16:13,810 --> 01:16:19,780 or make a change in society that makes a particular individual 1628 01:16:19,780 --> 01:16:24,160 or group of individuals better off in their own estimation, 1629 01:16:24,160 --> 01:16:28,630 without having to take the resources from another group 1630 01:16:28,630 --> 01:16:31,860 and make them less well-off, then you've 1631 01:16:31,860 --> 01:16:35,129 not reached an optimal allocation of resources. 1632 01:16:35,129 --> 01:16:36,670 You know, if there can be a win-win-- 1633 01:16:36,670 --> 01:16:40,080 if everybody can be better off by making certain investments, 1634 01:16:40,080 --> 01:16:43,560 you haven't yet found the optimal strategy. 1635 01:16:43,560 --> 01:16:47,610 Only when-- the only way to make somebody better off 1636 01:16:47,610 --> 01:16:50,670 in their own estimation than somebody else is by taking-- 1637 01:16:50,670 --> 01:16:53,730 and this is a big political debate still today-- 1638 01:16:53,730 --> 01:16:57,030 by taking from one group and redistributing resources 1639 01:16:57,030 --> 01:16:58,230 to the other. 1640 01:16:58,230 --> 01:17:00,400 When you're in that situation, then you 1641 01:17:00,400 --> 01:17:03,580 are at the Pareto optimal point. 1642 01:17:03,580 --> 01:17:04,470 OK? 1643 01:17:04,470 --> 01:17:08,610 So that's the key idea underlying this. 1644 01:17:08,610 --> 01:17:16,140 So what this means mathematically is that 1645 01:17:16,140 --> 01:17:17,610 an optimal solution -- 1646 01:17:17,610 --> 01:17:21,540 X*-- is optimal if and only if-- 1647 01:17:24,130 --> 01:17:27,100 for a feasible solution-- so, first of all the solution 1648 01:17:27,100 --> 01:17:28,970 has to be feasible. 1649 01:17:28,970 --> 01:17:32,770 And this is when you have multiple criteria. 1650 01:17:32,770 --> 01:17:34,900 We're trying to do vector optimization, 1651 01:17:34,900 --> 01:17:37,660 and x must be what we call an efficient solution, 1652 01:17:37,660 --> 01:17:40,630 and it's efficient only if its objective vector-- 1653 01:17:40,630 --> 01:17:44,020 J of X-- is non- dominated. 1654 01:17:44,020 --> 01:17:48,430 And what this means is that if you're looking at a point, 1655 01:17:48,430 --> 01:17:51,760 it's only efficient when it is not 1656 01:17:51,760 --> 01:17:55,390 possible to move from that point to another point 1657 01:17:55,390 --> 01:18:00,740 without degrading at least one of the other objectives. 1658 01:18:00,740 --> 01:18:03,580 So if you can move from a particular design point 1659 01:18:03,580 --> 01:18:08,030 to another design point, and all the objectives get better, 1660 01:18:08,030 --> 01:18:09,760 you're not yet efficient. 1661 01:18:09,760 --> 01:18:14,460 You're not yet at a Pareto optimal point. 1662 01:18:14,460 --> 01:18:16,950 And this gets us to the notion of dominance. 1663 01:18:16,950 --> 01:18:21,970 So dominance-- let's say we have two designs, two alternatives-- 1664 01:18:21,970 --> 01:18:23,950 J1 and J2. 1665 01:18:23,950 --> 01:18:26,110 Those are their objective vectors. 1666 01:18:26,110 --> 01:18:28,750 It means that-- this is for maximization. 1667 01:18:28,750 --> 01:18:30,640 We're trying to maximize. 1668 01:18:30,640 --> 01:18:35,380 It means J1 dominates J2 weakly if J1 1669 01:18:35,380 --> 01:18:43,710 is better or equal J2, and at least in one of the objectives, 1670 01:18:43,710 --> 01:18:47,840 I, strictly better than J2. 1671 01:18:47,840 --> 01:18:49,530 So you could have two alternatives. 1672 01:18:49,530 --> 01:18:51,830 This gets back to Pugh matrix and the question-- 1673 01:18:51,830 --> 01:18:55,090 who asked about the ties? 1674 01:18:55,090 --> 01:18:57,462 You did, right? 1675 01:18:57,462 --> 01:18:58,800 It's so long ago. 1676 01:18:58,800 --> 01:19:01,770 It's like, wait a minute, it's like an hour ago, you know? 1677 01:19:01,770 --> 01:19:05,430 So we have two alternatives, OK? 1678 01:19:05,430 --> 01:19:07,300 They're all equal. 1679 01:19:07,300 --> 01:19:09,900 They're tied in all criteria, except for one, 1680 01:19:09,900 --> 01:19:12,690 where J1 is better than J2. 1681 01:19:12,690 --> 01:19:16,740 Which means that J1 will dominate J2, but weakly, 1682 01:19:16,740 --> 01:19:17,970 because there's some ties. 1683 01:19:17,970 --> 01:19:18,960 Right? 1684 01:19:18,960 --> 01:19:20,610 And then there's a stronger definition, 1685 01:19:20,610 --> 01:19:26,520 which is that J1 strongly dominates dominates J2 if 1686 01:19:26,520 --> 01:19:29,340 and only if it is better strictly 1687 01:19:29,340 --> 01:19:32,290 better in all attributes. 1688 01:19:32,290 --> 01:19:35,580 So for J1 to dominate J2 strongly, 1689 01:19:35,580 --> 01:19:39,860 it's got to be better in all attributes than J2. 1690 01:19:39,860 --> 01:19:42,460 Does that make sense? 1691 01:19:42,460 --> 01:19:42,960 OK. 1692 01:19:42,960 --> 01:19:46,020 So I know this is a bit abstract. 1693 01:19:46,020 --> 01:19:48,390 So let's do a concept question that's 1694 01:19:48,390 --> 01:19:52,590 sort of based on a hypothetical but real example. 1695 01:19:52,590 --> 01:19:57,110 So assume-- let's say you're in charge 1696 01:19:57,110 --> 01:20:00,470 of designing in new commercial aircraft, 1697 01:20:00,470 --> 01:20:03,700 and there's four criteria. 1698 01:20:03,700 --> 01:20:06,040 You want to maximize range. 1699 01:20:06,040 --> 01:20:07,870 You want to minimize the costs-- 1700 01:20:07,870 --> 01:20:10,450 dollars per kilometer flown. 1701 01:20:10,450 --> 01:20:13,390 You want to maximize the capacity of the airplane-- 1702 01:20:13,390 --> 01:20:14,950 number of passengers. 1703 01:20:14,950 --> 01:20:19,240 And you want to maximize cruise speed in kilometers per hour. 1704 01:20:19,240 --> 01:20:22,730 So it's a multi-objective aircraft design. 1705 01:20:22,730 --> 01:20:23,920 You do all the work. 1706 01:20:23,920 --> 01:20:25,480 You do concept generation. 1707 01:20:25,480 --> 01:20:28,120 You come up with eight concepts for airplanes, 1708 01:20:28,120 --> 01:20:29,800 and you evaluate them. 1709 01:20:29,800 --> 01:20:32,620 And they're shown here-- one, two, three, et cetera, 1710 01:20:32,620 --> 01:20:33,880 through eight. 1711 01:20:33,880 --> 01:20:35,380 So let's just look at number one. 1712 01:20:35,380 --> 01:20:36,850 What does this mean? 1713 01:20:36,850 --> 01:20:39,370 Airplane number one, concept number one, 1714 01:20:39,370 --> 01:20:43,570 has a range of 7,587 kilometers. 1715 01:20:43,570 --> 01:20:47,660 It has a operating cost of $321 per kilometer. 1716 01:20:47,660 --> 01:20:50,440 It can carry 112 passengers. 1717 01:20:50,440 --> 01:20:52,810 And it has a cruise speed or max speed 1718 01:20:52,810 --> 01:20:55,910 of 950 kilometers per hour. 1719 01:20:55,910 --> 01:20:57,880 So I'm going to leave this up. 1720 01:20:57,880 --> 01:21:01,120 This is like a five minute-- this is-- 1721 01:21:01,120 --> 01:21:04,122 on the next slide, I'm going to ask you-- 1722 01:21:04,122 --> 01:21:06,130 and we'll come back to this-- 1723 01:21:06,130 --> 01:21:10,830 which of these airplane designs, these eight, are non-dominated. 1724 01:21:10,830 --> 01:21:14,970 Meaning that they're not dominated by any of the other. 1725 01:21:14,970 --> 01:21:17,770 And we're going to apply weak dominance here. 1726 01:21:17,770 --> 01:21:19,520 That was going to be your question, right? 1727 01:21:19,520 --> 01:21:20,510 Weak dominance. 1728 01:21:20,510 --> 01:21:22,187 So I don't think it actually matters. 1729 01:21:22,187 --> 01:21:24,020 I don't think there's too many ties in here, 1730 01:21:24,020 --> 01:21:27,260 but apply weak dominance. 1731 01:21:27,260 --> 01:21:30,200 So which of these are non-dominated? 1732 01:21:30,200 --> 01:21:35,550 Meaning that you wouldn't discard them right off the bat, 1733 01:21:35,550 --> 01:21:39,330 because there are some strong features that they have. 1734 01:21:39,330 --> 01:21:43,530 So work through this. 1735 01:21:43,530 --> 01:21:49,350 You might want to write down which ones are non-dominated. 1736 01:21:49,350 --> 01:21:51,510 And then we'll do the concert question, 1737 01:21:51,510 --> 01:21:53,640 and then we'll show you the answer. 1738 01:21:53,640 --> 01:21:55,110 AUDIENCE: I just want to make sure. 1739 01:21:55,110 --> 01:21:58,440 So non-dominated as in there is not one 1740 01:21:58,440 --> 01:22:01,614 that could be considered better than it? 1741 01:22:01,614 --> 01:22:02,280 PROFESSOR: Yeah. 1742 01:22:02,280 --> 01:22:07,160 So, this is the definition. 1743 01:22:09,940 --> 01:22:10,480 All right. 1744 01:22:10,480 --> 01:22:16,060 So let me-- so please submit your answers. 1745 01:22:16,060 --> 01:22:18,220 So here are the choices. 1746 01:22:18,220 --> 01:22:20,440 Which airplane designs are non-dominated? 1747 01:22:20,440 --> 01:22:22,450 Five, six, and seven. 1748 01:22:22,450 --> 01:22:24,400 One, three, four, and eight. 1749 01:22:24,400 --> 01:22:25,930 One, two, three, four, and eight. 1750 01:22:25,930 --> 01:22:27,090 Two, three, five, and six. 1751 01:22:27,090 --> 01:22:28,300 One, three, four, and seven. 1752 01:22:28,300 --> 01:22:32,140 Or you need-- this is not answerable. 1753 01:22:32,140 --> 01:22:35,270 You need more information to answer the question. 1754 01:22:35,270 --> 01:22:37,470 OK? 1755 01:22:37,470 --> 01:22:44,800 So we have a pretty good distribution here. 1756 01:22:44,800 --> 01:22:49,710 So the wisdom of crowds I think prevails. 1757 01:22:49,710 --> 01:22:53,070 So the correct answer is the third one-- one, two, three, 1758 01:22:53,070 --> 01:22:54,480 four, and eight. 1759 01:22:54,480 --> 01:22:58,530 So 36% of you got it right. 1760 01:22:58,530 --> 01:23:02,010 Now let me let me show you the solution to this. 1761 01:23:02,010 --> 01:23:04,350 So it is, in fact, possible to answer this 1762 01:23:04,350 --> 01:23:07,090 without additional information. 1763 01:23:07,090 --> 01:23:09,750 So the way to do this is you have 1764 01:23:09,750 --> 01:23:12,430 to do pairwise comparisons. 1765 01:23:12,430 --> 01:23:14,270 Now you said about-- 1766 01:23:14,270 --> 01:23:15,760 I think your strategy-- 1767 01:23:15,760 --> 01:23:17,380 What did you say, early on? 1768 01:23:17,380 --> 01:23:19,020 AUDIENCE: I was looking at maximums. 1769 01:23:19,020 --> 01:23:20,686 PROFESSOR: You were looking at maximums. 1770 01:23:20,686 --> 01:23:22,710 So you were saying who's the best 1771 01:23:22,710 --> 01:23:26,670 performer among the concepts for each criteria? 1772 01:23:26,670 --> 01:23:30,790 And you found those pretty easily, I would assume, right? 1773 01:23:30,790 --> 01:23:31,290 So what-- 1774 01:23:31,290 --> 01:23:33,620 AUDIENCE: [INAUDIBLE] 1775 01:23:33,620 --> 01:23:34,230 PROFESSOR: OK. 1776 01:23:34,230 --> 01:23:35,820 So you've got to get the sign right. 1777 01:23:35,820 --> 01:23:38,230 Do you have the mic on, by the way. 1778 01:23:38,230 --> 01:23:41,190 So what would-- if that concept is 1779 01:23:41,190 --> 01:23:43,260 the best performer in a criterion, 1780 01:23:43,260 --> 01:23:45,100 what does that tell you? 1781 01:23:45,100 --> 01:23:47,520 AUDIENCE: It means that none of the other performers 1782 01:23:47,520 --> 01:23:49,320 have something equal to or greater 1783 01:23:49,320 --> 01:23:53,370 than that performer for that criterion. 1784 01:23:53,370 --> 01:23:57,020 PROFESSOR: And this relates to the question-- is it Martin? 1785 01:23:57,020 --> 01:23:57,690 Martin? 1786 01:23:57,690 --> 01:23:59,150 In the red shirt, first row? 1787 01:24:01,882 --> 01:24:03,090 I didn't get your name right. 1788 01:24:03,090 --> 01:24:04,692 I'm sorry. 1789 01:24:04,692 --> 01:24:05,442 AUDIENCE: Bastian. 1790 01:24:05,442 --> 01:24:06,470 PROFESSOR: Bastian. 1791 01:24:06,470 --> 01:24:07,410 Bastian. 1792 01:24:07,410 --> 01:24:08,610 Bastian. 1793 01:24:08,610 --> 01:24:10,410 You remember at the start of the lecture, 1794 01:24:10,410 --> 01:24:12,743 you said-- what did you say at the start of the lecture? 1795 01:24:16,160 --> 01:24:20,474 AUDIENCE: I said that if you evaluate all the concepts based 1796 01:24:20,474 --> 01:24:22,505 on just plus or minus for the criteria, 1797 01:24:22,505 --> 01:24:28,370 you might have one where you evaluate all the-- 1798 01:24:28,370 --> 01:24:30,577 a bunch of criteria that are negatives, 1799 01:24:30,577 --> 01:24:32,160 but there's just one that is positive. 1800 01:24:32,160 --> 01:24:34,392 But that one might be very, very positive. 1801 01:24:34,392 --> 01:24:35,100 PROFESSOR: Right. 1802 01:24:35,100 --> 01:24:37,230 So that's exactly the point here. 1803 01:24:37,230 --> 01:24:38,910 That's exactly this point. 1804 01:24:38,910 --> 01:24:42,750 If you are-- if this concept is the best performer 1805 01:24:42,750 --> 01:24:44,870 in just one of the criteria. 1806 01:24:44,870 --> 01:24:48,120 Like it has the best speed, or the lowest cost, 1807 01:24:48,120 --> 01:24:51,990 but it's terrible, terrible in everything else, 1808 01:24:51,990 --> 01:24:52,890 it doesn't matter. 1809 01:24:52,890 --> 01:24:56,610 In terms of dominance, it is non-dominated. 1810 01:24:56,610 --> 01:24:59,920 It cannot be dominated by any other concept. 1811 01:24:59,920 --> 01:25:03,570 If you are the best in class, even just for one criteria-- 1812 01:25:03,570 --> 01:25:05,460 does that make sense? 1813 01:25:05,460 --> 01:25:11,000 You cannot be dominated by any other design if you're the best 1814 01:25:11,000 --> 01:25:14,879 performer on just one criterion. 1815 01:25:14,879 --> 01:25:16,420 So let's think about track and field. 1816 01:25:19,090 --> 01:25:21,790 So the people-- who's doing track and field? 1817 01:25:21,790 --> 01:25:23,020 Any athletes here? 1818 01:25:23,020 --> 01:25:25,780 Like runners or javelin or-- 1819 01:25:25,780 --> 01:25:26,300 Veronica. 1820 01:25:26,300 --> 01:25:29,320 I didn't know that about you. 1821 01:25:29,320 --> 01:25:30,510 Cross country and track. 1822 01:25:30,510 --> 01:25:31,320 OK. 1823 01:25:31,320 --> 01:25:32,270 So 5K? 1824 01:25:35,130 --> 01:25:35,870 All right. 1825 01:25:35,870 --> 01:25:37,810 So let's say-- 1826 01:25:37,810 --> 01:25:40,820 I'm sure you were very good at 5K. 1827 01:25:40,820 --> 01:25:42,752 And how was your shot putt? 1828 01:25:42,752 --> 01:25:45,690 AUDIENCE: I never competed in the shot putt. 1829 01:25:45,690 --> 01:25:46,630 PROFESSOR: OK. 1830 01:25:46,630 --> 01:25:49,240 So then we don't know, right? 1831 01:25:49,240 --> 01:25:53,110 But the point I'm trying to make is you could be a super duper 1832 01:25:53,110 --> 01:25:56,840 specialist in one criterion, and that's-- you can will win 1833 01:25:56,840 --> 01:25:59,010 a gold medal with that. 1834 01:25:59,010 --> 01:26:02,320 But you're very maybe not good in other things. 1835 01:26:02,320 --> 01:26:04,170 So that's the point here is in order 1836 01:26:04,170 --> 01:26:07,650 to be dominated, if you're the best performer-- 1837 01:26:07,650 --> 01:26:10,770 so, right off the bat, in this exercise 1838 01:26:10,770 --> 01:26:12,930 here, in our little exercise, you 1839 01:26:12,930 --> 01:26:17,310 can remove all the concepts that are best in class from the set. 1840 01:26:17,310 --> 01:26:19,870 Because they cannot be dominated. 1841 01:26:19,870 --> 01:26:22,320 So they have to be non-dominated. 1842 01:26:22,320 --> 01:26:25,080 OK, so the way you do the scoring 1843 01:26:25,080 --> 01:26:29,250 with pairwise comparisons is you say, let's compare one and two. 1844 01:26:29,250 --> 01:26:31,200 Concept one and two. 1845 01:26:31,200 --> 01:26:36,060 Well, one is better in range, right? 1846 01:26:36,060 --> 01:26:38,640 And it's better in speed. 1847 01:26:38,640 --> 01:26:42,240 But then the second concept is better in two and three. 1848 01:26:42,240 --> 01:26:45,330 So the score here is two versus two, 1849 01:26:45,330 --> 01:26:50,220 which means neither one nor two dominate each other, right? 1850 01:26:50,220 --> 01:26:53,340 Even if it's-- what if it's three to one? 1851 01:26:53,340 --> 01:26:57,000 What if it was a three to one score? 1852 01:26:57,000 --> 01:26:58,740 It would still be true, right? 1853 01:26:58,740 --> 01:27:01,440 That one doesn't-- one is, in a sense, 1854 01:27:01,440 --> 01:27:03,600 better because it's better than the other. 1855 01:27:03,600 --> 01:27:06,840 But it doesn't dominate it, because dominance 1856 01:27:06,840 --> 01:27:10,110 is a very crisp definition. 1857 01:27:10,110 --> 01:27:12,750 Now this is different, right? 1858 01:27:12,750 --> 01:27:14,670 Compare one versus number six. 1859 01:27:14,670 --> 01:27:17,310 Concept one versus concept six. 1860 01:27:17,310 --> 01:27:21,180 One is better in all four criteria to six. 1861 01:27:21,180 --> 01:27:25,020 So clearly, solution one dominates solution six, 1862 01:27:25,020 --> 01:27:27,330 and as a rational decision maker, 1863 01:27:27,330 --> 01:27:30,840 you can eliminate concept six from the set 1864 01:27:30,840 --> 01:27:34,550 because it is completely dominated by concept one. 1865 01:27:34,550 --> 01:27:35,430 OK? 1866 01:27:35,430 --> 01:27:37,350 So in order to be dominated, a solution 1867 01:27:37,350 --> 01:27:42,970 must have essentially a score of zero in a pairwise comparison. 1868 01:27:42,970 --> 01:27:45,930 Now, if we apply this to the full set of eight 1869 01:27:45,930 --> 01:27:47,940 we can actually show this as a matrix. 1870 01:27:47,940 --> 01:27:50,760 I call that the domination matrix. 1871 01:27:50,760 --> 01:27:52,920 It shows which solutions dominate 1872 01:27:52,920 --> 01:27:55,290 which other solution-- the horizontal rows 1873 01:27:55,290 --> 01:27:56,910 and the vertical rows. 1874 01:27:56,910 --> 01:28:00,240 So the way you read this is so where you see these dots, 1875 01:28:00,240 --> 01:28:01,980 these are dominance relationships 1876 01:28:01,980 --> 01:28:03,940 in the pairwise comparison. 1877 01:28:03,940 --> 01:28:11,140 So this tells you that solution two dominates solution five. 1878 01:28:11,140 --> 01:28:13,630 This tells us, if we look column-wise, 1879 01:28:13,630 --> 01:28:17,350 solution seven is dominated by solution two and solution 1880 01:28:17,350 --> 01:28:18,550 eight. 1881 01:28:18,550 --> 01:28:23,000 So, if we do the row sum-- if we sum this along the rows, 1882 01:28:23,000 --> 01:28:25,840 the row indicates how many solutions 1883 01:28:25,840 --> 01:28:29,110 this particular solution dominates. 1884 01:28:29,110 --> 01:28:31,540 And-- but the question was about non-dominance, 1885 01:28:31,540 --> 01:28:33,280 so we need to look at the column sum. 1886 01:28:33,280 --> 01:28:35,800 The column sum indicates by how many 1887 01:28:35,800 --> 01:28:40,390 other solutions the kth solution or concept is dominated. 1888 01:28:40,390 --> 01:28:42,760 And when you do this column sum, you 1889 01:28:42,760 --> 01:28:47,740 can see that concepts one, two, three, and four have a zero. 1890 01:28:47,740 --> 01:28:50,210 They're not dominated by any other design. 1891 01:28:50,210 --> 01:28:54,460 Column eight has a zero, and concepts five, six, and seven 1892 01:28:54,460 --> 01:28:58,360 are dominated each by at least one other concept, which 1893 01:28:58,360 --> 01:29:03,340 means that they're dominated, and the others 1894 01:29:03,340 --> 01:29:05,480 are non dominated. 1895 01:29:05,480 --> 01:29:07,990 So when you do your concept selection, 1896 01:29:07,990 --> 01:29:10,830 you can actually apply this to a much bigger set. 1897 01:29:10,830 --> 01:29:13,630 Your filter out all the dominated solutions. 1898 01:29:13,630 --> 01:29:17,560 And it turns out the bigger your number of alternatives 1899 01:29:17,560 --> 01:29:21,340 or architectures, the smaller percentage-wise the set 1900 01:29:21,340 --> 01:29:23,920 will be of non-dominated solutions. 1901 01:29:23,920 --> 01:29:27,270 And those are the ones you really want to focus on. 1902 01:29:27,270 --> 01:29:29,190 Is that clear? 1903 01:29:29,190 --> 01:29:32,520 I know this takes a little thinking about, 1904 01:29:32,520 --> 01:29:33,780 what does this really mean? 1905 01:29:33,780 --> 01:29:38,300 But but that's a very rigorous way to do it. 1906 01:29:38,300 --> 01:29:41,010 And then this gets us to, eventually, 1907 01:29:41,010 --> 01:29:44,430 this gets us to the notion of Pareto optimality. 1908 01:29:44,430 --> 01:29:46,460 So what's the relationship between Pareto 1909 01:29:46,460 --> 01:29:48,680 optimality and non-dominance. 1910 01:29:48,680 --> 01:29:51,470 So let's say we have two objectives, J1-- 1911 01:29:51,470 --> 01:29:53,150 we want to maximize J1. 1912 01:29:53,150 --> 01:29:57,020 Maybe that's performance, endurance. 1913 01:29:57,020 --> 01:29:58,550 We want to minimize J2-- 1914 01:29:58,550 --> 01:30:00,410 maybe cost, for example. 1915 01:30:00,410 --> 01:30:05,010 So the Utopian point, or the optimal corner to be in, 1916 01:30:05,010 --> 01:30:06,290 is the lower right. 1917 01:30:06,290 --> 01:30:09,196 We want to maximize J1, minimize J2. 1918 01:30:09,196 --> 01:30:10,070 There's nothing here. 1919 01:30:10,070 --> 01:30:10,740 It's empty. 1920 01:30:10,740 --> 01:30:12,440 There's nothing feasible here. 1921 01:30:12,440 --> 01:30:15,290 So the best we can do is get close to it. 1922 01:30:15,290 --> 01:30:17,630 So when you look at a set of discrete points, 1923 01:30:17,630 --> 01:30:21,170 discrete concepts, those concepts 1924 01:30:21,170 --> 01:30:23,480 can have three properties. 1925 01:30:23,480 --> 01:30:26,660 They can be D, meaning dominated. 1926 01:30:26,660 --> 01:30:31,340 So all the red points shown here are dominated. 1927 01:30:31,340 --> 01:30:34,400 And the way you can see this if you draw a little-- 1928 01:30:34,400 --> 01:30:40,130 let me try to let me try to draw a box here. 1929 01:30:44,970 --> 01:30:46,422 Where's the eraser? 1930 01:30:54,180 --> 01:30:56,765 So let's pick a point, a particular point-- 1931 01:30:59,800 --> 01:31:01,780 Let's pick this point right here. 1932 01:31:04,420 --> 01:31:07,000 Why do we know that this point is dominated, 1933 01:31:07,000 --> 01:31:09,895 just looking at it graphically? 1934 01:31:09,895 --> 01:31:10,770 How do you know that? 1935 01:31:13,420 --> 01:31:14,710 Go ahead. 1936 01:31:14,710 --> 01:31:16,755 AUDIENCE: The point is not at the front. 1937 01:31:16,755 --> 01:31:18,130 PROFESSOR: It's not at the front, 1938 01:31:18,130 --> 01:31:20,930 but the way the way to find first of all, 1939 01:31:20,930 --> 01:31:23,230 whether it's dominated, which is relative 1940 01:31:23,230 --> 01:31:26,950 to the other solutions, is we can just draw a horizontal line 1941 01:31:26,950 --> 01:31:29,720 here, and a vertical line here. 1942 01:31:29,720 --> 01:31:31,510 And so in a multi-dimensional space, 1943 01:31:31,510 --> 01:31:34,480 you're essentially drawing a hyper-cube. 1944 01:31:34,480 --> 01:31:37,570 And down here is our utopia. 1945 01:31:37,570 --> 01:31:40,090 This is where we would love to be, but we can't be. 1946 01:31:40,090 --> 01:31:41,930 It's not feasible. 1947 01:31:41,930 --> 01:31:45,880 And you see in this box, between the point and the utopia, 1948 01:31:45,880 --> 01:31:47,900 there is this other guy here. 1949 01:31:47,900 --> 01:31:50,210 This point here. 1950 01:31:50,210 --> 01:31:53,240 And because it's there, it's closer to the utopia, 1951 01:31:53,240 --> 01:31:57,830 and it dominates this point in both objectives. 1952 01:31:57,830 --> 01:32:00,510 Let's take this point here. 1953 01:32:00,510 --> 01:32:01,700 Let's do the same thing. 1954 01:32:01,700 --> 01:32:04,670 Draw a vertical line and a horizontal line. 1955 01:32:04,670 --> 01:32:06,560 And there's a whole bunch of points, 1956 01:32:06,560 --> 01:32:10,970 all these points right here are all the box, 1957 01:32:10,970 --> 01:32:12,680 closer to the Utopian point. 1958 01:32:12,680 --> 01:32:15,470 So this point here is actually dominated by one, two, three, 1959 01:32:15,470 --> 01:32:19,610 four, five, six seven other points. 1960 01:32:19,610 --> 01:32:22,250 So that's essentially dominated. 1961 01:32:22,250 --> 01:32:23,848 Let me erase this again. 1962 01:32:27,520 --> 01:32:31,540 Then we have these darker red, purple points. 1963 01:32:31,540 --> 01:32:34,770 Let me just circle them real quick. 1964 01:32:34,770 --> 01:32:42,000 So we have this point here, this point, this point, this point, 1965 01:32:42,000 --> 01:32:44,130 this one, and this one. 1966 01:32:44,130 --> 01:32:46,980 And those are what we call non-dominated, 1967 01:32:46,980 --> 01:32:49,230 like we just did the filtering, meaning there there's 1968 01:32:49,230 --> 01:32:53,730 no other point in the set that's better than it in all criteria. 1969 01:32:53,730 --> 01:32:57,510 So you would consider them as concepts to choose from. 1970 01:32:57,510 --> 01:33:03,160 But they're not exactly on that dashed line, 1971 01:33:03,160 --> 01:33:05,670 which means that there is a way to improve them. 1972 01:33:05,670 --> 01:33:08,850 There's a way to tweak them, to optimize them further, 1973 01:33:08,850 --> 01:33:13,560 if you could to get them right onto that dashed line. 1974 01:33:13,560 --> 01:33:17,640 They're close but they're not exactly on that dashed line. 1975 01:33:17,640 --> 01:33:19,800 So you could tweak these concepts to get them 1976 01:33:19,800 --> 01:33:22,810 closer to the Pareto front. 1977 01:33:22,810 --> 01:33:27,110 But they're still pretty good, because they're non-dominated. 1978 01:33:27,110 --> 01:33:31,370 And then we have those grey points, which are shown here. 1979 01:33:31,370 --> 01:33:33,320 There's three of them. 1980 01:33:33,320 --> 01:33:36,062 Let me use a different color for these. 1981 01:33:42,230 --> 01:33:44,015 Let me use a different color, maybe green. 1982 01:33:49,140 --> 01:33:49,740 Here. 1983 01:33:49,740 --> 01:33:55,890 This point, this point, and this point, they are Pareto optimal. 1984 01:33:55,890 --> 01:33:58,660 Meaning they're on the Pareto front. 1985 01:33:58,660 --> 01:34:01,600 There's no way to improve them, such 1986 01:34:01,600 --> 01:34:04,310 that you could make both objectives better. 1987 01:34:04,310 --> 01:34:04,810 Right? 1988 01:34:04,810 --> 01:34:09,130 If you want to make J1 better, you have to sacrifice on J2. 1989 01:34:09,130 --> 01:34:12,280 So the only way to modify these points 1990 01:34:12,280 --> 01:34:15,640 is to slide along the Pareto front, 1991 01:34:15,640 --> 01:34:18,410 meaning you're trading off the objectives. 1992 01:34:18,410 --> 01:34:19,850 Do you see that? 1993 01:34:19,850 --> 01:34:22,600 And so, in my spring class that I teach, 1994 01:34:22,600 --> 01:34:25,090 called Multi-disciplinary System Optimization, 1995 01:34:25,090 --> 01:34:26,560 we get into this pretty deeply. 1996 01:34:26,560 --> 01:34:30,190 Like how do you actually do this? 1997 01:34:30,190 --> 01:34:33,600 But the point I want you to keep for now 1998 01:34:33,600 --> 01:34:37,800 is that when you select the concept, 1999 01:34:37,800 --> 01:34:40,810 do not select a dominated one. 2000 01:34:40,810 --> 01:34:42,390 Get rid of the dominated ones. 2001 01:34:42,390 --> 01:34:43,690 Filter them out. 2002 01:34:43,690 --> 01:34:45,880 And then you still have to choose among the others, 2003 01:34:45,880 --> 01:34:48,340 and that's where your preferences come in. 2004 01:34:48,340 --> 01:34:51,970 Right, so given the blue and green choices here, 2005 01:34:51,970 --> 01:34:54,520 which one are you going to recommend at the end? 2006 01:34:54,520 --> 01:34:59,440 And that this is sort of the preferences between maximizing 2007 01:34:59,440 --> 01:35:01,590 J1 and minimizing J2. 2008 01:35:01,590 --> 01:35:04,240 That's a social process of coming up 2009 01:35:04,240 --> 01:35:05,680 with your preferences. 2010 01:35:05,680 --> 01:35:08,860 But you now determine those preferences 2011 01:35:08,860 --> 01:35:11,515 after you've scored all the concepts. 2012 01:35:11,515 --> 01:35:13,420 Do you see the difference? 2013 01:35:13,420 --> 01:35:16,720 That's pretty fundamental. 2014 01:35:16,720 --> 01:35:17,580 OK. 2015 01:35:17,580 --> 01:35:23,190 So, one more thing and then we'll wrap it up. 2016 01:35:23,190 --> 01:35:24,990 So now you've gone through all this work. 2017 01:35:24,990 --> 01:35:27,900 You've done your concept generation, concept selections, 2018 01:35:27,900 --> 01:35:32,330 scoring, and you found the one concept 2019 01:35:32,330 --> 01:35:35,760 that you're going to go forward with. 2020 01:35:35,760 --> 01:35:37,080 That's when you do PDR. 2021 01:35:37,080 --> 01:35:38,250 So what is the PDR? 2022 01:35:38,250 --> 01:35:39,570 The Preliminary Design Review. 2023 01:35:39,570 --> 01:35:43,530 You explain what concept and system architecture was chosen. 2024 01:35:43,530 --> 01:35:48,030 You explain why it was chosen, and you 2025 01:35:48,030 --> 01:35:51,030 want this to be approved as the design baseline 2026 01:35:51,030 --> 01:35:54,540 for going forward and doing more detailed design work. 2027 01:35:54,540 --> 01:35:57,000 You compare it against the rejected alternatives. 2028 01:35:57,000 --> 01:36:00,720 Typically at the PDR, you will spend some time on saying, 2029 01:36:00,720 --> 01:36:03,420 these are the other interesting concepts 2030 01:36:03,420 --> 01:36:05,760 that almost were selected, but we didn't 2031 01:36:05,760 --> 01:36:08,220 because of these reasons. 2032 01:36:08,220 --> 01:36:13,870 You show some quantitative analysis that gives confidence, 2033 01:36:13,870 --> 01:36:17,640 especially you want to show that the requirements that you had 2034 01:36:17,640 --> 01:36:20,280 defined and agreed upon at the SRR 2035 01:36:20,280 --> 01:36:23,890 can be satisfied by your chosen concept. 2036 01:36:23,890 --> 01:36:26,640 There's maybe still some risk or some uncertainty. 2037 01:36:26,640 --> 01:36:29,660 So any risk reduction experiments or prototypes. 2038 01:36:29,660 --> 01:36:32,750 So you can actually do prototyping at this stage. 2039 01:36:32,750 --> 01:36:34,490 You don't want to spend too much on that, 2040 01:36:34,490 --> 01:36:38,030 but you want to do enough of it so you can reduce risks. 2041 01:36:38,030 --> 01:36:40,970 And then you do a preview of the detailed design 2042 01:36:40,970 --> 01:36:44,700 phase between PDR leading up to CDR. So here-- 2043 01:36:44,700 --> 01:36:47,870 I'm not going to read this-- but this is the description 2044 01:36:47,870 --> 01:36:50,480 of what the PDR is all about. 2045 01:36:50,480 --> 01:36:54,160 In the NASA system engineering handbook on page 177, 2046 01:36:54,160 --> 01:36:55,970 there's a very detailed description 2047 01:36:55,970 --> 01:37:00,780 of entrance criteria, exit criteria, et cetera, et cetera. 2048 01:37:00,780 --> 01:37:05,982 So here's a description from a PDR from a European program, 2049 01:37:05,982 --> 01:37:07,940 and you can see it's a pretty big group, right? 2050 01:37:07,940 --> 01:37:10,340 This is like 30, 40 people. 2051 01:37:10,340 --> 01:37:15,560 And because the PDR is such an important decision, 2052 01:37:15,560 --> 01:37:18,320 that's when you also invite external stakeholders. 2053 01:37:18,320 --> 01:37:21,560 And you want to make sure that people come away from the PDR 2054 01:37:21,560 --> 01:37:25,070 with confidence that, yeah, we have a good system 2055 01:37:25,070 --> 01:37:25,920 architecture. 2056 01:37:25,920 --> 01:37:28,650 We've picked the good concept, and it's worthwhile 2057 01:37:28,650 --> 01:37:30,900 now putting all this work into getting it 2058 01:37:30,900 --> 01:37:33,880 to a detailed design that can actually be built. 2059 01:37:33,880 --> 01:37:34,720 OK. 2060 01:37:34,720 --> 01:37:39,590 Any questions about PDR, or any experiences from PDRs 2061 01:37:39,590 --> 01:37:41,510 that people want to share? 2062 01:37:41,510 --> 01:37:44,004 AUDIENCE: I have a question about the PDR. 2063 01:37:44,004 --> 01:37:45,920 So you said there's 30 people in that picture. 2064 01:37:45,920 --> 01:37:50,380 How do you reconcile that with seven plus, minus two rule? 2065 01:37:50,380 --> 01:37:53,960 PROFESSOR: Well, it's seven, plus or minus two, squared, 2066 01:37:53,960 --> 01:37:54,740 right? 2067 01:37:54,740 --> 01:37:59,990 So you have a hierarchy of people. 2068 01:37:59,990 --> 01:38:03,044 Not everybody is a decision maker here. 2069 01:38:03,044 --> 01:38:03,710 AUDIENCE: I see. 2070 01:38:03,710 --> 01:38:05,293 PROFESSOR: So you probably have people 2071 01:38:05,293 --> 01:38:07,670 who did some of the thermal design, 2072 01:38:07,670 --> 01:38:09,800 some people did the cost modeling. 2073 01:38:09,800 --> 01:38:13,310 So, it's a great question actually. 2074 01:38:13,310 --> 01:38:16,340 You want many, many voices and people 2075 01:38:16,340 --> 01:38:19,070 who contributed to make sure the PDR presents 2076 01:38:19,070 --> 01:38:21,650 all the information you have. 2077 01:38:21,650 --> 01:38:24,620 You don't want to do a PDR with just seven people. 2078 01:38:24,620 --> 01:38:27,230 AUDIENCE: But I guess you could say that in the presentation, 2079 01:38:27,230 --> 01:38:30,590 I guess, only seven plus or minus two, will people 2080 01:38:30,590 --> 01:38:32,880 be driving at that point in time. 2081 01:38:32,880 --> 01:38:36,740 PROFESSOR: So there will be a leadership, right? 2082 01:38:36,740 --> 01:38:40,340 There is an organizational issue. 2083 01:38:40,340 --> 01:38:46,670 But the point here is that it's a big review. 2084 01:38:46,670 --> 01:38:47,600 It's a big milestone. 2085 01:38:47,600 --> 01:38:49,691 Please, go ahead. 2086 01:38:49,691 --> 01:38:53,030 AUDIENCE: The big issue here is that all the people who are not 2087 01:38:53,030 --> 01:38:55,650 decision makers, that are in the picture, they 2088 01:38:55,650 --> 01:38:57,980 will have to implement the detailed design, 2089 01:38:57,980 --> 01:39:01,250 so it's very important to have them in the process. 2090 01:39:01,250 --> 01:39:03,914 Otherwise, you have to go back to your organization and do 2091 01:39:03,914 --> 01:39:07,186 the PDR again to explain the what's and the why's. 2092 01:39:07,186 --> 01:39:10,024 This is why you try to have these very large meetings. 2093 01:39:10,024 --> 01:39:13,772 When you do a pre-PDR, you do a dry-run, 2094 01:39:13,772 --> 01:39:16,490 and you send stuff in advance and make sure 2095 01:39:16,490 --> 01:39:19,274 that you're really all agreeing finally, 2096 01:39:19,274 --> 01:39:25,453 if possible, you get the discrepancies off the table. 2097 01:39:25,453 --> 01:39:27,661 But then you all agree that this is how to go forward 2098 01:39:27,661 --> 01:39:32,580 and then the worker bees can go and do it. 2099 01:39:32,580 --> 01:39:35,450 PROFESSOR: Yes, that's another reason for being inclusive. 2100 01:39:35,450 --> 01:39:36,570 I agree with that. 2101 01:39:36,570 --> 01:39:41,480 Any other-- who has gone through a PDR in their prior work? 2102 01:39:41,480 --> 01:39:42,610 Anybody want to share? 2103 01:39:42,610 --> 01:39:44,670 Marissa. 2104 01:39:44,670 --> 01:39:46,060 Tell us about how that went. 2105 01:39:46,060 --> 01:39:46,750 I mean as much-- 2106 01:39:46,750 --> 01:39:49,090 I know you can't probably we share everything but-- 2107 01:39:49,090 --> 01:39:51,850 AUDIENCE: For us, I mean, it was a pretty big milestone. 2108 01:39:51,850 --> 01:39:53,766 You work a lot leading up to a PDR 2109 01:39:53,766 --> 01:39:56,140 and then you finally have all the really big stakeholders 2110 01:39:56,140 --> 01:39:56,620 in the room. 2111 01:39:56,620 --> 01:39:58,300 And a lot of them have been involved during the design 2112 01:39:58,300 --> 01:40:00,800 phase, but when you actually have them all sitting there. 2113 01:40:00,800 --> 01:40:03,270 And I think getting the buy-in was also really-- 2114 01:40:03,270 --> 01:40:05,290 I think that's a really big part of PDR. 2115 01:40:05,290 --> 01:40:06,250 Like normally you're feeling pretty 2116 01:40:06,250 --> 01:40:08,375 confident in your designs, but then actually making 2117 01:40:08,375 --> 01:40:11,530 sure whoever the agency is that you're working with, 2118 01:40:11,530 --> 01:40:15,215 then your stakeholders are also kind of all on the same page. 2119 01:40:15,215 --> 01:40:16,090 It's pretty critical. 2120 01:40:16,090 --> 01:40:18,670 PROFESSOR: Did you discover problems or discrepancies 2121 01:40:18,670 --> 01:40:19,570 during the PDR? 2122 01:40:19,570 --> 01:40:20,195 AUDIENCE: Yeah. 2123 01:40:20,195 --> 01:40:23,830 I definitely think-- just generally we've had delta PDRs, 2124 01:40:23,830 --> 01:40:25,900 if things weren't completely thought through. 2125 01:40:25,900 --> 01:40:27,820 Or you'd have a fair number of action items 2126 01:40:27,820 --> 01:40:30,840 that come out of PDR to go and address 2127 01:40:30,840 --> 01:40:33,770 before you really are allowed to start moving towards CDR. 2128 01:40:33,770 --> 01:40:34,670 PROFESSOR: Good. 2129 01:40:34,670 --> 01:40:37,776 Any comments or experiences with PDRs at EPFL? 2130 01:40:45,079 --> 01:40:46,870 AUDIENCE: I did a PDR at the European Space 2131 01:40:46,870 --> 01:40:49,900 Agency during the REXUS BEXUS program 2132 01:40:49,900 --> 01:40:52,240 three years ago, and as far as I remember 2133 01:40:52,240 --> 01:40:56,740 it was very appreciated to bring as many details as possible. 2134 01:40:56,740 --> 01:41:00,520 And to be very close to that critical design review 2135 01:41:00,520 --> 01:41:03,710 stage in the PDR. 2136 01:41:03,710 --> 01:41:06,590 PROFESSOR: That's interesting. 2137 01:41:06,590 --> 01:41:09,290 So, that's good. 2138 01:41:09,290 --> 01:41:11,850 I think there's a good to have detail and so forth. 2139 01:41:11,850 --> 01:41:13,400 But what's the downside? 2140 01:41:13,400 --> 01:41:15,620 Can you think of a negative what would 2141 01:41:15,620 --> 01:41:22,550 be what would be not so good about almost being a CDR right 2142 01:41:22,550 --> 01:41:23,420 there? 2143 01:41:23,420 --> 01:41:25,730 Can you think about a downside to it? 2144 01:41:29,570 --> 01:41:31,650 AUDIENCE: Probably not taking too much time 2145 01:41:31,650 --> 01:41:34,680 in order to clarify the concept and the requirement 2146 01:41:34,680 --> 01:41:40,740 because you are in a rush to finish the product. 2147 01:41:40,740 --> 01:41:42,710 PROFESSOR: That's a good point. 2148 01:41:42,710 --> 01:41:44,762 So it takes more time. 2149 01:41:44,762 --> 01:41:46,220 But what I'm thinking about is it's 2150 01:41:46,220 --> 01:41:49,930 very difficult to then undo the decision or backtrack. 2151 01:41:49,930 --> 01:41:52,850 Let's say that at the PDR that there's some-- 2152 01:41:52,850 --> 01:41:54,950 it could be that you really forgot 2153 01:41:54,950 --> 01:41:56,420 something very important. 2154 01:41:56,420 --> 01:42:00,620 And that your concept is actually flawed. 2155 01:42:00,620 --> 01:42:04,160 But you didn't know it as you were working inside your team. 2156 01:42:04,160 --> 01:42:07,820 And then if you have a design that's almost CDR quality, 2157 01:42:07,820 --> 01:42:09,930 it's very difficult to undo that. 2158 01:42:09,930 --> 01:42:12,790 So I would say that a PDR-- 2159 01:42:12,790 --> 01:42:18,020 I actually believe that at a PDR you should have some detail, 2160 01:42:18,020 --> 01:42:20,700 but not too much yet. 2161 01:42:20,700 --> 01:42:25,460 I think a true PDR should be at the system architectural level, 2162 01:42:25,460 --> 01:42:28,970 with some detail but not all the details. 2163 01:42:28,970 --> 01:42:29,570 But thank you. 2164 01:42:29,570 --> 01:42:33,110 That's a great comment. 2165 01:42:33,110 --> 01:42:35,250 Thank you very much. 2166 01:42:35,250 --> 01:42:37,560 So let me close it here. 2167 01:42:37,560 --> 01:42:42,050 So in terms of concept selection during conceptual design, 2168 01:42:42,050 --> 01:42:45,830 we can use Pugh matrix selection. 2169 01:42:45,830 --> 01:42:49,130 And that's essentially useful when we don't yet 2170 01:42:49,130 --> 01:42:51,470 have those detailed mathematical models. 2171 01:42:51,470 --> 01:42:53,030 It's qualitative. 2172 01:42:53,030 --> 01:42:56,780 And then as we move more into preliminary design, 2173 01:42:56,780 --> 01:43:00,740 we use utility analysis, especially for noncommercial 2174 01:43:00,740 --> 01:43:04,210 systems where a net present value-- a financial-- 2175 01:43:04,210 --> 01:43:06,500 in commercial companies, you will often 2176 01:43:06,500 --> 01:43:10,010 do a financial analysis as a measure of utility. 2177 01:43:10,010 --> 01:43:12,290 And then this concept of non-dominance 2178 01:43:12,290 --> 01:43:13,700 is very, very important. 2179 01:43:13,700 --> 01:43:16,550 Generating many designs and then applying 2180 01:43:16,550 --> 01:43:21,050 Pareto filters to try to come down to the non-dominated set. 2181 01:43:21,050 --> 01:43:23,940 So that's essentially the essence of it. 2182 01:43:23,940 --> 01:43:27,390 So I hope that was clear. 2183 01:43:27,390 --> 01:43:30,440 A-4 will be posted later today.