1 00:00:01,580 --> 00:00:03,920 The following content is provided under a Creative 2 00:00:03,920 --> 00:00:05,340 Commons license. 3 00:00:05,340 --> 00:00:07,550 Your support will help MIT OpenCourseWare 4 00:00:07,550 --> 00:00:11,640 continue to offer high quality educational resources for free. 5 00:00:11,640 --> 00:00:14,180 To make a donation or to view additional materials 6 00:00:14,180 --> 00:00:18,110 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:18,110 --> 00:00:19,340 at ocw.mit.edu. 8 00:00:24,570 --> 00:00:28,770 PROFESSOR: So this is our last 10.34 lecture of the year. 9 00:00:28,770 --> 00:00:31,090 And we're just going to use it for review. 10 00:00:31,090 --> 00:00:34,410 So I'm going to give a brief recap of the course. 11 00:00:34,410 --> 00:00:36,620 We've done a lot, actually. 12 00:00:36,620 --> 00:00:40,299 I hope you've learned a lot over the course of the term. 13 00:00:40,299 --> 00:00:42,090 We've had lots of opportunities to practice 14 00:00:42,090 --> 00:00:44,706 with different numerical methods. 15 00:00:44,706 --> 00:00:46,080 I've checked in from time to time 16 00:00:46,080 --> 00:00:46,780 on the homework submissions. 17 00:00:46,780 --> 00:00:49,410 I've been really impressed with the quality of the work 18 00:00:49,410 --> 00:00:51,990 that the class has produced as a whole. 19 00:00:51,990 --> 00:00:55,290 It tells me that you guys do understand 20 00:00:55,290 --> 00:00:58,119 what you're doing, that you can apply these methods to problems 21 00:00:58,119 --> 00:00:59,160 of engineering interests. 22 00:00:59,160 --> 00:01:00,762 And I hope that the things you've 23 00:01:00,762 --> 00:01:02,220 learned over the course of the term 24 00:01:02,220 --> 00:01:06,030 can be applied to your research down the road. 25 00:01:06,030 --> 00:01:07,740 Right, that's sort of the goal here. 26 00:01:07,740 --> 00:01:11,640 Whether you do experiments or modeling or theory, 27 00:01:11,640 --> 00:01:15,046 at some point, computation is going to play a role. 28 00:01:15,046 --> 00:01:16,920 You should feel comfortable after that things 29 00:01:16,920 --> 00:01:20,910 that we've done in this course on reaching back to these tools 30 00:01:20,910 --> 00:01:23,880 and utilizing them to either solve simple transport 31 00:01:23,880 --> 00:01:26,880 models of a drug delivery problem that you're working on, 32 00:01:26,880 --> 00:01:31,170 or fitting data reliably, doing hypothesis testing of models 33 00:01:31,170 --> 00:01:32,670 versus data. 34 00:01:32,670 --> 00:01:35,410 You have the tools now to do that. 35 00:01:35,410 --> 00:01:37,260 The homework assignments that you've 36 00:01:37,260 --> 00:01:38,850 completed over the course of the term 37 00:01:38,850 --> 00:01:42,120 sort of lay out the framework in which that can be done. 38 00:01:42,120 --> 00:01:44,760 And you seem very competent. 39 00:01:44,760 --> 00:01:49,800 We'll test that competence on the final exam, which is 40 00:01:49,800 --> 00:01:52,800 Monday from 9:00 to 12:00 AM. 41 00:01:52,800 --> 00:01:54,930 It's not in Walker, though, OK. 42 00:01:54,930 --> 00:01:57,930 It's in 56-154. 43 00:01:57,930 --> 00:01:59,760 I'll remind you of that again at the end. 44 00:01:59,760 --> 00:02:02,490 But it's not in Walker, so don't show up there at 9:00 AM. 45 00:02:02,490 --> 00:02:04,470 You'll be taking the wrong final exam. 46 00:02:04,470 --> 00:02:07,460 It's in 56. 47 00:02:07,460 --> 00:02:11,240 There's going to be a review session with the TAs on Friday. 48 00:02:11,240 --> 00:02:12,650 It's going to be in this room. 49 00:02:12,650 --> 00:02:16,510 It's going to be from 10:00 to 12:00 AM. 50 00:02:16,510 --> 00:02:19,750 And the TAs asked me to ask you to review 51 00:02:19,750 --> 00:02:22,120 your grade on Stellar. 52 00:02:22,120 --> 00:02:26,680 Make sure that everything is entered accurately. 53 00:02:26,680 --> 00:02:30,221 So check that your assignment grades match the grades that 54 00:02:30,221 --> 00:02:30,970 are entered there. 55 00:02:30,970 --> 00:02:33,910 Check that your quiz grades match the grades that 56 00:02:33,910 --> 00:02:35,470 are entered there. 57 00:02:35,470 --> 00:02:37,270 You're going to take your final on Monday. 58 00:02:37,270 --> 00:02:39,370 And we're going to grade it on Tuesday 59 00:02:39,370 --> 00:02:42,850 and submit your final grades, because some of us 60 00:02:42,850 --> 00:02:44,914 have to travel right after the final exam period. 61 00:02:44,914 --> 00:02:46,330 So we want to make sure everything 62 00:02:46,330 --> 00:02:51,330 is as accurate as possible when we enter the final data. 63 00:02:51,330 --> 00:02:53,096 OK? 64 00:02:53,096 --> 00:02:53,596 Good. 65 00:03:15,540 --> 00:03:17,845 OK, so a course recap. 66 00:03:17,845 --> 00:03:21,200 We've done a lot this term, actually, right? 67 00:03:21,200 --> 00:03:24,140 You should be really pleased with yourself 68 00:03:24,140 --> 00:03:26,360 for having made it this far. 69 00:03:26,360 --> 00:03:30,110 Any one of these subjects could comprise an entire course 70 00:03:30,110 --> 00:03:31,310 in and of itself. 71 00:03:31,310 --> 00:03:34,239 And you've gotten a broad overview of all of them. 72 00:03:34,239 --> 00:03:35,030 So I'll remind you. 73 00:03:35,030 --> 00:03:36,321 We started with linear algebra. 74 00:03:36,321 --> 00:03:38,870 I told you the first day that linear algebra 75 00:03:38,870 --> 00:03:42,080 would be the underpinning of all the numerical methods 76 00:03:42,080 --> 00:03:43,491 that we utilized along the way. 77 00:03:43,491 --> 00:03:44,240 And that was true. 78 00:03:44,240 --> 00:03:45,600 I didn't lie. 79 00:03:45,600 --> 00:03:49,670 You really wanted to understand linear algebra well 80 00:03:49,670 --> 00:03:52,960 if you were going to solve any problems numerically. 81 00:03:52,960 --> 00:03:55,220 We moved on to things that were more interesting 82 00:03:55,220 --> 00:03:58,190 from an engineering context, systems of nonlinear equations 83 00:03:58,190 --> 00:03:59,810 and optimization. 84 00:03:59,810 --> 00:04:03,320 So being able to solve engineering problems, being 85 00:04:03,320 --> 00:04:07,070 able to do engineering design by optimizing over 86 00:04:07,070 --> 00:04:10,680 various uncontrolled degrees of freedom is important. 87 00:04:10,680 --> 00:04:14,560 And we did both constrained and unconstrained optimization. 88 00:04:14,560 --> 00:04:15,860 Then we moved on to dynamics. 89 00:04:15,860 --> 00:04:18,050 We did ODE initial value problems 90 00:04:18,050 --> 00:04:20,959 and DAEs, which are the combination of nonlinear 91 00:04:20,959 --> 00:04:24,160 equations and initial value problems. 92 00:04:24,160 --> 00:04:25,910 We discussed boundary value problems 93 00:04:25,910 --> 00:04:30,020 and partial differential equations, which largely 94 00:04:30,020 --> 00:04:33,370 are a class of boundary value problems as well, at least 95 00:04:33,370 --> 00:04:36,072 the ones that chemical engineers are typically interested in. 96 00:04:36,072 --> 00:04:37,280 And we switched a little bit. 97 00:04:37,280 --> 00:04:40,580 We went into probability and models and data. 98 00:04:40,580 --> 00:04:42,890 And then we came back to more modeling 99 00:04:42,890 --> 00:04:44,692 in the form of stochastic simulation. 100 00:04:44,692 --> 00:04:47,150 And all along, this was threaded through with lots and lots 101 00:04:47,150 --> 00:04:48,260 of MATLAB. 102 00:04:48,260 --> 00:04:50,239 So hopefully, you learned to do programming 103 00:04:50,239 --> 00:04:51,530 if you didn't know that before. 104 00:04:54,810 --> 00:04:58,080 MATLAB's actually a very useful programming tool to know. 105 00:04:58,080 --> 00:05:01,830 It's not so specific. 106 00:05:01,830 --> 00:05:03,750 There aren't specific quirks and details 107 00:05:03,750 --> 00:05:05,370 you need to understand like you would 108 00:05:05,370 --> 00:05:10,449 with a lower level programming language like C or Fortran. 109 00:05:10,449 --> 00:05:12,990 It's very high level, and the concepts that you utilize there 110 00:05:12,990 --> 00:05:16,420 can be translated to other programming languages as well. 111 00:05:16,420 --> 00:05:19,680 So you may find at some point in your research you want 112 00:05:19,680 --> 00:05:22,130 to do a lot of data handling. 113 00:05:22,130 --> 00:05:25,470 MATLAB's not great for handling data actually. 114 00:05:25,470 --> 00:05:28,680 Parsing large data sets is quite difficult in MATLAB. 115 00:05:28,680 --> 00:05:31,566 But there are tool kits that are built into programming 116 00:05:31,566 --> 00:05:32,940 environments like Python that are 117 00:05:32,940 --> 00:05:34,560 very efficient at doing that. 118 00:05:34,560 --> 00:05:38,534 And Python is also underpinned by similar numerical methods 119 00:05:38,534 --> 00:05:39,700 that are readily accessible. 120 00:05:39,700 --> 00:05:43,570 Everything you learned in MATLAB can be applied there as well. 121 00:05:43,570 --> 00:05:45,510 So you should have a broad understanding 122 00:05:45,510 --> 00:05:49,860 of how to draft an algorithm, how to modularize code, 123 00:05:49,860 --> 00:05:52,170 how to integrate several different numerical methods 124 00:05:52,170 --> 00:05:55,620 together to form a solution to a more complex problem. 125 00:05:58,910 --> 00:06:00,290 So we did linear algebra. 126 00:06:00,290 --> 00:06:02,862 This was the basis of all the problem solving we did. 127 00:06:02,862 --> 00:06:04,820 Principally, it was concerned with the solution 128 00:06:04,820 --> 00:06:08,840 of systems of linear equations, though oftentimes, that's 129 00:06:08,840 --> 00:06:10,340 sort of disguised. 130 00:06:10,340 --> 00:06:12,290 It isn't a direct system of linear equations 131 00:06:12,290 --> 00:06:13,206 we're trying to solve. 132 00:06:13,206 --> 00:06:15,890 But somehow, we need to decompose a matrix. 133 00:06:15,890 --> 00:06:17,720 And in the process of decomposing it, 134 00:06:17,720 --> 00:06:21,669 we're essentially doing solution of systems of linear equations. 135 00:06:21,669 --> 00:06:23,210 One thing we're really concerned with 136 00:06:23,210 --> 00:06:25,400 was how linear algebra operations 137 00:06:25,400 --> 00:06:27,410 scale with the size of the problem 138 00:06:27,410 --> 00:06:29,240 that we're interested in. 139 00:06:29,240 --> 00:06:32,300 The bigger the system of equations gets, 140 00:06:32,300 --> 00:06:34,280 the slower the calculation is going to be. 141 00:06:34,280 --> 00:06:36,830 And how slow can that get? 142 00:06:36,830 --> 00:06:38,900 This will introduce fundamental limits 143 00:06:38,900 --> 00:06:40,790 on what we can do computationally. 144 00:06:40,790 --> 00:06:43,550 When we try to solve partial differential equations, 145 00:06:43,550 --> 00:06:46,350 and we decompose them into linear equations, 146 00:06:46,350 --> 00:06:49,100 and we have huge numbers of states 147 00:06:49,100 --> 00:06:51,620 that we'd like to be able to resolve 148 00:06:51,620 --> 00:06:55,370 with a partial differential equation, 149 00:06:55,370 --> 00:06:58,060 the problem can get so complex so fast 150 00:06:58,060 --> 00:06:59,670 that a computer can't approach it. 151 00:06:59,670 --> 00:07:02,120 So being able to estimate the computational cost 152 00:07:02,120 --> 00:07:05,000 of these operations is critical for deciding whether a problem 153 00:07:05,000 --> 00:07:07,580 is feasible or unfeasible. 154 00:07:07,580 --> 00:07:09,290 You'd like to know that before you 155 00:07:09,290 --> 00:07:12,560 start writing down a numerical solution to the problems. 156 00:07:12,560 --> 00:07:14,360 We focused a lot on that. 157 00:07:14,360 --> 00:07:16,700 There was fundamentals like arithmetic, 158 00:07:16,700 --> 00:07:20,570 solving linear equations numerically using Gaussian 159 00:07:20,570 --> 00:07:22,360 elimination, for example. 160 00:07:22,360 --> 00:07:24,650 We discussed decomposition in the form of eigenvalues 161 00:07:24,650 --> 00:07:27,350 and eigenvectors and, more generally, the singular value 162 00:07:27,350 --> 00:07:28,760 decomposition. 163 00:07:28,760 --> 00:07:31,250 And we discussed briefly iterative solutions 164 00:07:31,250 --> 00:07:32,210 of linear equations. 165 00:07:32,210 --> 00:07:35,000 That was sort of our segue into iterative solutions 166 00:07:35,000 --> 00:07:36,480 of nonlinear equations as well. 167 00:07:36,480 --> 00:07:38,990 So that was things like the Jacobi method 168 00:07:38,990 --> 00:07:41,060 or the Gauss-Seidel method, which 169 00:07:41,060 --> 00:07:44,300 were ways of the really operator-splitting methods 170 00:07:44,300 --> 00:07:47,750 of the same sort of class as we talked about in PDEs, where 171 00:07:47,750 --> 00:07:51,560 we took that matrix and we split it into different parts. 172 00:07:51,560 --> 00:07:53,060 We split it in a way that would make 173 00:07:53,060 --> 00:07:56,540 it convenient to solve a simpler system of linear equations. 174 00:07:56,540 --> 00:07:59,440 And depending on the properties of the matrix, 175 00:07:59,440 --> 00:08:03,930 that solution might be achievable or not achievable. 176 00:08:03,930 --> 00:08:05,960 The iterative method may not converge. 177 00:08:05,960 --> 00:08:07,460 But for certain classes of problems, 178 00:08:07,460 --> 00:08:09,209 we could guarantee that it would converge, 179 00:08:09,209 --> 00:08:11,060 and it might do it very quickly. 180 00:08:11,060 --> 00:08:14,240 Things we didn't cover, but that are important, I think. 181 00:08:14,240 --> 00:08:16,980 There's a notion called linear operator theory, 182 00:08:16,980 --> 00:08:21,530 which is an extension of linear algebra to infinite dimensions. 183 00:08:21,530 --> 00:08:24,920 You've seen some of this already in your transport class, 184 00:08:24,920 --> 00:08:28,490 for example, where you solve differential equations 185 00:08:28,490 --> 00:08:32,140 with sets of functions. 186 00:08:32,140 --> 00:08:33,820 You have some infinite set of functions 187 00:08:33,820 --> 00:08:36,886 that you solve a differential equation with, 188 00:08:36,886 --> 00:08:38,260 an ordinary differential equation 189 00:08:38,260 --> 00:08:39,980 or a partial differential equation. 190 00:08:39,980 --> 00:08:42,429 And all of the underpinnings of that 191 00:08:42,429 --> 00:08:44,480 fall under linear operator theory, 192 00:08:44,480 --> 00:08:46,690 which is an extension of what you've learned already 193 00:08:46,690 --> 00:08:49,000 for linear algebra. 194 00:08:49,000 --> 00:08:51,440 We talked about Gaussian elimination and the LU 195 00:08:51,440 --> 00:08:52,072 decomposition. 196 00:08:52,072 --> 00:08:53,530 You'll find in the literature there 197 00:08:53,530 --> 00:08:55,900 are lots of matrix decomposition methods. 198 00:08:55,900 --> 00:08:58,900 And the ones that are chosen are chosen 199 00:08:58,900 --> 00:09:01,510 for often problem-specific reasons. 200 00:09:01,510 --> 00:09:03,880 So there's things like the QR decomposition 201 00:09:03,880 --> 00:09:07,750 or the Cholesky decomposition or the Schur decomposition. 202 00:09:07,750 --> 00:09:09,220 These are technical terms. 203 00:09:09,220 --> 00:09:10,362 You can look them up. 204 00:09:10,362 --> 00:09:12,820 And you should feel confident that, if you read about them, 205 00:09:12,820 --> 00:09:14,361 you should be able to understand them 206 00:09:14,361 --> 00:09:16,540 given the basis in linear algebra you have now. 207 00:09:16,540 --> 00:09:19,310 But we didn't look at those in particular. 208 00:09:19,310 --> 00:09:21,880 And the state of the art in linear algebra and solving 209 00:09:21,880 --> 00:09:24,610 systems of linear equations lies in what I refer 210 00:09:24,610 --> 00:09:26,680 to as Krylov subspace methods. 211 00:09:26,680 --> 00:09:28,690 We talked about conjugate gradient 212 00:09:28,690 --> 00:09:31,390 as an iterative method for linear equations. 213 00:09:31,390 --> 00:09:32,890 There's a broader view of that that 214 00:09:32,890 --> 00:09:35,170 says Krylov subspace method. 215 00:09:35,170 --> 00:09:37,520 You may see that in the literature from time to time. 216 00:09:37,520 --> 00:09:44,950 It's a more nuanced discussion of the linear dependence 217 00:09:44,950 --> 00:09:49,817 of columns in a matrix or linear independence 218 00:09:49,817 --> 00:09:51,400 and how you can take advantage of that 219 00:09:51,400 --> 00:09:55,090 to rapidly generate iterative solutions to those equations. 220 00:09:57,890 --> 00:09:59,810 Any questions about linear algebra? 221 00:09:59,810 --> 00:10:01,570 No? 222 00:10:01,570 --> 00:10:02,530 Things are clear. 223 00:10:02,530 --> 00:10:04,269 You've done a lot of it this term. 224 00:10:04,269 --> 00:10:05,560 Systems of nonlinear equations. 225 00:10:05,560 --> 00:10:07,300 So these are, of course, essential 226 00:10:07,300 --> 00:10:10,720 for any engineering problems of practical importance. 227 00:10:10,720 --> 00:10:13,964 Linear equations tend to be easy to solve overall. 228 00:10:13,964 --> 00:10:15,880 Most engineering problems that are interesting 229 00:10:15,880 --> 00:10:17,320 are nonlinear by nature. 230 00:10:17,320 --> 00:10:19,180 That's what makes them interesting. 231 00:10:19,180 --> 00:10:22,420 And we also saw later on that solving systems 232 00:10:22,420 --> 00:10:24,760 of nonlinear equations was intimately 233 00:10:24,760 --> 00:10:28,510 associated with optimization. 234 00:10:28,510 --> 00:10:32,890 And so we discussed fundamentals like convergence of sequences 235 00:10:32,890 --> 00:10:35,620 and the rate of convergence of those sequences, 236 00:10:35,620 --> 00:10:39,220 stopping criteria for iterative methods. 237 00:10:39,220 --> 00:10:44,540 The best sort of nonlinear equation solvers 238 00:10:44,540 --> 00:10:47,260 with multiple equations and multiple unknowns 239 00:10:47,260 --> 00:10:49,690 are Newton-Raphson-type methods. 240 00:10:49,690 --> 00:10:52,274 They're the ones that are going to, if we're 241 00:10:52,274 --> 00:10:54,940 close enough to a solution, give us quadratic convergence, which 242 00:10:54,940 --> 00:10:58,220 is often more than we deserve. 243 00:10:58,220 --> 00:11:03,130 It's exceptionally fast, but it's not a globally convergent 244 00:11:03,130 --> 00:11:04,180 sort of methodology. 245 00:11:04,180 --> 00:11:06,540 So we also discuss quasi-Newton-Raphson methods 246 00:11:06,540 --> 00:11:10,450 that can help enhance the convergence of Newton-Raphson. 247 00:11:10,450 --> 00:11:15,490 And then we knew that we needed good initial guesses 248 00:11:15,490 --> 00:11:17,134 for solving these problems. 249 00:11:17,134 --> 00:11:18,925 We don't know how many solutions there are. 250 00:11:18,925 --> 00:11:20,840 We have no idea where they're located. 251 00:11:20,840 --> 00:11:23,380 And so we talked about homotopy and bifurcation 252 00:11:23,380 --> 00:11:25,810 as ways of generating those initial guesses 253 00:11:25,810 --> 00:11:29,950 and ways of tracking multiple routes or multiple solutions 254 00:11:29,950 --> 00:11:32,166 to the system of equations. 255 00:11:32,166 --> 00:11:33,790 I mentioned briefly during that lecture 256 00:11:33,790 --> 00:11:36,010 the concept of arclength continuation, which 257 00:11:36,010 --> 00:11:39,370 is a more advanced sort of homotopy method. 258 00:11:39,370 --> 00:11:41,620 And we didn't get to discuss that in detail 259 00:11:41,620 --> 00:11:45,430 because, actually, the underlying equation 260 00:11:45,430 --> 00:11:47,990 for arclength continuation is a DAE. 261 00:11:47,990 --> 00:11:50,380 OK, it's a fully implicit DAE that one 262 00:11:50,380 --> 00:11:55,690 has to solve in order to track the solution path as a function 263 00:11:55,690 --> 00:11:58,750 of the homotopy parameter. 264 00:11:58,750 --> 00:12:00,100 So we didn't cover this. 265 00:12:00,100 --> 00:12:02,200 We didn't try to solve any arclength continuation 266 00:12:02,200 --> 00:12:02,530 problems. 267 00:12:02,530 --> 00:12:04,000 So it's a very common method that's 268 00:12:04,000 --> 00:12:09,740 applied to solutions of nonlinear equations. 269 00:12:09,740 --> 00:12:12,110 Then we moved on to optimization. 270 00:12:12,110 --> 00:12:13,910 So we need this for design. 271 00:12:13,910 --> 00:12:16,855 We use this for model fitting. 272 00:12:16,855 --> 00:12:18,980 And we discussed both unconstrained and constrained 273 00:12:18,980 --> 00:12:19,940 optimization. 274 00:12:19,940 --> 00:12:26,950 In the unconstrained context, we derived optimality conditions. 275 00:12:26,950 --> 00:12:29,370 How do we know that we found the solution 276 00:12:29,370 --> 00:12:33,100 to this problem, at least a locally optimal solution? 277 00:12:33,100 --> 00:12:35,100 And then we discuss steepest descent methods 278 00:12:35,100 --> 00:12:36,570 that could get us to that solution 279 00:12:36,570 --> 00:12:38,460 with linear conversions. 280 00:12:38,460 --> 00:12:40,140 We discussed Newton-Raphson methods 281 00:12:40,140 --> 00:12:44,030 that can get us to that solution with quadratic convergence. 282 00:12:44,030 --> 00:12:47,910 We discussed dogleg methods that are heuristic by nature, 283 00:12:47,910 --> 00:12:50,940 but meant to mix the properties of steepest descent 284 00:12:50,940 --> 00:12:55,230 and Newton-Raphson to give us reliable convergence no matter 285 00:12:55,230 --> 00:12:59,030 how far we are from the solution. 286 00:12:59,030 --> 00:13:01,880 We moved on to constrained optimization. 287 00:13:01,880 --> 00:13:03,560 Those problems came in two types, right, 288 00:13:03,560 --> 00:13:05,716 equality constraints and inequality constraints. 289 00:13:05,716 --> 00:13:07,340 And we discussed the method of Lagrange 290 00:13:07,340 --> 00:13:10,880 multipliers to handle equality constraints. 291 00:13:10,880 --> 00:13:12,920 We discussed interior point methods 292 00:13:12,920 --> 00:13:16,310 to handle inequality constraints and their mixture 293 00:13:16,310 --> 00:13:18,770 to handle a combination of equality and inequality 294 00:13:18,770 --> 00:13:19,850 constraints. 295 00:13:19,850 --> 00:13:21,200 We left out a lot. 296 00:13:21,200 --> 00:13:23,570 Optimization is a huge field. 297 00:13:23,570 --> 00:13:25,250 So there are special classes of problems 298 00:13:25,250 --> 00:13:26,810 called linear programs, which are 299 00:13:26,810 --> 00:13:28,270 solved in a particular fashion. 300 00:13:28,270 --> 00:13:33,170 These are problems that have a linear dependence on the design 301 00:13:33,170 --> 00:13:34,530 variables. 302 00:13:34,530 --> 00:13:37,540 There's problems called integer programs 303 00:13:37,540 --> 00:13:40,101 where the design variables can only take on integer values. 304 00:13:40,101 --> 00:13:41,350 These might be very important. 305 00:13:41,350 --> 00:13:43,016 You can imagine trying to design a plant 306 00:13:43,016 --> 00:13:46,490 and asking what integer number of heat exchangers 307 00:13:46,490 --> 00:13:48,290 or separators is optimal. 308 00:13:51,050 --> 00:13:53,570 Associated with constrained optimization and inequality 309 00:13:53,570 --> 00:13:55,520 constraints, there is an equivalent set 310 00:13:55,520 --> 00:13:57,800 of optimality conditions that we didn't try to derive. 311 00:13:57,800 --> 00:14:00,820 Their called the KKT conditions. 312 00:14:00,820 --> 00:14:02,660 So this comes up a lot in literature. 313 00:14:02,660 --> 00:14:04,640 So it's sort of the fundamentals of knowing 314 00:14:04,640 --> 00:14:06,740 that you found the optimum to the inequality 315 00:14:06,740 --> 00:14:08,450 constrained problem. 316 00:14:08,450 --> 00:14:09,907 They're not difficult to derive. 317 00:14:09,907 --> 00:14:11,990 But we have a limited amount of time in the class. 318 00:14:11,990 --> 00:14:14,600 It's something fundamental that's sort of missing. 319 00:14:14,600 --> 00:14:19,690 And then these sorts of methods of all find local optima. 320 00:14:19,690 --> 00:14:22,657 But oftentimes, we're interested in global optimization instead. 321 00:14:22,657 --> 00:14:24,740 And that's not something that we discussed at all. 322 00:14:24,740 --> 00:14:28,130 That's quite complicated and beyond the scope of the course. 323 00:14:28,130 --> 00:14:30,859 But things like genetic algorithms, that's 324 00:14:30,859 --> 00:14:32,150 a term that will come up a lot. 325 00:14:32,150 --> 00:14:34,130 That's one way of trying to seek out 326 00:14:34,130 --> 00:14:36,194 global optima in a landscape. 327 00:14:39,770 --> 00:14:44,210 ODE-IVP, right, so now we want to move from static engineering 328 00:14:44,210 --> 00:14:45,845 situations to dynamics. 329 00:14:49,970 --> 00:14:51,890 And we were really concerned, when 330 00:14:51,890 --> 00:14:54,320 modeling dynamics, with both the accuracy 331 00:14:54,320 --> 00:14:57,295 and the stability of these methods. 332 00:14:57,295 --> 00:14:58,670 We were concerned about stability 333 00:14:58,670 --> 00:15:01,430 with other algorithms, too, for solutions 334 00:15:01,430 --> 00:15:02,540 of nonlinear equations. 335 00:15:02,540 --> 00:15:05,330 We were concerned with whether the Newton-Raphson method would 336 00:15:05,330 --> 00:15:06,310 converge or not. 337 00:15:06,310 --> 00:15:08,830 That's a stability issue in a sense. 338 00:15:08,830 --> 00:15:11,510 OK, so we're concerned with stability there. 339 00:15:11,510 --> 00:15:13,820 But here we really focused on it in detail, 340 00:15:13,820 --> 00:15:17,210 in part because some of these differential equations 341 00:15:17,210 --> 00:15:19,420 are inherently unstable. 342 00:15:19,420 --> 00:15:21,530 In engineering circumstances, that happens. 343 00:15:21,530 --> 00:15:30,280 And we would be unwilling to accept unstable solutions 344 00:15:30,280 --> 00:15:32,770 in situations where the equations themselves 345 00:15:32,770 --> 00:15:34,995 are stable. 346 00:15:34,995 --> 00:15:36,370 In the case of Newton-Raphson, we 347 00:15:36,370 --> 00:15:38,470 can do these sorts of quasi-Newton-Raphson methods 348 00:15:38,470 --> 00:15:41,720 to change the path the solution follows. 349 00:15:41,720 --> 00:15:43,720 And we didn't really care what path it followed. 350 00:15:43,720 --> 00:15:45,690 But here, the path is important. 351 00:15:45,690 --> 00:15:47,730 We're not free to change the path. 352 00:15:47,730 --> 00:15:50,530 So the method itself has to be inherently stable. 353 00:15:53,340 --> 00:15:55,730 So we discussed numerical integration. 354 00:15:55,730 --> 00:15:58,040 We discussed two classes of method, explicit methods 355 00:15:58,040 --> 00:15:59,690 and implicit methods. 356 00:15:59,690 --> 00:16:02,230 So explicit methods, we mentioned 357 00:16:02,230 --> 00:16:05,120 these explicit Runge-Kutta formulas 358 00:16:05,120 --> 00:16:08,150 that have been derived for all sorts of accuracy 359 00:16:08,150 --> 00:16:11,710 levels, both local truncation error and global truncation 360 00:16:11,710 --> 00:16:14,070 error. 361 00:16:14,070 --> 00:16:16,320 Explicit methods, unfortunately you 362 00:16:16,320 --> 00:16:19,800 can't guarantee that they're always stable. 363 00:16:19,800 --> 00:16:23,160 They lack a property called A stability. 364 00:16:23,160 --> 00:16:26,210 Implicit methods, on the other hand, 365 00:16:26,210 --> 00:16:30,120 can be made to be stable under all circumstances. 366 00:16:30,120 --> 00:16:34,059 So one of those is the backward Euler method. 367 00:16:34,059 --> 00:16:35,100 That's a nice one to use. 368 00:16:35,100 --> 00:16:38,100 The penalty for using these implicit methods is you 369 00:16:38,100 --> 00:16:40,800 likely have to solve a nonlinear equation at each time step. 370 00:16:40,800 --> 00:16:43,230 So there is a cost to pay associated 371 00:16:43,230 --> 00:16:45,270 with enhancing stability. 372 00:16:45,270 --> 00:16:47,910 We discussed local and global truncation error. 373 00:16:47,910 --> 00:16:52,290 And we talked about stiffness and linear stability criteria. 374 00:16:52,290 --> 00:16:56,640 So linearizing the ordinary differential equation, 375 00:16:56,640 --> 00:17:00,250 assessing what the eigenvalues associated with the Jacobian 376 00:17:00,250 --> 00:17:02,070 of that linearization is. 377 00:17:02,070 --> 00:17:05,099 And using those eigenvalues to tell us, 378 00:17:05,099 --> 00:17:08,190 for example, what kind of time steps we need with our method 379 00:17:08,190 --> 00:17:12,519 in order to achieve stable integration in time. 380 00:17:12,519 --> 00:17:15,060 Things we didn't cover that are sort of interesting and maybe 381 00:17:15,060 --> 00:17:17,190 important in engineering context, one of those 382 00:17:17,190 --> 00:17:18,960 is event location. 383 00:17:18,960 --> 00:17:23,550 You may want to know the time at which something happens. 384 00:17:23,550 --> 00:17:27,210 So I may want to know when does the solution hit 385 00:17:27,210 --> 00:17:29,700 a certain point in time. 386 00:17:29,700 --> 00:17:32,640 Maybe, I even want to change my dynamics when the solution hits 387 00:17:32,640 --> 00:17:35,194 a particular point in time. 388 00:17:35,194 --> 00:17:37,360 There's a broader class of methods that can do that. 389 00:17:37,360 --> 00:17:39,026 They're built into MATLAB too, actually. 390 00:17:39,026 --> 00:17:43,500 You can set up these events and have the dynamics change 391 00:17:43,500 --> 00:17:46,260 or have a report that an event occurred. 392 00:17:46,260 --> 00:17:48,600 But we didn't really discuss that at all. 393 00:17:48,600 --> 00:17:52,650 And in numerical computing, parallelization 394 00:17:52,650 --> 00:17:57,840 is important for studying classes of big problems. 395 00:17:57,840 --> 00:17:59,340 It's sort of funny, but you can even 396 00:17:59,340 --> 00:18:02,085 do parallelization in time. 397 00:18:02,085 --> 00:18:05,550 So this is something that's been discussed for a long time. 398 00:18:05,550 --> 00:18:07,050 And for certain classes of problems, 399 00:18:07,050 --> 00:18:10,260 it offers computational advantages 400 00:18:10,260 --> 00:18:14,730 over serial integration of the dynamics an ODE-IVP. 401 00:18:14,730 --> 00:18:18,210 So you can imagine how do you parallelize in time. 402 00:18:18,210 --> 00:18:20,700 So you can take the time window that you're interested in 403 00:18:20,700 --> 00:18:23,180 and cut it up into different slices. 404 00:18:23,180 --> 00:18:26,310 And you could try to solve in parallel the dynamics over each 405 00:18:26,310 --> 00:18:27,810 of those slices. 406 00:18:27,810 --> 00:18:30,360 But for each slice, you need an initial condition that 407 00:18:30,360 --> 00:18:33,327 has to be precisely prescribed. 408 00:18:33,327 --> 00:18:35,160 And you don't know those initial conditions. 409 00:18:35,160 --> 00:18:38,120 That's what the serial integration would tell you. 410 00:18:38,120 --> 00:18:40,620 One way of doing parallel in time is to say, well, 411 00:18:40,620 --> 00:18:42,600 those initial conditions are unknown. 412 00:18:42,600 --> 00:18:44,550 And they match up to the terminal conditions 413 00:18:44,550 --> 00:18:46,590 of the previous slice. 414 00:18:46,590 --> 00:18:50,850 So why not wrap this serial integration process 415 00:18:50,850 --> 00:18:54,395 in a Newton-Raphson solver to determine each of these points? 416 00:18:54,395 --> 00:18:55,770 With certain classes of problems, 417 00:18:55,770 --> 00:18:58,210 certain ODE-IVPs that are inherently stable, 418 00:18:58,210 --> 00:18:59,160 you can do that. 419 00:18:59,160 --> 00:19:03,420 And you can get a computational speedup over serial integration 420 00:19:03,420 --> 00:19:05,580 just by doing the integration for each 421 00:19:05,580 --> 00:19:07,200 of the slices on different computers 422 00:19:07,200 --> 00:19:09,464 and bringing all of the results together. 423 00:19:09,464 --> 00:19:11,880 The bigger the farm of computers you have, the more slices 424 00:19:11,880 --> 00:19:13,944 you can utilize. 425 00:19:13,944 --> 00:19:15,360 That's sort of interesting, right? 426 00:19:15,360 --> 00:19:18,540 Parallel in time, it's bizarre, because we usually 427 00:19:18,540 --> 00:19:21,280 think of time integration as being sort of causal. 428 00:19:21,280 --> 00:19:25,200 I have to know where I was to predict where I'm going. 429 00:19:25,200 --> 00:19:27,000 But you can actually divide these things up 430 00:19:27,000 --> 00:19:30,030 in more sophisticated ways. 431 00:19:33,090 --> 00:19:37,230 We talked about BVPs and PDEs. 432 00:19:37,230 --> 00:19:42,390 We treated them as the same class of problem broadly. 433 00:19:42,390 --> 00:19:47,070 Usually, when chemical engineers are looking at these problems, 434 00:19:47,070 --> 00:19:48,840 we're interested in spatial variation 435 00:19:48,840 --> 00:19:51,480 of some sort of a conserved quantity, so momentum 436 00:19:51,480 --> 00:19:55,830 or energy or concentration of a species or mass. 437 00:19:55,830 --> 00:19:58,140 And here we are also concerned with both the accuracy 438 00:19:58,140 --> 00:19:59,850 and stability of method stability 439 00:19:59,850 --> 00:20:03,630 now for when we had also time integration 440 00:20:03,630 --> 00:20:08,100 of the solution, in addition to spatial variation. 441 00:20:08,100 --> 00:20:11,670 And so for BVPs, we talked about shooting methods. 442 00:20:11,670 --> 00:20:15,570 So can I recast a boundary value problem in one dimension 443 00:20:15,570 --> 00:20:18,865 as an initial value problem with an unknown? 444 00:20:18,865 --> 00:20:20,990 That's a very common way of solving these problems. 445 00:20:20,990 --> 00:20:23,886 And they are sometimes stable and sometimes not. 446 00:20:23,886 --> 00:20:25,260 When they're unstable, you can do 447 00:20:25,260 --> 00:20:27,950 what's called multiple shooting, where you divide the domain up 448 00:20:27,950 --> 00:20:31,024 into subdomains, and you shoot from the initial condition 449 00:20:31,024 --> 00:20:32,190 of each of those subdomains. 450 00:20:32,190 --> 00:20:35,030 It looks exactly like the parallel in time integration 451 00:20:35,030 --> 00:20:36,830 that I described for you before. 452 00:20:36,830 --> 00:20:39,139 And then you try to match up the initial conditions 453 00:20:39,139 --> 00:20:41,555 with the terminal conditions to get a continuous solution. 454 00:20:44,400 --> 00:20:46,110 Then we discussed relaxation methods 455 00:20:46,110 --> 00:20:47,950 for solution of BVPs and PDEs. 456 00:20:47,950 --> 00:20:50,780 So collocation, Galerkin. 457 00:20:50,780 --> 00:20:53,310 Collocation, you'll recall, was we 458 00:20:53,310 --> 00:20:55,260 want to try to satisfy the governing 459 00:20:55,260 --> 00:20:59,620 equation at various points in the solution domain. 460 00:20:59,620 --> 00:21:03,780 So we might write our solution as the superposition 461 00:21:03,780 --> 00:21:04,840 of a set of functions. 462 00:21:04,840 --> 00:21:07,542 And then try to satisfy at various points 463 00:21:07,542 --> 00:21:09,000 that we've laid out in the domains. 464 00:21:09,000 --> 00:21:10,500 Maybe, there's even an optimal place 465 00:21:10,500 --> 00:21:13,650 to put all these points in order to minimize 466 00:21:13,650 --> 00:21:16,670 the error in the solution. 467 00:21:16,670 --> 00:21:21,870 Galerkin, instead, we tried to make 468 00:21:21,870 --> 00:21:27,120 the projection of the error in our equations 469 00:21:27,120 --> 00:21:31,920 on different orthogonal functions that represented 470 00:21:31,920 --> 00:21:33,170 the solution be zero. 471 00:21:33,170 --> 00:21:35,937 So that was sort of a global error 472 00:21:35,937 --> 00:21:38,520 that we tried to minimize over different orthogonal functions. 473 00:21:38,520 --> 00:21:40,724 Whereas collocation is sort of a local error 474 00:21:40,724 --> 00:21:42,765 that we're trying to minimize, a local truncation 475 00:21:42,765 --> 00:21:45,180 error that we're trying to minimize. 476 00:21:45,180 --> 00:21:47,640 We discussed finite difference and finite volume methods 477 00:21:47,640 --> 00:21:48,310 as well. 478 00:21:48,310 --> 00:21:51,010 So ways of discretizing in the equations. 479 00:21:51,010 --> 00:21:53,610 Finite difference, I would say, is the easiest one 480 00:21:53,610 --> 00:21:55,770 to reach for. 481 00:21:55,770 --> 00:21:58,620 If you had to pick a quick and dirty method 482 00:21:58,620 --> 00:22:01,650 to try to get a solution, finite difference 483 00:22:01,650 --> 00:22:05,370 can be really easy to go to. 484 00:22:05,370 --> 00:22:07,440 It's easy to figure out how big the errors are 485 00:22:07,440 --> 00:22:09,990 in your finite difference method. 486 00:22:09,990 --> 00:22:13,770 But it may not be the optimal approach to the problem. 487 00:22:13,770 --> 00:22:15,690 For certain geometries, finite volume 488 00:22:15,690 --> 00:22:19,140 is really far more preferable, because it's 489 00:22:19,140 --> 00:22:22,230 easy to estimate the fluxes through surfaces 490 00:22:22,230 --> 00:22:24,990 of funny shapes where your grid shape now 491 00:22:24,990 --> 00:22:27,550 conforms to your geometry. 492 00:22:27,550 --> 00:22:29,160 Finite volume also had the advantage 493 00:22:29,160 --> 00:22:31,740 of being strictly conservative. 494 00:22:31,740 --> 00:22:35,089 So I can never gain or lose a conserved quantity 495 00:22:35,089 --> 00:22:36,630 in the finite volume method, at least 496 00:22:36,630 --> 00:22:38,220 to within the numerical error. 497 00:22:38,220 --> 00:22:41,740 And that isn't true of finite difference or finite element. 498 00:22:41,740 --> 00:22:43,750 So if I'm concerned about those things, 499 00:22:43,750 --> 00:22:45,780 finite volume is really the approach to use. 500 00:22:45,780 --> 00:22:51,330 And you'll find one fluid mechanical solver 501 00:22:51,330 --> 00:22:53,020 that's freely available. 502 00:22:53,020 --> 00:22:55,260 You can go download and use it today, 503 00:22:55,260 --> 00:22:57,690 and it is widely used in research, is OpenFOAM. 504 00:22:57,690 --> 00:23:00,120 And that's all based on the finite volume method. 505 00:23:00,120 --> 00:23:02,640 So they're trying to conserve momentum, conserve mass, 506 00:23:02,640 --> 00:23:06,600 and conserve energy associated with the fluid. 507 00:23:06,600 --> 00:23:09,630 OpenFOAM is good at drawing bizarre grids of the domain 508 00:23:09,630 --> 00:23:11,400 and giving you very accurate integration 509 00:23:11,400 --> 00:23:13,200 of fluid mechanical equations. 510 00:23:13,200 --> 00:23:15,880 And it's good at doing that because it uses finite volume. 511 00:23:18,510 --> 00:23:19,990 You discussed the method of lines, 512 00:23:19,990 --> 00:23:24,010 which was a way of discretizing space, but leaving the time 513 00:23:24,010 --> 00:23:28,180 dimension undiscretized and then applying ODE-IVP methods 514 00:23:28,180 --> 00:23:31,840 for solving in time. 515 00:23:31,840 --> 00:23:35,260 It's an incredibly useful way to approach those problems. 516 00:23:35,260 --> 00:23:39,870 Because it leverages expertise in two different techniques. 517 00:23:39,870 --> 00:23:42,700 There are specialized methods for doing time integration 518 00:23:42,700 --> 00:23:46,180 with some spatial discretization that 519 00:23:46,180 --> 00:23:49,720 are vetted for their stability properties and their accuracy. 520 00:23:49,720 --> 00:23:51,390 And those are fine, but I would say 521 00:23:51,390 --> 00:23:54,000 they're a little antiquated. 522 00:23:54,000 --> 00:23:56,339 They're reliable for particular classes of problems. 523 00:23:56,339 --> 00:23:57,880 But for general problems, you may not 524 00:23:57,880 --> 00:24:00,005 have any idea whether they're going to work or not. 525 00:24:00,005 --> 00:24:03,310 Whereas method of lines may be a really good approach 526 00:24:03,310 --> 00:24:07,667 to a parabolic partial differential equation 527 00:24:07,667 --> 00:24:09,250 you're working with where you can just 528 00:24:09,250 --> 00:24:11,560 rely on the adaptive time stepping and the air 529 00:24:11,560 --> 00:24:16,470 control and an ODE-IVP solver to keep everything stable in time 530 00:24:16,470 --> 00:24:18,660 as you integrate forward. 531 00:24:18,660 --> 00:24:20,410 We did application of commercial software. 532 00:24:20,410 --> 00:24:23,009 So you used COMSOL to solve a problem. 533 00:24:23,009 --> 00:24:23,550 I don't know. 534 00:24:23,550 --> 00:24:27,240 Would you say was COMSOL easier to use than writing 535 00:24:27,240 --> 00:24:28,710 your own MATLAB code? 536 00:24:28,710 --> 00:24:31,294 Harder to use than writing your own MATLAB code, do you think? 537 00:24:31,294 --> 00:24:31,793 No. 538 00:24:31,793 --> 00:24:32,610 AUDIENCE: Easier. 539 00:24:32,610 --> 00:24:34,230 PROFESSOR: Easier to use. 540 00:24:34,230 --> 00:24:36,400 But the result, I would say, wasn't 541 00:24:36,400 --> 00:24:40,700 as good without some careful refinement of the COMSOL 542 00:24:40,700 --> 00:24:42,450 solution. 543 00:24:42,450 --> 00:24:45,030 So for that particular problem, if you 544 00:24:45,030 --> 00:24:51,090 wanted a solution that looked like your MATLAB solution that 545 00:24:51,090 --> 00:24:54,120 was carefully resolved in space, you really 546 00:24:54,120 --> 00:24:58,920 had to go in and refine the grid in particular places. 547 00:24:58,920 --> 00:25:02,430 So it gets harder and harder the more detail you want. 548 00:25:02,430 --> 00:25:06,180 But it's an easy way to get an answer. 549 00:25:06,180 --> 00:25:09,750 You should check whether that answer is right by comparing it 550 00:25:09,750 --> 00:25:10,860 with other methods. 551 00:25:10,860 --> 00:25:12,660 You can't guarantee that it's giving you 552 00:25:12,660 --> 00:25:15,690 the right result either. 553 00:25:15,690 --> 00:25:21,970 To give you an example, I had mentioned an unsteady advection 554 00:25:21,970 --> 00:25:24,580 diffusion problem to one of my grad students. 555 00:25:24,580 --> 00:25:28,420 And he's very good with COMSOL. 556 00:25:28,420 --> 00:25:30,635 I mentioned that it was a hard problem because there 557 00:25:30,635 --> 00:25:31,510 were boundary layers. 558 00:25:31,510 --> 00:25:34,330 And the boundary layers changed thickness and time, 559 00:25:34,330 --> 00:25:36,002 and resolving those things can be 560 00:25:36,002 --> 00:25:37,960 quite challenging without any physical insight. 561 00:25:37,960 --> 00:25:40,209 And he says, well, I think I can just do it in COMSOL. 562 00:25:40,209 --> 00:25:44,070 And he tried it, and COMSOL was happy to report 563 00:25:44,070 --> 00:25:47,650 a solution in time and space, but the solution 564 00:25:47,650 --> 00:25:50,230 was nonsensical. 565 00:25:50,230 --> 00:25:52,300 We could visualize it and see that it 566 00:25:52,300 --> 00:25:56,110 didn't look like what other numerical solutions look like, 567 00:25:56,110 --> 00:25:58,690 and in certain limits, didn't correspond 568 00:25:58,690 --> 00:26:02,950 to what those solutions are known to be in those limits. 569 00:26:02,950 --> 00:26:08,410 So the black box is great in that it reports an answer. 570 00:26:08,410 --> 00:26:12,230 But it can also be really problematic as well. 571 00:26:12,230 --> 00:26:14,710 So maybe, you have expertise with COMSOL. 572 00:26:14,710 --> 00:26:16,720 You can get a solution to converge 573 00:26:16,720 --> 00:26:18,050 for a difficult problem. 574 00:26:18,050 --> 00:26:20,440 But you should really have other methodologies 575 00:26:20,440 --> 00:26:26,590 that let you assess the quality of that solution. 576 00:26:26,590 --> 00:26:28,690 For simple problems, simple elliptic problems 577 00:26:28,690 --> 00:26:30,280 and simple parabolic problems, it's 578 00:26:30,280 --> 00:26:32,590 going to do great, because it's built 579 00:26:32,590 --> 00:26:34,090 to eat those for breakfast. 580 00:26:34,090 --> 00:26:36,070 But for complicated engineering problems, 581 00:26:36,070 --> 00:26:40,120 you've got to vet your solutions. 582 00:26:40,120 --> 00:26:41,570 Oh, what didn't we cover? 583 00:26:41,570 --> 00:26:43,730 We didn't talk about hyperbolic equations. 584 00:26:43,730 --> 00:26:44,450 Not really. 585 00:26:44,450 --> 00:26:48,530 We talked a little bit about advection equations, 586 00:26:48,530 --> 00:26:51,060 which have a hyperbolic character associated with them. 587 00:26:51,060 --> 00:26:53,040 So you have like a pulse of [? solute, ?] 588 00:26:53,040 --> 00:26:54,890 and it's advected along in time. 589 00:26:54,890 --> 00:26:56,660 And you want that advection to be stable, 590 00:26:56,660 --> 00:26:59,870 so you want to use things like upwind differencing 591 00:26:59,870 --> 00:27:02,450 to ensure that stability. 592 00:27:02,450 --> 00:27:05,390 But there's a broader class of hyperbolic problems 593 00:27:05,390 --> 00:27:09,050 that relate to things like the motion of waves 594 00:27:09,050 --> 00:27:11,452 in elastic solids. 595 00:27:11,452 --> 00:27:13,910 We don't typically encounter those in chemical engineering, 596 00:27:13,910 --> 00:27:17,080 but chemical engineering is a broad discipline. 597 00:27:17,080 --> 00:27:19,640 And maybe you're engineering soft materials 598 00:27:19,640 --> 00:27:20,840 for some sort of device. 599 00:27:20,840 --> 00:27:23,240 And you want to look at the elastic response 600 00:27:23,240 --> 00:27:24,680 of that material. 601 00:27:24,680 --> 00:27:29,090 The PDEs that govern that are a completely different class 602 00:27:29,090 --> 00:27:31,310 than the ones that we typically look at. 603 00:27:31,310 --> 00:27:32,510 They're not parabolic. 604 00:27:32,510 --> 00:27:33,514 They're not elliptic. 605 00:27:33,514 --> 00:27:35,930 They're hyperbolic, and they are these wave-like solutions 606 00:27:35,930 --> 00:27:36,805 associated with them. 607 00:27:36,805 --> 00:27:39,530 They require their own particular solution methods 608 00:27:39,530 --> 00:27:43,285 in order to be stable and accurate. 609 00:27:43,285 --> 00:27:45,410 And then there are all sorts of specialized methods 610 00:27:45,410 --> 00:27:47,930 that are dreamed up to apply to, maybe, 611 00:27:47,930 --> 00:27:49,597 a broad class of problems, but it's 612 00:27:49,597 --> 00:27:51,680 kind of hard to ensure that they're going to work. 613 00:27:51,680 --> 00:27:53,054 But nonetheless, you'll see a lot 614 00:27:53,054 --> 00:27:54,830 of this out in the literature. 615 00:27:54,830 --> 00:27:56,855 So there are things called multi-grid methods. 616 00:27:56,855 --> 00:27:59,600 They try to discretize a partial differential equation 617 00:27:59,600 --> 00:28:02,650 with different levels of fineness. 618 00:28:02,650 --> 00:28:06,260 And you use course solutions as initial guesses for finer 619 00:28:06,260 --> 00:28:07,970 and finer solutions in space. 620 00:28:07,970 --> 00:28:10,010 For some problems, this works great. 621 00:28:10,010 --> 00:28:13,430 And in fact, you can show that it can 622 00:28:13,430 --> 00:28:16,160 work as well as is possible. 623 00:28:16,160 --> 00:28:19,040 So depending on the size of the problem, 624 00:28:19,040 --> 00:28:21,930 say you have n points at which you want to find the solution. 625 00:28:21,930 --> 00:28:24,880 You can find it in order n time with some of these methods 626 00:28:24,880 --> 00:28:27,130 depending on the particular problem you're looking at. 627 00:28:27,130 --> 00:28:28,220 That's good. 628 00:28:28,220 --> 00:28:31,137 But oftentimes, engineering problems aren't of that sort. 629 00:28:31,137 --> 00:28:32,970 There's something more complicated going on. 630 00:28:32,970 --> 00:28:35,345 And it's hard to ensure that you get those sorts of rates 631 00:28:35,345 --> 00:28:36,382 of convergence. 632 00:28:36,382 --> 00:28:38,090 But nonetheless, they exist and something 633 00:28:38,090 --> 00:28:42,190 that we didn't cover you should be aware of. 634 00:28:42,190 --> 00:28:43,670 We did DAEs. 635 00:28:43,670 --> 00:28:48,140 So now, we coupled ODE-IVPs with nonlinear algebraic equations. 636 00:28:48,140 --> 00:28:50,750 Most dynamic engineering problems are of this type. 637 00:28:50,750 --> 00:28:55,040 We showed that DAEs are really an issue with model formulation 638 00:28:55,040 --> 00:28:55,910 ultimately. 639 00:28:55,910 --> 00:28:57,830 We write down conservation equations, 640 00:28:57,830 --> 00:28:59,630 and we write down certain constraints 641 00:28:59,630 --> 00:29:01,070 on those conservation equations. 642 00:29:01,070 --> 00:29:03,620 The constraints can be specifications 643 00:29:03,620 --> 00:29:06,306 like we want particular flow rates in certain places. 644 00:29:06,306 --> 00:29:07,430 Or they can be fundamental. 645 00:29:07,430 --> 00:29:09,140 They can be thermodynamic constraints 646 00:29:09,140 --> 00:29:14,270 on the coexistence of different phases of a particular material 647 00:29:14,270 --> 00:29:17,240 in a particular unit operation. 648 00:29:17,240 --> 00:29:18,920 If we could solve those constraints 649 00:29:18,920 --> 00:29:21,110 for certain unknowns and substitute them 650 00:29:21,110 --> 00:29:24,920 in the equations, we might be able to produce ODE-IVPs. 651 00:29:24,920 --> 00:29:27,030 Probably we can't come up with those solutions. 652 00:29:27,030 --> 00:29:30,330 So instead, we've always got this coupled system 653 00:29:30,330 --> 00:29:36,270 of initial value problems with nonlinear equations. 654 00:29:36,270 --> 00:29:38,317 So you have that sort of a circumstance. 655 00:29:38,317 --> 00:29:39,650 How should you think about that? 656 00:29:39,650 --> 00:29:41,540 One way to think about it was to say 657 00:29:41,540 --> 00:29:47,630 that the algebraic equations are essentially infinitely stiff. 658 00:29:47,630 --> 00:29:50,170 They have to be imposed exactly at every time step. 659 00:29:50,170 --> 00:29:54,540 That they never relax in time with the finite time constant. 660 00:29:54,540 --> 00:29:56,390 And so we knew that those sorts of methods 661 00:29:56,390 --> 00:30:01,215 are hard to resolve with typical ODE solvers. 662 00:30:01,215 --> 00:30:03,590 And we tried to assess that by computing something called 663 00:30:03,590 --> 00:30:05,550 the differential index. 664 00:30:05,550 --> 00:30:07,250 So we asked how many times do I have 665 00:30:07,250 --> 00:30:09,740 to take derivatives of the algebraic equations 666 00:30:09,740 --> 00:30:12,770 or any new algebraic constraints in order 667 00:30:12,770 --> 00:30:16,880 to get a system of ordinary differential equations 668 00:30:16,880 --> 00:30:18,500 to represent the same model. 669 00:30:18,500 --> 00:30:20,960 I may want to solve that system of ordinary differential 670 00:30:20,960 --> 00:30:21,940 equations. 671 00:30:21,940 --> 00:30:24,020 Or I may just want to know the differential index 672 00:30:24,020 --> 00:30:27,080 so that I can choose a particular method to solve 673 00:30:27,080 --> 00:30:28,140 this problem. 674 00:30:28,140 --> 00:30:31,790 We saw the higher the index was, the more sensitive 675 00:30:31,790 --> 00:30:34,130 the solution was to perturbations 676 00:30:34,130 --> 00:30:37,010 in different input variables. 677 00:30:37,010 --> 00:30:39,650 So the differential index played an important role 678 00:30:39,650 --> 00:30:42,770 in telling us something physical about the problem 679 00:30:42,770 --> 00:30:47,870 and the responses that could be elicited by DAE systems. 680 00:30:47,870 --> 00:30:50,540 We also talked about consistent initialization. 681 00:30:50,540 --> 00:30:52,520 I need to prescribe the initial conditions 682 00:30:52,520 --> 00:30:56,550 for all the differential and algebraic variables. 683 00:30:56,550 --> 00:31:00,250 Can I prescribe any of those independently? 684 00:31:00,250 --> 00:31:02,400 Well, the answer is actually no. 685 00:31:02,400 --> 00:31:03,900 Depending on the differential index, 686 00:31:03,900 --> 00:31:05,850 there are underlying algebraic constraints 687 00:31:05,850 --> 00:31:08,050 that have to be satisfied at all points in time. 688 00:31:08,050 --> 00:31:11,190 And if they're not, then that can break the method that's 689 00:31:11,190 --> 00:31:12,385 trying to integrate the DAE. 690 00:31:12,385 --> 00:31:15,690 So you have to prescribe the initial conditions 691 00:31:15,690 --> 00:31:17,400 consistently. 692 00:31:17,400 --> 00:31:19,740 If they're not prescribed consistently, 693 00:31:19,740 --> 00:31:22,000 and you go to a piece of commercial software, 694 00:31:22,000 --> 00:31:23,310 it may give an error. 695 00:31:23,310 --> 00:31:25,370 It may tell you nothing. 696 00:31:25,370 --> 00:31:28,190 You may get a reasonable result or an unreasonable result. 697 00:31:28,190 --> 00:31:30,210 It really depends on the methodology. 698 00:31:30,210 --> 00:31:33,420 So it's up to you to know in advance 699 00:31:33,420 --> 00:31:37,730 that this is an important aspect of the problem. 700 00:31:37,730 --> 00:31:41,190 We solved index-1 DAEs typically. 701 00:31:41,190 --> 00:31:44,370 Index-2 DAEs can be converted into index-1 DAEs 702 00:31:44,370 --> 00:31:45,480 and solved pretty easily. 703 00:31:45,480 --> 00:31:49,470 Index-3 and bigger DAEs are difficult because 704 00:31:49,470 --> 00:31:51,150 of their high sensitivities. 705 00:31:51,150 --> 00:31:53,434 So there's special software that's out there. 706 00:31:53,434 --> 00:31:55,350 I mentioned some of it during our DAE lecture. 707 00:31:55,350 --> 00:31:57,474 And that's really what you want to reach for if you 708 00:31:57,474 --> 00:31:59,970 have a high index DAE. 709 00:31:59,970 --> 00:32:03,510 Examples, we were able to craft all sorts of examples 710 00:32:03,510 --> 00:32:09,670 where we tried to do funny things with causality, where 711 00:32:09,670 --> 00:32:15,450 you tried to change the input by affecting the output, which 712 00:32:15,450 --> 00:32:16,890 doesn't make a lot of sense. 713 00:32:16,890 --> 00:32:20,040 But that led to very high index DAEs. 714 00:32:20,040 --> 00:32:22,920 Mechanical systems often have very high index DAEs 715 00:32:22,920 --> 00:32:24,780 associated with the mechanical constraints, 716 00:32:24,780 --> 00:32:27,640 typically lead to index-3 DAEs. 717 00:32:27,640 --> 00:32:30,720 So these pop up in places that are important. 718 00:32:30,720 --> 00:32:32,430 And being able to look at the model 719 00:32:32,430 --> 00:32:36,202 and assess the index of it is kind of essential. 720 00:32:39,300 --> 00:32:40,890 We did probability. 721 00:32:40,890 --> 00:32:44,492 So the physical world has inherent uncertainty. 722 00:32:44,492 --> 00:32:46,950 It's not just uncertainty in our ability to measure things, 723 00:32:46,950 --> 00:32:49,140 but fundamental, physical uncertainty 724 00:32:49,140 --> 00:32:51,550 is built into the world around us. 725 00:32:51,550 --> 00:32:53,580 So measurements have uncontrolled or even 726 00:32:53,580 --> 00:32:56,100 uncontrollable sensitivities to this uncertainty. 727 00:32:56,100 --> 00:32:58,370 And we wanted to know how to handle it. 728 00:32:58,370 --> 00:33:01,290 So we talked about things like sources of randomness. 729 00:33:01,290 --> 00:33:02,700 We did fundamentals. 730 00:33:02,700 --> 00:33:04,310 Maybe this overlapped with 10.40. 731 00:33:04,310 --> 00:33:07,530 But I think overlap is good, because humans 732 00:33:07,530 --> 00:33:11,470 tend to have bad intuition about probability in general. 733 00:33:11,470 --> 00:33:14,970 So we discuss probability densities, means, covariances, 734 00:33:14,970 --> 00:33:17,010 expected values, joint probabilities, 735 00:33:17,010 --> 00:33:19,750 conditional probabilities. 736 00:33:19,750 --> 00:33:22,110 We talked about common probability densities. 737 00:33:22,110 --> 00:33:24,090 You covered the central limit theorem. 738 00:33:24,090 --> 00:33:27,270 You talked about the difference between sample averages 739 00:33:27,270 --> 00:33:31,410 and the population mean. 740 00:33:31,410 --> 00:33:34,020 One thing that's important that I don't think that gets 741 00:33:34,020 --> 00:33:35,730 covered much in detail, but if you're 742 00:33:35,730 --> 00:33:40,530 going to be working with randomness computationally, 743 00:33:40,530 --> 00:33:44,110 oftentimes you have to generate random numbers. 744 00:33:44,110 --> 00:33:46,380 And you'd like to generate good random numbers. 745 00:33:46,380 --> 00:33:52,790 Turns out a computer doesn't generate proper random numbers, 746 00:33:52,790 --> 00:33:54,350 only pseudo random numbers. 747 00:33:54,350 --> 00:33:57,500 And these generators have various properties 748 00:33:57,500 --> 00:33:58,457 associated with them. 749 00:33:58,457 --> 00:33:59,540 And some of them are good. 750 00:33:59,540 --> 00:34:00,620 And some of them are bad. 751 00:34:00,620 --> 00:34:02,100 And you want to use good ones. 752 00:34:02,100 --> 00:34:03,500 So if you have to generate random numbers 753 00:34:03,500 --> 00:34:05,041 for your research, you'd like to know 754 00:34:05,041 --> 00:34:09,260 that they are high quality, that the sequence of numbers that's 755 00:34:09,260 --> 00:34:11,870 being generated doesn't repeat over the time 756 00:34:11,870 --> 00:34:16,429 that you're using it, that it won't overlap over 757 00:34:16,429 --> 00:34:19,320 many different uses of this methodology. 758 00:34:19,320 --> 00:34:23,570 So high quality random numbers are important. 759 00:34:23,570 --> 00:34:25,250 We discussed models versus data. 760 00:34:25,250 --> 00:34:29,360 So we wanted to regress, estimate parameters 761 00:34:29,360 --> 00:34:30,980 from the data. 762 00:34:30,980 --> 00:34:34,460 We wanted to do things like hypothesis testing. 763 00:34:34,460 --> 00:34:35,449 I have a model. 764 00:34:35,449 --> 00:34:37,219 Is the model consistent with the data, 765 00:34:37,219 --> 00:34:40,280 or is it not consistent with the data? 766 00:34:40,280 --> 00:34:41,719 So you covered things like linear 767 00:34:41,719 --> 00:34:45,620 and nonlinear least squares, maximum likelihood estimates, 768 00:34:45,620 --> 00:34:51,440 confidence intervals found using the chi-square distribution. 769 00:34:51,440 --> 00:34:54,409 We did Bayes' theorem and Bayesian parameter estimation 770 00:34:54,409 --> 00:34:59,000 as well, so a way of using prior information about parameter 771 00:34:59,000 --> 00:35:02,390 values to better estimate the probability 772 00:35:02,390 --> 00:35:07,630 that those parameters will take on values given the data. 773 00:35:07,630 --> 00:35:09,470 We didn't cover design of experiments. 774 00:35:09,470 --> 00:35:12,350 That's a 10.551 topic. 775 00:35:12,350 --> 00:35:16,400 But there are specific ways that one can design the experiment. 776 00:35:16,400 --> 00:35:19,280 Which data values do I take in order 777 00:35:19,280 --> 00:35:25,260 to make the fitting of parameters optimal? 778 00:35:25,260 --> 00:35:27,310 And there are other problems besides regression 779 00:35:27,310 --> 00:35:28,450 that are important. 780 00:35:28,450 --> 00:35:31,680 But we don't typically utilize those in chemical engineering. 781 00:35:31,680 --> 00:35:33,530 So one of those is classification. 782 00:35:33,530 --> 00:35:36,430 So if I have a series of points, and I do regression, 783 00:35:36,430 --> 00:35:38,440 I'm in some sense asking for like what's 784 00:35:38,440 --> 00:35:42,070 the best curve due to this model that can be drawn 785 00:35:42,070 --> 00:35:43,326 through those points, right? 786 00:35:43,326 --> 00:35:44,950 What parameter values give me the curve 787 00:35:44,950 --> 00:35:47,540 that's closest to all these points, that's least squares. 788 00:35:47,540 --> 00:35:49,330 Classification might ask instead, 789 00:35:49,330 --> 00:35:53,050 what's the curve that divides these points most evenly, 790 00:35:53,050 --> 00:35:56,370 or sits furthest from all these points in an equal amount 791 00:35:56,370 --> 00:35:57,280 instead. 792 00:35:57,280 --> 00:36:01,450 Which side of this line do the points sit on? 793 00:36:01,450 --> 00:36:03,300 Are they of type A or type B? 794 00:36:03,300 --> 00:36:05,700 It's kind of a related problem to regression. 795 00:36:05,700 --> 00:36:07,900 But it's a different sort of problem. 796 00:36:07,900 --> 00:36:10,420 The same sorts of methods can be applied to classification 797 00:36:10,420 --> 00:36:12,940 that you apply to doing regression and parameter 798 00:36:12,940 --> 00:36:14,137 estimation. 799 00:36:14,137 --> 00:36:15,220 It's an important problem. 800 00:36:20,620 --> 00:36:23,410 And then we finished up with stochastic simulation. 801 00:36:23,410 --> 00:36:27,610 So you learned that sampling of random processes 802 00:36:27,610 --> 00:36:30,040 can be used for engineering calculations. 803 00:36:30,040 --> 00:36:32,260 And they're even inherently random physical 804 00:36:32,260 --> 00:36:36,280 processes that you might want to model reliably. 805 00:36:36,280 --> 00:36:38,500 So we did Metropolis Monte Carlo. 806 00:36:38,500 --> 00:36:41,950 This was a way of basically computing 807 00:36:41,950 --> 00:36:43,360 high dimensional integrals. 808 00:36:43,360 --> 00:36:45,610 That's what it boiled down to, where 809 00:36:45,610 --> 00:36:48,790 we wanted to sample from different places 810 00:36:48,790 --> 00:36:53,630 in the domain, weighted by some probability density function. 811 00:36:53,630 --> 00:36:56,560 You did kinetic Monte Carlo, which 812 00:36:56,560 --> 00:37:00,280 is really a sort of event-driven algorithm. 813 00:37:00,280 --> 00:37:02,920 You are expecting that certain events are going 814 00:37:02,920 --> 00:37:05,320 to occur with certain rates. 815 00:37:05,320 --> 00:37:07,420 They're actually nonstochastic versions 816 00:37:07,420 --> 00:37:09,532 of that, the same sort of event-driven algorithm. 817 00:37:09,532 --> 00:37:10,990 I know certain things should happen 818 00:37:10,990 --> 00:37:12,690 at certain points in time. 819 00:37:12,690 --> 00:37:15,040 And rather than integrating the dynamics of the system 820 00:37:15,040 --> 00:37:17,860 all along these points in time, I just advance the dynamics 821 00:37:17,860 --> 00:37:19,630 until the next event occurs. 822 00:37:19,630 --> 00:37:21,460 That event occurs. 823 00:37:21,460 --> 00:37:24,760 I keep a list of what the next event is going to be. 824 00:37:24,760 --> 00:37:26,110 And I follow those events along. 825 00:37:26,110 --> 00:37:29,320 So it could be something like billiard balls on a table. 826 00:37:29,320 --> 00:37:32,320 I know the trajectories of the billiard balls, 827 00:37:32,320 --> 00:37:34,390 given their position and their momenta. 828 00:37:34,390 --> 00:37:37,324 And all I need to do is keep track of when they collide 829 00:37:37,324 --> 00:37:38,740 and where they're going to go next 830 00:37:38,740 --> 00:37:39,870 and what the next events are going to be. 831 00:37:39,870 --> 00:37:41,480 Well, kinetic Monte Carlo was the same thing, 832 00:37:41,480 --> 00:37:43,270 except the events are now stochastic. 833 00:37:43,270 --> 00:37:46,900 They occur with prescribed rates. 834 00:37:46,900 --> 00:37:50,380 And so I just need the sample from this distribution of rates 835 00:37:50,380 --> 00:37:53,002 to get the right sequence of events. 836 00:37:53,002 --> 00:37:54,960 There's funny things about kinetic Monte Carlo. 837 00:37:54,960 --> 00:37:59,140 So applied to chemical kinetics, you can estimate these rates 838 00:37:59,140 --> 00:38:00,091 reasonably well. 839 00:38:00,091 --> 00:38:02,590 You've got to know the rates for this process to look right. 840 00:38:02,590 --> 00:38:04,180 If you don't have the rates correct, 841 00:38:04,180 --> 00:38:06,780 or you don't have the relative magnitudes of the rates, 842 00:38:06,780 --> 00:38:10,300 then the simulation is kind of meaningless. 843 00:38:10,300 --> 00:38:14,170 The result is going to be nonsense or garbage. 844 00:38:14,170 --> 00:38:16,420 And so there's going to be limitations on your ability 845 00:38:16,420 --> 00:38:20,080 to do this based upon how accurately you 846 00:38:20,080 --> 00:38:22,300 can estimate these rates. 847 00:38:22,300 --> 00:38:25,240 It also means that if you know that there's disparities 848 00:38:25,240 --> 00:38:27,010 in the rates, say one rate is very high 849 00:38:27,010 --> 00:38:29,500 and one rate is very low, you can make certain assumptions 850 00:38:29,500 --> 00:38:30,630 about the process. 851 00:38:30,630 --> 00:38:32,770 So things that happen very quickly 852 00:38:32,770 --> 00:38:35,020 you may treat as though there are equilibrated 853 00:38:35,020 --> 00:38:38,170 and not try to resolve them at all and look more at infrequent 854 00:38:38,170 --> 00:38:39,920 processes that occur, slower rates. 855 00:38:39,920 --> 00:38:46,230 So the precision with which you have to know these things 856 00:38:46,230 --> 00:38:47,140 is important. 857 00:38:47,140 --> 00:38:48,640 The relative rates are what's really 858 00:38:48,640 --> 00:38:49,720 important in these problems. 859 00:38:49,720 --> 00:38:51,345 And you want to get them right in order 860 00:38:51,345 --> 00:38:54,040 to resolve these physical processes correctly. 861 00:38:54,040 --> 00:38:55,540 Then we discussed molecular dynamics 862 00:38:55,540 --> 00:38:59,590 after that, which is another type of stochastic simulation 863 00:38:59,590 --> 00:39:00,250 methodology. 864 00:39:00,250 --> 00:39:02,170 Here you just integrate the equations 865 00:39:02,170 --> 00:39:03,880 of motion associated with whatever 866 00:39:03,880 --> 00:39:07,210 particles you're interested in. 867 00:39:07,210 --> 00:39:09,340 Kinetic Monte Carlo and molecular dynamics 868 00:39:09,340 --> 00:39:12,970 are sort of fundamentally far from equilibrium, 869 00:39:12,970 --> 00:39:14,740 but may relax towards equilibrium, 870 00:39:14,740 --> 00:39:18,760 depending on what constraints you put on the particular rate 871 00:39:18,760 --> 00:39:20,590 processes that you're modeling. 872 00:39:20,590 --> 00:39:22,660 Metropolis Monte Carlo is usually 873 00:39:22,660 --> 00:39:26,530 sampling an underlying-- if we're applying it 874 00:39:26,530 --> 00:39:28,660 in the statistical mechanical sense, 875 00:39:28,660 --> 00:39:31,480 it's usually sampling an underlying equilibrium 876 00:39:31,480 --> 00:39:34,120 Boltzmann distribution. 877 00:39:34,120 --> 00:39:38,530 So they often get used in fundamentally different ways 878 00:39:38,530 --> 00:39:40,360 as well. 879 00:39:40,360 --> 00:39:43,930 We didn't cover stochastic differential equations. 880 00:39:43,930 --> 00:39:47,770 So depending on which degrees of freedom 881 00:39:47,770 --> 00:39:52,161 you want to represent in a model like molecular dynamics, 882 00:39:52,161 --> 00:39:53,660 there may be some degrees of freedom 883 00:39:53,660 --> 00:39:56,500 you simply don't care about. 884 00:39:56,500 --> 00:39:58,460 But they have an influence on the problem. 885 00:39:58,460 --> 00:40:01,270 So in my research, for example, we model 886 00:40:01,270 --> 00:40:04,000 nanoparticles and particulates. 887 00:40:04,000 --> 00:40:07,840 We're not very interested in the details of the solvents 888 00:40:07,840 --> 00:40:09,010 surrounding these particles. 889 00:40:09,010 --> 00:40:12,190 There's a huge disparity in length scales associated 890 00:40:12,190 --> 00:40:15,682 with nanoparticles and the molecular dimensions 891 00:40:15,682 --> 00:40:17,140 of the solvent that surrounds them. 892 00:40:17,140 --> 00:40:19,330 The solvent has an influence nonetheless. 893 00:40:19,330 --> 00:40:21,082 It's fluctuating all the time. 894 00:40:21,082 --> 00:40:23,290 Solvent molecules are colliding with these particles, 895 00:40:23,290 --> 00:40:25,480 and they're imparting momentum. 896 00:40:25,480 --> 00:40:27,730 One way of resolving a system like that 897 00:40:27,730 --> 00:40:30,070 without having to look explicitly at the solvent 898 00:40:30,070 --> 00:40:33,430 is to try to integrate out the solvent degrees of freedom 899 00:40:33,430 --> 00:40:37,120 and treat those collisions of the solvent with this 900 00:40:37,120 --> 00:40:40,120 particle in a stochastic sense. 901 00:40:40,120 --> 00:40:42,910 So at various points in time, there's 902 00:40:42,910 --> 00:40:47,680 momentum that's randomly transferred to particles 903 00:40:47,680 --> 00:40:48,970 dispersed in a solvent. 904 00:40:48,970 --> 00:40:52,454 The physical manifestation of that is Brownian motion. 905 00:40:52,454 --> 00:40:54,370 When you look at small particles in the fluid, 906 00:40:54,370 --> 00:40:56,209 you see them diffuse around, and that's 907 00:40:56,209 --> 00:40:57,250 where that's coming from. 908 00:40:57,250 --> 00:40:59,710 So you can simulate processes like that 909 00:40:59,710 --> 00:41:02,320 without having to resolve molecular scales. 910 00:41:02,320 --> 00:41:04,930 So as you go up in scale, you integrate out 911 00:41:04,930 --> 00:41:07,510 degrees of freedom, but oftentimes, that 912 00:41:07,510 --> 00:41:10,210 introduces inherent stochasticity 913 00:41:10,210 --> 00:41:12,190 in the underlying dynamics of whatever 914 00:41:12,190 --> 00:41:13,290 you're trying to model. 915 00:41:13,290 --> 00:41:14,836 That's very interesting. 916 00:41:14,836 --> 00:41:16,210 And then for Monte Carlo methods, 917 00:41:16,210 --> 00:41:18,280 there's advanced sampling methods. 918 00:41:18,280 --> 00:41:20,510 And we didn't discuss any of these in detail. 919 00:41:20,510 --> 00:41:23,920 But you saw some of this with the model 920 00:41:23,920 --> 00:41:26,920 you saw with the problem you approached in your homework 921 00:41:26,920 --> 00:41:30,790 where you had two populations or two peaks 922 00:41:30,790 --> 00:41:33,670 in the probability distribution that were widely separated. 923 00:41:33,670 --> 00:41:35,980 And it may be hard for a Monte Carlo method 924 00:41:35,980 --> 00:41:37,960 to traverse between those peaks. 925 00:41:37,960 --> 00:41:41,920 And umbrella sampling as a way of biasing the probability 926 00:41:41,920 --> 00:41:44,290 distribution to make those paths from one 927 00:41:44,290 --> 00:41:47,430 peak to another more probable. 928 00:41:47,430 --> 00:41:50,650 So it's just a way of seeking out rare events 929 00:41:50,650 --> 00:41:52,900 like the transitions between these two 930 00:41:52,900 --> 00:41:54,700 different populations. 931 00:41:54,700 --> 00:41:56,950 And it's pretty important, ultimately, 932 00:41:56,950 --> 00:42:00,400 for exploring complex high dimensional spaces. 933 00:42:00,400 --> 00:42:02,530 But you have to understand something 934 00:42:02,530 --> 00:42:04,930 about the underlying physics in order 935 00:42:04,930 --> 00:42:08,230 to figure out the biases needed to enhance the sampling. 936 00:42:08,230 --> 00:42:12,190 So it's so quite complicated. 937 00:42:12,190 --> 00:42:13,150 So I'll remind you. 938 00:42:13,150 --> 00:42:15,460 Your final exam, right, it's Monday. 939 00:42:15,460 --> 00:42:18,110 It's in 56-154 at 9:00 AM. 940 00:42:18,110 --> 00:42:19,110 Don't show up to Walker. 941 00:42:19,110 --> 00:42:20,460 It's not going to be there. 942 00:42:20,460 --> 00:42:22,439 It's three hours. 943 00:42:22,439 --> 00:42:23,730 It's going to be comprehensive. 944 00:42:23,730 --> 00:42:26,830 You can expect roughly one problem each thematic topic. 945 00:42:26,830 --> 00:42:27,705 We've written it now. 946 00:42:27,705 --> 00:42:30,660 I think it's nine problems long. 947 00:42:30,660 --> 00:42:32,760 So we're in the process of revising it 948 00:42:32,760 --> 00:42:34,740 for difficulty and time. 949 00:42:34,740 --> 00:42:36,870 It's going to be short answer format. 950 00:42:36,870 --> 00:42:41,580 We sort of aim at 10 to 15 minutes per problem. 951 00:42:41,580 --> 00:42:44,100 I think that's the sort of frame you should be looking at. 952 00:42:44,100 --> 00:42:45,808 The problems are structured in such a way 953 00:42:45,808 --> 00:42:48,930 that that's about how long they should take. 954 00:42:48,930 --> 00:42:53,940 You should use that to guide your temporal budget 955 00:42:53,940 --> 00:42:55,200 on the exam. 956 00:42:55,200 --> 00:42:59,140 If it's taking too long, switch to something else. 957 00:42:59,140 --> 00:43:03,420 Don't let yourself get hung up focusing on one thing. 958 00:43:03,420 --> 00:43:05,490 Move around and come back to it later. 959 00:43:05,490 --> 00:43:07,145 There will be some fundamentals. 960 00:43:07,145 --> 00:43:09,270 There's going to be some translation of engineering 961 00:43:09,270 --> 00:43:10,290 into numerics. 962 00:43:10,290 --> 00:43:12,570 And there's going to be some MATLAB. 963 00:43:12,570 --> 00:43:15,180 So it'll look like a blend of the first exam 964 00:43:15,180 --> 00:43:16,153 and the second exam. 965 00:43:19,104 --> 00:43:19,604 Yeah. 966 00:43:19,604 --> 00:43:20,786 AUDIENCE: When you say it's on MATLAB, 967 00:43:20,786 --> 00:43:22,077 do you mean [INAUDIBLE] MATLAB? 968 00:43:22,077 --> 00:43:23,055 PROFESSOR: Yeah. 969 00:43:23,055 --> 00:43:24,264 AUDIENCE: [INAUDIBLE]. 970 00:43:24,264 --> 00:43:24,930 PROFESSOR: Yeah. 971 00:43:24,930 --> 00:43:26,030 AUDIENCE: No computers. 972 00:43:26,030 --> 00:43:27,030 PROFESSOR: No computers. 973 00:43:27,030 --> 00:43:30,490 It's just write a little script. 974 00:43:30,490 --> 00:43:32,520 We've asked problems like that before. 975 00:43:32,520 --> 00:43:35,804 It'll be of the same type. 976 00:43:35,804 --> 00:43:36,304 Yes. 977 00:43:36,304 --> 00:43:37,679 AUDIENCE: You said nine problems. 978 00:43:37,679 --> 00:43:38,760 [INAUDIBLE] 979 00:43:38,760 --> 00:43:40,980 PROFESSOR: I think there's nine, yeah. 980 00:43:40,980 --> 00:43:43,641 Well, that's how many we've written now. 981 00:43:43,641 --> 00:43:45,140 I'm not planning on writing anymore. 982 00:43:45,140 --> 00:43:45,750 AUDIENCE: Plus or minus one. 983 00:43:45,750 --> 00:43:47,208 PROFESSOR: Plus or minus one, yeah. 984 00:43:47,208 --> 00:43:50,249 AUDIENCE: [INAUDIBLE]. 985 00:43:50,249 --> 00:43:51,040 PROFESSOR: Oh sure. 986 00:43:56,070 --> 00:44:00,010 Listen, if we gave you nine problems of the same length 987 00:44:00,010 --> 00:44:02,590 as on the first two quizzes, you wouldn't 988 00:44:02,590 --> 00:44:04,670 get through the problem. 989 00:44:04,670 --> 00:44:07,060 So they're not written to be so long. 990 00:44:07,060 --> 00:44:08,980 But there are multiple parts. 991 00:44:08,980 --> 00:44:11,110 But multiple parts are usually intended 992 00:44:11,110 --> 00:44:14,200 to guide you to the solution we're looking for. 993 00:44:14,200 --> 00:44:17,314 Usually, we're testing for a particular understanding. 994 00:44:17,314 --> 00:44:19,480 It also helps you out when there are multiple parts. 995 00:44:19,480 --> 00:44:21,460 So there's lots of partial credit available. 996 00:44:21,460 --> 00:44:23,043 You don't have to get the whole thing. 997 00:44:23,043 --> 00:44:25,580 You've just got to get two out of the three parts. 998 00:44:25,580 --> 00:44:26,671 You did pretty good. 999 00:44:26,671 --> 00:44:29,617 OK, more questions? 1000 00:44:34,036 --> 00:44:36,000 AUDIENCE: So when you say short answer, 1001 00:44:36,000 --> 00:44:41,646 does that just mean that similar types of problems 1002 00:44:41,646 --> 00:44:43,365 as what we had on the first two quizzes. 1003 00:44:43,365 --> 00:44:44,240 They're just shorter. 1004 00:44:44,240 --> 00:44:45,730 PROFESSOR: Yes, that's right. 1005 00:44:45,730 --> 00:44:46,384 Yeah. 1006 00:44:46,384 --> 00:44:48,080 AUDIENCE: [INAUDIBLE]. 1007 00:44:48,080 --> 00:44:52,995 PROFESSOR: Yeah, they're not multiple choice, true or false. 1008 00:44:52,995 --> 00:44:53,870 They're short answer. 1009 00:45:01,760 --> 00:45:04,390 You want to hit a level of detail that shows us 1010 00:45:04,390 --> 00:45:08,680 you understand what you're talking about, right. 1011 00:45:08,680 --> 00:45:12,400 We don't need pages of information on these things. 1012 00:45:12,400 --> 00:45:14,590 But we need to have enough information that we can 1013 00:45:14,590 --> 00:45:16,480 see you know what you're doing. 1014 00:45:16,480 --> 00:45:19,240 We're trying to assess your understanding of the material. 1015 00:45:19,240 --> 00:45:19,938 That's all. 1016 00:45:23,842 --> 00:45:25,306 More questions? 1017 00:45:30,186 --> 00:45:33,470 No more questions, OK. 1018 00:45:33,470 --> 00:45:33,970 Well, good. 1019 00:45:33,970 --> 00:45:36,850 It's been a real pleasure teaching you guys. 1020 00:45:36,850 --> 00:45:39,754 I'm really pleased with the outcome of this course. 1021 00:45:39,754 --> 00:45:41,170 This is my third time teaching it. 1022 00:45:41,170 --> 00:45:43,090 I think this is the strongest group that I've 1023 00:45:43,090 --> 00:45:45,610 seen come through 10.34. 1024 00:45:45,610 --> 00:45:48,190 So you guys should be really proud of yourselves. 1025 00:45:51,580 --> 00:45:55,020 Good, all right, good luck.