1 00:00:00 --> 00:00:00 2 00:00:00 --> 00:00:02 ANNOUNCER: Open content is provided under a creative 3 00:00:02 --> 00:00:03 commons license. 4 00:00:03 --> 00:00:06 Your support will help MIT OpenCourseWare continue to 5 00:00:06 --> 00:00:10 offer high-quality educational resources for free. 6 00:00:10 --> 00:00:13 To make a donation, or view additional materials from 7 00:00:13 --> 00:00:16 hundreds of MIT courses, visit MIT OpenCourseWare 8 00:00:16 --> 00:00:19 at ocw.mit.edu. 9 00:00:19 --> 00:00:23 PROFESSOR JOHN GUTTAG: All right. 10 00:00:23 --> 00:00:30 That said, let's continue, and if you remember last time, we 11 00:00:30 --> 00:00:35 ended up looking at this thing I called square roots bi. 12 00:00:35 --> 00:00:39 This was using something called a bisection method, which is 13 00:00:39 --> 00:00:42 related to something called binary search, which we'll see 14 00:00:42 --> 00:00:47 lots more of later, to find square roots. 15 00:00:47 --> 00:00:53 And the basic idea was that we had some sort of a line, and we 16 00:00:53 --> 00:00:57 knew the answer was somewhere between this point 17 00:00:57 --> 00:01:00 and this point. 18 00:01:00 --> 00:01:10 The line is totally ordered. 19 00:01:10 --> 00:01:16 And what that means, is that anything here is smaller 20 00:01:16 --> 00:01:19 than anything to its right. 21 00:01:19 --> 00:01:23 So the integers are totally ordered, the reals are totally 22 00:01:23 --> 00:01:25 ordered, lots of things are, the rationals are 23 00:01:25 --> 00:01:28 totally ordered. 24 00:01:28 --> 00:01:32 And that idea was, we make a guess in the middle, we test it 25 00:01:32 --> 00:01:38 so this is kind of a guess and check, and if the answer was 26 00:01:38 --> 00:01:43 too big, then we knew that we should be looking over here. 27 00:01:43 --> 00:01:46 If it was too small, we knew we should be looking over here, 28 00:01:46 --> 00:01:50 and then we would repeat. 29 00:01:50 --> 00:01:53 So this is very similar, this is a kind of recursive thinking 30 00:01:53 --> 00:01:57 we talked about earlier, where we take our problem and we 31 00:01:57 --> 00:02:02 make it smaller, we solve a smaller problem, et cetera. 32 00:02:02 --> 00:02:03 All right. 33 00:02:03 --> 00:02:10 So now, we've got it, I've got the code up for you. 34 00:02:10 --> 00:02:13 I want you to notice the specifications to start. 35 00:02:13 --> 00:02:17 We're assuming that x is greater than or equal to 0, and 36 00:02:17 --> 00:02:21 epsilon is strictly greater than 0, and we're going to 37 00:02:21 --> 00:02:29 return some value y such that y squared is within epsilon of x. 38 00:02:29 --> 00:02:34 I'd last time talked about the two assert statements. 39 00:02:34 --> 00:02:36 In some sense, strictly speaking they shouldn't be 40 00:02:36 --> 00:02:42 necessary, because the fact that my specification starts 41 00:02:42 --> 00:02:46 with an assumption, says, hey you, who might call square 42 00:02:46 --> 00:02:51 root, make sure that the things you call me with 43 00:02:51 --> 00:02:53 obey the assumption. 44 00:02:53 --> 00:02:56 On the other hand, as I said, never trust a programmer 45 00:02:56 --> 00:03:00 to do the right thing, so we're going to check it. 46 00:03:00 --> 00:03:02 And just in case the assumptions are not true, 47 00:03:02 --> 00:03:04 we're just going to stop dead in our tracks. 48 00:03:04 --> 00:03:08 All right. 49 00:03:08 --> 00:03:12 Then we're going to set low to-- low and high, and we're 50 00:03:12 --> 00:03:18 going to perform exactly the process I talked about. 51 00:03:18 --> 00:03:23 And along the way, I'm keeping track of how many iterations, 52 00:03:23 --> 00:03:26 at the end I'll print how many iterations I took, before 53 00:03:26 --> 00:03:28 I return the final guess. 54 00:03:28 --> 00:03:32 All right, let's test it. 55 00:03:32 --> 00:03:35 So one of the things I want you to observe here, is that 56 00:03:35 --> 00:03:38 instead of sitting there and typing away a bunch of test 57 00:03:38 --> 00:03:43 cases, I took the trouble to write a function, called 58 00:03:43 --> 00:03:44 test bi in this case. 59 00:03:44 --> 00:03:49 All right, so what that's doing, is it's taking the 60 00:03:49 --> 00:03:55 things I would normally type, and putting them in a function, 61 00:03:55 --> 00:03:57 which I can then call. 62 00:03:57 --> 00:04:05 Why is that better than typing them? 63 00:04:05 --> 00:04:10 Why was it worth creating a function to do this? 64 00:04:10 --> 00:04:11 Pardon? 65 00:04:11 --> 00:04:13 STUDENT:: [INAUDIBLE] 66 00:04:13 --> 00:04:17 PROFESSOR JOHN GUTTAG: Then I can I can use it again 67 00:04:17 --> 00:04:19 and again and again. 68 00:04:19 --> 00:04:24 Exactly. 69 00:04:24 --> 00:04:28 By putting it in a function, if I find a bug and I change my 70 00:04:28 --> 00:04:33 program, I can just run the function again. 71 00:04:33 --> 00:04:40 The beauty of this is, it keeps me from getting lazy, and not 72 00:04:40 --> 00:04:44 only testing my program and the thing that found the bug, 73 00:04:44 --> 00:04:47 but in all the things that used to work. 74 00:04:47 --> 00:04:51 We'll talk more about this later, but it often happens 75 00:04:51 --> 00:04:54 that when you change your program to solve one problem, 76 00:04:54 --> 00:04:58 you break it, and things that used to work don't work. 77 00:04:58 --> 00:05:01 And so what you want to do, and again we'll come back to this 78 00:05:01 --> 00:05:08 later in the term, is something called regression testing. 79 00:05:08 --> 00:05:13 This has nothing to do with linear regression. 80 00:05:13 --> 00:05:16 And that's basically trying to make sure our program has not 81 00:05:16 --> 00:05:21 regressed, as to say, gone backwards in how well it works. 82 00:05:21 --> 00:05:24 And so we always test it on everything. 83 00:05:24 --> 00:05:25 All right? 84 00:05:25 --> 00:05:28 So I've created this function, let's give it a shot 85 00:05:28 --> 00:05:37 and see what happens. 86 00:05:37 --> 00:05:40 We'll run test bi. 87 00:05:40 --> 00:05:42 Whoops! 88 00:05:42 --> 00:05:46 All right, well let's look at our answers. 89 00:05:46 --> 00:05:48 I first tested it on the square root of 90 00:05:48 --> 00:05:53 4, and in one iteration it found 91 00:05:53 --> 00:05:53 2. 92 00:05:53 --> 00:05:56 I like that answer. 93 00:05:56 --> 00:05:58 I then tested it on the square root of 94 00:05:58 --> 00:06:02 9, and as I mentioned last time, I didn't find 95 00:06:02 --> 00:06:03 3. 96 00:06:03 --> 00:06:05 I was not crushed. 97 00:06:05 --> 00:06:08 You know, I was not really disappointed, it found 98 00:06:08 --> 00:06:09 something close enough to 99 00:06:09 --> 00:06:13 3 that I'm happy. 100 00:06:13 --> 00:06:13 All right. 101 00:06:13 --> 00:06:15 I tried it on 102 00:06:15 --> 00:06:19 2, I surely didn't expect a precise and exact answer to 103 00:06:19 --> 00:06:23 that, but I got something, and if you square this, you'll 104 00:06:23 --> 00:06:25 find the answer kept pretty darn close to 105 00:06:25 --> 00:06:27 2. 106 00:06:27 --> 00:06:34 I then tried it on 0.25 One quarter. 107 00:06:34 --> 00:06:39 And what happened was not what I wanted. 108 00:06:39 --> 00:06:44 As you'll see, it crashed. 109 00:06:44 --> 00:06:50 It didn't really crash, it found an assert statement. 110 00:06:50 --> 00:06:53 So if you look at the bottom of the function, you'll see that, 111 00:06:53 --> 00:07:04 in fact, I checked for that. 112 00:07:04 --> 00:07:08 I assert the counter is less than or equal to 0. 113 00:07:08 --> 00:07:11 I'm checking that I didn't leave my program because 114 00:07:11 --> 00:07:14 I didn't find an answer. 115 00:07:14 --> 00:07:16 Well, this is a good thing, it's better than my program 116 00:07:16 --> 00:07:21 running forever, but it's a bad thing because I don't have 117 00:07:21 --> 00:07:25 it the square root of 0.25. 118 00:07:25 --> 00:07:31 What went wrong here? 119 00:07:31 --> 00:07:33 Well, let's think about it for a second. 120 00:07:33 --> 00:07:35 You look like-- someone looks like they're dying 121 00:07:35 --> 00:07:36 to give an answer. 122 00:07:36 --> 00:07:37 No, you just scratching your head? 123 00:07:37 --> 00:07:39 All right. 124 00:07:39 --> 00:07:44 Remember, I said when we do a bisection method, we're 125 00:07:44 --> 00:07:46 assuming the answer lies somewhere between the lower 126 00:07:46 --> 00:07:51 bound and the upper bound. 127 00:07:51 --> 00:07:56 Well, what is the square root of a quarter? 128 00:07:56 --> 00:08:01 It is a half. 129 00:08:01 --> 00:08:06 Well, what-- where did I tell my program to 130 00:08:06 --> 00:08:08 look for an answer? 131 00:08:08 --> 00:08:14 Between 0 and x. 132 00:08:14 --> 00:08:22 So the problem was, the answer was over here somewhere, and so 133 00:08:22 --> 00:08:25 I'm never going to find it cleverly searching in 134 00:08:25 --> 00:08:28 this region, right? 135 00:08:28 --> 00:08:34 So the basic idea was fine, but I failed to satisfy the initial 136 00:08:34 --> 00:08:38 condition that the answer had to be between the lower 137 00:08:38 --> 00:08:40 bound and the upper bound. 138 00:08:40 --> 00:08:44 Right? 139 00:08:44 --> 00:08:45 And why did I do that? 140 00:08:45 --> 00:08:52 Because I forgot what happens when you look at fractions. 141 00:08:52 --> 00:08:56 So what should I do? 142 00:08:56 --> 00:08:58 Actually I lied, by the way, when I said the answer 143 00:08:58 --> 00:08:59 was over there. 144 00:08:59 --> 00:09:02 Where was the answer? 145 00:09:02 --> 00:09:05 Somebody? 146 00:09:05 --> 00:09:10 It was over here. 147 00:09:10 --> 00:09:15 Because the square root of a quarter is not smaller than a 148 00:09:15 --> 00:09:18 quarter, it's bigger than a quarter. 149 00:09:18 --> 00:09:18 Right? 150 00:09:18 --> 00:09:27 A half is strictly greater than a quarter. 151 00:09:27 --> 00:09:29 So it wasn't on the region. 152 00:09:29 --> 00:09:31 So how-- what's the fix? 153 00:09:31 --> 00:09:33 Should be a pretty simple fix, in fact we should be able 154 00:09:33 --> 00:09:35 to do it on the fly, here. 155 00:09:35 --> 00:09:40 What should I change? 156 00:09:40 --> 00:09:42 Do I need to change the lower bound? 157 00:09:42 --> 00:09:45 Is the square root ever going to be less than 0? 158 00:09:45 --> 00:09:49 Doesn't need to be, so, what should I do about 159 00:09:49 --> 00:09:52 the upper bound here? 160 00:09:52 --> 00:09:56 Oh, I could cheat and make, OK, the upper bound a half, but 161 00:09:56 --> 00:09:58 that wouldn't be very honest. 162 00:09:58 --> 00:10:05 What would be a good thing to do here? 163 00:10:05 --> 00:10:08 Pardon? 164 00:10:08 --> 00:10:13 I could square x, but maybe I should just do something 165 00:10:13 --> 00:10:16 pretty simple here. 166 00:10:16 --> 00:10:19 Suppose-- whoops. 167 00:10:19 --> 00:10:35 Suppose I make it the max of x and 1. 168 00:10:35 --> 00:10:37 Then if I'm looking for the square root of something 169 00:10:37 --> 00:10:44 less than 1, I know it will be in my region, right? 170 00:10:44 --> 00:11:04 All right, let's save this, and run it and see what happens. 171 00:11:04 --> 00:11:08 Sure enough, it worked and, did we get-- we got the 172 00:11:08 --> 00:11:16 right answer, 0.5 All right? 173 00:11:16 --> 00:11:19 And by the way, I checked all of my previous ones, 174 00:11:19 --> 00:11:25 and they work too. 175 00:11:25 --> 00:11:25 All right. 176 00:11:25 --> 00:11:33 Any questions about bisection search? 177 00:11:33 --> 00:11:37 One of the things I want you to notice here is the number 178 00:11:37 --> 00:11:42 iterations is certainly not constant. 179 00:11:42 --> 00:11:43 Yeah, when I will looked at 180 00:11:43 --> 00:11:46 4, it was a nice number like 1, 181 00:11:46 --> 00:11:50 9 looked like it took me 18, 182 00:11:50 --> 00:11:54 2 took me 14, if we try some big numbers it 183 00:11:54 --> 00:12:00 might take even longer. 184 00:12:00 --> 00:12:03 These numbers are small, but sometimes when we look at 185 00:12:03 --> 00:12:08 really harder problems, we got ourselves in a position where 186 00:12:08 --> 00:12:12 we do care about the number of iterations, and we care 187 00:12:12 --> 00:12:23 about something called the speed of convergence. 188 00:12:23 --> 00:12:27 Bisection methods were known to the ancient Greeks, and it is 189 00:12:27 --> 00:12:32 believed by many, even to the Babylonians. 190 00:12:32 --> 00:12:36 And as I mentioned last time, this was the state of the 191 00:12:36 --> 00:12:41 art until the 17th century. 192 00:12:41 --> 00:12:45 At which point, things got better. 193 00:12:45 --> 00:12:51 So, let's think about it, and let's think about what we're 194 00:12:51 --> 00:12:54 actually doing when we solve this. 195 00:12:54 --> 00:13:00 When we look for something like the square root of x, what 196 00:13:00 --> 00:13:06 we're really doing, is solving an equation. 197 00:13:06 --> 00:13:19 We're looking at the equation f of guess equals the 198 00:13:19 --> 00:13:26 guess squared minus x. 199 00:13:26 --> 00:13:29 Right, that's what that is equal to, and we're trying 200 00:13:29 --> 00:13:39 to solve the equation that f of guess equals 0. 201 00:13:39 --> 00:13:45 Looking for the root of this equation. 202 00:13:45 --> 00:13:54 So if we looked at it pictorially, what we've got 203 00:13:54 --> 00:14:02 here is, we're looking at f of x, I've plotted it here, and 204 00:14:02 --> 00:14:08 we're asking where it crosses the x axis. 205 00:14:08 --> 00:14:12 Sorry for the overloading of the word x. 206 00:14:12 --> 00:14:15 And I'm looking here at 16. 207 00:14:15 --> 00:14:20 Square root of 16, and my plot basically shows it crosses at 208 00:14:20 --> 00:14:23 4 and-- well, I think that's minus 4. 209 00:14:23 --> 00:14:29 The perspective is tricky-- and so we're trying 210 00:14:29 --> 00:14:32 to find the roots. 211 00:14:32 --> 00:14:37 Now Isaac Newton and/or Joseph Raphson figured out how to do 212 00:14:37 --> 00:14:41 this kind of thing for all differentiable functions. 213 00:14:41 --> 00:14:43 Don't worry about what that means. 214 00:14:43 --> 00:14:49 The basic idea is, you take a guess, and you -- whoops 215 00:14:49 --> 00:14:56 -- and you find the tangent of that guess. 216 00:14:56 --> 00:14:58 So let's say I guessed 217 00:14:58 --> 00:14:59 3. 218 00:14:59 --> 00:15:02 I look for the tangent of the curve at 219 00:15:02 --> 00:15:04 3. 220 00:15:04 --> 00:15:08 All right, so I've got the tangent, and then my next guess 221 00:15:08 --> 00:15:14 is going to be where the tangent crosses the x axis. 222 00:15:14 --> 00:15:18 So instead of dividing it in half, I'm using a different 223 00:15:18 --> 00:15:24 method to find the next guess. 224 00:15:24 --> 00:15:30 The utility of this relies upon the observation that, most of 225 00:15:30 --> 00:15:34 the time-- and I want to emphasize this, most of the 226 00:15:34 --> 00:15:40 time, that implies not all of the time-- the tangent line is 227 00:15:40 --> 00:15:45 a good approximation to the curve for values 228 00:15:45 --> 00:15:51 near the solution. 229 00:15:51 --> 00:15:56 And therefore, the x intercept of the tangent will be closer 230 00:15:56 --> 00:16:01 to the right answer than the current guess. 231 00:16:01 --> 00:16:04 Is that always true, by the way? 232 00:16:04 --> 00:16:07 Show me a place where that's not true, where the tangent 233 00:16:07 --> 00:16:12 line will be really bad. 234 00:16:12 --> 00:16:13 Yeah. 235 00:16:13 --> 00:16:17 Suppose I choose it right down there, I guess 0. 236 00:16:17 --> 00:16:21 Well, the tangent there will not even have an x intercept. 237 00:16:21 --> 00:16:24 So I'm really going to be dead in the water. 238 00:16:24 --> 00:16:26 This is the sort of thing that people who do 239 00:16:26 --> 00:16:30 numerical programming worry about all the time. 240 00:16:30 --> 00:16:32 And there are a lot of a little tricks they use to deal with 241 00:16:32 --> 00:16:36 that, they'll perturb it a little bit, things like that. 242 00:16:36 --> 00:16:41 You should not, at this point, be worrying about those things. 243 00:16:41 --> 00:16:45 This method, interestingly enough, is actually the method 244 00:16:45 --> 00:16:48 used in most hand calculators. 245 00:16:48 --> 00:16:52 So if you've got a calculator that has a square root button, 246 00:16:52 --> 00:16:56 it's actually in the calculator running Newton's method. 247 00:16:56 --> 00:16:58 Now I know you thought it was going to do that thing you 248 00:16:58 --> 00:17:01 learned in high school for finding square roots, which 249 00:17:01 --> 00:17:03 I never could quite understand, but no. 250 00:17:03 --> 00:17:07 It uses Newton's method to do it. 251 00:17:07 --> 00:17:11 So how do we find the intercept of the tangent, 252 00:17:11 --> 00:17:13 the x intercept? 253 00:17:13 --> 00:17:17 Well this is where derivatives come in. 254 00:17:17 --> 00:17:27 What we know is that the slope of the tangent is given by the 255 00:17:27 --> 00:17:30 first derivative of the function f at the 256 00:17:30 --> 00:17:31 point of the guess. 257 00:17:31 --> 00:17:36 So the slope of the guess is the first derivative. 258 00:17:36 --> 00:17:36 Right. 259 00:17:36 --> 00:17:41 Which dy over dx. 260 00:17:41 --> 00:17:45 Change in y divided by change in x. 261 00:17:45 --> 00:17:50 So we can use some algebra, which I won't go through here, 262 00:17:50 --> 00:17:54 and what we would find is that for square root, the 263 00:17:54 --> 00:18:05 derivative, written f prime of the i'th guess is equal to two 264 00:18:05 --> 00:18:07 times the i'th guess. 265 00:18:07 --> 00:18:09 Well, should have left myself a little more 266 00:18:09 --> 00:18:15 room, sorry about that. 267 00:18:15 --> 00:18:18 All right? 268 00:18:18 --> 00:18:20 You could work this out. 269 00:18:20 --> 00:18:20 Right? 270 00:18:20 --> 00:18:22 The derivative of the square root is not a 271 00:18:22 --> 00:18:26 complicated thing. 272 00:18:26 --> 00:18:35 Therefore, and here's the key thing we need to keep in mind, 273 00:18:35 --> 00:18:46 we'll know that we can choose guess i plus 1 to be equal to 274 00:18:46 --> 00:18:53 the old guess, guess i, minus whatever the value is of the 275 00:18:53 --> 00:19:01 new guess-- of the old rather, the old guess-- divided 276 00:19:01 --> 00:19:14 by twice the old guess. 277 00:19:14 --> 00:19:17 All right, again this is straightforward kind of 278 00:19:17 --> 00:19:20 algebraic manipulations to get here. 279 00:19:20 --> 00:19:22 So let's look at an example. 280 00:19:22 --> 00:19:25 Suppose we start looking for the square root 281 00:19:25 --> 00:19:28 of 16 with the guess 282 00:19:28 --> 00:19:30 3. 283 00:19:30 --> 00:19:35 What's the value of the function f of 284 00:19:35 --> 00:19:37 3? 285 00:19:37 --> 00:19:41 Well, it's going to be, we looked at our function 286 00:19:41 --> 00:19:43 there, guess squared, 287 00:19:43 --> 00:19:43 3 times 288 00:19:43 --> 00:19:45 3 is 289 00:19:45 --> 00:19:56 9 I think, minus 16, that's what x is in this case, 290 00:19:56 --> 00:20:02 which equals minus 7. 291 00:20:02 --> 00:20:13 That being the case, what's my next guess? 292 00:20:13 --> 00:20:25 Well I start with my old guess, 3, minus f of my old guess, 293 00:20:25 --> 00:20:38 which is minus 7, divided by twice my old guess, which is 6, 294 00:20:38 --> 00:20:46 minus the minus, and I get as my new guess 4.1666 295 00:20:46 --> 00:20:51 or thereabouts. 296 00:20:51 --> 00:20:57 So you can see I've missed, but I am closer. 297 00:20:57 --> 00:21:03 And then I would reiterate this process using that as 298 00:21:03 --> 00:21:08 guess i, and do it again. 299 00:21:08 --> 00:21:12 One way to think about this intuitively, if the derivative 300 00:21:12 --> 00:21:19 is very large, the function is changing quickly, and therefore 301 00:21:19 --> 00:21:21 we want to take small steps. 302 00:21:21 --> 00:21:23 All right. 303 00:21:23 --> 00:21:28 If the derivative is small, it's not changing, maybe want 304 00:21:28 --> 00:21:31 to take a larger step, but let's not worry about 305 00:21:31 --> 00:21:34 that, all right? 306 00:21:34 --> 00:21:37 Does this method work all the time? 307 00:21:37 --> 00:21:42 Well, we already saw no, if my initial guess is zero, 308 00:21:42 --> 00:21:44 I don't get anywhere. 309 00:21:44 --> 00:21:46 In fact, my program crashes because I end up trying to 310 00:21:46 --> 00:21:49 divide by zero, a really bad thing. 311 00:21:49 --> 00:21:53 Hint: if you implement Newton's method, do not 312 00:21:53 --> 00:21:57 make your first guess zero. 313 00:21:57 --> 00:22:08 All right, so let's look at the code for that. 314 00:22:08 --> 00:22:13 All right so-- yeah, how do I get to the code for that? 315 00:22:13 --> 00:22:25 That's interesting. 316 00:22:25 --> 00:22:30 All right. 317 00:22:30 --> 00:22:32 So we have that square root NR. 318 00:22:32 --> 00:22:36 NR for Newton Raphson. 319 00:22:36 --> 00:22:41 First thing I want you to observe is its specification is 320 00:22:41 --> 00:22:48 identical to the specification of square root bi. 321 00:22:48 --> 00:22:54 What that's telling me is that if you're a user of this, you 322 00:22:54 --> 00:22:56 don't care how it's implemented, you 323 00:22:56 --> 00:23:00 care what it does. 324 00:23:00 --> 00:23:03 And therefore, it's fine that the specifications are 325 00:23:03 --> 00:23:07 identical, in fact it's a good thing, so that means if someday 326 00:23:07 --> 00:23:09 Professor Grimson invents something that's better than 327 00:23:09 --> 00:23:13 Newton Raphson, we can all re-implement our square root 328 00:23:13 --> 00:23:16 functions and none of the programs that use it will have 329 00:23:16 --> 00:23:24 to change, as long as the specification is the same. 330 00:23:24 --> 00:23:28 All right, so, not much to see about this. 331 00:23:28 --> 00:23:33 As I said, the specifications is the same, same assertions, 332 00:23:33 --> 00:23:35 and the-- it's basically the same program as the one we were 333 00:23:35 --> 00:23:43 just looking at, but I'm starting with a different 334 00:23:43 --> 00:23:47 guess, in this case x over 335 00:23:47 --> 00:23:49 2, well I'm going to, couple of different guesses we can start 336 00:23:49 --> 00:23:53 with, we can experiment with different guesses and see 337 00:23:53 --> 00:23:57 whether we get the same answer, and in fact, if we did, we 338 00:23:57 --> 00:24:00 would see we didn't get this, we got different answers, 339 00:24:00 --> 00:24:02 but correct answers. 340 00:24:02 --> 00:24:09 Actually now, we'll just comment that out. 341 00:24:09 --> 00:24:13 I'm going to compute the difference, just as I did on 342 00:24:13 --> 00:24:17 the board, and off we'll go. 343 00:24:17 --> 00:24:18 All right. 344 00:24:18 --> 00:24:23 Now, let's try and compare these things. 345 00:24:23 --> 00:24:31 And what we're going to look at is another procedure, you have 346 00:24:31 --> 00:24:34 the code for these things on your handout so we won't worry, 347 00:24:34 --> 00:24:38 don't need to show you the code, but let's look at how 348 00:24:38 --> 00:24:40 we're going to test it. 349 00:24:40 --> 00:24:45 I'm doing a little trick by the way, I'm using raw input in my 350 00:24:45 --> 00:24:49 function here, as a just a way to stop the display. 351 00:24:49 --> 00:24:53 This way I can torture you between tests by 352 00:24:53 --> 00:24:54 asking you questions. 353 00:24:54 --> 00:24:57 Making it stop. 354 00:24:57 --> 00:24:59 All right, so, we'll try some things. 355 00:24:59 --> 00:25:06 We'll see what it does. 356 00:25:06 --> 00:25:09 Starting with that, well, let's look at some of 357 00:25:09 --> 00:25:12 the things it will do. 358 00:25:12 --> 00:25:23 Yeah, I'll save it.. 359 00:25:23 --> 00:25:34 It's a little bit annoying, but it makes the font bigger. 360 00:25:34 --> 00:25:39 All right, so we've tested it, and we haven't tested it yet, 361 00:25:39 --> 00:25:46 we have tested it but, we haven't seen it, well, you 362 00:25:46 --> 00:25:47 know what I'm going to do? 363 00:25:47 --> 00:25:49 I'm going to tort-- I'm going to make the font smaller 364 00:25:49 --> 00:25:51 so we can see more. 365 00:25:51 --> 00:25:52 Sorry about this. 366 00:25:52 --> 00:26:04 Those of you in the back, feel free to move forward. 367 00:26:04 --> 00:26:08 All right. 368 00:26:08 --> 00:26:12 So we've got it, now let's test it. 369 00:26:12 --> 00:26:13 So we're going to do here, we're going to 370 00:26:13 --> 00:26:20 run compare methods. 371 00:26:20 --> 00:26:32 Well we're seeing this famous computers are no damn good. 372 00:26:32 --> 00:26:34 All right. 373 00:26:34 --> 00:26:35 So we're going to try it on 374 00:26:35 --> 00:26:38 2, and at least we'll notice for 375 00:26:38 --> 00:26:43 2, that the bisection method took eight iterations, the 376 00:26:43 --> 00:26:49 Newton Raphson only took three, so it was more efficient. 377 00:26:49 --> 00:26:52 They came up with slightly different answers, but both 378 00:26:52 --> 00:26:56 answers are within .01 which is what I gave it here 379 00:26:56 --> 00:26:59 for epsilon, so we're OK. 380 00:26:59 --> 00:27:02 So even though they have different answers, they both 381 00:27:02 --> 00:27:09 satisfy the same specification, so we have no problem. 382 00:27:09 --> 00:27:11 All right? 383 00:27:11 --> 00:27:17 Try it again, just for fun. 384 00:27:17 --> 00:27:21 I gave it here a different epsilon, and you'll note, 385 00:27:21 --> 00:27:26 we get different answers. 386 00:27:26 --> 00:27:29 Again, that's OK. 387 00:27:29 --> 00:27:36 Notice here, when I asked for a more precise answer, bisection 388 00:27:36 --> 00:27:44 took a lot more iterations, but Newton Raphson took only one 389 00:27:44 --> 00:27:49 extra iteration to get that extra precision in the answer. 390 00:27:49 --> 00:27:54 So we're sort of getting the notion that Newton Raphson 391 00:27:54 --> 00:27:59 maybe is considerably better on harder problems. 392 00:27:59 --> 00:28:03 Which, by the way, it is. 393 00:28:03 --> 00:28:08 We'll make it an even harder problem, by making it looking 394 00:28:08 --> 00:28:13 an even smaller epsilon, and again, what you'll see is, 395 00:28:13 --> 00:28:19 Newton Raphson just crept up by one, didn't take it long, and 396 00:28:19 --> 00:28:23 got the better answer, where bisection gets worse and worse. 397 00:28:23 --> 00:28:26 So as you can see, as we escalate the problem 398 00:28:26 --> 00:28:30 difficulty, the difference between the good method and the 399 00:28:30 --> 00:28:35 not quite as good method gets bigger and bigger and bigger. 400 00:28:35 --> 00:28:38 That's an important observation, and as we get to 401 00:28:38 --> 00:28:43 the part of the course, we talk about computational complexity, 402 00:28:43 --> 00:28:46 you'll see that what we really care about is not how efficient 403 00:28:46 --> 00:28:50 the program is on easy problems, but how efficient it 404 00:28:50 --> 00:28:54 is on hard problems. 405 00:28:54 --> 00:28:57 All right. 406 00:28:57 --> 00:29:00 Look at another example. 407 00:29:00 --> 00:29:07 All right, here I gave it a big number, 123456789. 408 00:29:07 --> 00:29:11 And again, I don't want to bore you, but you can see what's 409 00:29:11 --> 00:29:19 going on here with this trend. 410 00:29:19 --> 00:29:27 So here's an interesting question. 411 00:29:27 --> 00:29:29 You may notice that it's always printing out the 412 00:29:29 --> 00:29:36 same number of digits. 413 00:29:36 --> 00:29:39 Why should this be? 414 00:29:39 --> 00:29:48 If you look at it here, what's going on? 415 00:29:48 --> 00:29:55 Something very peculiar is happening here. 416 00:29:55 --> 00:30:04 We're looking at it, and we're getting some funny answers. 417 00:30:04 --> 00:30:09 This gets back to what I talked about before, about some of the 418 00:30:09 --> 00:30:11 precision of floating point numbers. 419 00:30:11 --> 00:30:20 And the thing I'm trying to drive home to you here is 420 00:30:20 --> 00:30:24 perhaps the most important lesson we'll talk 421 00:30:24 --> 00:30:33 about all semester. 422 00:30:33 --> 00:30:49 Which is, answers can be wrong. 423 00:30:49 --> 00:30:51 People tend to think, because the computer says it's 424 00:30:51 --> 00:30:53 so, it must be so. 425 00:30:53 --> 00:30:58 That the computer is-- speaks for God. 426 00:30:58 --> 00:31:00 And therefore it's infallible. 427 00:31:00 --> 00:31:01 Maybe it speaks for the Pope. 428 00:31:01 --> 00:31:03 It speaks for something that's infallible. 429 00:31:03 --> 00:31:06 But in fact, it is not. 430 00:31:06 --> 00:31:11 And so, something I find myself repeating over and over again 431 00:31:11 --> 00:31:16 to myself, to my graduate students, is, when you get an 432 00:31:16 --> 00:31:19 answer from the computer, always ask yourself, 433 00:31:19 --> 00:31:21 why do I believe it? 434 00:31:21 --> 00:31:26 Do I think it's the right answer? 435 00:31:26 --> 00:31:31 Because it isn't necessarily. 436 00:31:31 --> 00:31:34 So if we look at what we've got here, we've got something 437 00:31:34 --> 00:31:36 rather peculiar, right? 438 00:31:36 --> 00:31:38 What's peculiar about what this computer is now 439 00:31:38 --> 00:31:47 printing for us? 440 00:31:47 --> 00:31:50 Why should I be really suspicious about what I 441 00:31:50 --> 00:31:51 see in the screen here? 442 00:31:51 --> 00:31:56 STUDENT: [INAUDIBLE] 443 00:31:56 --> 00:31:58 PROFESSOR JOHN GUTTAG: Well, not only is it different, it's 444 00:31:58 --> 00:32:00 really different, right? 445 00:32:00 --> 00:32:02 If it were just a little bit different, I could say, all 446 00:32:02 --> 00:32:05 right, I have a different approximation. 447 00:32:05 --> 00:32:10 But when it's this different, something is wrong. 448 00:32:10 --> 00:32:12 Right? 449 00:32:12 --> 00:32:17 We'll, later in the term when we get to more detailed 450 00:32:17 --> 00:32:20 numerical things, look at what's wrong. 451 00:32:20 --> 00:32:23 You can run into issues of things like overflow, 452 00:32:23 --> 00:32:26 underflow, with floating point numbers, and when you see a 453 00:32:26 --> 00:32:30 whole bunches of ones, it's particularly a good 454 00:32:30 --> 00:32:33 time to be suspicious. 455 00:32:33 --> 00:32:38 Anyway the only point I'm making here is, paranoia 456 00:32:38 --> 00:32:43 is a healthy human trait. 457 00:32:43 --> 00:32:44 All right. 458 00:32:44 --> 00:32:50 We can look at some other things which will work better. 459 00:32:50 --> 00:32:52 And we'll now move on. 460 00:32:52 --> 00:32:53 OK. 461 00:32:53 --> 00:32:56 So we've looked at how to solve square root we've, looked at 462 00:32:56 --> 00:33:00 two problems, I've tried to instill in you this sense of 463 00:33:00 --> 00:33:04 paranoia which is so valuable, and now we're going to pull 464 00:33:04 --> 00:33:09 back and return to something much simpler than numbers, 465 00:33:09 --> 00:33:11 and that's Python. 466 00:33:11 --> 00:33:12 All right? 467 00:33:12 --> 00:33:13 Numbers are hard. 468 00:33:13 --> 00:33:16 That's why we teach whole semesters worth of courses 469 00:33:16 --> 00:33:17 in number theory. 470 00:33:17 --> 00:33:24 Python it's easy, which is why we do it in about four weeks. 471 00:33:24 --> 00:33:25 All right. 472 00:33:25 --> 00:33:34 I want to return to some non-scalar types. 473 00:33:34 --> 00:33:39 So we've been looking, the last couple of lectures, at floating 474 00:33:39 --> 00:33:43 point numbers and integers. 475 00:33:43 --> 00:33:47 We've looked so far really at two non-scalar types. 476 00:33:47 --> 00:34:02 And those were tuples written with parentheses, and strings. 477 00:34:02 --> 00:34:05 The key thing about both of them is that they 478 00:34:05 --> 00:34:11 were immutable. 479 00:34:11 --> 00:34:16 And I responded to at least one email about this issue, someone 480 00:34:16 --> 00:34:19 quite correctly said tuple are immutable, how can 481 00:34:19 --> 00:34:21 I change one? 482 00:34:21 --> 00:34:25 My answer is, you can't change one, but you can create a new 483 00:34:25 --> 00:34:28 one that is almost like the old one but different 484 00:34:28 --> 00:34:31 in a little bit. 485 00:34:31 --> 00:34:38 Well now we're going to talk about some mutable types. 486 00:34:38 --> 00:34:42 Things you can change. 487 00:34:42 --> 00:34:46 And we're going to start with one that you, many of you, have 488 00:34:46 --> 00:34:53 already bumped into, perhaps by accident, which are lists. 489 00:34:53 --> 00:34:58 Lists differ from strings in two ways; one way is that it's 490 00:34:58 --> 00:35:03 mutable, the other way is that the values need not 491 00:35:03 --> 00:35:05 be characters. 492 00:35:05 --> 00:35:08 They can be numbers, they can be characters, they 493 00:35:08 --> 00:35:13 can be strings, they can even be other lists. 494 00:35:13 --> 00:35:18 So let's look at some examples here. 495 00:35:18 --> 00:35:27 What we'll do, is we'll work on two boards at once. 496 00:35:27 --> 00:35:36 So I could write a statement like, techs, a variable, is 497 00:35:36 --> 00:35:39 equal to the list, written with the square brace, not a 498 00:35:39 --> 00:35:55 parenthesis, MIT, Cal Tech, closed brace. 499 00:35:55 --> 00:36:03 What that basically does, is it takes the variable techs, and 500 00:36:03 --> 00:36:15 it now makes it point to a list with two items in it. 501 00:36:15 --> 00:36:25 One is the string MIT and one is the string Cal Tech. 502 00:36:25 --> 00:36:29 So let's look at it. 503 00:36:29 --> 00:36:37 And we'll now run another little test program, show 504 00:36:37 --> 00:36:49 lists, and I printed it, and it prints the list MIT, Cal Tech. 505 00:36:49 --> 00:36:55 Now suppose I introduce a new variable, we'll call it ivys, 506 00:36:55 --> 00:37:07 and we say that is equal to the list Harvard, Yale, Brown. 507 00:37:07 --> 00:37:11 Three of the ivy league colleges. 508 00:37:11 --> 00:37:16 What that does is, I have a new variable, ivys, and it's now 509 00:37:16 --> 00:37:24 pointing to another, what we call object, in Python and 510 00:37:24 --> 00:37:28 Java, and many other languages, think of these things that are 511 00:37:28 --> 00:37:31 sitting there in memory somewhere as objects. 512 00:37:31 --> 00:37:34 And I won't write it all out, I'll just write it's got 513 00:37:34 --> 00:37:39 Harvard as one in it, and then it's got Yale, and 514 00:37:39 --> 00:37:43 then it's got Brown. 515 00:37:43 --> 00:37:49 And I can now print ivys. 516 00:37:49 --> 00:37:55 And it sure enough prints what we expected it to print. 517 00:37:55 --> 00:38:02 Now, let's say I have univs, for universities, 518 00:38:02 --> 00:38:09 equals the empty list. 519 00:38:09 --> 00:38:12 That would create something over here called univs, another 520 00:38:12 --> 00:38:18 variable, and it will point to the list, an object that 521 00:38:18 --> 00:38:19 contains nothing in it. 522 00:38:19 --> 00:38:22 This is not the same as none. 523 00:38:22 --> 00:38:25 It's it does have a value, it just happens to be the list 524 00:38:25 --> 00:38:29 that has nothing in it. 525 00:38:29 --> 00:38:50 And the next thing I'm going to write is univs dot append tex. 526 00:38:50 --> 00:38:54 What is this going to do? 527 00:38:54 --> 00:39:03 It's going to take this list and add to it something else. 528 00:39:03 --> 00:39:17 Let's look at the code. 529 00:39:17 --> 00:39:21 I'm going to print it, and let's see what it prints. 530 00:39:21 --> 00:39:24 It's kind of interesting. 531 00:39:24 --> 00:39:25 Whoops. 532 00:39:25 --> 00:39:27 Why did it do that? 533 00:39:27 --> 00:39:36 That's not what I expected. 534 00:39:36 --> 00:39:37 It's going to print that. 535 00:39:37 --> 00:39:39 The reason it printed that is I accidentally had my finger on 536 00:39:39 --> 00:39:46 the control key, which said print the last thing you had. 537 00:39:46 --> 00:39:56 Why does it start with square braced square brace? 538 00:39:56 --> 00:39:58 I take it-- yes, go ahead. 539 00:39:58 --> 00:40:01 STUDENT: So you're adding a list to a list? 540 00:40:01 --> 00:40:03 PROFESSOR JOHN GUTTAG: So I'm adding a list to a list. 541 00:40:03 --> 00:40:07 What have I-- what I've appended to the empty list 542 00:40:07 --> 00:40:15 is not the elements MIT and Cal Tech but the list that 543 00:40:15 --> 00:40:21 contains those elements. 544 00:40:21 --> 00:40:24 So I've appended this whole object. 545 00:40:24 --> 00:40:28 Since that object is itself a list, what I 546 00:40:28 --> 00:40:37 get is a list of lists. 547 00:40:37 --> 00:40:49 Now I should mention this notation here append is what 548 00:40:49 --> 00:40:54 is in Python called a method. 549 00:40:54 --> 00:40:58 Now we'll hear lots more about methods when we get to classes 550 00:40:58 --> 00:41:03 and inheritance, but really, a method is just a fancy word 551 00:41:03 --> 00:41:08 for a function with different syntax. 552 00:41:08 --> 00:41:13 Think of this as a function that takes two arguments, the 553 00:41:13 --> 00:41:21 first of which is univs and the second of which is techs. 554 00:41:21 --> 00:41:25 And this is just a different syntax for writing 555 00:41:25 --> 00:41:27 that function call. 556 00:41:27 --> 00:41:33 Later in the term, we'll see why we have this syntax and why 557 00:41:33 --> 00:41:37 it wasn't just a totally arbitrary brain-dead decision 558 00:41:37 --> 00:41:42 by the designers of Python, and many languages before Python, 559 00:41:42 --> 00:41:45 but in fact is a pretty sensible thing. 560 00:41:45 --> 00:41:48 But for now, think of this as just another way to 561 00:41:48 --> 00:41:54 write a function call. 562 00:41:54 --> 00:42:00 All right, people with me so far? 563 00:42:00 --> 00:42:07 Now let's say we wanted as the next thing we'll do, 564 00:42:07 --> 00:42:15 is we're going to append the ivys to univ. 565 00:42:15 --> 00:42:18 Stick another list on it. 566 00:42:18 --> 00:42:20 All right. 567 00:42:20 --> 00:42:28 So we'll do that, and now we get MIT, Cal Tech, followed 568 00:42:28 --> 00:42:32 by that list followed by the list Harvard, Yale, Brown. 569 00:42:32 --> 00:42:41 So now we have a list containing two lists. 570 00:42:41 --> 00:42:45 What are we going to try next? 571 00:42:45 --> 00:42:49 Well just to see what we know what we're doing, let's 572 00:42:49 --> 00:42:56 look at this code here. 573 00:42:56 --> 00:43:01 I've written a little for loop, which is going to iterate over 574 00:43:01 --> 00:43:03 all of the elements in the list. 575 00:43:03 --> 00:43:10 So remember, before we wrote things like for i in range 10, 576 00:43:10 --> 00:43:17 which iterated over a list or tuple of numbers, here you can 577 00:43:17 --> 00:43:20 iterate over any list, and so we're going to just going to 578 00:43:20 --> 00:43:26 take the list called univs and iterate over it. 579 00:43:26 --> 00:43:30 So the first thing we'll do is, we'll print the element, in 580 00:43:30 --> 00:43:32 this case it will be a list, right? 581 00:43:32 --> 00:43:35 Because it's a list with two lists in it. 582 00:43:35 --> 00:43:38 Then the next thing in the loop, we're going to enter a 583 00:43:38 --> 00:43:45 nested loop, and say for every college in the list e, 584 00:43:45 --> 00:43:51 we're going to print the name of the college. 585 00:43:51 --> 00:43:59 So now if we look what we get-- do you not want to try and 586 00:43:59 --> 00:44:07 execute that?-- it'll first print the list containing 587 00:44:07 --> 00:44:12 MIT and Cal Tech, and then separately the strings MIT and 588 00:44:12 --> 00:44:15 Cal Tech, and then the list containing Harvard, Yale, and 589 00:44:15 --> 00:44:21 Brown, and then the strings Harvard, Yale, and Brown. 590 00:44:21 --> 00:44:25 So we're beginning to see this is a pretty powerful notion, 591 00:44:25 --> 00:44:28 these lists, and that we can do a lot of interesting 592 00:44:28 --> 00:44:31 things with them. 593 00:44:31 --> 00:44:37 Suppose I don't want all of this structure, and I 594 00:44:37 --> 00:44:49 want to do what's called flattening the list. 595 00:44:49 --> 00:44:56 Well I can do that by, instead of using the method append, use 596 00:44:56 --> 00:45:00 the concatenation operator. 597 00:45:00 --> 00:45:07 So I can concatenate techs plus ivys and assign that result to 598 00:45:07 --> 00:45:13 univs, and then when I print it you'll notice I just get 599 00:45:13 --> 00:45:19 a list of five strings. 600 00:45:19 --> 00:45:24 So plus and append do very different things. 601 00:45:24 --> 00:45:29 Append sticks the list on the end of the list, 602 00:45:29 --> 00:45:32 append flattens it, one level of course. 603 00:45:32 --> 00:45:36 If I had lists of lists of lists, then it would only take 604 00:45:36 --> 00:45:41 out the first level of it. 605 00:45:41 --> 00:45:45 OK, very quiet here. 606 00:45:45 --> 00:45:49 Any questions about any of this? 607 00:45:49 --> 00:45:49 All right. 608 00:45:49 --> 00:45:53 Because we're about to get to the hard part Sigh. 609 00:45:53 --> 00:45:54 All right. 610 00:45:54 --> 00:45:58 Let's look at the-- well, suppose I want to, 611 00:45:58 --> 00:46:00 quite understandably, eliminate Harvard. 612 00:46:00 --> 00:46:05 All right, I then get down here, where I'm 613 00:46:05 --> 00:46:09 going to remove it. 614 00:46:09 --> 00:46:13 So this is again another method, this is remove, takes 615 00:46:13 --> 00:46:17 two arguments, the first is ivys, the second is 616 00:46:17 --> 00:46:21 the string Harvard. 617 00:46:21 --> 00:46:25 It's going to search through the list until the first time 618 00:46:25 --> 00:46:39 it finds Harvard and then it's going to yank it away. 619 00:46:39 --> 00:46:42 So what happened here? 620 00:46:42 --> 00:46:45 Did I jump to the wrong place? 621 00:46:45 --> 00:46:45 STUDENT: You hit two returns. 622 00:46:45 --> 00:46:46 PROFESSOR JOHN GUTTAG: I hit two returns. 623 00:46:46 --> 00:46:46 Pardon? 624 00:46:46 --> 00:46:47 STUDENT: You hit two returns. 625 00:46:47 --> 00:46:47 One was at 626 00:46:47 --> 00:46:53 STUDENT: Pardo 627 00:46:53 --> 00:46:54 PROFESSOR JOHN GUTTAG: This one. 628 00:46:54 --> 00:46:55 STUDENT: No, up one. 629 00:46:55 --> 00:46:56 PROFESSOR JOHN GUTTAG: Up one. 630 00:46:56 --> 00:46:57 STUDENT: Right. 631 00:46:57 --> 00:46:57 PROFESSOR JOHN GUTTAG: But why is Harvard there? 632 00:46:57 --> 00:47:00 STUDENT: I'm sorry, I didn't write it down. 633 00:47:00 --> 00:47:00 PROFESSOR JOHN GUTTAG: Let's look at it again. 634 00:47:00 --> 00:47:04 All right, it's time to interrupt the world, and we'll 635 00:47:04 --> 00:47:06 just type into the shell. 636 00:47:06 --> 00:47:16 Let's see what we get here. 637 00:47:16 --> 00:47:21 All right, so let's just see what we got, we'll print univs. 638 00:47:21 --> 00:47:22 Nope, not defined. 639 00:47:22 --> 00:47:26 All right, well let's do a list equals, and we'll put some 640 00:47:26 --> 00:47:29 interesting things in it, we'll put a number in it, because we 641 00:47:29 --> 00:47:33 can put a number, we'll put MIT in it, because we can put 642 00:47:33 --> 00:47:39 strings, we'll put another number in it, 3.3, because we 643 00:47:39 --> 00:47:42 can put floating points, we can put all sorts of 644 00:47:42 --> 00:47:43 things in this list. 645 00:47:43 --> 00:47:47 We can put a list in the list again, as we've seen before. 646 00:47:47 --> 00:47:53 So let's put the list containing the string a, and 647 00:47:53 --> 00:47:57 I'll print out, so now we see something pretty interesting 648 00:47:57 --> 00:48:01 about a list, that we can mix up all sorts of things 649 00:48:01 --> 00:48:06 in it, and that's OK. 650 00:48:06 --> 00:48:09 You'll notice I have the string with the number 1, a string 651 00:48:09 --> 00:48:13 with MIT, and then it just a plain old number, not a string, 652 00:48:13 --> 00:48:17 again it didn't quite give me 3.3 for reasons we've talked 653 00:48:17 --> 00:48:21 before, and now it in the list a. 654 00:48:21 --> 00:48:27 So, suppose I want to remove something. 655 00:48:27 --> 00:48:32 What should we try and remove from this list? 656 00:48:32 --> 00:48:37 Anybody want to vote? 657 00:48:37 --> 00:48:38 Pardon? 658 00:48:38 --> 00:48:42 All right, someone wants to remove MIT. 659 00:48:42 --> 00:48:45 Sad but true. 660 00:48:45 --> 00:48:47 Now what do we get if we print l? 661 00:48:47 --> 00:48:51 MIT is gone. 662 00:48:51 --> 00:48:54 How do I talk about the different pieces of l? 663 00:48:54 --> 00:49:04 Well I can do this. l sub 0-- whoops-- will give me the first 664 00:49:04 --> 00:49:10 element of the list, just as we could do with strings, and I 665 00:49:10 --> 00:49:13 can look at l sub minus 1 to get the last element of the 666 00:49:13 --> 00:49:17 list, so I can do all the strings, all the things that 667 00:49:17 --> 00:49:20 I could do with strings. 668 00:49:20 --> 00:49:23 It's extremely powerful, but what we haven't 669 00:49:23 --> 00:49:29 seen yet is mutation. 670 00:49:29 --> 00:49:32 Well, we have seen mutation, right? 671 00:49:32 --> 00:49:36 Because notice that what remove did, it was it actually 672 00:49:36 --> 00:49:39 changed the list. 673 00:49:39 --> 00:49:40 Didn't create a new list. 674 00:49:40 --> 00:49:45 The old l is still there, but it's different 675 00:49:45 --> 00:49:47 than it used to be. 676 00:49:47 --> 00:49:51 So this is very different from what we did with slicing, where 677 00:49:51 --> 00:49:54 we got a new copy of something. 678 00:49:54 --> 00:49:59 Here we took the old one and we just changed it. 679 00:49:59 --> 00:50:05 On Thursday, we'll look at why that allows you to do lots of 680 00:50:05 --> 00:50:10 things more conveniently than you can do without mutation. 681 00:50:10 --> 00:50:10