
The discussion above in two dimensions demonstrates an important fact about derivatives which is responsible both for the usefulness of calculus and our ability to compute derivatives of all the standard functions, and others as well.
The fact is: in the linear approximation, which is the realm of differentials and derivatives, changes in a differentiable function from two different causes, such as from the change in x and the change in y, do not interact, they simply add on to one another as described in equation (A) above.
One implication of this fact is that in computing derivatives you can analyze the function into simple parts, compute the derivative resulting from the change in each part, and from every cause, and sum to get the derivative of the whole function.
For example the identity function f(x) = x obviously obeys df = dx and , and if c is a constant and f = cx we have df = cdx and f ' = c: the factor c is the slope of the tangent line which in this case is f itself.
In other words, when acting on x or cx, the action of taking the derivative is to replace the factor x by 1.
Suppose now we consider f = x^{n} for some positive integer n. This is the product of n factors each of which is x. The effect of taking the derivative with respect to any one factor, leaving the others fixed is to replace that factor by 1. The sum obtained by doing this to each factor is by our fact, the derivative of f with respect to x. We get
The n comes from replacing an x by a 1 for each of n different factors.
As a second example, suppose f = g * h , the function f is the product of g and h.
Then the differential change in f caused by a change in g alone will be (df)_{h fixed} = (dg) * h, and that caused by a change in h alone will be (df)_{g fixed} = g * dh. We can conclude here that the change in f in general will be the sum of these: df = dg * h + g * dh which implies f ' = g' h + gh'.
Another important implication of this same fact is that when you are confronted with an unknown function which depends on many parameters, you can, in examining the derivative of the function with respect to anything, model the change in the function with respect to each parameter separately, and thus get a relatively simple model of how the function changes under differential changes in any of its variables.
You can then hope to discover the actual behavior of the function under real changes in the parameters, by "integrating" the differential changes which you have modeled.
The alternative to this approach, modeling the effects of real changes in the parameters directly, is far more difficult. It is complicated by the fact that changes from different sources interact and can become very hard to model.
Though we study the derivatives of known functions to develop an understanding of the subject, the profound uses of calculus are in helping us determine unknown functions through this "analytic" process: modeling their differential changes and then "integrating" them.
Newton invented calculus to do this in studying the motion of planets and moving
bodies subject to various forces, with incredible success.
Exercises:
6.1 The derivative applet exhibits the derivative of any function you can
enter.
Enter the function (sin x)^{2} between 4 and 4 and see if you
can locate the points at which its derivative is 1/2.
6.2 What derivative do you find at x = 1? Where is the derivative 0?
6.3 Does the fact that a function of two variables has partial derivatives with respect to x and y at a point imply that it is differentiable there?
6.4 Does the fact that a function has directional derivatives in all directions
at a point imply that is is differentiable there?
(Hint: consider
at x = y = 0.)
6.5 Given that (sin x)' = cos x, what can we define to be at x = 0? How about at x = 0?
6.6 What is the gradient of the function (x^{2}+y^{2}) sin x? What is its directional derivative in the direction of the vector (1, 1) at the point x = 1, y = 2?
