7.1 Introduction: the Obvious Approximation: f '(x) ~ (f(x+d) - f(x)) / d

Suppose we have a given function,
$f$
, and we seek its
derivative at argument
${x}_{0}$
.

One way to estimate it is to evaluate
$f$
at two points,
${x}_{1}$
and
${x}_{2}$
, and examine the slope of the line from
$({x}_{1},f({x}_{1}))$
to
$({x}_{2},f({x}_{2}))$
. But what should we use for
${x}_{1}$
and
${x}_{2}$
and what will we learn about
$f\text{'}({x}_{0})$
?

The choice that first occurs to people is to set
${x}_{1}={x}_{0}$
, and
${x}_{2}={x}_{0}+d$
for some very small
$d$
. So one can compute

$$\frac{f({x}_{0}+d)-f({x}_{0})}{d}$$

This is not a horrible thing to do, but it is not very good, as we shall see.

What's wrong with it?

Well, if
$d$
is too big, the linear approximation won't be accurate and if
$d$
is too small, the inaccuracy of your calculation tools may screw up your answer. And the transition from being too big to too small may be difficult to find.