To approximate the change $\Delta f$ in a [[Derivative of a function|differentiable]] [[Function]] $f(x)$ from a point $x=a$, to nearby point $x=(a+\Delta x)$, recognize that the [[Linearization]] is increasingly accurate for small $\Delta x$ While $\Delta f$ is the true change, the **differential estimate** is $df$ $d f=f^{\prime}(a) \Delta x$ To measure the accuracy, calculate the error by subtracting. $\begin{array}{ll}\text { The true change: } & \Delta f=f(a+\Delta x)-f(a) \\ \text { The differential estimate: } \quad & d f=f^{\prime}(a) \Delta x\end{array}$ $\begin{aligned} \text { Approximation error } &=\Delta f-d f \\ &=\Delta f-f^{\prime}(a) \Delta x \\ &=\underbrace{f(a+\Delta x)-f(a)}_{\Delta f}-f^{\prime}(a) \Delta x \\ &=\underbrace{\left(\frac{f(a+\Delta x)-f(a)}{\Delta x}-f^{\prime}(a)\right)}_{\text {Call this part } \epsilon} \cdot \Delta x \\ &=\epsilon \cdot \Delta x \end{aligned}$ As $\Delta x \rightarrow 0$, the difference quotient approaches the derivative $ \frac{f(a+\Delta x)-f(a)}{\Delta x} \rightarrow f^\prime (a) $ implying $\epsilon$ (the stuff in the parenthesis) approaches 0 $\epsilon\rightarrow(f^\prime (a)-f^\prime (a)) \rightarrow 0$ So the approximation error $\epsilon \cdot\Delta x$ approaches zero even faster than $\epsilon$ or $\Delta x$ seperately. Perhaps best summarized as$\underbrace{\Delta f}_{\begin{array}{c}\text { true } \\ \text { change }\end{array}}=\underbrace{f^{\prime}(a) \Delta x}_{\begin{array}{c}\text { estimated } \\ \text { change }\end{array}}+\underbrace{\epsilon \Delta x}_{\text {error }}$ Leading to [[Change in y near x=a]]