Ask your own question, for FREE!
Mathematics 13 Online
OpenStudy (anonymous):

Suppose that y=f(x) is differentiable at x=a and that g(x)=m(x-a) + c is a linear function in which m and c are constants. If there error E(x)=f(x)-g(x) were small enough near x=a, we might think of using g as a linear approximation of f instead of the linearization L(x)=f(a)+f'(a)(x-a). Show that if we impose on g the conditions: 1. E(a)=0 2. lim as x-->a E(x)/(x-a)=0 then g(x)=f(a)+f'(a)(x-a). Thus, the linearization L(x) gives the only linear approximation whose error is both zero at x=a and negligible in comparison to x-a.

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!