Can someone please show me where I went wrong in solving the following problem. Find the local linear approximation of the function f(x)=sqrt(1+x) at x0=0, and use it to approximate sqrt(0.9) and sqrt(1.1)
your derivative is \(\frac{1}{2\sqrt{x+1}}\)
Here is what I did... Approximation: sqrt(1) x=0 f(x)=sqrt(1+x)=1 f'(x)=1/[2sqrt(1+x)]=1/2 point (0,1)
at \(x=0\) your slope is \(\frac{2(x^2+1)-2x(2x)1}{2}\) and your line is \[y=\frac{1}{2}x+1\]
what on earth???
y-1=.5(x-0) y=.5(.9-0)+1=1.45 y=.5(1.1-0)+1=1.55
i meant your slope is \(\frac{1}{2}\)
so you get \[y=\frac{1}{2}x+1\] now you want to approximate \(\sqrt{.9}\) so if \(x+1=.9\) then \(x=-.1\)
i see the mistake, you used \(x=.9\) not \(x=-.1\)
Ok, so I was plugging in the wrong value for x
yeah, it is \(1+x=.9\) and \(1+x=1.1\) and so \(x=-.1\) and \(x=.1\) respectively
Ok, then why did they bother giving me the value for x-base 0? It is unnecessary correct?
I think that value kinda confused me
Join our real-time social learning platform and learn together with your friends!