Why in a taylor series we assume that x=a?
A taylor series is used primarily for approximating. If the series has (x - a) for its ... thingy raised to the powers, then it's the series about a point x = a. The gist of it is... we approximate functions by cutting the Taylor Series off at some point, when we reach a desired accuracy. Say we want to approximate a function at a point using a Taylor series about a point x = a then the series will converge faster the closer your x-value is to a.
Also implying that the closer your x-value is to a, the less earlier you can cut the rest of the series off..
Ok but why do we say: find the taylor series about x=0 and then it's actually a=0? I hope you see what i mean
Semantics... they kill me~ better consult an expert on this one :D Sorry I couldn't be of more help, though :)
Semantics?
Join our real-time social learning platform and learn together with your friends!