Ask your own question, for FREE!
Mathematics 15 Online
OpenStudy (kirbykirby):

Why are Taylor polynomials with increasing degree are progressively worse approximations for some functions?

OpenStudy (anonymous):

are they?

OpenStudy (kirbykirby):

Well usually not, but some functions are giving increasing errors the more terms taken to evaluate Tn. But like it happened for a certain range where this pattern occurred.

OpenStudy (jamesj):

I'd like to see an example of this.

OpenStudy (kirbykirby):

Like with the function 1/(x^2 + 3x -4), evaulating f(-1.5)-Tn(-1.5) is progressively giving larger errors with increasing n, but for f(0.5)-Tn(0.5), the error is getting smaller with increasing n

OpenStudy (jamesj):

Interesting. For your example, where is the expansion centered? For the record, the usual way to think about error is to measure the distance between the function and the approximation about some point on an interval, rather than at a point. But it's still interesting that if your example holds up that at one point it gets farther and farther away.

OpenStudy (kirbykirby):

Oh I'm using it with x centred at 0.

OpenStudy (jamesj):

Ah ... the issues is x^2 + 3x - 4 has a zero at x = 1. So the Taylor series is only guaranteed to converge for x such that \( |x - 0| < 1 \). Hence for x = -3/2, we are outside that interval of convergence and that's why there is no guarantee that \( T_n(-3/2) \) should converge to \( f(-3/2) \).

OpenStudy (kirbykirby):

Ah ok I see what you mean. That was helpful thanks :)!

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!