Test convergence of series
so i know that this converges, because the a^k + 1 is always bigger than 0 and it is also bigger than one, which means the fraction is always below 1...but i am not sure how to solve it :/
What happens when a=1 ?
i have k on 1 +1 and for k=1 that's 1/2
we get \[\frac12+\frac12+\frac12 +... =\infty\] the series diverges at a=1
doesn't diverging mean when the series is bigger than 1 ?
because like this no matter what a is the series will always be 1/m + 1/m + 1/m....
divergent means not convergent. convergent series means when add up the terms , your answer converges on a specific number
at a =1 if you keep adding 1/2 you will get to infinity not a specific number. your series diverges at a=1
Also check what happens when 0<a<1 and when a > 1.
You can use the ratio test for convergence. https://en.wikipedia.org/wiki/Convergent_series#Convergence_tests
okay, but is there a way to calculate this? like some method? or do i only explain it?
the ratio test isn't much helpful here, since i get |dw:1463963219455:dw|
Join our real-time social learning platform and learn together with your friends!