Ask your own question, for FREE!
Mathematics 8 Online
OpenStudy (anonymous):

suppose that 0

OpenStudy (kinggeorge):

I'm pretty sure that's convergent. Since \(0<a<b\) if we take the limit as n goes to infinity, \(a^n\) become much smaller than \(b^n\). So we can completely disregard the \(a^n\) term. Thus, we need to find the limit \[\lim_{n\rightarrow \infty} (a^n +b^n)^{1/n}=\lim_{n\rightarrow \infty} (b^n)^{1/n}=\lim_{n\rightarrow \infty} b=b\]

OpenStudy (anonymous):

to show that the sequence is converge we have to use the monotonic sequence theorem...if the sequence is decreasing and bounded below then it is convergent....i already show that the sequence is decreasing...but how to show that it is bounded below????

OpenStudy (kinggeorge):

Both a and b are positive. Since they're positive, and exponent \(a^n\) or \(b^n\) is also positive. Therefore, \[a^n+b^n\]is positive. From a similar argument, we get that \[(a^n+b^n)^{1/n}\]must also be positive. Therefore, the sequence is bounded by 0 from below.

OpenStudy (anonymous):

ok....thanks...=)

OpenStudy (kinggeorge):

You're welcome.

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!