\[a_n=\frac{(-3)^n}{n!}\] \[= \frac{(-3)\times(-3)\times(-3)\times(-3)\times...\times (-3)}{1\times2\times3\times4...\times n}\] \[\ge \frac{-3}{1} \times \frac{(-3)^{n-2}}{3^{n-2}} = \frac{(-3)}{1}\left(\frac{(-3)}{3}\right)^{n-2}\] \[-1 <1\] converges to -1?
0
say what?
maple gave me that lol took limit as n->infinity
what's wrong with my thought process @KingGeorge?
He's using the (not so obvious) fact that \(f(x)=x!\) trumps \(f(x)=c^x\) as \(x\to\infty\).
I'm not actually sure what you've done wrong. However, I can't tell why it would prove convergence to anything either. Also, @timo86m don't just give out answers like that.
If you wanted to prove convergence, the alternating series test would be a good bet.
here is the question from my book. "Determine whether the sequence converges or diverges. If it converges, find the limit."
First, use the alternating series test to determine if it converges. When you use that, you'll easily see what the limit is.
but it's sequences
In this case, it's basically the same thing. Look at \[\lim_{n\to\infty}(-1)^{n}\frac{3^n}{n!}\]
Using the same concept of the alternating series test, you can just look at \[\lim_{n\to\infty}\frac{3^n}{n!}\]
here is what I did...I took Satellites train of thought and applied it to this question. I guess it's not that simple? http://openstudy.com/users/mathsofiya#/updates/4ff89c0be4b058f8b7631d1b
If that limit I just posted above is anything but 0, it diverges. Otherwise, it converges to 0.
Here's a good intuition for why \(x!\) increases faster than \(c^x\). Think about a large value of \(x\), and think about what increasing it by one does to each expression. For the \(x!\), you multiply the old value by \(x+1\), whereas for \(c^x\) you multiply by \(c\). Since \(c\) is fixed, and \(x\) tends to infinity, \(x!\) will always eventually increase faster than \(c^x\) as \(x\to\infty\).
makes sense @nbouscal
its zero cuzz n! grows larger than even abs(3^n)
and \[\frac{3n}{\infty}=0\]
silly question...so this sequence is finite?
don't shoot me
No no no, the sequence is still infinite. We're just looking at how these things act as they tend towards infinity by looking at examples of them increasing at specific points.
ok :P
n! grows a heck of a lot faster than some integer^n
So basically the key point to look at is where the factorial starts growing faster than the exponential. When n=c, the factorial starts to grow faster than the exponential does, and it's just a matter of time for the factorial to catch up and pass the exponential in value. Then it will continue to grow faster as you approach infinity, which means the sequence converges to zero.
ok...let's see if I understand this. The sequence is infinite, but eventually converges to zero.
right.
yes
Hmm I don't like the use of the word eventually there.
That implies that there is a point at which the terms of the sequence are zero, and that is not the case.
We say that the sequence converges to zero, but that does not mean that it actually gets there.
how would you rephrase this sentence? "The sequence is infinite, but eventually converges to zero." the sequence is infinite, and converges to zero?
Yes. The word eventually implies that there is a point at which the sequence converges to zero, but that's not what convergence means. No matter how big an \(n\) you choose, \(\dfrac{(-3)^n}{n!}\) will always be nonzero.
I think this is an important distinction to understand when talking about sequence convergence. Sequence convergence simply means that you can get as close as you like to zero by going far enough up the sequence, it doesn't mean the sequence ever actually reaches zero.
I don't see that the term "eventually" is incorrect. Converging to 0 does not mean that it eventually equals 0.
Eventually implies temporality to the convergence, and convergence does not have temporality.
In this case then, eventually means "immediately" since it immediately starts to converge to 0. The two aren't mutually distinct.
"Starts to converge" is still attributing temporality to the convergence. Convergence is not temporal. A sequence either converges or it does not. There is not a point at which it starts to converge.
Well it starts at \(n=0\) (or 1 in some cases).
What are you going to do, define a value of epsilon where we can say the convergence has begun? Convergence is on-off, there is no notion of time involved.
Either there's a delta for every epsilon or there is not.
I could also argue that for the first few terms for \(n=0,1,2\) the absolute value of the sequence is increasing, and is thus not converging to 0.
If you were to argue as such, you would be misunderstanding what convergence means.
Convergence isn't an action, it is an attribute.
Maybe that's why I don't like analysis very much :P
It can be convenient to think about convergence as an action that the sequence takes, but it is not really proper. A sequence converges if there's a delta for every epsilon. It doesn't converge if there isn't.
Haha, I don't blame you for that. I think most people don't like analysis very much.
And now we've probably confused MathSofiya much more than we have helped :P
no I like the discussion:P
thanks guys!!!
you're welcome.
While the results alternate between negative and positive, they are still ALWAYS getting closer and closer to 0 |dw:1342054051960:dw| That's a bad picture, but it's a little something like that.
Join our real-time social learning platform and learn together with your friends!