I came across an interesting limit, tell me what you think.
\[\lim_{x \rightarrow 0} (\frac{ 1 }{ x^2 } + \frac{ - \sin x }{ x^3 })\] What's the answer?
sum property
is the answer -1?
you can just use l'hospital a bunch a few times
write as one fraction of course
then do the whole l'hospital thing
The answer's not -1 @Flop
can you not multiply by x^2 so you would get 1-sin(x)/x = 1-1=0
its actually 0?
It is zero, yeah.
Nope @Jemurray3
Oh my fault.. 1
hint: it is something between 0 and 1
i agree with @myininaya 's idea we have a 0/0 form by substitution so l'hopital's rule looks good (in two cases)
the divergence cancels out, was the point.
1/3
not 1/3
but getting warmer
l'hospital would work but there has to be a shorter way
l'hospital isn't long here it is actually very short and cute
oh balls I can't divide. 1/6
that's great
Despite my apparent inability to do fractions, the main principle is what I meant -- expanding in a power series cancels the divergence and you just pick up the leftovers.
\[-\sin x=-x+\frac{ x^3 }{ 3! }-...\] \[-\sin x+x=\frac{ x^3 }{ 3! }-...\] \[\frac{-\sin x+x}{x^3}=\frac{ 1 }{ 3! }-...\] The rest of the terms are dependent on x so they go away. =P
^ That's what I get for trying to do eight things at once... but I'm a big supporter of power series over l'Hospital when possible.
now that is creative
Just playing around with trig stuff and came across that and thought I'd share it. If anyone else wants to show interesting/fun trig/powerseries/limit stuff go for it!
I've got a nice integration trick, something cool about power series, and an unusual limit/iterated function/approximation trick. Take your pick. Too much of that rhymed.
ALL!
Or your favorite, I have time lol. Integration if you still can't pick.
lets go with the integral first.....i should be doing physics
i do like integral tricks, although all of the above is still as good a choice as any. >.>
Alrighty -- I'll just keep going in this thread until I get yelled at by the establishment.
So we are imagining that we have an integral of the form \[ \int dx \space f(g(x)) \cdot g(x) \] Notice that the second function is g(x), not g'(x) -- otherwise this would be a straightforward antiderivative.
what I'm going to do is invent a parameter lambda, so I write my integral as a function of lambda: \[ I(\lambda) = \int dx \space f(\lambda \cdot g(x)) \cdot g(x) \]
I'm listening... =)
I notice something lovely - that I can rewrite the integrand as follows: \[ I(\lambda) = \int dx \frac{\partial}{\partial \lambda} f(\lambda\cdot g(x)) \] and since the parameter lambda and the integration variable x are completely independent, I'm free to exchange the integral sign and the partial derivative: \[I(\lambda) = \frac{d}{d\lambda} \int dx \space f(\lambda\cdot g(x)) \]
Wait a sec... Are you going to take the derivative wrt lambda and ultimately set lamda = 1? I'm
That's the idea, yes! Though it should be noted that you can do other things with this as well.
Yeah, I'm curious how that goes, since you'll have t integrate and slve fr a constant of integration and wow this keyboard is broken.
The point is, if you are able to perform the remaining integral, then differentiation wrt lambda and setting lambda = 1 solves the problem. It may perhaps seem artificial, but allow me to give you an example:
there's one way to do it without series or l'hopital, change x = 3y
Hmm that's really clever, I was thinking of something slightly different.
It's well known that \[ \int_{-\infty}^\infty dx \space e^{-x^2} = \sqrt{\pi} \] It can similarly be shown straightforwardly that \[ \int_{-\infty}^\infty dx \space e^{-ax^2} = \sqrt{\pi/a} \] So what happens when we have (suppressing the limits, which go from - infinity to infinity) \[ \int dx \space x^2 e^{-x^2} \] ?
You can use integration by parts to solve this, but what I'm gonna show in a second will demonstrate IBP's limitations in this problem.
\[ I(\lambda) = \int dx \space x^2 \cdot e^{-\lambda x^2} = -\frac{d}{d\lambda} \sqrt{\pi/\lambda} = \frac{\sqrt{\pi}}{2\lambda^{3/2}}\] so \[I(\lambda) = \frac{\sqrt{\pi}}{2} \] Just like you said. Now let's stretch our wings a little bit--- what about \[ \int dx \space x^n \cdot e^{-\lambda x^2} \]
Because of the parity of the problem, it's clear that if n is odd, the integral is equal to zero. But if it's even, and we define m = n/2, then we can write down the answer: \[ I(\lambda) = \frac{d^m}{d\lambda^m} \sqrt{\frac{\pi}{\lambda}} \]
Which you can simplify however you like. Now let's go for the big one: What about \[\int dx \space f(x) e^{-x^2} \] ? Let's assume we know the power series of f(x) around the point x = 0. I.e. \[ f(x) = \sum_{n = 0}^\infty \frac{f^{(n)}(0)}{n!}x^n \] We can immediately write down our integral: \[\int dx \space f(x) e^{-x^2} = \sum_{m = 0,2,4...} \frac{f^{(n)}(0)}{n!}\frac{d^m}{d\lambda^m} \sqrt{\frac{\pi}{\lambda}} |_{\lambda = 1}\]
Oops! Also, there should be a -1 term in there: \[ \int dx \space f(x)e^{-x^2} = \sum_{\text{even m}} (-1)^m\frac{f^{(m)}}{m!} \frac{d^m}{d\lambda^m} \sqrt{\frac{\pi}{\lambda}}|_{\lambda = 1} \] Whew! Sorry for the numerous typos, but that should be right.
Well damn, that's clever!
But does the power series simplify further into something nicer?
Sigh.. one more typo on that last line, it should be \[ f^{(m)}(0) \] And it depends entirely on the function f(x), which we haven't specified.
That's the nice part -- that represents the integral for *any* function f(x) that has a continuous power series. So really, just analytic functions.... but still, that's quite a variety you could have there.
I get the reasoning though, makes perfect sense since the power series turns it into a polynomial and only the even terms won't be zero, so we're good there. What about if we know their fourier series, can we do anything interesting there?
Hmm... if we knew the Fourier series, we'd still end up with a bunch of sines and cosines, which would be difficult to integrate.
Here's an interesting question which I just thought of, and will be spending some time on ... if f has a power series with a particular radius of convergence, i.e. it DIVERGES past a certain point, could we still use it here? The exponential part would kill off all of the polynomial terms as they go to infinity, so almost everything would converge in principle.
Have you seen this before? If a is a constant greater than or equal to zero. \[1=\int\limits_{0}^{\infty} ae^{-ax}dx\] This integral is equal to 1 by simply solving it. Pull out the constant a and you get:\[a^{-1}=\int\limits_{0}^{\infty} e^{-ax}dx\]differentiate both sides with respect to a: \[a^{-2}=\int\limits_{0}^{\infty} xe^{-ax}dx\] the negative signs cancel! \[2a^{-3}=\int\limits_{0}^{\infty} x^2e^{-ax}dx\]\[3!*a^{-4}=\int\limits_{0}^{\infty} x^3e^{-ax}dx\]keep going... \[\frac{ n! }{ a^{n+1} }=\int\limits_{0}^{\infty} x^n e^{-ax}dx\] and of course for a=1\[n!=\int\limits_{0}^{\infty} x^n e^{-x}dx\]
I've never seen that before -- that's an awesome way to arrive at the gamma function.
@Jemurray3 Actually isn't that kind of the idea behind the Laplace transform? I think you're right as long as you can show that e^{-x^2} dies out faster.
Have you ever heard of a pade approximant?
Nope, what's that?
Let me start another thread and introduce it to you quickly. It's an active area of research and pretty damn mysterious, but it is related to the subtlety of the question I asked.
Hmm sounds awesome, thanks!
Join our real-time social learning platform and learn together with your friends!