Ask your own question, for FREE!
Mathematics 21 Online
OpenStudy (idku):

One question about the domain of a Laplace of a Particular function.

OpenStudy (idku):

I have started to read something about laplaces on my own:)

OpenStudy (idku):

\(\large\displaystyle {\cal L}\{\sinh(\alpha t)\}(s)=\lim_{N\to\infty}\int\limits_{0}^{N} e^{-st}\sinh(\alpha t)dt\) \(\large\displaystyle {\cal L}\{\sinh(\alpha t)\}(s)=\frac{1}{2}\lim_{N\to\infty}\int\limits_{0}^{N} (e^{-st}e^{\alpha t}-e^{-st}e^{-\alpha t})dt\) \(\large \displaystyle {\cal L}\{\sinh(\alpha t)\}(s)=\frac{1}{2}\lim_{N\to\infty}\int\limits_{0}^{N} (e^{(\alpha-s)t}-e^{-(\alpha+s) t})dt\)

OpenStudy (idku):

\(\large \displaystyle {\cal L}\{\sinh(\alpha t)\}(s)=\frac{1}{2}\lim_{N\to\infty} \left[\frac{e^{(\alpha-s)t}}{\alpha-s}+\frac{e^{-(\alpha+s)t}}{\alpha+s}\right]_{t=0}^{t=N} \) I will paste my question refering to here ****

OpenStudy (idku):

\(\large \displaystyle =\frac{1}{2}\lim_{N\to\infty} \left[\frac{e^{(\alpha-s)t}}{\alpha-s}+\frac{e^{-(\alpha+s)t}}{\alpha+s}\right]_{t=0}^{t=N} \) \(\large \displaystyle =-\frac{1}{2} \left[\frac{1}{\alpha-s}+\frac{1}{\alpha+s}\right] \) \(\large \displaystyle =\frac{1}{2} \left[\frac{1}{s-\alpha}-\frac{1}{s+\alpha}\right] \) \(\large \displaystyle =\frac{1}{2} \left[\frac{(s+\alpha)-(s-\alpha)}{s^2-\alpha^2}\right] \) \(\large \displaystyle {\cal L}\{\sinh(\alpha t\}(s)= \frac{\alpha}{s^2-\alpha^2} \) Just to finish the transform

OpenStudy (idku):

Now, the question ...

OpenStudy (idku):

For the limit to converge, all coefficients of t must be negative ((and not zero, because of the denominators that resulted after integration)), and that means that \(\large \displaystyle \alpha-s<0 \) ---> \(\large \displaystyle \alpha<s \) \(\large \displaystyle -(\alpha+s) <0 \) ---> \(\large \displaystyle s>-\alpha \) And so it would be that: \(\large \displaystyle s>| \alpha| \)

OpenStudy (idku):

And everywhere I look I keep seeing just \(\large \displaystyle s> \alpha \).

OpenStudy (idku):

(by the way, same question comes up for \(\large \displaystyle \cosh(\alpha t)\).... just is for sinh )

OpenStudy (kainui):

Ahh I am going to guess that \(\alpha\) is always just assumed to be greater than 0 since it being negative ultimately results in the same answer, like you have there \(|\alpha| <s\) . For instance, if you picked an alpha less than 0 you'd be able to pick \(\alpha = - \beta\) and make it positive, \[\sinh(\alpha t) = \sinh(-\beta t) = -\sinh (\beta t)\] Probably not a very satisfying answer, sorry haha but at least you seem to be doing a pretty thorough job of self studying which is impressive. :D

OpenStudy (mimi_x3):

sinh(X)= sin(ix)?

OpenStudy (idku):

Well, I thought that hyperbolic easier.

OpenStudy (idku):

Generically, you are just having e^(at+b)

OpenStudy (mimi_x3):

it is im tryna determine the hyperbolix equation but i wanna do it myself thats how u get sinh from sin right

OpenStudy (idku):

It might not work the imaginary way.

OpenStudy (kainui):

Hey stop trying to hijack idku 's question lol

OpenStudy (mimi_x3):

lol sry :D

OpenStudy (idku):

I won't derive Euler's Formula through Taylor series for e^x, it's pretty obvious without me. \(\large\displaystyle {\cal L}\{\sinh(\alpha t)\}(s)=\lim_{N\to\infty}\int\limits_{0}^{N} e^{-st}i\sin(\alpha t)dt\) the i is on the outside actually.

OpenStudy (idku):

yes, I have seen that previously, while doing Dirichlet's test.

OpenStudy (idku):

(for series conv)

OpenStudy (idku):

But, then you need integration by parts twice to solve for this integral. Not the best way:)

OpenStudy (idku):

In any case, Kainui, I guess, yes, \(\alpha>0\) is the presumption. Otherwise, I don't see a lot of sense....

OpenStudy (kainui):

There's actually a third way of solving this integral without going to Euler's identity or doing integration by parts twice. You can invert a 2x2 matrix haha :P

OpenStudy (idku):

Matrix? I haven't really done anything in Linear Algebra. I read about transformation, linear dependence, of course Gaussian elimination, "does S span R^n" and all introductory stuff...

OpenStudy (idku):

Can you show me?

OpenStudy (idku):

(what do I have to know to view this? I might not be eligible knowledge-wise)

OpenStudy (kainui):

Well if you can invert a 2x2 matrix, that's about all you need to know, it's sorta a strange trick if you're not used to it but once you know it it gets comfortable fast and you can use it to solve some differential equations as well.

OpenStudy (idku):

Oh, take the inverse of. Yes, I do.

OpenStudy (idku):

I can take the inverse of any square-matrix!

OpenStudy (kainui):

Differentiation is a linear operator and the derivatives of \(e^{ax}\cos (bx)\) and \(e^{ax}\sin(bx)\) is closed, it's a 2D space if you wish to think of it that way, so you can use these as your basis vectors for your space, so the column vector: \[ [ 3,-7]^\top \] represents the function: \[3 e^{ax} \cos(bx) -7 e^{ax}\sin(bx)\] So here's where the fun comes in.

hartnn (hartnn):

sorry to interrupt, but we got s > a and s >-a then how did that become s > |a| ???????????? for s > |a|, we must have s>a and s< -a, right? since a is assumed to be positive, s> a will eventually cover s> -a too.

OpenStudy (kainui):

If you know how the basis vectors transform, you can find the derivative matrix, since that's the main power of linear operators. Knowing how they change the basis vectors, you can change any linear combination with the transformation. Once you know that, then you invert that derivative matrix, and that's your integral matrix, since differentiation is the inverse of integration. So multiply this inverted 2x2 integration matrix by the vector representing the function you wanna integrate and that's it... Hahaha maybe easier said than done?

OpenStudy (idku):

Hartnn. s>a s>-a If you just say s>a, then you will get a diverging Laplace for all a<0. If you just say s<a, then you will get a diverging Laplace for all a>0. So, from those two equations, the domain of s becomes s>|a|. And, that was before we concluded that assumingly s>a. (And I think they should write s>|a|, without that assumption, but what do I know?)

OpenStudy (idku):

let me fix it.

OpenStudy (idku):

assuming a>0.

OpenStudy (kainui):

I originally thought what @hartnn was thinking too and then at the end of writing my original response I had to delete it cause I saw what @idku was saying

hartnn (hartnn):

what even? :O where did s<a come from??? why would we say that? if we had s<-a and s>a , only then s > |a| holds true! go to basics, draw a number line if you need to! s>a , s>-a implies s should be > a . thats what i know for sure... I STILL DON'T SEE IT!! omg

OpenStudy (idku):

Look at the limit, where I said "paste the question here" .... The coefficients of t, all, must be negative.

OpenStudy (idku):

(otherwise your laplace won't exist)

hartnn (hartnn):

a-s < 0 --> a <s , s>a -(a+s) < 0 --> a+s > 0 --> s > -a s>a AND s>-a ---> s >a >.<

OpenStudy (idku):

Yes, that is assuming that a>0, but my question was before we made this assumption.

hartnn (hartnn):

what will change in what i replied last if a was negative? still s>a would hold good

hartnn (hartnn):

a is negative a-s < 0 --> a <s , s>a -(a+s) < 0 --> a+s > 0 --> s > -a s>a AND s>-a ---> s >a

OpenStudy (idku):

-(a+s) say, for example, s=7 a=-8 (notice s>a) -(-8+7)=1>0 limit diverges.

hartnn (hartnn):

sorry for so much trouble, i just don't see it :P

OpenStudy (idku):

lim N-->∞, e^(-(a+s)N) converges only when -(a+s)<0. lim N-->∞, e^(a-s)N) converges only when a-s<0.

OpenStudy (idku):

So, for both limits to converge, you need s>|a|, if the assumption a>0 isn't made.

OpenStudy (kainui):

@hartnn if you just say \(\alpha <s\) then this doesn't imply \(-\alpha < s\) since \(\alpha\) could be negative.

OpenStudy (kainui):

Specifically (lol) this could be true: \(\alpha < - \alpha\)

hartnn (hartnn):

alright so for a negative, s>a AND s>-a ---> s >a doesn't hold true...

OpenStudy (kainui):

So that's why I basically just said in my response that they probably just assume \(\alpha>0\) That's cause now YOU are saying \(|\alpha| < s\) because of your AND condition.

OpenStudy (idku):

Why do I want to avoid this assumption a>0? Well, sinh(at) is a generic hyperbolic sine functions (pretty much, except that it lacks a generic coefficient/1st-deg-poly in front) why do I want to neglect all cases when a<0? Just because some "professors" are lazy to put |..| around a (so that s>|a|, just as it should for limits to converge, for all a.)

OpenStudy (kainui):

This just went backwards hahaha

OpenStudy (idku):

oh, 0-deg-poly (not 1st)

OpenStudy (idku):

In any case, we have got off track:)

OpenStudy (idku):

We kind of resolved the question already.

OpenStudy (idku):

i need to referesh

hartnn (hartnn):

you're right, s > |a| where did you see s >a only? wiki says s>|a| https://en.wikipedia.org/wiki/Laplace_transform

OpenStudy (idku):

Oh, I have looked in the printed sources.

OpenStudy (idku):

Another proof why they suck:) xD

OpenStudy (idku):

So, Kainui, you want to set a transformation: \(\large \displaystyle T(z_1, z_2) = (z_1e^{ax}\cos(bt)~,~z_1e^{ax}\sin(bt))\) ?

OpenStudy (idku):

(using z for variables)

hartnn (hartnn):

its always good to cross-verify with reliable sources like wiki (though sometimes wiki is also not that reliable)

OpenStudy (idku):

Yeah, some of my instructors told me that they purposely posted fallacious info in wiki, and waited how long it took them to figure out and fix it. I will tell you, it wss a long long long while:)

OpenStudy (idku):

Wait, kainui, so the matrix would be? \(\large \displaystyle e^{ax}\cos(bx)~~~~~~~~~~~~~~~~~~~~e^{ax}\sin(bx)\) \(\large \displaystyle \frac{d}{dx}e^{ax}\cos(bx)~~~~~~~~~~~~~\frac{d}{dx}e^{ax}\sin(bx)\)

OpenStudy (idku):

The only thing I have done with matrix in calc/DE application so far, is Cramer's Rule for Variation of Parameters v_1', and v_2'.

OpenStudy (kainui):

Yeah so we have some derivative matrix with arbitrarily filled entries to solve for: \[D=\begin{pmatrix} w & x \\ y & z \end{pmatrix}\] We know the derivative of: \[D (e^{ax} \cos(bx)) = ae^{ax} \cos(bx)-be^{ax} \sin(bx)\] So we can write this in terms of our vectors as: \[\begin{pmatrix} w & x \\ y & z \end{pmatrix} \begin{pmatrix}1 \\ 0 \end{pmatrix}=\begin{pmatrix}a \\ -b \end{pmatrix} \] So that's the first column of our matrix solved for. Try the next column or ask questions if you're lost.

OpenStudy (kainui):

Yeah you're on the exact right path I already had this typed up halfway and didn't see you had started so I kinda reexplained some stuff you're already figuring out haha

OpenStudy (idku):

No, I just guessed that, because that is a similar set up for det(W) in Var of Par.

OpenStudy (idku):

\(\large \displaystyle e^{ax}\cos(bx)~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~e^{ax}\sin(bx)\) \(\large \displaystyle ae^{ax}\cos(bx)-be^{ax}\sin(bx)~~~~~~~ae^{ax}\sin(bx)+be^{ax}\cos(bx)\)

OpenStudy (idku):

Like that?

OpenStudy (kainui):

Yeah, like that. This is our linear transformation of the derivative from our basis onto itself again.

OpenStudy (idku):

And what do I do to this matrix now?

OpenStudy (kainui):

I am just showing a way to solve for the entries of this matrix. Once you have it, you invert it to get the 'integration' matrix since you can invert a linear operator like this. Do you have the derivative matrix yet?

OpenStudy (kainui):

I'll just post it again, this is the steps I'm using to find the first column of our derivative matrix. --- Yeah so we have some derivative matrix with arbitrarily filled entries to solve for: \[D=\begin{pmatrix} w & x \\ y & z \end{pmatrix}\] We know the derivative of: \[D (e^{ax} \cos(bx)) = ae^{ax} \cos(bx)-be^{ax} \sin(bx)\] So we can write this in terms of our vectors as: \[\begin{pmatrix} w & x \\ y & z \end{pmatrix} \begin{pmatrix}1 \\ 0 \end{pmatrix}=\begin{pmatrix}a \\ -b \end{pmatrix} \] So that's the first column of our matrix solved for. Try the next column or ask questions if you're lost.

OpenStudy (idku):

Alright, I will look at this. I think I need to study more transformations before doing this particular example. Also, its 3am here, kind of tired. Kainui, thank you \(\displaystyle \lim_{x\to\infty} !^x\)

OpenStudy (idku):

Have a good morning!

OpenStudy (idku):

gtg

OpenStudy (kainui):

Thanks you too lol

OpenStudy (idku):

:)

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!