Ask your own question, for FREE!
Mathematics 15 Online
OpenStudy (anonymous):

Show that assuming y=e^lambda(t) we are led to the equation... a(lambda^2)+b(lambda)+c=0

OpenStudy (anonymous):

I'm assuming this is for Diff Eq? Suppose you have the ODE \[ay''+by'+cy=0\] Assume \(y=e^{\lambda t}\), which gives you \[y'=\lambda e^{\lambda t}\text{ and }y''=\lambda^2e^{\lambda t}\] The ODE then changes to \[a\left(\lambda^2e^{\lambda t}\right)+b\left(\lambda e^{\lambda t}\right)+c\left(e^{\lambda t}\right)=0\\ e^{\lambda t}\left(a\lambda^2+b\lambda +c\right)=0\] \(e^{\lambda t}>0\text{ for all }t,\) so you can divide both sides by it, leaving you with \[a\lambda^2+b\lambda +c=0\]

OpenStudy (anonymous):

Yes you are completely correct, it is for Differential Equation. :) Thanks for the help! :D

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!