suppose f(x) satisfies the following equation f''(x)=-f(x). show that (f(x))^2+(f'(x))^2=constant for all x
\[f''(x)=-f(x)\] show that \[(f(x))^2+(f'(x))^2=constant\]
@SithsAndGiggles
this question is evil
Have you taken differential equations yet? That'd probably be the easiest way to figure this one out.
no, this is for calc 1
i doubt this is gonna be on the final that i have in an hour
we didnt go over it in class but its on the review
Maybe it's a matter of pattern recognition, in which case you should recall that \(\dfrac{d}{dx}\sin x=\cos x\) and \(\dfrac{d^2}{dx}\sin x=-\sin x\), which satisfies the condition \(f''(x)=-f(x)\). The same applies for \(f(x)=\cos x\).
As well as for \(f(x)=\sin x+\cos x\).
so the constant is 1?
ya interesting
In this case, yes, but you can see that the condition is met for all \(k\) with \(f(x)=k(\sin x+\cos x)\).
The question is how to arrive at these functions for \(f\). The DE-way is easy enough, but probably beyond what you've learned if you haven't been exposed to differential equations yet.
it seems like a pretty unique question, so at least i know the answer, thanks
Join our real-time social learning platform and learn together with your friends!