let c and s both be continuous function if \[s'(x)=c(x)~~,~~ c'(x)=-s(x) ~~and~~ s(0)=0,~~c(0)=1\] prove that \[[s(x)]^2+[c(x)]^2=1\]
i know this sin and cos i'm stuck oh how to show they are
the conditions satisfy sin and cos functions
i was thinking of adding the two derivative
\[\begin{cases}s'(x)=c(x)&(1)\\c'(x)=-s(x)&(2)\end{cases}\] Differentiating (1) gives \(s''(x)=c'(x)\). Substituting into (2) gives \[s''(x)=-s(x)~~\iff~~s''(x)+s(x)=0\] which is a second order ODE with characteristic equation \(r^2+1=0\), roots \(r=\pm i\), which generate the fundamental solution set \(\{\cos x,\sin x\}\), like you said. Alternatively, you can solve this as you would a system of ODEs. \[\begin{pmatrix}s(x)\\c(x)\end{pmatrix}'=\begin{pmatrix}0&1\\1&0\end{pmatrix}\begin{pmatrix}s(x)\\c(x)\end{pmatrix}\] The coefficient matrix has eigenvalues \(\lambda=\pm i\), which give eigenvectors \(\vec{\eta}=\begin{pmatrix}\mp i\\1\end{pmatrix}\), again giving you the same solution set.
Slight mistake, the bottom left 1 in the coefficient matrix should be negative.
i got what you did but how would we prove the identity (this is not a DE question) it is cal2 problem
you know i kind of went to that DE too but idk how it could help that way
my prof said do implicit differentiation to the identity we want to prove and some other stuff, but i disagree with him we need to prove that holds so we can use the identity itself we need to arrive at it
can't*
Suppose we try a reverse approach to what your professor suggested. \[\begin{align*}\begin{cases}s'(x)=c(x)\\c'(x)=-s(x)\end{cases}~~\implies~~-s(x)s'(x)&=c(x)c'(x)\\ -2s(x)s'(x)&=2c(x)c'(x)\\ 0&=2c(x)c'(x)+2s(x)s'(x)\\ 0&=\frac{d}{dx}\left[(c(x))^2+(s(x))^2\right]\\ C&=(c(x))^2+(s(x))^2 \end{align*}\] Now when \(x=0\), you have \[\begin{align*}C&=(c(x))^2+(s(x))^2\\ C&=1^2+0^2\\ C&=1\end{align*}\]
yes that's good! that is what i was looking for i thought of summing both equations but i didn't go further
thanks
yw
i was trying with a similar problem hehe \[f(x+y)=f(x)f(y)-g(x)g(y)~~and ~~g(x+y)=f(x)g(y)+g(x)f(y)\] and again i have to prove \[[f(x)]^2+[g(x)]^2=1\]
there are similar we have f(0)=1 g(0)=0
i will see how i would solve this one!
I'm not too experienced with functional equations, but if I were to guess you might be able to find a form of a partial derivative if you play around with the terms.
hmm partial derivative! unfortunately i forgot how to play with calc3 stuff i will see what i can do :) thanks
my thought for now is to let x=y and then see what i can pull from that
trying this \[\begin{cases}f(x+y)=f(x)f(y)-g(x)g(y)\\g(x+y)=g(x)f(y)+g(y)f(x)\end{cases} \Longrightarrow \begin{cases}[f(x+y)]^2=[f(x)f(y)-g(x)g(y)]^2\\ [g(x+y)]^2=[g(x)f(y)+g(y)f(x)]^2\end{cases}\]
but this is not using differentiation lol
if i add them up i should get \[[f(x+y)]^2+[g(x+y)]^2=[f(x)+g(x)][f(y)+g(y)]\] since g(0)=0 and f(0)=1 then we 1
eh but this a flow didn't pay attention to left hand side lol
how about using differentiation from that point?
eh that must be incorrect if i let y=0 \[[g(x)]^2+[f(x)]^2=f(x)+g(x)\] this wouldn't make any sense darn
hello maki san :)
by the way the question we are doing is on the comment 14th
@Kainui have a look at this :)
Join our real-time social learning platform and learn together with your friends!