A question on exact DE. Suppose I have h'(y) (after solving for Psi_x), but h'(y) depends on x. That cannot happen or should I treat x as a constant as usual? I mean, all examples I saw so far, the x's cancel out, but I am not sure if this should be generalized.
http://www.khanacademy.org/math/differential-equations/v/exact-equations-example-3 for instance, 3:10 of this video. The x cancel out when you take the partial derivative with respect to y and compare with Nx.
Can you please post your problem to see what is going on?
I have this linear equation: \[x^2\frac{dy}{dx} + xy = 1 \]I know this is separable and I have already solved it. But I was trying to practice using an integrating factor to make it exact. I rewrote the equation as: \[x^2dy + (xy - 1)dx = 0 \]I found \(\mu(x) = x^{-1} \) and it seems to work: \[xdy + (y - \frac{1}{x})dx = 0 \]The derivative of the first with respect to x is 1, from the second with respect to y is also 1. That implies that \(\Psi (x,y)_x = x \implies \Psi(x,y) = \frac{1}{2} x^2 + h(y)\). Derivating with respect to y yields: \(\Psi (x,y)_y = h '(y) = y - \frac{1}{x}\). My question is this: can h' depend on x like that or I can't solve the DE like that?
@amistre64 ? :-D
it looks like a couchy
assume x^r is a solution and plug it in
What I did first was I divided by x to get \[x \frac{dy}{dx} + y = \frac{1}{x}\]Then I said that \( \frac{dx}{dx} = 1 \) and rewrote the equation as: \[x \frac{dy}{dx} + y\frac{dx}{dx} = \frac{1}{x} \implies \frac{1}{dx} (xdy + ydx) = \frac{1}{x} \implies \frac{d}{dx}(xy) = \frac{dx}{x}\]And solved it like that :-). But my question is on the exact diff. right below. And thanks for helping me out.
x^2 dy + (xy - 1) dx = 0 Fyx = 2x Fxy = x its not exact
I know. The integrating factor is \( \mu (x) = x^{-1} \) :-) At least, the one I found.
im not conversant on all diffyQs so if you found something that works, kudos ;)
But do you remember when you solve an exact diff. equation, you have to derivative the function with respect to y after integrating with respect to x? Supposing the solution is a function of x and y, that is.
i recall exacts yes; and i was reading up the other night on line integrals that looked similar to them. But i still havent wrapped up a good concept for them yet
Hmm. My question is, after you take the derivative with respect to y, do all x's have to disappear from the function? Or can I have something like that above: h'(y) = y - 1/x ?
Mdx + N dy can be seen as a dot product of a partial and another vector
i dont think all xs have to disappear, but that might be my novice views getting in the way
http://archive.org/details/IntroductionToLineIntegrals and alot of other good stuff there as well
I don't think either, but having the (bad) luck so far of all the examples I did the x's cancelling out after the partial, seeing something like that makes my spider sense starts to tingle
Thanks for the link, mate :-)
\[ \Psi (x,y)_x = y - \frac 1 x \\ \Psi (x,y)= x y -\ln x + h(y)\\ \Psi (x,y)_y= x + h'(y)= x\\ h(y) = k \\ \text { Hence } \Psi (x,y)= x y -\ln x + k \] \[\Psi (x,y)_x \]is the coefficient of dx and not of dy
Ah, that makes sense. :-) In general, h(y) cannot depend on x, right?
yes, if we have exactness.
Thanks for the help :-) This doubt was bothering me for some time.
yw
Join our real-time social learning platform and learn together with your friends!