I'm trying to figure out how to prove that a function is decreasing with a little more rigor than plug-and chug; the normal go-to would be to take the derivative of a function and if it has a single negative term, you are certain. What if it has both negative and positive terms? How do I prove that it is decreasing over an interval, again, beyond just plugging in points?
E.g., let's say I have: \[f(x) = \frac{ 1}{ x } - \frac {1}{x^{2}}\] \[\frac{ dy }{ dx } = \frac {1}{x^{3}} - \frac {1}{x^{2}}\] I can see immediately that by plugging in values that 1/x^3 will always be less than 1/x^2 and thus the function will be decreasing. How do I prove that that is true for *all* x?
1/x - 1/x^2 is not a decreasing function because it's not even define at 0. You can, however, find where part of the function is decreasing using the first derivative test.
1/x^3 will always be less than 1/x^2 what if 0<x<1 ? eg when x=0.5
Sorry, should specify something: I'm only concerned with the interval from [1, infinity).
I already used the first derivative test, as shown. I don't know if that's satisfactory, with respect to the interval, but I feel like it isn't. @agent0smith @sourwing
1/x^3 < 1/x^2 if x > 1 so i feel like that's good enough
Alright, cool, maybe me just being paranoid, lmao.
you can also multiply both sides by x^2 if you like, it's always positive so won't affect the inequality 1/x < 1 if x > 1
Join our real-time social learning platform and learn together with your friends!