An object is dropped from 9 ft above the surface of the moon. How long will it take the object to hit the surface of the moon. d^2s/dt^2=-5.2 ft/sec^2 A. 1.861 sec B. 3.462sec C. 1.316 sec D. .865 sec
try solving it
integrate twice then solve for t
Not sure what you mean
do you know how to integrate?
The object's initial position is 9 ft, so if \(s(t)\) describes the position \(s\) at time \(t\), then you have one initial value, \(s(0)=9\). The object is dropped, which means it has no preliminary velocity associated with it. This gives the initial value \(v(0)=0\), where \(v(t)\) is the velocity \(v\) at time \(t\). The velocity function is the derivative of the position function [\(v(t)=s'(t)\)], and the acceleration function is the derivative of the velocity function, i.e the second derivative of the position function [\(a(t)=v'(t)=s''(t)\)]. \[s''(t)=-5.2\] Integrating both sides with respect to \(t\) gives \[\int s''(t)~dt=-5.2\int dt\\ s'(t)=-5.2t+C_1\] Integrating again gives the position function itself. You'll have an expression with two unknowns, \(C_1\) and \(C_2\) (if you keep with my notation). You use the initial values to determine the unknowns. Once you find those, you can set up an equation to find the time necessary to hit the surface of the moon: \[s(t)=0\] and you'd solve for \(t\).
The other way to approach this problem is as you would a free fall problem in physics, using the formula \[y=\frac{1}{2}gt^2\] where \(g\) is the acceleration due to gravity, \(-5.2\) in this case, and \(y\) is the position, \(9\) in this case.
Join our real-time social learning platform and learn together with your friends!