Two identical 100 kg objects (of negligible size) are in space 100 m away from each other. The force of gravity causes both objects to accelerate towards one another. Find the time it takes for the objects to collide.
\[\Large f=ma\]\[\Large s=ut+\frac12at^2\]s=50, u=0, f=9.8, m=100
I know that the acceleration (in terms of r) is: \[a(r)=G \frac{100}{r^2}\]where r is the distance between the objects. I can find the velocity in terms of r by integrating that; I get: \[v(r)=\frac{-300G}{r^3}+C\] Where do I go from here?
@kc_kennylau the problem deals with non-constant acceleration, so the second equation doesn't work.
lolz ok I don't know how to do xP I'm only in Grade 9 xP
thanks for trying.
no problem
oops... \[v(r)=-\frac{100G}{r}+C\]
Sorry too lazy to write out the equation but here u go: using r as the distance, acceleration = d2r/dt2 so u have d2r/dt2 = Gx100/r^2 solve the differential equation with the boundary conditions r=0, t=0 and r=50 and t=x where x is ur solution.
Thanks for the reply. How do you use differential equations on the second derivative?
Try this: http://en.wikipedia.org/wiki/Classical_central-force_problem#One-dimensional_problem It really more of a physics question than math prob.
Join our real-time social learning platform and learn together with your friends!