Ask your own question, for FREE!
Mathematics 20 Online
OpenStudy (anonymous):

Two semi trucks are driving loads from Chicago to Denver, distance of 1125 miles. First truck leaves at 8 AM, averages 50 mph. Second truck leaves at 9 AM, averages at 54 mph. How much would the driver of the first truck have to increase her speed in order to arrive in Denver first? The answer is "at least 1.53 mph" . But I don't know how to get to this.

OpenStudy (badhi):

Firs u can find the time when the second driver reaches Denver (take \(t'\)) First driver will have to at least arrive at this time (t') to fullfill the requirement. if the required speed increase - \(u\) \((50+u)(t'-8)=1125\) For certain it should be less than 4

OpenStudy (anonymous):

that still doesn't explain how it gets to at least 1.53, @BAdhi

OpenStudy (kropot72):

The times taken for the journey by the second truck is 1125/54 hours. To arrive at the same time as the second truck the first truck must travel at a speed of: \[\large \frac{1125}{1+\frac{1125}{54}}\ mph\]

OpenStudy (kropot72):

*The time taken........

OpenStudy (anonymous):

that equals 51.53

OpenStudy (kropot72):

Correct. Therefore the speed of the first truck must increase from 50 mph to more than 51.53 mph, giving a required increase in speed of more than 51.53 - 50 = 1.53 mph.

OpenStudy (anonymous):

Oh okay. Thank you!

OpenStudy (kropot72):

You're welcome :)

OpenStudy (anonymous):

Refer to the attached solution from Mathematica 9.

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!