Two semi trucks are driving loads from Chicago to Denver, distance of 1125 miles. First truck leaves at 8 AM, averages 50 mph. Second truck leaves at 9 AM, averages at 54 mph. How much would the driver of the first truck have to increase her speed in order to arrive in Denver first?
the answer is "at least 1.53 mph" . But I don't know how to get to this.
You have to think about what is going on. (It helps to have already solved similar problems) First truck leaves at 8 AM, averages 50 mph. After one hour, the truck is 50 miles closer to Denver, so it has 1125 miles - 50 miles= 1075 miles to go
Second truck leaves at 9 AM, averages at 54 mph, and has 1125 miles to go Just to get an idea of what is going on, I would figure out how long will it take the first truck to get to Denver, starting at 9 AM. use rate*time= distance, or time = distance/rate time = 1075/50= 21.5 hours the second truck will take 1125/54 = 20.83 hours so that says the second truck will get to Denver first. unless the first truck speeds up
If we are the first truck, we say, "we are only allowed 20 and 5/6 hour to go 1075 miles" now use rate* time = distance : rate * 125/6 = 1075 (btw, 20.8333... and 125/6 and 20 and 5/6 are all the same number) solve for rate r \[ r \cdot \frac{125}{6} = 1075 \] can you do that ?
51.6?
yes, that is what I got. the 1st truck was going 50, and it has to speed up to 51.6 so it must increase its speed by 1.6 mph I don't think the answer can be 1.53 mph ??
or, maybe they are thinking what should the new speed be starting immediately? i.e. at 8 AM the 1st truck has to get to denver in 20 5/6 hours ? can you figure that out ?
I'll try. Thank you, though!
wait!, because the 1st truck starts 1 hour earlier, it has 21 and 5/6 hours
so the problem is distance = 1125 miles rate is unknown time is 21 and 5/6 hours = 131/6 hours rate * 131/6 = 1125
Join our real-time social learning platform and learn together with your friends!