For the first 60 miles of a 120-mile trip, a person drive 50 mph. What speed would he have to travel for the last half of the trip so that the average speed for the entire trip would be 61 mph. The answer is supposed to be 78 8/39 miles. How can this be?
Here's my logic: 61 = 50*x 2 Cross multiply: 61*2 = 50*x 122 = 50 x – subtract 50 from 122 and I get 72 = x but this isn't correct.
I would not use logic. I would use the definition of average speed total distance/total time = average speed to use that equation we need some numbers. time spent on the first half of the trip: rate * time = distance 50*x = 60 x= 60/50= 1.2 hours he spend 1.2 hours going 60 miles. next find the time to go the next 60 miles, using the first equation \[ \frac{ total \ distance}{total \ time}= average \ speed\\ \frac{120}{x+1.2}= 61 \]
solve for x to get the time it takes to travel the last 60 miles 61x+ 73.2= 120 61x= 46.8 x= 46.8/61 that is the time it takes to go the second 60 miles it will have a speed of 60 miles/x \[ speed = 60 \cdot \frac{61}{46.8}= \frac{3660}{46.8}= 78 \ \frac{8}{39}\ mph \]
@phi Thank you! I didn't think of it that way.
Join our real-time social learning platform and learn together with your friends!