I have to solve this in Algebra form: Terry's plane takes off from the Big Town airport to fly 260 miles to visit Terry's mom in Little Town. Fifteen minutes later, Terry's mom starts her twenty mile drive to the Little Town airport to just meet Terry when his flight lands. If mom drives 50 miles per hour, how fast does Terry's plane fly?
Terry's plane averages 400 mph. In this problem equate the average times for Terry and Mom to get to the airport. The clock begins for both Terry and Mom when Terry takes off. t = d/r Let r be the rate of Terry's flying speed. Solve the following for r: \[\frac{260}{r}\text{= }\frac{20}{50}+\frac{15}{60}\]The LHS is Terry's flight time. The RHS is the time Mom took to get to the airport plus the elapsed time that passed before Mom got into her car and drove to the airport.
Join our real-time social learning platform and learn together with your friends!