The time t required to drive a certain distance varies inversely with the speed r. If it takes 2 hours to drive the distance at 30 miles per hour, how long will it take to drive the same distance at 35 miles per hour?
30 miles per hour for 2 hours is 60 miles. 60 / 35 = 1.71 hours
\(\bbox[5pt, #000000 ,border:5px solid #00FF00]{\bf\huge \color{#00FF00}\sim \! \! \! \! \color{#00FF00}i\it\Huge\color{#00FF00}G\color{#00FF00}r\color{#00FF00}e\color{#00FF00}e\color{#00FF00}n\bf\color{#00FF00}ツ}\)
Okay see. This is a proportionality problem. Here: \[\large{t \propto \cfrac{1}{r}}\] \[\implies\large{t = \cfrac{k}{r}}\] Now, from first part of the problem we have, \[\large{2 = \cfrac{k}{30}}\tag{1}\] If the required time is let t', then, \[\large{t' = \cfrac{k}{35}}\tag{2}\] Now, from equations (1) and (2), you can obtain the correct answer.
Join our real-time social learning platform and learn together with your friends!