If you rent a car for one day and drive it for 100 miles, the cost is $40.00. If you drive it 220 miles, the cost is $46.00. Use the linear function to find out how much you will pay to rent the car for one day if you drive it 300 miles.
If you divide $6.00 by 120, you come up with 5 cents per mile. So the slope of the line (m) would be 0.05, or 1/20. We also know that it costs $40.00 at 100 miles, so when x=100, y=40. If you multiply 100 by 0.05 you get 5, which means the rate goes up $5.00 for every 100 miles. So if you take your original y=40 ($40.00 for 100 miles) and subtract 5, that means y=35 for 0 miles, and that gives you your zero-intercept (b). So your equation would be y = 1/20x + 35 Now, if x = 300 miles, then y= 1/20*300 + 35 y = 15 + 35 y = 50, therefore it would cost $50.00 to drive the rental car 300 miles.
well from the infromation 120 miles sees an increase of $6 so each mile costs 6/120 = 5cents so 100 miles at 5 cents per mile is $5 therefore the fixed cost in the car rental is $35 then the function is C =0.05M + 35 now substitiute M = 300 to find the cost
So the answer is 50.00?
Join our real-time social learning platform and learn together with your friends!