A pilot flies a plane south and then 600 miles west, where she lands the plane. How far south did the pilot fly if she lands 610 miles from her starting point?
Draw a diagram of the flight path and connect 'miles from starting point': method of solution would be revealed once you see your drawing.
I don't understand this at all D:
You need to know some basic of right triangle and pythagorean theory: its a big word but not that difficult to understand.
I know how to use the Pythagorean theorem i just don't know like how to do the problem >_<
I would square 600 and 610 right?
Attempt what I said earlier. Go south draw a line down the page, go west another line
Then square root the sum of the two numbers?
|\ | \ x | \ 610 | \ |_____ \ 600
a^2 + b^2 = c^2 x^2 + 600^2 = 610^2 x^2 + 360000 = 372100 x^2 = 12100 x = 110 miles
Join our real-time social learning platform and learn together with your friends!