If a ball is thrown at a 30 degree horizontal angle at a speed of 30 m/s starting at 1,85m above ground with a gravity at 9,81m/s^2 A) How can you calculate time until it hits the ground? B) How long it goes horizontally before hitting the ground? C) And the speed when hitting the ground? And at what angle? My try was this: A) 20m/s*sin(30)=10m/s 1,85m+10m/s-((1/2)(9,81m/s^2*t^2)=0 So t= approx. 1,554s B) 20m/s*cos(30)=approx 17,32m/s 17,32m/s*1,554s=26,92m C) 10m/s-(9,81m/s*1,554s)= 5,24m/s sqrt((5,24m/s)^2 + (17,32m/s)^2)=18,1m/s Sin((17,32m/s)/(18,1m/s))=
Could anyone help with this?
Does the ball have a speed of 30 m/s or 20 m/s, because in all of your calculations you use 20, but the problem you wrote states 30.
Also, in the last line you want to be using arcsin, not sin
I ment 20 m/s and i ment arcsin so i'd get 73,1 degrees. Is it correct though?
I get 2.12 for the time
seconds, of course
then I get 36.72 for the distance
I then get -10.776 for the final velocity
I can tell that is right because it is negative, or downward
It is also slightly greater than the initial, because it starts above the ground
oh, and the distance was in the x direction, and the final velocity was in the y direction
so: |dw:1343766214417:dw|
Join our real-time social learning platform and learn together with your friends!