A baseball is hit into the air at an initial speed of 33.7 m/s and an angle of 44.1 ° above the horizontal. At the same time, the center fielder starts running away from the batter and catches the ball 0.894 m above the level at which it was hit. If the center fielder is initially 1.22 x 102 m from home plate, what must be his average speed?
Copyright © 2024 1QUIZZ.COM - All rights reserved.
Answers & Comments
Verified answer
Short version: 1.49 m/s
Long version:
The first thing we must do is find the velocity that the ball is going vertically and horizontally. To use this, we set up a right triangle with an angle of 44.1 deg. and a hypotenuse of 33.7 m/s. Using sin(x)=opp/hyp and cos(x)=adj/hyp, we can find a horizontal component of 24.201m/s and a vertical component of 23.452m/s.
We must now find how much time it takes for the ball to reach its vertical peak, i.e. reach a velocity of 0m/s. Using the formula a=v/t with the values a=9.8m/s^2 (acceleration of gravity) and v=23.452 (vertical component from above), we find that t=2.393.
We must now find the distance up that the ball goes. Using the formula d=.5at^2 and the values a=9.8m/s^2 (gravity) and t=2.393 (above), we find that d=28.060m. We now need to find the time that it takes to fall down to the player, who catches it .894m above the ground. This means that the ball falls 27.166m, and it takes 2.355s to fall that far (using the formula from earlier this paragraph and solving for t). Adding these times together, we find that it takes 4.748s to rise and fall.
We must now find how far the ball travels horizontally in this time. Using the horizontal component above (24.201m/s) and the formula v=d/t, we can solve for d and find a distance of 114.906m. The player is 122m from home plate. Assuming that he is along the same axis as the ball has been hit, he will have to go 7.094m in the 4.748s that the ball takes to rise and fall. Using v=d/t, we get 1.49 m/s (as I said in the short answer.)