Answer:
1.375 seconds
Step-by-step explanation:
One mile is equivalent to 5280 feet. Throwing at 30 miles an hour is 0.5 miles each minute, or one mile every 120 seconds(Unfortunately for the pitcher, I don't think the ball would make it to home plate). That means it travels 1/120 miles each second, or 44 feet(5280/120). The ball will reach home plate in 1.375 seconds(60.5/44).
The answer is C that’s all I can give to you
True. EH is the diameter of D
you have to use intercept theorem, also known as Thales's theorem to slove this problem