Scan QR code or get instant email to install app
Question:
Forty minutes is \(\frac{\mathrm{40} }{\mathrm{60}}\) or \(\frac{\mathrm{2} }{\mathrm{3}}\) of an hour. Because average speed is distance divided by time, divide 20 miles by \(\frac{\mathrm{2} }{\mathrm{3}}\) hour to obtain 30 mph. It’s worthwhile to note that 60 mph is a mile a minute. In this case, it took 40 minutes to go 20 miles, and this is half a mile a minute, so their average speed was 30 mph.
Comments