Δt = (r1-r2) / r2 * t1
That's right. The amount of time you will save by driving faster is proportionate to the ratio of the difference in speed to the new speed.
No matter how far you have to go, it seems, you will always only save that much. For example, if you are driving 30 mph and think it might save time to go 40 mph, you are right, but maybe not as right as you think. You will cut your driving time by a quarter.
d = rt
t = d / r
t = 20 miles / 30 mph
t = 20/30 h = 2/3 h = 40 minutes
t = d / r
t = 20 miles / 40 mph
t = 20/40 h = 1/2 h = 30 minutes
I had a friend in college who drove 4 hours home. He suggested that his drive was long enough that going 75 mph instead of 70 mph was worth the risk. He thought he was saving a lot of time. We now know he was only saving 1/15, or about 4 minutes per hour. Over 4 hours, that translates to 16 minutes. He would get home at 3:44 instead of 4:00. He could catch the end of Chip 'n Dale: Rescue Rangers.
If he'd pushed it up to 80 mph, he would have saved 1/8 of his time, or 30 minutes. He could have watched the whole episode.
So here's the moral: If you're going to drive faster, drive faster.
1 comment:
Unless of course you get caught by the cops, then in spite of gaining time by going faster, the time you spend with the cops getting the ticket will nullify your gain. Let's call it an unknown variable.
Post a Comment