Will Speed Limits Be A Pastime Once The Driverless Car Era Arrives

Dr. Lance Eliot, AI Insider

Image for post
Image for post

I feel the need, the need for speed.

But, then again, some say that speed kills.

We have speed limits on our roadways, by-and-large (autobahn exceptions noted).

There are advocacy groups that say our existing speed limits are too high. They argue that we need to bring down the speed limits to slower and more “reasonable” speeds.

On the other side of this coin, there are advocacy groups that say we need to raise the speed limits. Indeed, some of those advocating faster speeds are even prone to saying that we should not have any maximum speed limit at all. In a kind of capitalistic viewpoint, they argue that it should be a “free market” and allow anyone to drive as fast as they want. Having restrictions on our driving speed is akin to restricting our freedom of speech.

Nowadays, there are those that argue that the gas consumption curves of years ago that were used to set our maximum speed limits are no longer applicable to the fuel efficiencies of modern day gas powered cars.

Furthermore, we are seeing more cars becoming electrically powered and the question is raised as to whether the gas fuel charts make any sense when our cars are powered instead by electricity.

Here’s some factors that are used as arguing for setting speed limits:

  • Can reduce roadway casualties due to car accidents and collisions that happen when going fast
  • Can lessen the physical impact to our roadways that makes the roads spotty and full of potholes
  • Can improve our air quality by reducing the amount of exhaust pollutants that occur when going fast
  • Can make our roadways safer for non-car traffic such as bicyclists and pedestrians
  • And can presumably be more fuel efficient (had to list it)

Each of those factors are readily open to discussion and debate.

Like most states in the United States, California has a so-called “basic speed law” that refers to the notion that you are supposed to drive as a “reasonable man” would drive (the word “man” here is meant to indicate mankind, as in both men and women). You are not supposed to drive faster than is safe for whatever the current roadway conditions are.

AI Self-Driving Cars Need to be Savvy About Speed Limits

What does this have to do with AI self-driving driverless autonomous cars?

At the Cybernetic AI Self-Driving Car Institute, we are developing AI systems for self-driving cars and this includes being savvy about speed limits.

Let’s consider some of the ramifications about speed limits and self-driving cars.

Those that believe in a Utopian world of all self-driving cars, which I claim won’t happen for a very long time since we currently have 200+ million conventional cars in the US and those aren’t going away overnight, but anyway once someday that we do have all self-driving cars that then we can do something about our speed limits.

I say this because suppose we take the human out of the equation for why we set speed limits.

If you assume that the human won’t be driving any cars on our roadways, we no longer need to be concerned about a human drunk driver. If we also assume that the AI will be able to drive the car as well as or even better than a human proper driver, we will presumably have many less car accidents. For those that falsely believe we will end-up with zero accidents, I point out that there will still be accidents involving cars that hit debris in the roadway, or that blow out a tire and veer into another lane, or that hit a pedestrian that steps onto the roadway unexpectedly, and so on. We aren’t going to have zero fatalities, and I’d ask that people stop claiming that it will happen.

So, given the aforementioned caveats, in theory we could increase the speed limits since we no longer have the human frailties of driving.

What about the other factors that we had covered earlier?

It was suggested that higher speeds harm the roadways in terms of physical impacts to the roadway. There’s nothing magical about self-driving cars that will overcome this. Now, as mentioned, cars are getting better in terms of their design and so generally a better designed car, conventional or self-driving, will nonetheless extract less of a toll on the physical roadways.

In terms of improving air quality, again a self-driving car doesn’t really impact this aspect per se, and it’s up to how the car is designed that will make a difference, conventional or self-driving. Same generally goes for the topic of fuel efficiency. The point being that even if a self-driving car is a better driver than a human, going at high speeds will still produce the same amount of pollutants and be as fuel inefficient as if a human driver was driving, and so we would need to decide how important those factors are (having taken off the table the human driver frailties).

This brings us to the factor that our roadways would be safer at lower speeds due to the greater chance of avoiding non-car traffic accidents such as with bicyclists and pedestrians. The question then is whether an AI self-driving car, if allowed to go at speeds faster than our existing speed limits, would it have a tendency to be able to avoid these non-car traffic accidents, doing so at a better rate than if it were a human driver?

You might say that the AI system would hopefully be less likely than humans to get into these non-car accidents. This is based on the assumption of human frailties, namely that humans are apt to be sleepy when driving and fail to see a pedestrian, or be drunk, or otherwise not be as “flawless” as a machine would be. But, we need to be careful in assuming that the AI and the sensors of the self-driving car are of necessity more astute than a human driver and their human senses and human mind. It’s a science fiction type of assumption that the AI and the sensors will work flawlessly, all of the time, and in every way.

In fact, it is unlikely that the AI and the sensors will be perfect machines.

Does it make sense to increase the speed limit in a world of self-driving cars?

We are going to have a mixture of human driven cars and self-driving cars for many decades, and so if we do increase the speed limit it would imply that human drivers could go faster too. In that case, we’re back to the frailties of human drivers and what happens when they are allowed to drive faster. Some say that we could devote lanes to self-driving cars, and separate the human drivers from the self-driving cars. Imagine if you are driving on the freeway, and the HOV lane is reserved for self-driving cars. And, suppose further that those self-driving cars are allowed to go at a speed of 100 mph, or maybe at whatever speed they can go.

This could be a dangerous situation. Human drivers might veer into this special lane and cause a wreck. When self-driving cars try to get into the lane, it will be hard for them to reach the prevailing speeds in that lane as when coming from the conventional speed lanes.

I realize that an all self-driving car situation would allow for the self-driving cars to coordinate their efforts. Using V2V (vehicle-to-vehicle communications), the self-driving cars could let each other know when a self-driving car wants to get into a lane or exit from a lane. The other nearby self-driving cars would then be aware to allow for this movement of the other self-driving car. In that manner, it could be that we would not have any speed limit at all. The AI’s of the self-driving cars would just agree to whatever speed seems to make sense for the prevailing situation.

This brings up too that the AI of the self-driving car can be a kind of speed limiter. The AI is going to decide what speed to go. Should it be programmed to never go faster than a set speed limit?

You could contend that AI self-driving cars will always be law abiding. They will always obey the speed limit. Assuming that the AI is indeed programmed for this, and that it works flawlessly in doing so, we are back to our earlier question about whether the speed limit should always apply or not, such as the case of the wife driving her husband to the hospital. If you were in a self-driving car, and your husband was bleeding and dying, and if the AI refused to go faster than the posted speed limit, would you be upset that the AI won’t do your bidding and go faster?

If the AI is really sharp, it might be able to dialogue with the occupants and figure out that an emergency is taking place. Perhaps, in that case, the AI would have been pre-programmed that it can go as fast as it can achieve. This kind of notion though is a bit far-fetched and we’d likely need to still have some means of validating that the AI is doing the right thing, maybe an after-the-fact kind of verification.

Perhaps we should have a governmental regulation that indicates how the self-driving cars and their AI is to behave with regard to speed limits. Some would argue to keep the government out of such things, and instead maybe there should be an industry-based association that can set standards for this type of aspect. Others might say that the marketplace should decide, and leave it to each auto maker to provide whatever they believe the market most wants.

Speed limits can presumably become digitally-based in the future. We could still have posted speed limit signs, but they would be a digital display rather than a painted sign. This would allow for the speed limit to be changed at any time, as based on the time of day, the prevailing traffic conditions and so on. For an AI self-driving car, it could “read” the digital sign by using a camera sensor that’s on the self-driving car. Or, the digital sign could emit a signal that transmits the posted speed limit.

If we did have all self-driving cars, we could do away with the speed limit signs altogether. In theory, the speed limit could be a virtual speed limit and the use of V2I (vehicle-to-infrastructure) system would communicate the speed limit to the self-driving cars. As self-driving cars drive down a particular street, the V2I lets them know that since it a road next to a school and since it is morning time when the kids are arriving, the speed limit is 5 mph. Once the kids are in the classrooms, the V2I communicates to any passing self-driving cars that the speed limit has been raised to 15 mph. And so on.

We might allow for AI self-driving cars to exceed the speed limit when there’s an emergency, and maybe even when there isn’t an emergency.

Now that you’ve had a chance to think about the nature of speed limits, it hopefully has raised your overall awareness that it is not such an easy answer as to whether with AI self-driving cars we should have higher speed limits, or possibly even no speed limits at all. I would also urge you to realize that the day when we have all self-driving cars and no human driven cars is a long, long, long ways in the future. As such, whatever we want to do about speed limits, it will need to be undertaken in a world consisting of both human drivers and AI self-driving cars.

Maybe we’d have AI that when stopped by a police officer and asked whether it saw the speed limit sign, the AI would respond by saying yes, but it doesn’t believe everything it reads.

For free podcast of this story, visit: http://ai-selfdriving-cars.libsyn.com/website

The podcasts are also available on Spotify, iTunes, iHeartRadio, etc.

More info about AI self-driving cars, see: www.ai-selfdriving-cars.guru

To follow Lance Eliot on Twitter: https://twitter.com/@LanceEliot

For his Forbes.com blog, see: https://forbes.com/sites/lanceeliot/

For his Medium blog, see: https://medium.com/@lance.eliot

For Dr. Eliot’s books, see: https://www.amazon.com/author/lanceeliot

Copyright © 2019 Dr. Lance B. Eliot

Written by

Dr. Lance B. Eliot is a renowned global expert on AI, Stanford Fellow at Stanford University, was a professor at USC, headed an AI Lab, top exec at a major VC.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store