Disengagement Reporting for AI Self-Driving Cars is Woefully Insufficient

Dr. Lance B. Eliot, AI Insider

Each year, California releases a report listing the various “disengagements” of AI self-driving cars that are formally authorized to operate in California, and for which those auto makers and tech firms doing so are required to file the prescribed reports dutifully with the DMV (Department of Motor Vehicles).

The excitement and anxiousness to see the latest reporting is somewhat reminiscent of a classic movie. Let’s see if you can guess which one, here’s my clue: “The new phonebook is here! The new phonebook is here!”

Yes, you might recall that comedian Steve Martin made that famous exclamation in his now-classic movie The Jerk. He was referring to seeing his name in-print (well, his movie character’s name), and being excited to know that he had finally made it to the big time.

Likewise, when the California Department of Motor Vehicles (DMV) releases its eagerly awaited collection of so-called “disengagement” reports there are similar exhortations.

A disengagement is considered an instance in which the autonomous mode of the self-driving car was disengaged by a human back-up driver, doing so presumably because the human back-up driver judged that the self-driving car was unable to handle a particular driving moment. The human back-up driver would typically right away take over the controls, hopefully alleviate the driving situation, and assuming things seemed okay to further proceed with the self-driving, the human back-up driver would re-engage the AI system to drive the self-driving car.

One of initial and more robust collections covered the time period of December 2015 to November 2016 (I’ll refer to this as the 2016 listing), and 11 companies filed reports at that time: BMW, Bosch, GM’s Cruise Automation, Delphi Automotive Systems, Ford, Google — Waymo, Honda, Nissan North America, Mercedes-Benz, Tesla Motors, and Volkswagon Group of America.

More recently, for the 2018 listing, as released in early 2019, the data encompassed 48 companies, though only 28 filed reports since they were the only ones actively underway on public roadways (if a firm kept to only private roads, they could skip doing the reporting). There were 496 vehicles included, amounting to slightly over 2 million miles of road driving, and came to a total of nearly 144,000 disengagements.

You can see the official DMV 2018 disengagement reports here: https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/disengagement_report_2018

The reason these are worthy of rapt attention is that they help reveal the latest status of AI self-driving cars, albeit in a somewhat limited way, and some would argue a meaningless way (more on this in a moment).

The numbers provided are self-reported by the companies, and presumably forthright (else they’d be violating the DMV regulations), though there isn’t any independent third-party verification per se of their reported claims.

I am not suggesting that any of the reports are false. Actually, the bigger issue is that the DMV opted to allow for flexibility in reporting, and so it is not readily feasible to compare the numbers reported by the companies.

It’s unfortunate that a regulator (the California DMV) on the one hand has insisted on annual reporting, which I argue is handy, but at the same time the regulator did not provide clear-cut standards for doing the reporting. It is like a football game involving teams that pretty much get to make up the rules and therefore you cannot compare them on the basis of touchdowns, since some of teams are using 6 points for a touchdown while others are using say 4 points for a touchdown.

The California DMV needs to tighten up the reporting requirements.

That being said, California’s DMV should be applauded for even requiring such reporting, while other states such as Michigan, Florida, Arizona, and Nevada either don’t yet require such reporting or are considering doing so. I don’t want to seemingly besmirch California for failing to offer clearer reporting requirements and somehow lose sight of the fact that most of the other states don’t require any reporting at all.

To me, those states deserve an even harsher lashing.

California has done the right thing, forcing the companies that want to test their self-driving cars on the public roadways to come forth with how well it is coming along and how much of such testing they are doing.

I realize that some might argue that this is an over-intrusive requirement on these companies, and some might say that it discourages those companies from testing. Given that some states aren’t requiring the reporting, companies that want to keep secret their testing are sliding over to those states and avoiding having to publish their status. Or, in California, they stay off the public roads and do their testing only on private lands (a somewhat clever or some say sneaky way around the rules). I won’t focus further on the public policy implications here, and just note that it is a factor to keep in mind about how self-driving cars are evolving and what the states are doing about it.

What do the numbers show?

In the 2016 set, Google’s Waymo reported that their self-driving cars logged about 636,000 miles in California on public roadways during the reporting time period. That was a staggeringly high number in comparison to the other ten companies, which combined together were a mere fraction of that number of miles. For total miles logged and reported, Waymo was about 97% of the miles and the other ten were about 3%. GM’s Cruise Automation came in at the #2 spot in terms of most number of miles, indicating about 9,850 miles during the reporting time period. Some companies, such as Honda and Volkswagon, reported zero miles and indicated that they had not done testing on California’s public roadways during the reporting time period (at least as far as they interpret what the DMV regulation encompasses).

In the 2018 time period, Waymo logged 1.2 million miles, once again the leader of the pack. GM Cruise logged about 448,000 miles. Some like to make use of a metric that takes the number of miles and indicates how many disengagements there were per 1,000 miles driven. Waymo was 0.09 disengagements per 1,000 driven miles, while GM Cruise was 0.19 disengagements per 1,000 driven miles.

This brings us to one of the reporting metrics problems.

The number of miles driven is a quite imperfect measure since it is important to point out that a mile of open highway is not the same driving complexity as a mile of inner city street-clogged bumper-busting traffic.

Not all miles were created equal, some aptly say.

If I have my self-driving car use open highways where there is little human-like navigation required, this is a far cry from the human-like capability needed in more onerous traffic conditions. For example, much of Waymo’s mileage is apparently in the Mountain View area, along suburban streets. I would contend that these kinds of miles are a lot “easier” than say downtown San Francisco driving.

I’ll also point out something else that I’ve observed while in the Mountain View area. I’ve seen the Waymo cars quite often. I have also noticed that other human-driven cars are tending to give a lot of latitude to the Waymo cars nowadays.

I mention this aspect because the next metric I am going to discuss is the disengagements counts.

Waymo reported that they had a relatively modest number of disengagements, which seems pretty darned good when you also consider that their self-driving cars miles was so high. It is a good sign, but at the same time, I wonder how much of this is due to the fact that human drivers are changing their behavior to allow the self-driving car to drive without having to deal with true traffic conditions (i.e., humans not driving as wildly as they normally do). And, these cars are driving typically repeatedly on the same roads, over and over. This is vastly different from having to navigate more unfamiliar territory and figure out the idiosyncrasies of roads you’ve not been on before.

Let’s also give some scrutiny to the definition of the disengagements metric, since any looseness means that the auto makers and tech firms can potentially sway their numbers.

According to the California DMV, a disengagement involves the test driver taking over immediate manual control of the vehicle during a testing activity on a public road in California, and doing so because the test driver believed the vehicle had a technological failure or because they thought the self-driving car was not operating in a safe manner. This might seem like an airtight definition.

It is actually full of tons of loopholes. Let’s start with taking over immediate manual control. For some companies, their viewpoint is that if the test driver waits say more than a few seconds then this is not considered an “immediate” circumstance and so is not counted as a disengagement. Is this a fair or unfair interpretation? Again, it should not be open to interpretation and a clearer standard should exist.

I can also imagine that there might be pressure placed on the human test drivers to avoid doing a disengagement.

In essence, if you are a self-driving car company and you know that ultimately the whole world will be examining your number of disengagements, you would probably want to seek the minimal number that you can.

This is not to suggest that anyone is telling test drivers to allow themselves to be put in jeopardy or jeopardize others on the road. It is simply another element to consider that the test drivers will each vary as to why and when they think a disengagement is warranted.

Thus, again, the disengagement metric is not a reliably standardized metric.

I have been especially irked by some of the national and worldwide reporting about the California DMV disengagement reports.

One bold headline was that numbers don’t lie and that the reports presumably prove that self-driving cars are nearly ready to hit the roads unattended. This claim that numbers don’t lie is a sadly simplistic suggestion, and I think that all readers should recall the famous line of British Prime Minister Benjamin Disraeli: “There are three kinds of lies: lies, damned lies, and statistics.”

Another irksome headline was that the disengagement reports show that there were many thousands of “problems” (disengagements) with the self-driving cars. This is a total of the number of disengagements reported, but it is disingenuous to suggest that somehow the count implies that self-driving cars had “problems” per se.

Yes, a disengagement might have occurred because a self-driving car had a system failure, but it also could be that the test driver felt uncomfortable entrusting the self-driving car in a dicey traffic condition and so decided to take over manual control.

Please be cautious in interpreting the disengagement reporting.

I will end on a more macroscopic note about self-driving cars, namely that we need to establish the equivalent of a Turing Test for self-driving cars. In AI, we all know that the Turing Test is a handy means to try and ascertain whether a system appears to embody human intelligence, doing so by trying to see whether humans can distinguish whether or not a system is interacting in a human intelligence-like fashion.

Though imperfect as a test, it nonetheless is a means to assess AI-based systems.

A Level 5 self-driving car, which is considered the topmost of self-driving cars and implies that a self-driving car can be driven by automation in whatever same manner that a human can, there isn’t any specific testing protocol for this. Do we just let a self-driving car drive for a thousand miles and if it does not need any disengagements, do we safely conclude it is a true Level 5? Or do we let it drive for 100,000 miles? Or millions of miles?

As I’ve mentioned herein, miles alone is not sufficient, and disengagement is imperfect too. We need a more comprehensive, standardized test that could be applied to self-driving cars.

For free podcast of this story, visit: http://ai-selfdriving-cars.libsyn.com/website

The podcasts are also available on Spotify, iTunes, iHeartRadio, etc.

More info about AI self-driving cars, see: www.ai-selfdriving-cars.guru

To follow Lance Eliot on Twitter: @LanceEliot

Copyright © 2019 Dr. Lance B. Eliot.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store