On Monday, Waymo — the subsidiary of Google parent company Alphabet that’s developing a full-stack driverless vehicle platform — announced that its cars have driven a combined 20 million autonomous miles to date, up from 10 million miles in October 2018. The metric signifies Waymo’s logistical and technological superiority, implied CEO John Krafcik, who equated the miles driven to 1,400 years of driving experience for an average American.
But some experts assert that measuring driverless systems’ progress by miles is a flawed approach.
This week in a conversation with VentureBeat at the 2020 Consumer Electronics Show (CES), Dmitry Polishchuk, the head of Russian tech giant Yandex’s autonomous car project, said that miles aren’t very meaningful without context to accompany them. “It’s tough to directly compare miles driven,” he said. “Obviously, the more miles [you] have, the better, but we believe that the environments that you’re in have a huge impact.”
Yandex isn’t without a horse in the race — its over 100 autonomous cars in Innopolis and Skolkovo, Russia; Las Vegas; and Tel Aviv have driven 1.75 million miles as of January, up from 1.5 million miles and 1 million miles last December and October, respectively. But policymakers as well as competitors in the nearly $41.25 billion global autonomous car segment have expressed similar sentiments.
Noah Zych, head of system safety at Uber’s Advanced Technologies Group, told Wired in an interview that mileage critically omits details like situations encountered, obstacles, and accidents. “You need to know … ‘What was the objective of the testing in [any given area]?” he said. “Was it to collect data? Was it to prove that the system was able to handle those scenarios? Or was it to just run a number up?”
And at a conference organized by Nvidia in Washington two years ago, Derek Kan, U.S. secretary for policy at the U.S. Department of Transportation, stressed the need for objective and agreed-upon measures of driverless systems performance. Separately, David Friedman, former acting administrator of the National Highway Traffic Safety Administration (NHTSA) and vice president at Consumer Reports, recently urged Congress to direct the NHTSA to implement privacy protections, minimum performance standards, and accessibility rules for self-driving cars, trucks, SUVs, and crossovers.
Disengagements — or deactivations of cars’ autonomous modes when failures occur or when drivers are forced to take over — have been adopted by agencies including California’s Department of Motor Vehicles as an alternative to miles driven. (By law, companies actively testing self-driving cars on public roads in the state are required to publish disengagement reports.) But Polishchuk argues that this, too, is an imperfect metric.
“We have kind of been waiting for some sort of industry standard,” he said, noting that Yandex hasn’t yet released a disengagement report. “Self-driving companies aren’t following the exact same protocols for things. [For example, there might be a] disengagement because there’s something blocking the right lane or a car in the right lane, and [the safety driver realizes] as a human that [this object or car] isn’t going to move.”
For its part, whenever Yandex deploys new code into production, the company conducts real-world tests to ensure that systems performance (and by extension, safety) isn’t degraded. It takes 10 cars — five equipped with the codebase from half a year ago and five with the latest code — and it runs them for a day on the same route such that they encounter identical obstacles and weather conditions. It even switches up the safety drivers behind the wheel to prevent bias from influencing the results.
“We look back at the numbers and check the correlation … using hundreds of different parameters,” said Polishchuk. “The absolute number of disengagements doesn’t matter.”
Unfortunately for companies like Yandex, less regulatory guidance — not more — seems the likelier near-future path, at least in the U.S. At CES on Wednesday, Transportation Secretary Elaine Chao announced Automated Vehicles 4.0 (AV 4.0), new guidelines regarding self-driving cars that seek to promote “voluntary consensus standards” among autonomous vehicle developers. It requests but doesn’t mandate regular assessments on self-driving vehicle safety, and it permits those assessments to be completed by automakers themselves as opposed to by a standards body.
Advocacy groups including the Advocates for Highway and Auto Safety criticized the policy for its vagueness. “Without strong leadership and regulations … [autonomous vehicle] manufacturers can and will continue to introduce extremely complex supercomputers-on-wheels onto public roads … with meager government oversight,” said president Cathy Chase in a satatement. “Voluntary guidelines are completely unenforceable, will not result in adequate performance standards, and fall well short of the safeguards that are necessary to protect the public.”
Indeed, regulation could go a long way to convincing a skeptical public.
Two studies — one published by the Brookings Institution and another by the Advocates for Highway and Auto Safety (AHAS) — found that a majority of Americans aren’t convinced of driverless cars’ safety. More than 60% of respondents to the Brookings poll said that they weren’t inclined to ride in self-driving cars, and almost 70% of those surveyed by the AHAS expressed concerns about sharing the road with them. Elsewhere, a study conducted by think tank HNTB found that 59% of people expect self-driving cars will be “no safer” than cars driven by humans.
In the U.S., legislation remains stalled at the federal level, unfortunately. More than a year ago, the House unanimously passed the SELF DRIVE Act, which would create a regulatory framework for autonomous vehicles. But it has yet to be taken up by the Senate, which in 2018 tabled a separate bill, the AV START Act, that made its way through committee in November 2017.
Polishchuk predicts that legislation will only emerge when some “reasonable amount” of self-driving cars hit public roads. Optimistic projections peg the number at 10 million by 2030. “When this happens, we would have statistics, and basically, statistics will push regulators,” he said.
For AI coverage, send news tips to Khari Johnson and Kyle Wiggers and AI editor Seth Colaner — and be sure to subscribe to the AI Weekly newsletter and bookmark our AI Channel.
Thanks for reading,
AI Staff Writer
Source: Read Full Article