Gridlock Guy: Upset about Waymos? Georgia Tech professor says give them time.

Most advances that people see as “coming soon” have already arrived. And autonomous vehicles — or driverless cars — certainly fit that bill.
Self-driving Teslas have traversed Atlanta roads for years, but Waymo’s advent came last spring. The fundamental difference, of course, is that Teslas still have someone at the helm of a personal vehicle. Waymos are robo-taxis that operate with no on-site, human supervision.
When Waymo vehicles began running school bus stops and stopping in intersections, they quickly became infamous.
Given the outcry that ensued, but knowing the overall goal of autonomous vehicles is to actually make the roads safer, I recently posed a question in this space: Should Waymos be held to a higher standard than humans?
My 11Alive co-worker Liza Lucas has done extensive reporting on Waymo and suggested I reach out to Srinivas Peeta, a Georgia Tech professor in the School of Civil and Environmental Engineering. I asked him if the public outrage fits the robot-cars’ crimes.
“When (people) see Waymos and other autonomous vehicles and such failing to reach those levels, there’s a natural sense of unease that they would experience,” Peeta said.
He has studied connected and intelligent traffic technology for 37 years. He’s seen traffic data collection go from static points to flowing oceans, from analog to digital. He has seen truckloads of innovation and big changes in how people commute and plan their itineraries.
The introduction of driverless robo-taxis shocks the system in ways other traffic changes just do not. Peeta believes people are mainly afraid because robots are expected to be perfect, and they are hard to keep accountable.
To the first notion, yes. Autonomous vehicle companies most certainly agree their software ultimately navigates roads more safely than human drivers.
As for accountability, Peeta said owners of personal vehicles are liable for what their cars do. And robo-taxi companies are on the hook for what their public-facing vehicles do, though, he added, the punishments vary by jurisdiction.
Another crucial but simple way to hold motorists accountable is by beckoning and warning. But robots simply do not understand those human codes — such as a pedestrian giving a hand signal or a person yelling and waving to stop a wrong-way driver — as well.
Peeta explained unlearned stimuli cause AVs to simply stop.
“Sometimes these vehicles freeze because they are not sure what to do. And this is where it sometimes tends to be glaring because, as humans, we tend to have context. We tend to have situational awareness, so we’re able to come up with decisions that seem so natural to us,” he said.
To cure that, AVs simply need reps, Peeta said. They also behave best when they are around other AVs because the situations are more predictable.
The bar for driverless cars should be high, Peeta said. But people need to be patient to let the technology achieve that status; the wait is worth the results.
“(AVs) tend to have a lot more safe-driving benchmarks for the same amount of mileage or vehicle miles driven. They tend to be a lot more safer statistically,” he said.
Doug Turnbull covers the traffic/transportation beat for WXIA-TV (11Alive). His reports appear on the 11Alive Morning News 6-9 a.m. and on 11Alive.com. Email Doug at dturnbull@11alive.com. Subscribe to the weekly Gridlock Guy newsletter here.

