- AV approaches intersection where some obstacle blocks cross-street visibility.
- AV slowly enters intersection to get better visibility.
- AV detects cross traffic and stops quickly.
- Trailing human driver following too closely rear-ends AV.
Back when Google was testing in Mountain View, this was the most common accident. One particular intersection had a tree in the median strip, at a height that blocked the vehicle-top LIDAR, forcing a slow, careful entry to the intersection. At least two Google AVs were rear-ended there.
As the article says, there are no recorded incidents where a Waymo AV entered an intersection with legit cross-traffic and was hit. There's one where a human driver ran a red light, clearly not the fault of the AV.
One partial solution would be to have AVs flash their brake lights rapidly when they're in this may-brake-suddenly situation. That would warn humans to back off. AVs know when they're being tailgated.
I've said this on here many times before, but one of the reasons I love riding in Waymos is because they (in my experience) obey all traffic laws to the letter of the law. So if there's a stop sign, they all stop.
Would love to know the specifics of these rear-end collisions because I'd bet that they're either California rolls at stop signs or doing the same roll behavior when turning right on red traffic lights.
But, having used and been behind other assisted driving vehicles. (Which one thing I find exhausting with auto cruise is that people pass, usually on the right, and then cut in constantly if you have the distance set safely)…
But as another commenter said about mental models of drivers. I wonder if part of it is they brake sooner, and perhaps more completely.
Example: someone turning left. Most human drivers will slow and go around if safe. Does waymo brake far away, and abruptly?
Again, I can’t comment on waymo, but I know my own vehicle too will sometimes brake late and harder if set to auto because of sensor distance. (I try to not let this happen and over ride and brake sooner of course, gradually slowing vs flying up, detecting a stop or slowdown, and then braking aggressively )
I’d submit aggressive drivers aren’t used to that.
Waymos indeed make odd and non-human maneuvers. A quite frequent one is inching forward jerkily as it pulls over (<5mph) at which point there's usually no-one moving nearby either.
Waymo has veered into oncoming lanes a few times, though traffic was stopped at that intersection. I've really never witnessed any risky maneuver that would endanger property, humans, or even warrant a ticket.
BUT... motorists and pedestrians HATE Waymo with a passion. I was literally spit upon when riding a scooter-share and it may get worse in those comfy white Jaguars.
No overt incidents yet, but several passers-by have vocally expressed their ire, and there is no shortage of folks who deliberately block our path just because I have no control. But I suspect, violence will arise as the pressure rises.
Still, it is and always has been completely and totally insane to test these heavy fast robots on public streets. I never in a million years would have thought we would do that.
The obvious thing to do is:
A) Start with light slow robots, like a golf cart covered in foam, and work your way up to Knight Industries Two Thousand. Going directly for KITT is a fetish.
B) Test it in one of those artificial cities with, you know, people who have at least signed waivers to be human test dummies for your two-ton high-speed killer robots?
E.g.:
> Mcity is a 32-acre simulated urban and suburban environment that includes a network of roads with intersections, traffic signs and signals, streetlights, building facades, sidewalks and construction obstacles. It is designed to support rigorous, repeatable testing of new technologies before they are tried out on public streets and highways.
https://news.umich.edu/u-m-opens-mcity-test-environment-for-...
Comparing self-driving cars with human car drivers is stupid, a more fair comparison is trains or trams, which also are less prone to accidents and you can't be in a hurry when taking a ride and make them go faster.
It's possible confuse other drivers without technically making a mistake. The article doesn't say whether Waymos are rear-ended more often than human drivers, but it's plausible that this could be the case. Human drivers have a mental model of what other drivers will or won't do, and that model might not work well for autonomous vehicles. It's very possible that Waymos are faster to put on the brakes in unclear situations than a human driver would be, and/or that they categorize different situations as dangerous than a human driver would. I'm sure human drivers will eventually adapt to the behavior of these robots sharing the road with them, but that might take a bit of time and experience.