codeflo
> It’s also possible that Waymo's erratic braking contributed to a few of those rear-end crashes

It's possible confuse other drivers without technically making a mistake. The article doesn't say whether Waymos are rear-ended more often than human drivers, but it's plausible that this could be the case. Human drivers have a mental model of what other drivers will or won't do, and that model might not work well for autonomous vehicles. It's very possible that Waymos are faster to put on the brakes in unclear situations than a human driver would be, and/or that they categorize different situations as dangerous than a human driver would. I'm sure human drivers will eventually adapt to the behavior of these robots sharing the road with them, but that might take a bit of time and experience.

Animats
The classic rear-ending Waymo collision, from DMV reports, looks like this:

- AV approaches intersection where some obstacle blocks cross-street visibility.

- AV slowly enters intersection to get better visibility.

- AV detects cross traffic and stops quickly.

- Trailing human driver following too closely rear-ends AV.

Back when Google was testing in Mountain View, this was the most common accident. One particular intersection had a tree in the median strip, at a height that blocked the vehicle-top LIDAR, forcing a slow, careful entry to the intersection. At least two Google AVs were rear-ended there.

As the article says, there are no recorded incidents where a Waymo AV entered an intersection with legit cross-traffic and was hit. There's one where a human driver ran a red light, clearly not the fault of the AV.

One partial solution would be to have AVs flash their brake lights rapidly when they're in this may-brake-suddenly situation. That would warn humans to back off. AVs know when they're being tailgated.

xnx
Same source information as "Human drivers are to blame for most serious Waymo collisions" 3 days ago | 94 comments https://news.ycombinator.com/item?id=41516934
harmmonica
Since Waymo is very heavily California-biased at this point, a possible explanation for this, and one I think is responsible for a lot of rear-end collisions all across the world, is the "California roll." For those unfamiliar, in California (and, again, almost everywhere) it's super typical for people to approach a stop sign and not actually come to a complete stop. Instead, the driver slows down and then continues to "roll" through the stop. Many drivers here in CA expect the driver in front of them to do the California roll and so their calculation of when they will reach the actual stop sign/line will be wrong if they assume the car in front of them will roll.

I've said this on here many times before, but one of the reasons I love riding in Waymos is because they (in my experience) obey all traffic laws to the letter of the law. So if there's a stop sign, they all stop.

Would love to know the specifics of these rear-end collisions because I'd bet that they're either California rolls at stop signs or doing the same roll behavior when turning right on red traffic lights.

yial
I haven’t driven behind a waymo. So this is just curiosity.

But, having used and been behind other assisted driving vehicles. (Which one thing I find exhausting with auto cruise is that people pass, usually on the right, and then cut in constantly if you have the distance set safely)…

But as another commenter said about mental models of drivers. I wonder if part of it is they brake sooner, and perhaps more completely.

Example: someone turning left. Most human drivers will slow and go around if safe. Does waymo brake far away, and abruptly?

Again, I can’t comment on waymo, but I know my own vehicle too will sometimes brake late and harder if set to auto because of sensor distance. (I try to not let this happen and over ride and brake sooner of course, gradually slowing vs flying up, detecting a stop or slowdown, and then braking aggressively )

cut3
Waymos are notorious for signaling for half a second and then cutting you off to merge. Ive almost hit a couple that cut me off only to then immediately brake in front of me.
nkingsy
Strange headline for a very positive article about Waymo safety
paulddraper
Makes you wonder about the decision to allow human drivers.
Fezzik
As a tangent: if you have not ridden in a Waymo you owe it to yourself to take a ride. I was in LA last weekend and took a 60 minute ride from my cousin’s house to a concert and, after flying, or maybe even before, it is the most futuristic thing I have ever experienced. And it feels far safer than any human driver I have been in a car with, including myself.
hindsightbias
I’ve observed Waymos pulling over in neighborhood streets when aggressively tailgated. They then continue on.

I’d submit aggressive drivers aren’t used to that.

AStonesThrow
I've logged 762 Waymo miles and over 2,500 ride minutes, as a passenger, so here goes:

Waymos indeed make odd and non-human maneuvers. A quite frequent one is inching forward jerkily as it pulls over (<5mph) at which point there's usually no-one moving nearby either.

Waymo has veered into oncoming lanes a few times, though traffic was stopped at that intersection. I've really never witnessed any risky maneuver that would endanger property, humans, or even warrant a ticket.

BUT... motorists and pedestrians HATE Waymo with a passion. I was literally spit upon when riding a scooter-share and it may get worse in those comfy white Jaguars.

No overt incidents yet, but several passers-by have vocally expressed their ire, and there is no shortage of folks who deliberately block our path just because I have no control. But I suspect, violence will arise as the pressure rises.

Kalanos
Probably stopping at yellow lights and for pedestrians at crosswalks. It's a good way to get rear ended in Boston.
wrp
This sounds eerily like the difference between men and women drivers, as in the traditional explanations for why male drivers are more often judged to be at fault.
carapace
I'm a fan of self-driving vehicles (can we please call them auto-autos?) because they are, as this article points out, already safer than human drivers.

Still, it is and always has been completely and totally insane to test these heavy fast robots on public streets. I never in a million years would have thought we would do that.

The obvious thing to do is:

A) Start with light slow robots, like a golf cart covered in foam, and work your way up to Knight Industries Two Thousand. Going directly for KITT is a fetish.

B) Test it in one of those artificial cities with, you know, people who have at least signed waivers to be human test dummies for your two-ton high-speed killer robots?

E.g.:

> Mcity is a 32-acre simulated urban and suburban environment that includes a network of roads with intersections, traffic signs and signals, streetlights, building facades, sidewalks and construction obstacles. It is designed to support rigorous, repeatable testing of new technologies before they are tried out on public streets and highways.

https://news.umich.edu/u-m-opens-mcity-test-environment-for-...

JohnClark1337
[dead]
ChrisTrenkamp
[dead]
jaharios
Many accidents are because humans are in a hurry, something slow self-driving cars won't even know what it is.

Comparing self-driving cars with human car drivers is stupid, a more fair comparison is trains or trams, which also are less prone to accidents and you can't be in a hurry when taking a ride and make them go faster.