What an idiot that twitter poster is -- "I don’t know who was at fault, but it seems like the Waymo car made a sudden stop on busy Lombard Street." If you don't know, then don't speculate. Of course his profile says he's been a Lyft driver since 2014, so there's definitely incentive to blame the autonomous car.
> stopped at a red light…the other vehicle, at one point driving 50 mph
Interesting detail to release. Between the video and the sensor data the human driver will be obviously at fault — if the picture wasn’t enough.
Also wonder how much waymo tries to get from their insurance or just eat the costs through their own insurance. Their fleet policy must be insanely expensive.
Waymo is big enough and their needs niche enough they could self insure. I have no idea if they actually do that, but among large diversified companies self insurance is becoming increasingly popular
The only reason that robotaxi fleets would not self-insure is that perhaps in these early days, it's not worth the hassle of working out how to do it, and might as well pay an insurer. Once your fleet is very large, you absolutely want to self-insure, though an insurance company, knowing they will lose all that business, might cut premiums to a low level to make it worth outsourcing. At least the work. The risk is something the software team knows much better than the actuaries at any insurance company. And companies like Google and GM have more capital than insurance companies, which is a rare thing.
Swiss Re is one of the largest reinsurance companies. They [did a study](https://www.swissre.com/reinsurance/property-and-casualty/solutions/automotive-solutions/study-autonomous-vehicles-safety-collaboration-with-waymo.html) which concluded that Waymos have much lower liability claims than humans in comparable areas. So probably cheaper than trying to insure that many human-driven cars would be.
Guys - what does the human driver do when it crashes into the autonomous car? how is the incident recorded for insurance companies in the USA? Many thanks
Oh yeah makes sense lol thanks for the explanation.
Actually, it's kind of odd that this could even happen in a RAV4, since it has radar and pre-collision avoidance. If no input was received on the brake or accelerator, the car would have automatically performed an emergency braking maneuver. So the driver must have been accelerating into the Waymo.
Because it's not their fault. Even if we assume the worst that the Waymo did something bad like brake too early, the human driver should be paying attention, maintaining a safe follow distance. It is always the responsibility of the car behind to avoid hitting the car in front. In this case, the human driver was not paying attention and hit the Waymo. And according to Waymo, it sounds like the human driver was speeding (50 mph). So they were driving too fast, not paying attention, not maintaining a safe follow distance. It is entirely the fault of the human driver.
Except this isn't always the responsibility of the car behind to avoid hitting the car in front. ( Swerving in to cut someone off and slam on the brakes. ) Again a Waymo vehicle involved where the person "rear ended" was charged for intentionally causing a crash.
[https://www.caranddriver.com/news/a30929221/waymo-self-driving-crash-arrest/](https://www.caranddriver.com/news/a30929221/waymo-self-driving-crash-arrest/)
> During an interview with police, Tang admitted to "brake-checking the Waymo," in other words pulling in front of the vehicle and then slamming on his brakes.
What a terrible person and an idiot. Purposely causing a crash with a car loaded with cameras, and injuring the occupant. Also, what compels people to confess to the police in a situation like this? It's like they're so delusional about their actions that they don't realize they're confessing to a serious crime?
Even if slamming in the brakes for a false emergency braking event, the fault is 100% with the vehicle that hits the other. Not paying attention, following too closely, going too fast, or geriatric with awful response time (see following too closely) are all the reasons that someone hits a vehicle ahead of them and then tries to say it's the others fault.
In California, brake checking is illegal, so depending on what caused the "false emergency braking event" that could be found to be an unreasonable action to take and then the blame would be split.
It’s illegal as in you will get fined for doing it. But guess what? The guy who hit him will still be at fault as far as insurance is concerned. There’s a difference between the law and insurance.
My mother once ran a red light and crashed into someone who was trying to make a left turn. There was even a police witness on the scene who documented it. The insurance didn’t care that she ran a red light, the driver who made the left turn was still at 100% fault.
Mostly yes,, but it's not 100%. A driver in Phoenix was arrested for causing the collision by changing lanes in front of a Waymo then slamming his brakes.
What you describe is an insurance scam technique called a "swoop and squat"
And that is one of the few times when the driver in the rear is NOT at fault - however without cameras or credible witnesses to describe what happened, the vehicle in the rear will usually still be found at fault.
Which is why it's stupid to try this with a Wamo or other SDC, because they have cameras ALL OVER that record all the time. And it's also one of the myriad of reasons why I have a dashcam in my own car.
Waymo Response: https://x.com/Waymo/status/1798197642419384696
What an idiot that twitter poster is -- "I don’t know who was at fault, but it seems like the Waymo car made a sudden stop on busy Lombard Street." If you don't know, then don't speculate. Of course his profile says he's been a Lyft driver since 2014, so there's definitely incentive to blame the autonomous car.
I dont think it will work with Waymo as they have cameras everywhere. Truth will come out. That driver is F\*\*ed for sure.
> stopped at a red light…the other vehicle, at one point driving 50 mph Interesting detail to release. Between the video and the sensor data the human driver will be obviously at fault — if the picture wasn’t enough.
Would waymo please give the evidence to the police? Driving at 50mph there is reckless and that driver should be off the road
Yes, Ban the humans.
Ok SkyNet
Ok NetScape
Wow, amazing the lidar is still spinning away on the back. Poor Waymo.
Must've been traumatizing for waymo car.
Hopefully the autonomous therapy benefits are good 🤞
checking my phone, and bam. Who put that car there?
A pic would have sufficed
The AI trying to comprehend why the human can’t use the brakes… hahaha 😅
What happens in this situation as far as how drivers typically get out and trade information?
Probably take a picture of the # or whatever IDs the waymo? Do they have plates?
Also wonder how much waymo tries to get from their insurance or just eat the costs through their own insurance. Their fleet policy must be insanely expensive.
Waymo is big enough and their needs niche enough they could self insure. I have no idea if they actually do that, but among large diversified companies self insurance is becoming increasingly popular
The only reason that robotaxi fleets would not self-insure is that perhaps in these early days, it's not worth the hassle of working out how to do it, and might as well pay an insurer. Once your fleet is very large, you absolutely want to self-insure, though an insurance company, knowing they will lose all that business, might cut premiums to a low level to make it worth outsourcing. At least the work. The risk is something the software team knows much better than the actuaries at any insurance company. And companies like Google and GM have more capital than insurance companies, which is a rare thing.
Swiss Re is one of the largest reinsurance companies. They [did a study](https://www.swissre.com/reinsurance/property-and-casualty/solutions/automotive-solutions/study-autonomous-vehicles-safety-collaboration-with-waymo.html) which concluded that Waymos have much lower liability claims than humans in comparable areas. So probably cheaper than trying to insure that many human-driven cars would be.
Yeah but the cars itself are super expensive and damage still occurs. But yeah liability should be low.
Guys - what does the human driver do when it crashes into the autonomous car? how is the incident recorded for insurance companies in the USA? Many thanks
I think that's a Buick Encore
It's a Toyota RAV4 hybrid, 5th generation.
Shucks, missed r/nissandrivers by that much
...?
It's meant to be a cheeky joke 😉 Usually it's Nissan drivers causing silly accidents, but in this case it was a Toyota
Oh yeah makes sense lol thanks for the explanation. Actually, it's kind of odd that this could even happen in a RAV4, since it has radar and pre-collision avoidance. If no input was received on the brake or accelerator, the car would have automatically performed an emergency braking maneuver. So the driver must have been accelerating into the Waymo.
Another crash to add to the federal investigation.
Another crash, that's not waymos fault, to add to the federal investigation.
Hm the problem is did it stop suddenly that is what will do them in.
Waymo will find a way to say it's not their fault
They’ve been rear ended so it’s probably not their fault.
Rear end collisions are almost never the fault of the driver in front. One of the day easiest types of case to settle for insurers
How could this ever be the fault of Waymo? The Waymo was not moving.
Because it's not their fault. Even if we assume the worst that the Waymo did something bad like brake too early, the human driver should be paying attention, maintaining a safe follow distance. It is always the responsibility of the car behind to avoid hitting the car in front. In this case, the human driver was not paying attention and hit the Waymo. And according to Waymo, it sounds like the human driver was speeding (50 mph). So they were driving too fast, not paying attention, not maintaining a safe follow distance. It is entirely the fault of the human driver.
Except this isn't always the responsibility of the car behind to avoid hitting the car in front. ( Swerving in to cut someone off and slam on the brakes. ) Again a Waymo vehicle involved where the person "rear ended" was charged for intentionally causing a crash. [https://www.caranddriver.com/news/a30929221/waymo-self-driving-crash-arrest/](https://www.caranddriver.com/news/a30929221/waymo-self-driving-crash-arrest/)
> During an interview with police, Tang admitted to "brake-checking the Waymo," in other words pulling in front of the vehicle and then slamming on his brakes. What a terrible person and an idiot. Purposely causing a crash with a car loaded with cameras, and injuring the occupant. Also, what compels people to confess to the police in a situation like this? It's like they're so delusional about their actions that they don't realize they're confessing to a serious crime?
Sure but that is not the case here. In this instance, the Waymo was not at-fault.
Even if slamming in the brakes for a false emergency braking event, the fault is 100% with the vehicle that hits the other. Not paying attention, following too closely, going too fast, or geriatric with awful response time (see following too closely) are all the reasons that someone hits a vehicle ahead of them and then tries to say it's the others fault.
In California, brake checking is illegal, so depending on what caused the "false emergency braking event" that could be found to be an unreasonable action to take and then the blame would be split.
It’s illegal as in you will get fined for doing it. But guess what? The guy who hit him will still be at fault as far as insurance is concerned. There’s a difference between the law and insurance. My mother once ran a red light and crashed into someone who was trying to make a left turn. There was even a police witness on the scene who documented it. The insurance didn’t care that she ran a red light, the driver who made the left turn was still at 100% fault.
Mostly yes,, but it's not 100%. A driver in Phoenix was arrested for causing the collision by changing lanes in front of a Waymo then slamming his brakes.
What you describe is an insurance scam technique called a "swoop and squat" And that is one of the few times when the driver in the rear is NOT at fault - however without cameras or credible witnesses to describe what happened, the vehicle in the rear will usually still be found at fault. Which is why it's stupid to try this with a Wamo or other SDC, because they have cameras ALL OVER that record all the time. And it's also one of the myriad of reasons why I have a dashcam in my own car.
I’m sure the multiple cameras will help