Google Self Driving Car Crashes Into Bus, Google Says They Are ‘Partially Responsible’

Google Self Driving Car

Back to the drawing board.

When many of us first heard about Google’s self driving cars, we didn’t really think that they could possibly work and were sure there were going to be a few crashes before they were widely accepted anywhere.

Surprisingly, although there have been a few examples of Google self driving cars being involved in crashes, one has never actually caused one until this recent incident where the vehicle pulled out in front of a bus in California. Nobody was injured as the bus was only travelling at 15mph and the car was only travelling at 2mph, but it does raise concerns about how safe the self driving cars can be.

In fairness though, the human in the car did say that he assumed the bus would slow down to let the car out, which is why he didn’t override the car’s self driving mechanism when he saw it. It kind of sounds like this could have happened to anyone who was just driving a car to be honest if they weren’t being careful, so maybe it isn’t too much of a concern at this point.

Google released the following statement:

Featured Image VIA

Self Driving Car 2

Image VIA

We clearly bear some responsibility, because if our car hadn’t moved, there wouldn’t have been a collision.

That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.

The car’s movements were made more complex by the presence of sandbags on the road.

From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.

It sounds like they’ve got it under control basically. I mean this is the first accident caused by the cars themselves and they’ve clocked up well over a million miles since they’ve been introduced in various American states, which is probably safer than a random sampling of human drivers covering the same distance.

This is probably just one minor setback that it already sounds like they’re sorting out. I’m not really that worried, although I am worried about that robot that managed to kill a human at the Volkswagen factory last year.


To Top