The autonomous Bolts GM's self-driving start-up has running around San Francisco have been involved in 22 accidents during 2017-- none of which were the software's fault, legally that is.

Cruise Automation has been using a fleet of self-driving Chevrolet Bolts to log autonomous miles in an urban environment since GM purchased the company for more than $1 billion in 2016. When you're trying to disrupt personal transportation as we know it and develop a new technology standard there are bound to be a few incidents.

But this hybrid model of humans and algorithms sharing the road is more complex than simply apportioning blame based on the law, isn't it? None of the 22 incidents involving GM Cruise's fleet were serious, but a majority of them were caused by a fundamental difference in the way autonomous and human drivers react.



In June, an autonomous Bolt traveling at 7 mph on Van Ness Avenue "decelerated" in response to a bus pulling away from the curb ahead, which led a white minivan to run into the back of it.

While accelerating away from a light on September 18, a vehicle in the right lane weaved within its own lane without crossing into the AV's lane. The software responded by abruptly decelerating, a 1984 BMW 633 CSi also accelerating towards the intersection countered by rear-ending it.  

On October 12, a Bolt AV was startled by a pedestrian on the sidewalk approaching the crosswalk whilst browsing their smartphone. As the car crossed into the intersection the algorithm decided to immediately decelerate just in case the person jumped into oncoming traffic--as a result, the Toyota Corolla following hit it from behind.

Six days later, a scooter merging from a right turn lane in front of an autonomous Bolt caused the car to stop in the middle of an intersection resulting in a collision with an oncoming Subaru Impreza, which had already begun turning left in anticipation of the AV clearing the intersection.

Then, on December 7, while crawling along in heavy traffic, a Bolt AV decided to merge left after identifying a gap in traffic between a minivan and a sedan. However, halfway through the move, the minivan slowed slightly, causing the AV to abort the lane change and return to the center lane, which resulted in a collision with a motorcycle that was lane splitting between the left and center lanes. The motorcyclist was deemed at fault for attempting to pass under unsafe conditions.



You may have noticed a common theme emerging: silly humans keep crashing into these poor autonomous vehicles which are only out here trying to virtue signal us into a newer, better, safer, traffic-free future where we'll all be free to mainline media, and commerce during our commutes, generating a trail of highly collectable digital data, 24 hours a day, 7 days a week, no matter where you go--but I digress.

According to an analysis of autonomous vehicle accident reports published by a group of researchers from the University of San Jose, an autonomous car is two times more likely to get rear-ended than a car operated by a human being. It's a side effect of the autonomous driving model which is, don't move unless it's painfully safe, stop immediately if you're worried, and definitely, don't hurt anyone.

The study analyzed autonomous accident data reported between September 2014 and March 2017, the results will either reinforce your conviction that autonomous cars are stupid, or fuel your evangelical belief that people are incapable of operating motor vehicles.

It was discovered that humans and algorithms are equally bad at not getting hit from behind, even so, 62 percent of autonomous car accidents are rear-end fender benders, which is double the probability a human driver has of being rear-ended. Ostensibly, this is a side effect of autonomous behavior, where abrupt deceleration in the face of seemingly unpredictable behavior is the norm.

In 22 out of the 26 reported accidents, the software was found to be not-at-fault, however, out of all the accidents, the AV was capable of detecting and avoiding an imminent accident only 3 times. It's rather difficult to avoid a crash caused by your own abrupt deceleration, but that's not what the headlines say, they say impending doom will befall us if we don't get humans out from behind the wheel.



94 percent of all automobile accidents are caused by human drivers. It's not a statistic that should really shock you because humans are the only entity currently capable of operating motor vehicles, who else could be at fault.

But the study's findings become more interesting when looking at accident frequencies per miles driven. The mean mileage for cars driven by humans before encountering an accident is 500,000 miles, compare that with 42,017 miles for self-driving cars. Clearly, not all of us are as bad at this driving stuff as some would lead you to believe.

It's already been taken as absolute fact that self-driving cars will save lives, but the truth is we simply don't have the data yet, the study claims a fleet of 100 vehicles would need to be driven accident-free 24/7, for 12.5 years in order to accurately estimate acceptable fatality rates.

The problem is, the algorithms aren't exactly getting better the more miles they drive, the researchers also concluded that the number of accidents observed had a significantly high correlation with the number of autonomous miles traveled, with no plateau in sight. Which leaves GM's plan to launch fleets of fully autonomous robo-taxis in dense urban environments by 2019 seeming borderline disingenuous, and Mary Barra's vision for a crash-free future sounding like a cash grab.

But it's not about saving lives, it's about increasing consumption but positioned under the guise of convenience, so please stop asking us those of us who value the greatest asset to our personal mobility to simply give it up just so that you can shop for shoes and share dank memes on your morning commute.