Autonomous Vehicles Decide Fate of Riders

By: Neal Kisor, News Writer

Source: kitguru.net

Self-Driving cars are causing a stir in the world of transportation. They are projected to save tens of thousands of lives each year, create smooth-flowing traffic, cut-down on commute time, and give the elderly a chance to move freely. However, they also prose a scary question:  “What about crashes?”

Undoubtedly accidents will occur because of self-driving cars. Opponents of autonomous vehicles have criticized numerous fender-benders and accidents which involve self-driving vehicles. However, almost every single accident occurred due to negligence or error by human drivers. Often, the self-driving vehicles obey laws which common drivers may forget about. For example, Google’s self-driving vehicle caused an injury when it got rear ended after stopping somewhat suddenly at an intersection. There was heavy traffic at the time and the car was programmed to never block intersections, which is the law. However, the driver behind the car expected to continue through the intersection, causing a crash. Another accident occurred when the Google car encountered a congested area due to storm drain damage. The car attempted to merge into a lane by using the “zipper technique”, allowing one car through and then attempting to merge. However, a bus did not slow down or wait for the car to merge and a crash occurred, no injuries were reported.

Now, self-driving cars have gone into the next step of development, and it has raised a bit of controversy. The cars now determine how a crash will occur, essentially deciding who gets hurt. For example, if a self driving car is about to get t-boned by a semi-truck at an intersection the car will swerve reactively. Specifically, the car will swerve in order to go for “the smaller thing” that it can “see”. So, between another car, and a small tree on the side of the road, the self-driving vehicle will go for small tree. This has raised concern that the cars will potentially target people on the sides of the road, or will prepare for a crash when a child unexpectedly crosses the road and will therefore target the child. Other concerns is the supposed sacrifice initiative that the car has now. If a family occupies a self-driving vehicle and is placed in a crash-situation, then the family’s fate is the car’s responsibility.

The cars will always go for the option which would kill the fewest people. So in a crash scenario between a self-driving car and a school bus, the self-driving car would sacrifice itself. This minimization of casualties is for the “greater good”.

In cases such as these, even though self-driving cars are expected to prevents thousands of crashes and casualties on the road the crashes which do occur will be placed in a spotlight. No one can solve car crashes, and no one will be able to force human drivers off of the road. Companies will be tested in how these cars handle crashes. Companies do not want their cars to be presented as rolling death traps after all. There’s only a few ways to realistically test these crash responses, hopefully the tests won’t prove to be fatal for riders, drivers, and pedestrians.