Autonomous vehicles have been talked about extensively during the past few years. Early
on, fully autonomous vehicles that could drive themselves without any interaction from a driver were promised in 2020. It’s 2021, and we’re not there yet. Testing centers have been established in a number of cities, and there are some small starts.
Houston has a pilot autonomous delivery service that has partnered with Kroger and CVS to deliver groceries or prescriptions in certain ZIP codes. Either an autonomous Toyota Prius or a custom-built driverless vehicle will drive to the customer’s residence where the customer can retrieve their groceries or medications. Nuro is the company providing the vehicles for these tests; it is a robotics company working to improve the way people live through the use of robotics and artificial intelligence.
In December 2020, the company was the first to receive a permit from the California Department of Motor Vehicles to deploy autonomous vehicles on public streets. The vehicles are allowed to operate commercially in two counties, and once Nuro establishes a connection with a partner, delivery of products autonomously will begin.
The vehicles are called R2 and have been tested in three different states with no drivers, occupants or chase cars. The vehicles are small and designed to carry packages, not passengers. A number of states have made it legal for delivery robots to share streets and sidewalks with people, including Pennsylvania, Virginia, Idaho, Florida, Wisconsin, as well as Washington, D.C.
Tesla has named its driver assistance system Autopilot and came out with a beta version of its Full Self-Driving, or FSD technology, last October. However, the company states that neither is an autonomous system and that drivers must remain fully attentive with their hands on the wheel, prepared to take over full control at any moment. While there are plans for the technology to progress over time to full autonomy, those capabilities do not yet exist, and the names are confusing at best.
This is all well and good, but one large question remains: how does insurance work with an autonomous vehicle? The issue, of course, is the assignment of liability in the event of an accident. Once vehicles truly are autonomous and capable of navigating along the roads without a driver paying attention or even being present in some cases, what happens if the vehicle is involved in an accident?
Assigning liability in self-driving car crashes
Recently, Waymo conducted an interesting study using its autonomous vehicles. It took 72 fatal crash scenarios and recreated the events with its autonomous vehicles in place of the vehicle that initiated the crash and the vehicle that had to respond to the driving behavior of the initiator.
The company ran 91 simulations, and in most instances, the vehicles didn’t crash. The responding vehicle avoided crashes in 32 simulations, and in 20 of those simulations, it didn’t even have to make a big movement to avoid an accident.
In one of the simulations, they also ran the original situation: An intoxicated driver running a red light traveling at 107 mph. The driver hit another vehicle that was traveling at 35 mph when struck; the driver in that vehicle died. When the Waymo vehicles were used in a simulation, the initiating vehicle did not exceed the speed limit nor did it run the red light. The Waymo vehicle was able to respond to the vehicle running the red light at excessive speed to get out of the way and avoid the fatality that happened in real life. The study shows that autonomous vehicles can successfully respond to the bad behavior of other drivers.
Determining fault will be the first course of action. Who is at fault if an autonomous vehicle malfunctions, and how is that verified? People are sure to blame the vehicle when they were actually driving in order to not be charged with the accident. We’re well aware of phantom vehicles that supposedly force drivers off the road when in reality the driver wasn’t paying attention but doesn’t want the accident to be held against him and increase his premium. It’s easy to see how people who aren’t paying attention when in control of their autonomous vehicle may blame an accident on the vehicle instead of accepting responsibility for the accident.
It’s going to be important for the vehicles to reliably record who is in control of the vehicle at any given time and for the details of an accident to be recorded — not just the speed of the vehicle but whether or not the autonomous technology was in control and if it was working correctly.
Many states have passed legislation or executive orders related to the testing and development of autonomous vehicles. Many are looking at the related licensing, registration, insurance, traffic regulation, and owner and operator responsibilities related to owning and operating autonomous vehicles. Once vehicles are fully autonomous, will liability even still be an issue? Will it become like buying a lawnmower, toaster oven or another piece of equipment, where, if it malfunctions, the manufacturer is responsible for any injuries or damages? With enough autonomous vehicles on the road, it could become an option.
There will be plenty of growing pains in the meantime; until the technology is perfected, we will have to deal with sorting out liability with new technology and people’s ability to understand how it works and how to use it. There is much more at stake in understanding how an autonomous or semi-autonomous vehicle works than programming the television. Only time will tell.