By Eric B. Krauss on October 11, 2017
The American public cannot get enough of robots. From science fiction to scientific fact, robotic news and entertainment are part of our daily lives. This past year, at least two television series focused on robots set in the near future (“Westworld” on HBO and “Humans” on AMC) have captured viewers’ imagination, and last week the Blade Runner 2049 sequel opened at theaters nationwide. Indeed, the relationship between people and robotics has been a film subject at least since Fritz Lang’s silent film Metropolis in 1927. Its continuing popularity reflects our love-hate relationship with automation.
We eagerly read reports of robotics gone awry with dismay, or with glee. Siri and Alexa can turn off lights, turn up the heat, and start our washing machines. But what happens when things go wrong? What legal rights and remedies do we have? So far, not many beyond existing product liability law. H.R. 3388, known as the SELF DRIVE Act, passed in September, 2017, and the Senate’s current autonomous vehicle legislation, the AV START Act (S.1885) prioritize development of driverless vehicles and leave many safety issues unaddressed.
The recent National Transportation Safety Board (NTSB) report on the May, 2016, Tesla Model S accident concluded that both human error and Tesla’s Autopilot system design contributed to the deadly crash.[1] The NTSB found that the driver’s pattern of use of the Autopilot system indicated “an overreliance on the automation and a lack of understanding of system limitations,” and that “the system was not designed to, and did not, identify the truck crossing the car’s path or recognize the impending crash.” During the last six minutes before the crash, the driver did not have his hands on the steering wheel. This violated Tesla’s instructions to always keep hands on the steering wheel. However, such directions can easily be disregarded; e.g., warnings not to use smart phones or text while driving.
The NTSB further found that the design of the Autopilot system permitted disengagement from driving and use in a matter inconsistent with Tesla’s warnings on the instrument panel and instructions in the owner’s manual. “If automated vehicle control systems do not automatically restrict their own operation to those conditions for which they were designed and are appropriate, the risk of driver misuse remains,” it said.
The NTSB determined that the probable cause of the accident was the truck driver’s failure to yield the right of way to the car, combined with the car driver’s inattention due to overreliance on vehicle automation, which resulted in the car driver’s lack of reaction to the truck. “Contributing to the car driver’s overreliance on vehicle automation was its operational design, which permitted his prolonged disengagement from the driving task and his use of the automation in ways inconsistent with guidance and warnings from the manufacturer,” it concluded.
In its report, the NTSB issued new safety recommendations to the Department of Transportation, the NHTSA, the manufacturers of vehicles with Level 2 advanced autonomous driving systems (Audi, BMW, Infiniti, Mercedes-Benz, Tesla, Volkswagen, and Volvo), and the Alliance of Automobile Manufacturers and Global Automakers. Among these were to incorporate safeguards that limit the use of automated vehicle control systems to the conditions for which they were designed, and to develop applications to more effectively sense the driver’s level of engagement while the automated vehicle control systems are in use.
The Tesla Autopilot accident illustrates the complexities that arise from human interaction with automated systems. Are we so comfortable with robotics that we are content relying on them to safeguard life and limb without a second thought? Tesla’s new Model 3, with a current backlog of 450,000 orders, claims to have “the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.”
The fatal accident and government investigations have not deterred Tesla from offering “full self-driving capability” and advertising it as a selling point. The law inevitably will have to evolve to address the legal and liability issues arising from the relationship between humans and robotics.[2]
[1] “Collision Between a Car Operating With Automated Vehicle Control Systems and a Tractor-Semitrailer Truck Near Williston, Florida, May 7, 2016,” Accident Report NTSB/HAR-17/02, PB2017-102600, Adopted September 12, 2017, https://www.ntsb.gov/investigations/AccidentReports/Reports/HAR1702.pdf
[1] Another issue is security of AV systems; e.g., security researchers were able to hack into a standard Jeep Cherokee over the Internet and control its steering, brakes, accelerator, and other functions. See https://youtu.be/MK0SrxBC1xs.