Call Us Now 800-634-8144

Officials Address Self-Driving Car Liability in Executive Orders

Self-Driving Car Error with car flipped

Late last year we reported on an appeal from an Under Secretary at the United States Department of Transportation for better testing of self-driving cars. We may be one or more decades away from a future where all cars are autonomously driven, but we are likely only a few years away from having millions of cars on the road which operate in a fully autonomous mode at least some of the time. These cars and trucks will, of course, be sharing the road with millions of other cars and trucks being piloted by real, live humans. When all cars are fully autonomous, communicating with one another and maybe even with the road itself, traffic accidents could possibly become a thing of the past. But until then, the mix of human-driven autos and auto-driven cars is going to cause its fair share of crashes. How will state legislatures, Congress, government agencies and the courts address issues of self-driving car liability?

Driverless Cars Are Not Perfect at Crash Avoidance

Fatal accidents have already happened with cars operating in an autonomous mode. In 2016, a Tesla on Autopilot collided with a tractor-trailer, underriding the trailer and killing the driver of the Tesla. In March of 2018, a Tesla on Autopilot struck a median barrier and was then struck by two other vehicles, killing the driver who died of his wounds at the hospital. That same month, a pedestrian was struck and killed by an autonomous vehicle (AV) while walking her bicycle across the street. There have been several other crashes to-date, including one as recently as February 2019 in New Jersey. While these accidents so far have been few, they may grow as more and more AV cars get on the road. Who is liable in an accident involving an AV car?

One good starting place in any accident is to look at the facts, e.g. who had the right-of-way at an intersection, how fast were cars going relative to traffic conditions, did the vehicle signal a lane change in advance of the maneuver, etc.? The rules of the road should still apply in any car accident, whether AV v. human driver, AV v. AV, or plain-old carbon-based life forms getting into a wreck.

A fully autonomous vehicle may not need a human operator involved in any part of the driving process other than inputting a destination, and future cars may even be designed without a means for a human driver to take over driving functions. But for now, there’s still a steering wheel and a brake, and states that allow AV cars on the road also require a human behind the wheel paying attention and in charge of the car. These laws generally place the ultimate responsibility on that human driver, even if the crash occurs in full AV mode. Of course, the driver in such an instance may strongly protest that a glitch in the system was responsible for the wreck, turning a car accident personal injury case against the driver into a products liability or product defect case against the manufacturer.

States Are Responding with Laws on Driverless Car Liability

All but less than a dozen states already have statutes on the books and/or executive orders regarding autonomous vehicles, according to sources such as the Brookings Institution and the National Conference of State Legislatures. Texas HB 119, introduced in January, would require accident report forms to include a way to indicate whether an automated motor vehicle was at fault or otherwise involved in the accident, paving the way for accident investigators to make an early call over liability in a crash. Laws such as this one wrestle with legal issues that have the feel of the metaphysical, like defining who (or what) is the driver. For instance, a driver could be the AV system itself, the human being who is sitting behind the wheel, or in the case of an autonomous fleet of semi-trucks, a remote operator hundreds of miles away. All of these ideas are floating around in different states. Could uniform federal rules be far behind?

Top

Exit mobile version