Thought Leadership
By Erin O'Leary, Esq.

Self-Driving Cars: Pedal to the Metal—or Time to Tap the Brakes?

article-img

The House Energy and Commerce Committee (“E&C”) projects that self-driving cars will reduce traffic fatalities by 90%, saving 30,000 lives per year.  This is perhaps not surprising given that 94% of motor vehicle accidents are attributable to human error, according to the National Highway Traffic Safety Administration (NHTSA). 

On September 6, 2017, the House passed H.R. 3388, known as the SELF DRIVE Act, with widespread bipartisan support, evincing Congressional intent to promote the development and deployment of self-driving – or “autonomous” – vehicles as soon as possible.  The bill allows entities developing self-driving cars and/or automation systems to test up to 100,000 such vehicles each year.

It also makes clear that the federal government intends to preempt state safety regulations with respect to self-driving cars, preventing a patchwork of potentially inconsistent state safety regulations.

It also allows exemptions from federal crashworthiness standards in increasing numbers, up to 100,000 vehicles per year, for entities developing and selling self-driving cars or automation systems.  Such an exemption is an absolute necessity to continued testing and development, considering that such vehicles lack critical components found in convention cars, such as brakes and steering wheels.

It provides that within 24 months following enactment, the Secretary of Transportation must promulgate a final rule requiring the submission of safety assessment certificates regarding how safety is being addressed by each entity developing a highly automated vehicle or an automated driving system.”  The Secretary must also make available to the public and to the appropriate House and Senate Committees a “rulemaking and safety priority plan, as necessary to accommodate the development and deployment of highly automated vehicles and to ensure the safety and security of highly automated vehicles and motor vehicles that will share the roads with highly automated vehicles ….”  In the meantime, however, such entities may continue to develop self-driving vehicles and automation systems, provided they submit safety assessment letters to NHTSA. 

Although Congress appears eager to encourage the testing and deployment of these autonomous vehicles and automation systems, these technologies will have an undeniable but as-yet-unknown legal impact on a number of industries.  Consider the following examples from the civil and criminal fields:

Liability for Motor Vehicle Torts

If autonomous vehicles do indeed achieve their intended purpose – improving highway safety and congestion – then we would expect to see a concomitant drop in motor vehicle accidents.  The bill provides that “compliance with a motor vehicle safety standard prescribed in this chapter does not exempt a person from liability at common law.”  Nor are common law claims preempted under the bill.   But with the element of human error removed – at least in theory – what is the role of the human “driver?”  And if a human “driver” is indeed found negligent for a collision involving a self-driving vehicle, would the third-party liability insurer have a right of subrogation with respect to the manufacturer?   Will there be a proportionate increase in products liability claims against manufacturers and automation developers for motor vehicle collisions in light of the theoretical elimination of human error? 

Products liability law in the automobile context has typically centered on the manufacturer of the motor vehicle itself, as well as any relevant component manufacturers.  We would certainly expect to see automation systems developers included among products liability defendants, but how far will liability extend for both products liability and other traditional common-law torts?  For example, will coders and other technical support personnel face products liability suits for design defects in their software?  Will these entities face negligence for failure to timely update software bugs?   A suit against a programmer claiming millions of dollars for wrongful death certainly seems within the realm of possibility, if not probability.

Cybersecurity

Entities who sell or “introduce … into commerce” self-driving cars must have a cybersecurity plan in place for detecting and responding to “cyber attacks, unauthorized intrusions, and false and spurious messages or vehicle control commands.” 

When the software in question is the “driver” of an automobile, software security breaches may mean life or death.  Imagine a scenario in which a third party has illegally accessed a self-driving car’s automation system and taken control with the intention of doing harm.  If loss of life, injury, or property damage does indeed result, as intended by the third party, what insulation from criminal prosecution does the human “driver” of such a vehicle have, if any?  May the “driver” be insulated from allegations of gross negligence or recklessness?

In short, while Congress and many industries appear eager to fast-track self-driving cars to market, the impact of such vehicles remains largely unknown.  Development of jurisprudence involving these vehicles is likely to remain a work in progress for the foreseeable future.