LOS ANGELES — By rolling out self-driving technology to consumers more aggressively than its competitors, Tesla Motors secured a spot in the forefront of a coming industry.

But that strategy could expose the company to a risk it has sought to avoid: liability in crashes.

Tesla in 2015 activated its autopilot mode, which automates steering, braking and lane switching. Tesla asserts the technology doesn’t shift blame for accidents from the driver to the company.

But Google, Zoox and other firms seeking to develop autonomous driving software say it’s dangerous to expect people in the driver’s seat to exercise any responsibility. Drivers get lulled into acting like passengers after a few minutes of the car doing most of the work, the companies say, so relying on them to suddenly brake when their cars fail to spot a hazard isn’t a safe bet.

Such a concern could undermine Tesla, whose autopilot feature is central to a fatal-accident investigation recently launched by federal regulators.

The National Highway Traffic Safety Administration is considering the role played by autopilot technology in a Florida collision between a Tesla Model S and a big rig. Tesla said autopilot sensors failed to detect the white truck, turning in front of a Model S, against a bright May sky, killing 40-year-old Joshua Brown.

Advertisement

Were the victim’s family to sue Tesla over an accident caused – or not avoided – by autopilot, one of several arguments they might make is that Tesla acted negligently by not doing what a reasonable manufacturer would do, said Stephen Nichols, an attorney in the Los Angeles office of law firm Polsinelli. The fact that others have developed similar technology but have chosen not to release it or have branded it in ways that don’t suggest automation, could leave Tesla vulnerable.

“You could say, ‘Tesla, you’re not doing what these other companies are doing, so you’re being unreasonable,'” Nichols said.

Cases about defective product design typically hinge on whether a company sufficiently vetted its wares – in this situation, programming code that interacts with a number of components throughout the car.

If the accident happened because the software was inadequate (because it couldn’t spot the white vehicle on a light backdrop) and proper testing would have found the flaw, Tesla could be on the hook, said Jon Tisdale, a general partner in Gilbert, Kelly, Crowley & Jennett’s Los Angeles office.

The competitive landscape bolsters his contention.

“There’s going to be the argument made that they are rushing to market to corner it before other manufacturers release the product, and that Tesla cut the testing short – ‘they didn’t do it right,'” said Tisdale, who mostly defends product liability cases.

Advertisement

Tesla’s billionaire founder Elon Musk has said that autopilot mode is a voluntary feature, that drivers are warned of the risks and that testing it with the public makes it safer than if the company were to do it solely internally. And he’s made clear since its release that drivers don’t abdicate responsibility.

“The onus is on the pilot to make sure the autopilot is doing the right thing,” he said in a televised interview in 2013. “We’re not yet at the stage where you can go to sleep and wake up at your destination.”

Consumer advocates say firms that insist on consumer culpability when a machine is in charge don’t understand what’s happening on the roads.

“On the one hand, they’re saying, ‘Trust us, we can drive better than you would.’ But on the other hand, they are saying, ‘If something goes wrong, don’t ask us to stand behind our product,'” said Rosemary Shahan, president of the Consumers for Auto Reliability and Safety lobbying group. “But if it’s controlled by an algorithm, why should you be liable?”

Google, with its goal of producing a car that doesn’t have a way for a human to take control, is one of the few companies with a different stance at the outset. The company says it would be responsible for accidents caused by its software.

Zoox, a Silicon Valley startup, declined to comment about how it views the liability question. But the company also isn’t planning to release technology that would require human intervention.

Shahan said holding companies accountable through lawsuits and regulation might stifle innovation, but it’s a worthwhile trade-off to get them to take more precautions.

“It’s hard enough to not nod off when you are in control, let alone when you’re in autopilot,” she said. “We shouldn’t trade one set of human error for another.”

Brown’s family has said through attorneys that they hope lessons from his crash “will trigger further innovation which enhances the safety of everyone on the roadways.” A decision on whether to file a lawsuit isn’t likely until a federal inquiry is completed, and the family’s focus remains on mourning, they said.


Only subscribers are eligible to post comments. Please subscribe or login first for digital access. Here’s why.

Use the form below to reset your password. When you've submitted your account email, we will send an email with a reset code.