Autonomous Car Accidents and Policy Implications – Special Lawsuit Edition

Welcome to Autonomous Car Accidents and Policy Implications.  Today we look less at a collision and more at the fallout of some of the decisions that have likely caused them.  Pausing for a moment between the best known Tesla collision of 2016 and the collisions in  2017, we examine an attempted class-action lawsuit which did a bit of financial damage but barely scratched Tesla’s publicity machine.

When we left off, there had been a Tesla fatality which was ruled driver error.  Investigations had been completed and Tesla had rolled out an update which made it so that drivers had to keep their hands on the wheel more consistently and if they didn’t, the car would have to be completely stopped before it would allow the driver to turn Autopilot back on.  So far, so good.  The problem for Tesla was that while regulators were effectively unable to act (lacking the authority to command) and investigators always returned ‘driver at fault’ (since all the warnings were in the Tesla manual), the customers were beginning to get restive.

During the interim period between the problems being identified and new software and hardware being introduced to fix it, Tesla had been forced to reduce the maximum velocity for Autopilot to 45 mph in January of 2017 which was raised incrementally back to 80 mph in March.  These tweaks seemed to be the last straw for the owners of Teslas, who were starting to feel less like owners of cars and more like unpaid experimental testers – which is pretty much what they were.  Tesla runs its software division like any Agile software company.  It tests new software on the users’ machines, sees how it works, and tweaks based on feedback.  The price of being an early adopter is that tweaks are usually more extensive, and often the system goes into a degraded state while safety critical updates are being tested.  More on that idea when we discuss Uber’s AV, and its ‘free’ rides.

The lawsuit was an attempt to form a class-action alleging that owners and lessees of Tesla cars had paid a premium price for a car with AV capabilities and that the capabilities in question were either nonfunctional or dangerous.  It should be noted that the advertisement they used as an example was for HW2, meaning that their complaint was for cars manufactured after the one used in the fatalities explored in my previous post.  The lawsuit focused on the $5000 cost of unlocking the Autopilot software pre-order, or $6000 post-order.

The gist was that they were accusing Tesla of deceptive advertising.  They claimed they’d have version 2 Autopilot (AP2) ready by the end of 2016 when they knew they couldn’t.  Further allegations were that the software they pushed was dangerous and far less capable than version 1, which they covered for by having customers test driving version 1 Autopilot cars and then telling the customers that AP2 would be much better than the experience they had.

Let’s go allegation by allegation now.

First allegation: AP2 at the time of rollout was dangerous.  They cited several tech journalists who owned one or more AP2 Teslas which swerved unexpectedly, merged across yellow lines, braked hard at bridges mistaking them for obstacles, and didn’t brake at all at red lights.  It’s unclear whether these examples were cherry picked or indicative of the full experience (let’s face it, an article showing how bad a Tesla is will get more views than one showing a good experience).  The allegation that there were difficulties with the software of AP2 was supported in the lawsuit by the fact that only 2 of the 8 cameras were currently being used by the software, and that AP2 had been forced to start from scratch because Tesla had lost the rights to AP1.  Quoting from the lawsuit which quoted from an article at the time:

Mobileye, the Israeli company that supplied the original camera and
software for Autopilot, cited safety concerns when it pulled out of its
partnership with Tesla. The company’s chief technology officer told
Reuters that Tesla was “pushing the envelope in terms of safety …
[Autopilot] is not designed to cover all possible crash situations in a
safe manner.” Tesla says the collaboration ended for commercial reasons.

They also cited numerous incidents and issues that the plaintiffs themselves had, including lack of collision warning, veering into the wrong lane even at low speeds, and braking with no warning or apparent cause.

Second allegation: The marketing materials and practices were intentionally deceptive.  This is a harder one to prove and relied on the plaintiffs’ own interpretation of their experiences.  The lawsuit, then, would have to demonstrate that any reasonable person would believe as the plaintiffs did:

  1. That the AP2 update would come as one rollout in December 2016 with full functionality in a single package.
  2. That AP2 was built on AP1s software rather than a replacement from scratch.
  3. That when activated, AP2 would be safe to use in the intended environment.
  4. That (in the case of one plaintiff) this functionality would be a fully functional AV (level 3 or above)

Third allegation: Tesla violated the Motor Vehicle Safety Act.  The concept being that an AV system with demonstrated and consistent faults constitute a safety defect under the Act.  Tesla was never charged under the Act and the author of this blog believes that this is due to the fact that at the time of the lawsuit there was no regulatory body empowered to oversee the software portion of AVs.  Defects had to impact a car being driven by a human or else they were outside the remit of the regulators.  That’s why all the ‘driver error’ results from investigations have effectively cleared automakers of liability.

Tesla, as might be expected, refuted all claims at the time.  Their defense was, in brief, that they had not promised the full package by December 2016 but to start updates at that time.  They claimed that the safety concerns were sensationalized, and that several of the functionalities mentioned as missing had in fact been provided.  They said that the systems were a beta test, and that drivers should not ‘abdicate responsibility’ to the Autopilot.  This year Tesla settled for $5.4M to all owners of AP2 cars, with payouts ranging from $280 to $20 depending on how long the drivers had owned their vehicle.  The settlement was based on how long the owners of the cars had waited for functions that were delayed, and the judge and attorneys said that they were confident that by September 2017, all of the main functions had been implemented.  Tesla continues to claim that there was no wrongdoing in their practices, and the settlement effectively ends any private investigations into the matter.

Here’s my take: This wasn’t about deceptive advertising, it was about the boundaries of live testing of software functions.  As mentioned earlier in this post, it’s a common practice to incrementally release software, and for live user experience to contribute to testing the effectiveness of the software.  Machine Learning algorithms require enormous data sets to function well, so there’s an added incentive to roll out AP patches and functions even if they’re not quite ready yet, because it feeds the algorithm what it needs to work well in the future.  What’s happening here, however, is that this practice isn’t being used in a web app or a program on a personal computer, but instead in a safety critical system.  If software on your computer has a bug or something isn’t working quite right, you send a complaint or a log to the developer and they use that information in their next patch to fix it.

This lawsuit was to try to say ‘not here.’  That live testing to fuel incremental and algorithmic software growth won’t work.  Cars are too dangerous.  The lawsuit was a cry to the authorities to do something because the average consumer wasn’t technically aware enough to understand what Tesla was saying, and that the practice of buying software that will be beta tested by the early adopters was not the right model for this industry.  Consumers who buy alpha and betas of games on Steam expect issues and are there to provide feedback on them, but there isn’t an expectation like that in the car industry.  That was what it was really about, and it failed.

This is, long term, probably a good thing.  It’s also very much a short term bad thing.  That’s technology, though.  The problem is that it’s expensive to exhaustively test software that’s this complex, especially software that’s heavily based on current ML algorithms.  Tesla hasn’t got the deep pockets of Google (which has been testing its AVs since before Tesla Autopilot version 1 was released, and won’t be starting its AV taxi service until 2019) or the established consumer base of a big automaker (I’ll be covering Cadillac’s new AV in a future post).  That’s why Tesla and Uber have to use the classical Agile incremental approach to software updates.  They both rely on automotive innovation to function and can neither afford to look stale by letting another company (GM to Tesla, Google to Uber) introduce their product first, nor sink millions into comprehensive testing and data collection which could take years and still leave them with fewer functions.

The message from this settlement is clear but impermanent: You’re buying your car from a silicon valley style innovation engine.  Expect them to act like one.  All new functions are beta, all users are testers.  You’re getting it early and that prestige comes with a price: it’s your data that will be used to build the next version and the one after that, with the older functions becoming stabler and newer functions becoming possible.  Eyes forward and hands on wheel, because if something happens, it’s only a level 2 AV and you’d better be ready to take over.

Speaking of expectations and knowledge, join me next time for a collision which strained the argument of ‘unsophisticated and ignorant’ driver error further than ever.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s