Tesla 'Self-Driving' NDA Hopes To Hide The Reality Of An Unfinished Product

from the I'm-sorry-Dave-I-can't-do-that dept

There isn’t a day that goes by where Tesla hasn’t found itself in the news for all the wrong reasons. Like last week, when Texas police sued Tesla because one of the company’s vehicles going 70 miles per hour in self-driving mode failed to function properly, injuring five officers.

If you hadn’t been paying attention, Teslas in self-driving mode crashing into emergency vehicles is kind of a thing that happens more than it should. In this latest episode of “let’s test unfinished products on public streets,” the Tesla vehicle in “self-driving” mode’s systems failed completely to detect not only the five officers, but their dog, according to the lawsuit filed against Tesla:

?The Tesla was completely unable to detect the existence of at least four vehicles, six people and a German Shepherd fully stopped in the lane of traffic,? reads the suit. ?The Tahoes were declared a total loss. The police officers and the civilian were taken to the hospital, and Canine Officer Kodiak had to visit the vet.”

Of course for Musk fans, a persecution complex is required for club membership, resulting in the belief that this is all one elaborate plot to ruin their good time. That belief structure extends to Musk himself, who can’t fathom that public criticism and media scrutiny in the wake of repeated self-driving scandals is his own fault. It’s also extended to the NDAs the company apparently forces Tesla owners to sign if they want to be included in the Early Access Program (EAP), a community of Tesla fans the company selects to beta test the company’s unfinished self-driving (technically “Level 2” driver-assistance system) on public city streets.

The NDA frames the press and transparency as enemies, and urges participants not to share any content online that could make the company look bad, even if it’s, you know, true:

“This NDA, the language of which Motherboard confirmed with multiple beta testers, specifically prohibits EAP members from speaking to the media or giving test rides to the media. It also says: “Do remember that there are a lot of people that want Tesla to fail; Don’t let them mischaracterize your feedback and media posts.” It also encourages EAP members to “share on social media responsibly and selectively…consider sharing fewer videos, and only the ones that you think are interesting or worthy of being shared.”

Here’s the thing: you don’t need to worry about this kind of stuff if you’re fielding a quality, finished product. And contrary to what Musk fans think, people concerned about letting fanboys test 5,000 pound automated robots that clearly don’t work very well are coming from a valid place of concern. Clips like this one, for example, which show the Tesla self-driving system failing to perform basic navigational functions while in self-driving mode, aren’t part of some elaborate conspiracy to make Tesla self-driving look bad and dangerous. There’s plenty of evidence now clearly showing that Tesla self-driving, at least in its current incarnation, often is bad and dangerous:

Ever since the 2018 Uber fatality in Arizona (which revealed the company had few if any meaningful safety protocols in place) it’s been clear that current “self-driving” technology is extremely undercooked. It’s also become increasingly clear that widely testing it on public streets (where other human beings have not consented to being used as Guinea pigs) is not a great idea. Especially if you’re going to replace trained testers with criticism-averse fanboys you’ve carefully selected in the hopes they’ll showcase only the most positive aspects of your products.

We’ve been so bedazzled by purported innovation we’ve buried common sense deep in the back yard. Wanting products to work, and executives to behave ethically, is not some grand conspiracy. It’s a reasonable reaction to the reckless public testing of an unfinished, over-marketed product on public streets.

Filed Under: , , ,
Companies: tesla

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Tesla 'Self-Driving' NDA Hopes To Hide The Reality Of An Unfinished Product”

Subscribe: RSS Leave a comment
47 Comments
This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

Something something the hold my beer race again is unable to learn from its history.
Someone wanna mail Elon a copy of ‘Unsafe at Any Speed’?

NDA might keep drivers from talking, up until they find out that they are being personally held responsible for harming/killing people.
Pretty sure auto insurance companies aren’t listing shitty AI as an authorized driver, which will leave the fanbois in the very very poor house.

If only we had some sort of agency charged with protecting the public on the roads who could do something about this.
But obviously this isn’t as important as the liberal conspiracy to silence conservatives online, so its not important enough to pursue.
I mean they already killed 700K people who didn’t need to die, how many can Tesla’s manage to kill until they get it fixed?

Ehud Gavron (profile) says:

Re: Re: Re:

"Thanks for admitting negligence!" <– perhaps there was a "to" missing there.

In the race car, the first thing we do after starting is turn of TCS, ABS, and other "helpful computer bits" that mess with track driving.

In the street cars we turn off TCS when four-wheeling or going through deep areas of water or loose sand.

Negligence (failure to use reasonable care) is not "letting the AI drive." It would be failing to consider whether the AI should drive or not.

Sorry, your weird assertion that using FSD is negligent… fails.

E

Sabroni says:

Re: Re: Re: Negligence is not "letting the AI drive"

An ABS system is not a self driving system. A mechanical system that distributes power or manages breaking does not decide when to brake, or when to turn. The AI "drives" these cars, it doesn’t shuffle power from one wheel to another.
You don’t know what you’re talking about.

This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

Re: Re: Re:

You turned control of the vehicle over to a "beta" AI, that there are a few known issues with… after you signed an NDA that you wouldn’t/couldn’t tell anyone outside of Tesla if the AI decided to plow through a kindergarten playground???

Thats a nice Tesla you have… pity it has to sit in your backyard since no insurance company will cover it.
And even if they improve the AI, its going to be a while before any of them will want to touch them.

It ran down 6 people, vehicles, and a dog…
In the other clip the AI decided to pursue humans trying to legally cross the street.

Given the number of assholes doing the sleep thing while the car drives on the freeway, you can’t convince me everyone trying out the full self driving will actually be paying full attention even knowing that other drivers have been screwed over it, they will assume they’ll be okay (because it won’t happen to them).

Ehud Gavron (profile) says:

Your right to talk

Tesla made a deal: we’ll give you FSD to beta… you don’t talk about it publicly.

It’s a fair deal.

FSD isn’t there yet. We all know this. What’s the big deal when the beta testers want to say "Hi this beta-test isn’t working right?"
None. Don’t sign it, or if you do, live by the contract you signed.

Hitting 5 cops — definite no no. But was it FSD or a DUI in progress. The cops hate releasing any exculptory information ever, so we won’t find out until it hits a courthouse…

E

Anonymous Coward says:

It seems. The self driving software does not understand police flashing lights warning signs etc or police stop signs, it simply drives into police cars parked on a road there’s no way the self driving software is ready for 1000s of new drivers to use it on public roads it just looks for other cars moving or maybe pedestrians crossing it has no understanding of a police checkpoint or emergency vechicles parked in the middle of a road the flashing lights maybe make it harder to
determine is that a car or just a traffic light on the road

Anonymous Coward says:

Re: Re: Re: Re:

What does that even mean? "Not running into objects" is the single most important aspect of safe driving. Even without any knowledge of traffic laws or conventions, you could at least survive most situations by just obeying that rule.

That anyone would program vehicles to outright ignore objects in their path is batshit insane.

Jeremy Lyman (profile) says:

Re: Re: Re: Re:

When cars "ignore" stopped objects, it’s generally because the forward facing radar tells the system that there’s nothing there. Radar is great for judging how fast objects are moving or differences in velocity, but when an object completely stops it blends into the background. Systems that rely too much on radar, trusting it over conflicting sensor inputs, are prone to these types of crash. They were designed to follow highway traffic at speed where, generally speaking, there aren’t parked cars in the road.

Ed (profile) says:

Re: Re: Re:2 Re:

Oddly enough, though, my Ford Escape with cameras and radar can and does see non-moving objects in the roadway just fine and slows/stops accordingly. It isn’t advertised or referred to as Full Self Driving but simply an "assist", which it is. Yet in thousands of miles of using it to drive across the country several times, it hasn’t hit a stopped vehicle or a pedestrian, or even come close to doing so. Funny that.

Jeremy Lyman (profile) says:

Re: Re: Re:3 Re:

Right, the system works reasonably well and doesn’t always ignore all immobile objects. That’s why they’re generally beneficial. But it has the same kinds of warnings Tesla uses:

WARNING: You are responsible for controlling your vehicle at all times. The system is designed to be an aid and does not relieve you of your responsibility to drive with due care and attention. Failure to follow this instruction could result in the loss of control of your vehicle, personal injury or death.
WARNING: The system only warns of vehicles detected by the radar sensor. In some cases there may be no warning or a delayed warning. Apply the brakes when necessary. Failure to follow this instruction could result in personal injury or death.
WARNING WARNING: The system may not detect stationary or slow moving vehicles below 10 km/h.

Ford Manual

WARNING: Traffic-Aware Cruise Control is designed for your driving comfort and convenience and is not a collision warning or avoidance system. It is your responsibility to stay alert, drive safely, and be in control of the vehicle at all times. Never depend on Traffic-Aware Cruise Control to adequately slow down Model 3. Always watch the road in front of you at all time. Failure to do so can result in serious injury or death.

Tesla Manual

Anonymous Coward says:

The ai should be programmed to stop or to slowly drive past any parked cars while avoiding any pedestrians eg do not drive into any cars parked in the way where ever they may be . It could also sound a horn to warn any people in front of its not sure how to proceed like reversing trucks have audio alarms to warn people of possible collisions

Anonymous Coward says:

Someone Has To

So yes I one a Tesla, 2 in fact. However, someone has to lead and pointer this field. Tesla takes all the heat for self driving because they are so far out in front, spotlight is on them. Not because they have a bad product. If it GM or Mercedes leading this FSD, it would be them under the microscope. It’s. double edged sword.

That One Guy (profile) says:

Re: ... cause five car pileups?

Tesla takes all the heat for self driving because they are so far out in front, spotlight is on them.

Well that and their cars seem to have a distressing tendency to plow into other vehicles more often that they should(that number being ‘never’), but I’m sure you’re right and it’s just their super-duper advanced tech that’s causing the criticism.

Anonymous Coward says:

Re: Re: ... cause five car pileups?

Well that and their cars seem to have a distressing tendency to plow into other vehicles more often that they should

That is down to the human monitoring not taking action when the autopilot is not doing its job. Failure to detect stationary objects is a well known problem with the software, and a reason human monitoring is required.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Someone Has To

So yes I own a Tesla, 2 in fact. However, someone has to lead and pioneer this field. Tesla takes all the heat for self driving because they are so far out in front, spotlight is on them. Not because they have a bad product. If it GM or Mercedes were leading this FSD, it would be them under the microscope. It’s. double edged sword.

Jeremy Lyman (profile) says:

Re: Re:

Yes there’s valid criticism to be found, though it’s generally a good idea to understand the systems you’re criticizing. This article is written with such a chip on its shoulder that it doesn’t bother to distinguish between the limited access "FSD Beta" it wants to blame vs the standard "Autopilot" which was active during this crash. You also didn’t seem to notice that they’re also suing the bar which served alcohol to the driver, so he was drunk and not driving responsibly. But drunk drivers hitting things is so common it barely qualifies as news any more.

Rocky says:

Re: Re: Re:

Agreed. Karl’s coverage of the incident is far from fair, it reads more like a hit-piece that conflates things to come a conclusion that’s not supported by the facts.

There is no doubt that Tesla have some problems with the self-driving which can result it fatalities, but most of the cases so far have shown us that some drivers are shockingly negligent and/or stupid. Sadly, you can’t cure stupid with even the most advanced AI – even if it happens to be a beta-version.

And when Karl wrote: "Of course for Musk fans, a persecution complex is required for club membership", I read that as "I don’t like people who don’t agree with my views on Musk, therefore I’ll denigrate them".

There’s one thing to have a strong opinion about something, but when someone goes down the rabbithole of conflating things to come up with ways to attack what they dislike they strayed into asshole territory and I have no problems at all calling them out for that behavior.

So Karl, I think you are a bit of a asshole when you write about Musk and his associated companies as evidenced by your conflation of a limited beta-program, it’s NDA plus not mentioning the real circumstances and context of the crash compared with every other Tesla car/driver not enrolled in that program.

On the whole, I think it’s good to highlight the risks and pitfalls of self-driving cars and the manufacturers responsibility in marketing the features correctly and you could have easily written a better piece with all the included facts, but you choose not to.

If you want to paint something in indignant faux black and white, don’t be surprised if people call you out on it. Do better.

Anonymous Coward says:

Re: Re: Re:2 A Dance As old As 2005

Behold the majestic fall migration for the Tesla fanbois has begun. As they work their way from on tech website to another defending their progenitor from perceived insults. Their displays of rhetoric ratchet up as they attempt to outdo each other in feats of logical fallacy as they vie to impress an extremely limited pool of available mates.

Lostinlodos (profile) says:

Oh, wait…what?

Oh, wait…what?

For starters the title of the article is generally ignored in a rant about crashes.
This type of NDA is par for the corse in beta anything. Do not publicly disclose problems during the beta period.
It’s why windows beta testers were long inside locked to the public MSDN discussions. It’s why Apple dev beta (pre public) discussion is behind the ADS program.
I’ve tested software, and hardware devices. Some of them dangerous on error. This is nothing unique.

Crashes?
This reeks of not-my-fault-ism!

How is Tesla at fault for the driver not retaking control?

Be it an OS or a car: the point of a beta test is to gage reaction to real world situations, make manual corrections, and report flaws; “bugs”.

So regardless of if it’s alt-esc when a random /rm * starts for no reason, or bus cars and SUVs and people in the road… ?
Why did the driver not do something?
Unless self drive or lvl 2 or what ever disengaged the steering MIB style: the driver is responsible. To some degree.

The damn manual/guide/agreement says the driver is responsible for controlling the vehicle!

Beta testing is self explanatory to anyone who reads the agreement they sign! It’s not finished!!!!!


The problem here, on Tesla’s end, is not better vetting the idiots who agree to beta testing. If the driver was doing the job they agreed to, actively testing an assistive technology, they wouldn’t be plowing into things while sleeping, reading, screwing, etc. —

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...