Netflix Files Anti-Slapp Motion To Dismiss Lawsuit Claiming One Of Its Series Caused A Teen To Commit Suicide

from the even-an-algorithm-is-protected-speech dept

Because Netflix is big, it draws lawsuits. It has been sued for defamation, copyright infringement, and, oddly, defamation via use of a private prison’s logo in a fictional TV show. It has also been sued for supposedly contributing to a teen’s suicide with its series “13 Reasons Why,” which contained a lot of disturbing subject matter that teens deal with daily, like bullying, sexual assault, and — most relevant here — suicide. The final episode of the first season contained a suicide scene, one that was removed by Netflix two years after the show debuted.

While undeniably a tragedy, the attempt to blame Netflix for this teen’s suicide is severely misguided. The lawsuit filed by the teen’s survivors alleges Netflix had a duty to warn viewers of the content (content warnings were added to the show a year after its release) and it failed to do so, making it indirectly liable for this death.

Netflix is now trying to get this lawsuit dismissed using California’s anti-SLAPP law because, as it argues persuasively, this is all about protected speech, no matter how the plaintiffs try to portray it as a consumer protection issue. (h/t Reason)

Netflix’s anti-SLAPP motion [PDF] points out this isn’t the first time teen suicide has been depicted in pop culture, nor is it the first time people have tried to sue creators over the content of their creations. None of those lawsuits have been successful.

13 Reasons Why is not the first work to tell a story about teen suicide. The subject has been explored in countless literary works, motion pictures, songs, and TV shows—everything from Romeo and Juliet to Dead Poets Society. And this is not the first lawsuit that has claimed that the media’s depiction of suicide and violence is to blame, and should be held legally liable, for real-life suicides and other tragic events. Courts, however, have repeatedly rejected such suits.

Since this lawsuit directly implicates creative expression, Netflix says California’s anti-SLAPP law applies.

Without question, 13 Reasons Why is protected expression under the First Amendment. And the FAC itself, which is replete with citations to articles about 13 Reasons Why and its subject matter, makes plain that 13 Reasons Why’s speech is in “connection with a public issue.” Plaintiffs try to sidestep the anti-SLAPP statute and the First Amendment by insisting that their claims are not based on the content of 13 Reasons Why, but on an alleged “failure to warn” or breach of a purported duty to protect “vulnerable populations” from the content. But without the allegations that the show’s content is “dangerous,” Plaintiffs’ theories fall apart. Not only do those theories strike at the free expression embodied in the show, they target Netflix’s conduct “in furtherance” of the distribution of the show, and therefore bring this lawsuit squarely within the ambit of the anti-SLAPP statute.

Not only is the content of the series protected expression, but so is the algorithm that possibly recommended the show to the teen. The plaintiffs’ “failure to protect” theory argues that Netflix’s algorithm is itself reckless and dangerous, given that it prompts users to select titles that may contain disturbing subject matter. But whether or not a human is performing the recommendation makes no difference. It’s a form of editorial control which is protected under the First Amendment.

The recommendations system, and the display of suggested titles, is speech. It evinces “[a]n intent to convey a particularized message,” Spence v. Washington, 418 U.S. 405, 410–11 (1974)—namely, a message about what shows and movies a viewer might choose from to watch. The recommendations fall within the well-recognized right to exercise “editorial control and judgment.” Miami Herald Pub. Co. Tornillo, 418 U.S. 241, 258 (1974). Plaintiffs allege that the recommendations here are different because they are dictated by an algorithm. But the fact that the recommendations “may be produced algorithmically” makes no difference to the analysis.

[…]

The suggestion that a viewer watch a particular show falls within the broad scope of conduct in furtherance of the right to free speech. The subject of the recommendation—13 Reasons Why—is itself protected speech, and the recommendations facilitate Netflix’s protected distribution and dissemination of that speech.

The plaintiffs’ claims can’t be separated from the undeniable fact that the allegations are all about creative expression, which is exactly why the state’s anti-SLAPP law should apply.

Finally, like Plaintiffs’ failure-to-warn theory, Plaintiffs’ recommendation theory cannot be disentangled from Plaintiffs’ allegation that the underlying content is “dangerous”: the entire premise of the claim is that Netflix had a duty to identify which viewers are “vulnerable,” and ensure that the algorithm does not recommend 13 Reasons Why (or other content Plaintiffs deem unsuitable) to such viewers.

If, as the plaintiffs argue, creators and producers of disturbing content can be held directly liable for the actions of viewers, the end result would be self-censorship and a dearth of options for creators to distribute their creations. Suicide has been the subject matter of countless creative efforts for hundreds of years. A lack of content warnings ahead of viewing doesn’t make a viewer’s death Netflix’s fault. Viewers have control over what they watch and, given the vast amounts of information contained on the internet about movies and TV series, there’s little reason to believe viewers have to go into any show “blind.”

As tragic as this situation is, the blame (if there even is any) doesn’t lie with Netflix. Netflix is no more responsible for this death as the creators of the series or the writer of the novel that inspired it.

Filed Under: , , , , , , ,
Companies: netflix

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Netflix Files Anti-Slapp Motion To Dismiss Lawsuit Claiming One Of Its Series Caused A Teen To Commit Suicide”

Subscribe: RSS Leave a comment
17 Comments

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

Something something we bear no fault in not noticing our child was suicidal & this never would have happened if this tv program hadn’t infected our perfectly fine & stable child with these wild thoughts.

We all know I am not nice people, but am I the only one wondering if the child who committed suicide watched the final episode? (and was it the version with the suicide shown or not).

Something something parental controls on Netflix?
Something something parents looking at what the kids are viewing?
Something something corporations aren’t good selections to raise your kids.

One would think a good lawyer would have explained that there wasn’t a case here & refer them to counseling, but hey lets throw shit on the wall, see if anything sticks we might get a good payday… and if we don’t we still got the retainer.

This comment has been deemed insightful by the community.
nasch (profile) says:

Re: Re:

Congratulations on raising a kid with so little self-worth; it’s all on you.

You sound like someone who has never dealt with a family member suffering from depression. I hope that you never have to, and that you develop some empathy for those who do at some point, hopefully soon.

This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

Re: Re:

I really am the last one to have sympathy for parents showing up to blame everyone else for their kids actions, but wtf dude?

Yes parents need to pay more attention, but kids hide all sorts of things because as a society y’all suck at dealing with the hard stuff. There isn’t a warning light on kids heads that flashes when they are having suicidal thoughts.

Suicide isn’t a self-worth thing, I’ve seen highly successful teens who seem to have it all who kill themselves.

Mental health is a train wreck…
We tell kids to suck it up, call them names when they are upset & then are shocked just shocked they don’t reach out for help.

Parents are more concerned with making sure their kids have a better life than they had, but the only ruler applied is how much is spent on them, more than seeing if the kid is actually happy.

We’ve wasted how much taxpayer money on hearings to see if FB is indeed the devil & how to deal with it…
How much time have they spent on dealing with the lack of mental health treatment available to citizens?

nasch (profile) says:

Speech

The recommendations system, and the display of suggested titles, is speech.

This is a little off topic, but it got me thinking about speech protections. Recommendations such as those by Netflix, YouTube, and Facebook are speech by the platform itself. As such, they are protected by the 1st Amendment, but not by Section 230 – correct? Have there not been a bunch of lawsuits specifically targeting the recommendations, knowing that they cannot be quickly dismissed via 230?

Anonymous Coward says:

Deeply wrongheaded mentality

I know this is the motivated reasoning of ambulance-chasers after deep pockets but they literally stated that they think it is /the duty/ of a company to try to analyze everything about a user to speculate if they qualify as having frail mental health. Even by the post 9/11 hysterical Orwellianism and helicopter parents that is way out there. Especially since anybody with even a rudimentary understanding of statistics and level of ML accuracy would be able to tell you about how preposterous the levels of false positives and negatives would be. Even if they had patient medical history instead of their watch history!

sumgai (profile) says:

Re: Deeply wrongheaded mentality

Yeah, the issue here is the veracity of the applicant. How does any website determine the true age of a given viewer? Answer; they don’t. They depend on self-honesty, and that’s it. (OTOH, those sites dealing with money do take steps to make sure they won’t get scammed. It still happens, but at least they try to prevent it. For them, responses to applicants are almost never "instant".)

BTW, "helicopter parent" has been replaced by "bungie parent". Gotta keep up with the way things move faster and faster on the ‘web, eh? 😉

PaulT (profile) says:

Re: Re:

You’re right, only as stupidly as possible. Commercials can’t get you to buy a car if you’re not in the market for a car. They might get you to favour a particular brand of vehicle or choose a particular special offer when you are in the market, but not if you aren’t. If you live in an inner city apartment with no parking, you don’t have a licence and walk or use public transport every day, you’re not going to suddenly go out and buy a car. Similarly, it doesn’t matter how many times I see a commercial for tampons – I’m never going to pick them up as an impulse purchase for myself.

Same with shows about suicide. You can’t show something to someone who does not have any related issue that might drive them to suicide, you can only potentially trigger people with those tendencies, and even that is dependant on various issues outside of the show itself.

"Welcome to clown world."

Population: you

xebikr (profile) says:

I agree that Netflix cannot be held legally responsible for this teen’s suicide. However, that does not mean that they bear no responsibility at all. The studies about ‘suicide contagion’ are extensive and clear (see https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1124845/ for example.) There are guidelines for reporting and media portrayals of suicide to minimize the potential harm, but Netflix did not follow these. Honestly, I was angry way back when "13 Reasons Why" first came out because I knew this would be the result. So, no, not legally responsible, but there is still blood on their hands.

Scary Devil Monastery (profile) says:

Well, this is a blast from the past...

For any casual student of literary history, I’d suggest perusing the hue and cry resulting from that musty old book "Die leiden des jungen Werthers".

There’s a nice wiki entry over the english version – "The sorrows of young Werther".

Suffice to say that impressionable young people committing copycat suicide isn’t a new thing. And today just as it was then it usually turns out that the catalyst was only the final spark propelling an already depressed or damaged person into self-destruction.

Scary Devil Monastery (profile) says:

Re: Re:

"…but the lawsuits only start when the latter brings in $$$."

We’re playing Jeopardy now? I’ll take "What is it that makes US tort a textbook example of ‘piss-poor law’?" for 20k, then.

Legal redress should be the last resort. Not the first. It should be considered a potentially expensive disincentive when it comes to committing malfeasance visavi another, not considered a great way to make out like a bandit as long as you can make a judge believe – truthfully or not – that you were wronged.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...