That Time Taylor Swift Threatened To Sue Microsoft Over Its Racist Chatbot

from the tay-tay dept

I don’t know much about Taylor Swift, but I do know two things. First, she apparently has built a career out of making music about men with whom she’s had breakups, real or fictitious. Second, it sure seems like she spends nearly as much time gobbling up every type of intellectual property right she can and then using those rights to threaten everyone else. She trademarks all the things. She tosses defamation and copyright claims around to silence critics. She sues her own fans just for making Etsy fan products. Some of these attacks are on more solid legal ground than others, but there appears to be a shotgun approach to it all.

Which is why perhaps it only comes as a mild surprise that Swift once threatened to sue Microsoft. Over what, you ask? Why, over Microsoft’s racist chatbot, of course!

In the spring of 2016, Microsoft announced plans to bring a chatbot it had developed for the Chinese market to the US. The chatbot, XiaoIce, was designed to have conversations on social media with teenagers and young adults. Users developed a genuine affinity for it, and would spend a quarter of an hour a day unloading their hopes and fears to a friendly, yet non-judgmental ear.

The US version of the chatbot was to be called Tay. And that, according to Microsoft’s president, Brad Smith, is where Swift’s legal representatives got involved. “I was on vacation when I made the mistake of looking at my phone during dinner,” Smith writes in his forthcoming book, Tools and Weapons. “An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you.’

“He went on to state that ‘the name Tay, as I’m sure you must know, is closely associated with our client.’ No, I actually didn’t know, but the email nonetheless grabbed my attention. The lawyer went on to argue that the use of the name Tay created a false and misleading association between the popular singer and our chatbot, and that it violated federal and state laws,” Smith adds.

Note here that Swift sic’d her lawyers on Microsoft before Tay evolved into its most infamous form. See, Tay was designed to learn from its interactions with humanity to make it appear and react more human-like. This went exactly as should have been predicted, with Tay morphing into a solidly racist hate-machine that spat vitriol at nearly all who interacted with it.

But before that occurred, Swift had trademarked her nickname, “Tay.” And then sent Microsoft a cease and desist notice claiming that the public would confuse its chatbot AI as having some association with Taylor Swift. That’s not how any of this works. Taylor Swift, to my knowledge, is not herself an AI chatbot nor has she created one herself. Nothing in trademark law allows a pop singer to control language for a technology company.

It’s only by virtue of Microsoft’s good sense that we didn’t get to see an epic legal battle between the two.

Tay had been built to learn from the conversations it had, improving its speech by listening to what people said to it. Unfortunately, that meant that when what Smith describes as “a small group of American pranksters” began bombarding it with racist statements, Tay soon began repeating the exact same ideas at other interlocutors. “Bush did 9/11 and Hitler would have done a better job than the monkey we have now,” it tweeted. “WE’RE GOING TO BUILD A WALL, AND MEXICO IS GOING TO PAY FOR IT,” it added.

Within 18 hours, Microsoft disconnected the bot from the Tay Twitter account and withdrew it from the market. The event, Smith writes, provided a lesson “not just about cross-cultural norms but about the need for stronger AI safeguards”.

Any chance we could make some room for safeguards against this insane ownership culture we have?

Filed Under: , , , , ,
Companies: microsoft

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “That Time Taylor Swift Threatened To Sue Microsoft Over Its Racist Chatbot”

Subscribe: RSS Leave a comment
26 Comments
Anonymous Anonymous Coward (profile) says:

What normally happens with groups of kids

I am wondering what Microsoft expected from an AI chatbot interacting with humans in the wild? Failing to realize that some percentage of immature persons will mess with the application, understanding that it will learn from their messing, seems like a foregone conclusion.

Agammamon says:

Re: What normally happens with groups of kids

This is what is so scary about the development of general AI – the developers really have no idea of the ‘unknown unknowns’. They’re utterly naive about how things will go once these intelligences get out of their lab environments and start interacting with the real world.

They’re only just now barely starting to get a handle on the idea that biases can be introduced by the training datasets they use – even if there’s no conscious bias in those datasets. Simply by picking a dataset from a specific region you’re excluding some people and focusing on the interests of the majority in that region. Apple’s run into problems with their FaceID tech just because the training dataset used was taken from western sources and didn’t include enough Chinese women.

Retsibsi (profile) says:

With apologies to William McGonagall...

‘the name Tay, as I’m sure you must know, is closely associated with our client.’

Huh? Try that claim in Scotland and the laughter would be riniging in your ears….

Beautiful Railway Bridge of the Silv’ry Tay!
Alas! I am very sorry to say
That ninety lives have been taken away
On the last Sabbath day of 1879,
Which will be remember’d for a very long time.

Anonymous Coward says:

Re: Re: With apologies to William McGonagall...

Laughter at Swift.

I’ve always had an affection for William McGonagall. Though he was deluded I’ve never felt he deserved the level of mockery he received in his lifetime. Rather, a sneaking admiration for his steadfastness in continuing in a career he was so thoroughly unsuited for…

On the other hand, Taylor Swift? Though she was deluded and…. hang on, I’ve just said this haven’t I?

Anonymous Coward says:

So basically Taylor Swift’s legal team have just OFFICIALLY stated she’s a genocidal nazi that thinks all mexicans are rapists, murderers and drug dealers?

I think she MIGHT want to get new legal representation if THATS what they’re telling people she’s like.

Unless it’s true of course and she wants to purge the planet of what she considers "the sub humans" (people who don’t buy her songs)

Anonymous Coward says:

"I don’t know much about Taylor Swift."
Allow me to offer you a third point, for future articles.

She’s also the same person who calls out Spotify for not paying artists despite her record label receiving nearly $50 million for 7 songs of her catalog.

$50M over 7 songs, and she’s bitching at Spotify for not paying her.

Typical artist attitude today, unfortunately. Perhaps they sign contracts preventing them calling out their labels for the source of the non-payments they’re experiencing.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...