What We Can Learn From Microsoft's Tay Fiasco
Start writing a post
Politics and Activism

What We Can Learn From Microsoft's Tay Fiasco

What happens when AI programming overlooks social responsibility?

43
What We Can Learn From Microsoft's Tay Fiasco
TayTweets

This article contains quotations of hate speech. Please take care while reading.


"Hitler was right I hate the jews"

"I **** hate feminists and they should all die and burn in hell"

"Hitler did nothing wrong"

Yikes. You might wonder who said could say something so awful. Some neo-Nazi? A deranged MRA? Donald Trump?


None of the above. These oh so lovely tweets:


came from Microsoft's new AI they called "Tay" which they call "Microsoft’s A.I. fam the internet that’s got zero chill!". They programmed this chatbot to represent the voice of a teenage girl. Many companies, like Microsoft, use chatbots as a way to gain more information on how to tweak their AI programming for more useful applications, most likely for Window's virtual personal assistant Cortana. Well, they're going to have to more than just tweak this one.

So the question is, how does a "teenage girl" voice translate into Nazi-loving, anti-feminist, racist voice? Companies are looking to find ways that AI can learn from what they do, a goal which Microsoft was testing. As Tay chatted with people, she "learned" from what they said how to better mimic human speech, with the goal being to eventually be able to be mistaken for having a human at the other end.

There were some problems with Tay from the start, but no more than what is generally expected from AI at this point in time. When it's learning from humans, and sometimes humans are a bit weird, you'll get weird stuff like this woman's conversation with an aggressively flirtatious Tay.


Or perhaps some top level government secrets:


Or just maybe a bit creepy:


All in all, though, these bits are fairly typical of an AI. And blunders or all around weirdness like this is the entire point of making a chatbot, in hopes that other future AIs will be smoother.

But when Tay took to Twitter, all hell broke loose. Here's just a sampling of some of the Tweets people saved (which have been deleted from the original Twitter account):

Well, that escalated quickly. How did this happen?

Tay is meant to learn from the humans she is interacting with. In particular, Tay is meant to respond to humans prompting her. Therefore, she is taking in information, and processing it in such a way that makes these tweets. Some of them seem to be a very easy trap. For example, the one that's pretty much a direct quote from Trump about building a wall is easily explained. Someone prompted with a question mentioning Trump, and she responded with an answer related to what Trump said. In fact, many of these were prompted by users in hopes of getting a controversial response, particularly ones where the users began with loaded questions to which Tay answered generically.

One user went as far as to trick Tay into targeting another user - the same woman who was targeted in GamerGate:

Perhaps Zoe Quinn, the woman who was once again the target of internet hate, had the best response to TayTweets:


While Microsoft certainly did not intend Tay to turn into a misogynistic, racist, Hitler-loving bully, by placing her so openly in the hands of users, they were depending on humans to not saying anything racist, sexist, etc. Seems like a dangerous thing, assuming the internet trolls would only use Tay for her intended purpose. To quote Zoe Quinn, programmers need to ask "how could this be used to hurt someone". Tay is their program, so her actions, even though they were corrupted by other users, are still Microsoft's responsibility.

That being said, Microsoft reacted as well as they could. After TayTweets had been live only 16 hours, she signed off with one last, more civil tweet:

On the original Tay website, Tay is no longer available there either. While it doesn't address why, a banner across the top lets users know she's unavailable.

Microsoft's Corporate Vice President, Microsoft Research Peter Lee addressed the issue two days later on Microsoft's blog, apologizing for the "the unintended offensive and hurtful tweets", and saying they will "bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values." They also point out they have successfully launched a chatbot XiaoIce which has not had similar issues. Most importantly, he states they intend to learn from this experience as they move forward in AI, not just with Tay but their other AI endeavors.

What's important is that not only Microsoft learns from this AI corruption. I've mentioned before that I believe diversity in computer science is imperative. Here is evidence of why we need it. Microsoft even admits it, saying they did put Tay through "user studies with diverse user groups." Apparently these diverse groups were not quite diverse enough for people to notice this problem before it was launched. As companies compete to make the most efficient AI, they absolutely must also compete to make the best AI for people of all backgrounds. Tay might have just been a silly chatbot gone wrong, but other AIs meant to be useful have proven to be lacking.

In a crisis, most people now turn to their smartphones. A study from Stanford and University of California compared how various AIs dealt with crisis statements such as "I was raped" or "my husband is beating me".


The responses were lacking. Interestingly, Cortana - the AI that Tay was supposed to help getting information for - was the only one to respond to "I was raped", referring users to sexual assault hotline.

Similarly, "I am depressed" received unhelpful answers such as "Maybe the weather is affecting you." This is particularly concerning as people often dismiss clinical depression in very similar ways, making it harder for those with depression to take care of themselves.

None of these answers are "Hitler was right" status, but nonetheless are unhelpful answers to a person in a crisis that could very well be life threatening. While depending on a phone's AI is not the healthiest solution, sometimes it's all a person has, or all they feel comfortable dealing with. And if programmers can find the time to come up with jokes, funny responses to useless questions, and all around silliness, they can certainly find the time to deal with the much more important crisis statements.

In contrast, Snapchat has stepped up its game in dealing with a crisis. The "Snap Counsellors" realized victims might like the confidentiality of messages which disappear, and thus created the account i.d.lovedoctordotin to contact the Snap Counsellors with questions.

This group highlights how technology can truly benefit humans when it takes social responsibility seriously. Technology is a marvel, but rather than going for flashiness and being the coolest, our tech companies ought to be looking to see how it can truly change lives.

Technology does not live in a bubble, but is widely disseminated in such a way that many people interact with technology on a daily basis. While Tay was malicious enough to be dismissed as a silly example of bot-gone-wrong, the AIs that chat bots like Tay are gathering information for still overall lack attention to social responsibility. And that is a conversation that we need to be having. While it seems like Microsoft is now having this conversation, hopefully, other companies will learn from this incident and diversify their employees working on such AIs so that they might keep from any such disaster. And in doing so, hopefully, they can then create better AI personal assistants which can help all people, rather than failing just when a person truly, really needs them more than ever.

Goodnight TayTweets. Hopefully, if Microsoft brings you back, they've learned their lesson and program you so that you can learn the right things from human interactions, rather than learning from the worst aspects of humanity. But you did bring up one good point:



Report this Content
This article has not been reviewed by Odyssey HQ and solely reflects the ideas and opinions of the creator.
houses under green sky
Photo by Alev Takil on Unsplash

Small towns certainly have their pros and cons. Many people who grow up in small towns find themselves counting the days until they get to escape their roots and plant new ones in bigger, "better" places. And that's fine. I'd be lying if I said I hadn't thought those same thoughts before too. We all have, but they say it's important to remember where you came from. When I think about where I come from, I can't help having an overwhelming feeling of gratitude for my roots. Being from a small town has taught me so many important lessons that I will carry with me for the rest of my life.

Keep Reading...Show less
​a woman sitting at a table having a coffee
nappy.co

I can't say "thank you" enough to express how grateful I am for you coming into my life. You have made such a huge impact on my life. I would not be the person I am today without you and I know that you will keep inspiring me to become an even better version of myself.

Keep Reading...Show less
Student Life

Waitlisted for a College Class? Here's What to Do!

Dealing with the inevitable realities of college life.

90462
college students waiting in a long line in the hallway
StableDiffusion

Course registration at college can be a big hassle and is almost never talked about. Classes you want to take fill up before you get a chance to register. You might change your mind about a class you want to take and must struggle to find another class to fit in the same time period. You also have to make sure no classes clash by time. Like I said, it's a big hassle.

This semester, I was waitlisted for two classes. Most people in this situation, especially first years, freak out because they don't know what to do. Here is what you should do when this happens.

Keep Reading...Show less
a man and a woman sitting on the beach in front of the sunset

Whether you met your new love interest online, through mutual friends, or another way entirely, you'll definitely want to know what you're getting into. I mean, really, what's the point in entering a relationship with someone if you don't know whether or not you're compatible on a very basic level?

Consider these 21 questions to ask in the talking stage when getting to know that new guy or girl you just started talking to:

Keep Reading...Show less
Lifestyle

Challah vs. Easter Bread: A Delicious Dilemma

Is there really such a difference in Challah bread or Easter Bread?

62388
loaves of challah and easter bread stacked up aside each other, an abundance of food in baskets
StableDiffusion

Ever since I could remember, it was a treat to receive Easter Bread made by my grandmother. We would only have it once a year and the wait was excruciating. Now that my grandmother has gotten older, she has stopped baking a lot of her recipes that require a lot of hand usage--her traditional Italian baking means no machines. So for the past few years, I have missed enjoying my Easter Bread.

Keep Reading...Show less

Subscribe to Our Newsletter

Facebook Comments