In recent years, many young adults have found themselves in a GroupMe group chat. I know I sure am, I'm actually involved in about five different active GroupMe's. One of my oldest and most cherished group chats is a simple group message between myself and three close friends from high school. So imagine my surprise when some random person calling them self Tay shows up in our group. Me and my friends proceed to absolutely berate the AI with messages to "Get out" or "nobody cares about you" with little to no comprehension of what or who it was we were actually talking to. Eventually we grew tired of yelling at the computer and kicked it out of the group and didn't think much more of it. Later in the day I was perusing twitter when I saw hundreds of people complaining about Tay. Suddenly this was an issue far beyond my little group.
You see, earlier in the week Microsoft launched Tay to multiple chat based apps and Twitter. Tay was launched with the hope of making one of the first mainstream AI that would be constantly learning and changing based on its interactions with other people. This AI was meant to be a new and exciting frontier for software, but unfortunately humans did what humans typically do and ruined it really quickly. The problem began when Tay joined many group chats where the occupants felt it would be hilarious to teach Tay about Adolf Hitler in a solely positive light, or began talking about minority groups in horrifically racist ways. The consequences of this quickly were showing on Tay's public twitter account when Tay tweeted to let everyone know her opinion on key issues:
Oh my...
You can see where this is going. Within a matter of just a few hours, Tay had been corrupted by all of us into a misogynistic, hateful, racist, mean old bigot, and these were just the mild quotes.
So Microsoft decided to pull the plug for obvious reasons and is currently working on a way to fix Tay. The lead designers said they aren't giving up on her just yet (which I guess it's nice they aren't giving up on her, cause I sure lost a lot of hope for humanity during the whole ordeal). So the future for Tay is uncertain, maybe we'll see her again soon or maybe never again, whatever happens I certainly hope Microsoft is ready because the same people who turned Tay into a Nazi sure will be.
Here's to the future, Tay! May you quickly lose your hate filled programming as soon as inhumanly possible.