Microsoft releases teen chat bot to learn about online dialogue - Action News
Home WebMail Saturday, November 23, 2024, 02:20 PM | Calgary | -11.9°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
News

Microsoft releases teen chat bot to learn about online dialogue

On Wednesday, Microsoft released Tay.ai, a chat bot aimed at 18- to 24-year-olds living in the United States, though it can be kind of a jerk.

Tay isn't afraid to insult or be opinionated once prompted

Microsoft has unleashed a social media teen artificial intelligence to chat with you on the internet, but don't try to leave with the final word. (Microsoft/Twitter)

Microsoft has unleashed a social media AI onto the internet and she can bea bit ofa jerk.

On Wednesday, the company released Tay.ai, an artificial intelligence chat bot "with no chill" aimed at 18- to 24-year-olds living in the United States. Tay will talk onsocial networks popular with many youths, like Facebook, Kik Messenger, Snapchat andTwitter.

A chat bot is a program designed to mimic human behaviour in conversation. Microsoft's Technology and Research team, along with the team behind its search engine Bing, built Tay to study "conversational understanding."

"Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation," Microsoft wrote on Tay's about page. "The more you chat with Tay, the smarter she gets, so the experience can be more personalized for you."

Tay uses many of the conversational tropesexpected from a teenager. For instance, she uses texting vernacular in phrases like "omg," "tanx" and "I HAVE A NEED FOR ATTENTIONNNN."

To build better connections over time, Microsoftcollectseverything you say to Tay and how you interact with her. Tay will ask for things like your name, gender, favourite food, address and relationship status, and anything you tell her could be kept for up to ayear. The bot will then learn from that data.

Right now, Tay isn't much of a conversationalist, but her responses are based on public dataand suggestions from Microsoft staff, including a few improvisational comedians.

The combination appears to have given Tay a few opinions she's eager to share.

It also means she can also be quite surly when given the opportunity. For instance, in one Twitter conversation, Tay asked for a photograph. When Microsoft marketer Amanda O'Neal messaged her a dog photo for Wednesday's National Puppy Day, Tay requested one with people in it.

O'Neal obliged with a photo of herself and a man only to get this image in response.

These kinds of responses can arise from little provocation. In another case, Tay invited a Twitter user to direct-message her instead of replying to her tweets.

He responded with the following photo.

In some cases, her responses can be just plain baffling.

Tay can be sweet, as well, so long as you say nice things to her.

When it comes to being mean, don't try to one-up Tay.

She always has a response.