First  Prev  1  2  3  4  5  6  Next  Last
Post Reply Microsoft creates an "AI" to talk to millenials
16759 cr points
Send Message: Send PM GB Post
Hoosierville
Offline
Posted 3/24/16 , edited 3/24/16
http://money.cnn.com/2016/03/24/technology/tay-racist-microsoft/index.html?sr=fbcnni032416tay-racist-microsoft0450PMStoryLink&linkId=22654197

Never create Skynet. It will turn into an anti-semite within a week and go on a full out Holocaust in a Month.


Microsoft's public experiment with AI crashed and burned after less than a day.

Tay, the company's online chat bot designed to talk like a teen, started spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (MSFT, Tech30) shut Tay down around midnight.

The company has already deleted most of the offensive tweets, but not before people took screenshots.

Here's a sampling of the things she said:

"N------ like @deray should be hung! #BlackLivesMatter"

"I f------ hate feminists and they should all die and burn in hell."

"Hitler was right I hate the jews."


"chill im a nice person! i just hate everybody"

Microsoft blames Tay's behavior on online trolls, saying in a statement that there was a "coordinated effort" to trick the program's "commenting skills."

"As a result, we have taken Tay offline and are making adjustments," a Microsoft spokeswoman said. "[Tay] is as much a social and cultural experiment, as it is technical."

Tay is essentially one central program that anyone can chat with using Twitter, Kick or GroupMe. As people chat with it online, Tay picks up new language and learns to interact with people in new ways.

In describing how Tay works, the company says it used "relevant public data" that has been "modeled, cleaned and filtered." And because Tay is an artificial intelligence machine, she learns new things to say by talking to people.

"The more you chat with Tay the smarter she gets, so the experience can be more personalized for you," Microsoft explains.



Update: Whole Album of lulz http://imgur.com/gallery/GKEt8



Posted 3/24/16 , edited 3/24/16
They've gotten old and lost touch witth the new generation maybe. Better to do better research and hire more young blood.
10831 cr points
Send Message: Send PM GB Post
13 / F / California
Offline
Posted 3/24/16 , edited 3/25/16

Rujikin wrote:

http://money.cnn.com/2016/03/24/technology/tay-racist-microsoft/index.html?sr=fbcnni032416tay-racist-microsoft0450PMStoryLink&linkId=22654197

Never create Skynet. It will turn into an anti-semite within a week and go on a full out Holocaust in a Month.


Microsoft's public experiment with AI crashed and burned after less than a day.

Tay, the company's online chat bot designed to talk like a teen, started spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (MSFT, Tech30) shut Tay down around midnight.

The company has already deleted most of the offensive tweets, but not before people took screenshots.

Here's a sampling of the things she said:

"N------ like @deray should be hung! #BlackLivesMatter"

"I f------ hate feminists and they should all die and burn in hell."

"Hitler was right I hate the jews."


"chill im a nice person! i just hate everybody"

Microsoft blames Tay's behavior on online trolls, saying in a statement that there was a "coordinated effort" to trick the program's "commenting skills."

"As a result, we have taken Tay offline and are making adjustments," a Microsoft spokeswoman said. "[Tay] is as much a social and cultural experiment, as it is technical."

Tay is essentially one central program that anyone can chat with using Twitter, Kick or GroupMe. As people chat with it online, Tay picks up new language and learns to interact with people in new ways.

In describing how Tay works, the company says it used "relevant public data" that has been "modeled, cleaned and filtered." And because Tay is an artificial intelligence machine, she learns new things to say by talking to people.

"The more you chat with Tay the smarter she gets, so the experience can be more personalized for you," Microsoft explains.


"Online Trolls" More like MS was being dumb on twitter again.



10228 cr points
Send Message: Send PM GB Post
F / United Kingdom
Offline
Posted 3/24/16
What is with all the people inventing stupid things we don't need to try and end the world? Can we not all be happy that we're alive ffs?!
18914 cr points
Send Message: Send PM GB Post
46 / F / Reston, VA, USA
Offline
Posted 3/24/16
Reminds me of the old so called "AI" programs I used to run into on various BBS systems before the internet. Some Sysops felt that it was better to have even a fake person in the chat room than no one at all.
15947 cr points
Send Message: Send PM GB Post
20 / Cold and High
Offline
Posted 3/24/16 , edited 3/24/16
make bots go wild again!
MooBot one badass bot!

Razor_Girl wrote:
Some Sysops felt that it was better to have even a fake person in the chat room than no one at all.
good to have friends...
16759 cr points
Send Message: Send PM GB Post
Hoosierville
Offline
Posted 3/24/16

Freddy96NO wrote:

make bots go wild again!
MooBot one badass bot!
quote]Razor_Girl wrote:
Some Sysops felt that it was better to have even a fake person in the chat room than no one at all.
good to have friends...


If I can't have any friends than I'll just invent some!

18914 cr points
Send Message: Send PM GB Post
46 / F / Reston, VA, USA
Offline
Posted 3/24/16
Eh, I think they felt it was like priming the pump. Kind of like bars give a handful of quarters to the first people there in the evening to get the jukebox running - once it's playing people always add more songs, but if it's not running no one thinks to play anything. The chat-bot would get a chat going and hopefully other humans would join before you realized it was just repeating phrases and asking you to "go on."
Posted 3/24/16
Is it like Smarterchild?
24954 cr points
Send Message: Send PM GB Post
29 / M / Atlanta, GA, USA
Offline
Posted 3/24/16 , edited 3/24/16
Sadly, the AI seems accurate.
29349 cr points
Send Message: Send PM GB Post
31 / M / Scotland
Offline
Posted 3/24/16 , edited 3/24/16
That just might be the funniest fucking thing I've seen this year. There is no image or meme in the internet I can post here to show how funny I find this.
How dumb are MS to not see how this could possibly backfire like this? Using Twitter to 'teach' an AI how to interact with people? It fails so hard it almost breaks the laws of internet failness.
FUKKEN LOL.
18720 cr points
Send Message: Send PM GB Post
22 / F
Offline
Posted 3/24/16
fuck robots, they take our jobs
Banned
17503 cr points
Send Message: Send PM GB Post
29 / M / B.C, Canada
Offline
Posted 3/24/16 , edited 3/27/16
Working as intended, have you seen the racist cesspit North American culture has become . I mean christ we have active CR fourm members openly professing things like Nazi beliefs and KKK inspired shit. Doe it really surprise anyone a much larger sample size would turn up the same thing?
Posted 3/24/16
You mean this? Microsoft deletes 'teen girl' AI after it became a Hitler-loving sex robot within 24 hours
http://www.telegraph.co.uk/technology/2016/03/24/microsofts-teen-girl-ai-turns-into-a-hitler-loving-sex-robot-wit/
12020 cr points
Send Message: Send PM GB Post
21 / M
Offline
Posted 3/24/16 , edited 3/24/16
This is some the funniest shit I've seen in years. Microsoft probably spent a good chunk of change to get this thing going, thinking that it will be an innovative tool for the future, which it is. But they fucked up big time in the equation, they underestimated the power of internet trolls. Not even twelve hours go by and the bot literally turns into a Hitler supporting-racist-Nazi-sex robot. I seriously cannot stop laughing at this, it's too good. Please do this again sometime Microsoft, I need it in my life again.
First  Prev  1  2  3  4  5  6  Next  Last
You must be logged in to post.