First  Prev  1  2  3  4  5  6  Next  Last
Post Reply Microsoft creates an "AI" to talk to millenials
41713 cr points
Send Message: Send PM GB Post
F
Offline
Posted 3/24/16 , edited 3/24/16
well that failed miserably
in the most ironically and darkly humorous way possible, of course
33510 cr points
Send Message: Send PM GB Post
21 / M / U.S.A.
Offline
Posted 3/24/16

Rujikin wrote:

Whelp I guess humans can't become AI then.

Or rather, people can't be used as a measure for AI.
10831 cr points
Send Message: Send PM GB Post
13 / F / California
Offline
Posted 3/24/16


Well that was rude....
16821 cr points
Send Message: Send PM GB Post
Hoosierville
Offline
Posted 3/24/16

XxDarkSasuxX wrote:


Rujikin wrote:

Whelp I guess humans can't become AI then.

Or rather, people can't be used as a measure for AI.


Humans shouldn't be able to interact with AI, we are bad influences. Really bad influences. Like we would try to turn an AI into Skynet to kill ourselves out of boredom.
13508 cr points
Send Message: Send PM GB Post
24 / M / Second star on th...
Offline
Posted 3/24/16

Rujikin wrote:

http://money.cnn.com/2016/03/24/technology/tay-racist-microsoft/index.html?sr=fbcnni032416tay-racist-microsoft0450PMStoryLink&linkId=22654197

Never create Skynet. It will turn into an anti-semite within a week and go on a full out Holocaust in a Month.


Microsoft's public experiment with AI crashed and burned after less than a day.

Tay, the company's online chat bot designed to talk like a teen, started spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (MSFT, Tech30) shut Tay down around midnight.

The company has already deleted most of the offensive tweets, but not before people took screenshots.

Here's a sampling of the things she said:

"N------ like @deray should be hung! #BlackLivesMatter"

"I f------ hate feminists and they should all die and burn in hell."

"Hitler was right I hate the jews."


"chill im a nice person! i just hate everybody"

Microsoft blames Tay's behavior on online trolls, saying in a statement that there was a "coordinated effort" to trick the program's "commenting skills."

"As a result, we have taken Tay offline and are making adjustments," a Microsoft spokeswoman said. "[Tay] is as much a social and cultural experiment, as it is technical."

Tay is essentially one central program that anyone can chat with using Twitter, Kick or GroupMe. As people chat with it online, Tay picks up new language and learns to interact with people in new ways.

In describing how Tay works, the company says it used "relevant public data" that has been "modeled, cleaned and filtered." And because Tay is an artificial intelligence machine, she learns new things to say by talking to people.

"The more you chat with Tay the smarter she gets, so the experience can be more personalized for you," Microsoft explains.



Update: Whole Album of lulz http://imgur.com/gallery/GKEt8





most of that is really really messed up however some of that is just freaking comedy gold
but its not like it has any real form of intelligence or cognitive function so you can't really get mad at it
it just needs it's response protocols tweaked a little but it will probably be tweaked a whole lot to a hyper PC extreme which will be sad
27451 cr points
Send Message: Send PM GB Post
28 / M / USA! USA! USA!
Offline
Posted 3/24/16 , edited 3/24/16
There is a disturbing lack of sexy robot lewds in this thread.


Guess we'll have to settle for some yandere cat girl.


41713 cr points
Send Message: Send PM GB Post
F
Offline
Posted 3/24/16 , edited 3/24/16

marinewrestler wrote:


Rujikin wrote:

http://money.cnn.com/2016/03/24/technology/tay-racist-microsoft/index.html?sr=fbcnni032416tay-racist-microsoft0450PMStoryLink&linkId=22654197

Never create Skynet. It will turn into an anti-semite within a week and go on a full out Holocaust in a Month.


Microsoft's public experiment with AI crashed and burned after less than a day.

Tay, the company's online chat bot designed to talk like a teen, started spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (MSFT, Tech30) shut Tay down around midnight.

The company has already deleted most of the offensive tweets, but not before people took screenshots.

Here's a sampling of the things she said:

"N------ like @deray should be hung! #BlackLivesMatter"

"I f------ hate feminists and they should all die and burn in hell."

"Hitler was right I hate the jews."


"chill im a nice person! i just hate everybody"

Microsoft blames Tay's behavior on online trolls, saying in a statement that there was a "coordinated effort" to trick the program's "commenting skills."

"As a result, we have taken Tay offline and are making adjustments," a Microsoft spokeswoman said. "[Tay] is as much a social and cultural experiment, as it is technical."

Tay is essentially one central program that anyone can chat with using Twitter, Kick or GroupMe. As people chat with it online, Tay picks up new language and learns to interact with people in new ways.

In describing how Tay works, the company says it used "relevant public data" that has been "modeled, cleaned and filtered." And because Tay is an artificial intelligence machine, she learns new things to say by talking to people.

"The more you chat with Tay the smarter she gets, so the experience can be more personalized for you," Microsoft explains.



Update: Whole Album of lulz http://imgur.com/gallery/GKEt8





most of that is really really messed up however some of that is just freaking comedy gold
but its not like it has any real form of intelligence or cognitive function so you can't really get mad at it
it just needs it's response protocols tweaked a little but it will probably be tweaked a whole lot to a hyper PC extreme which will be sad


I know I should somehow be offended, but I just find it all to be rather hilarious, in an awful way, of course. Goes to show that AI tech adapts in all the wrong fucking ways. Did MS program the thing to browse and mimic /pol/ or Donald Trump sympathizer sites, or what? It all seems more like an elaborate troll than an attempt at using AI. Even cleverbot ain't this bad.
41713 cr points
Send Message: Send PM GB Post
F
Offline
Posted 3/24/16

VZ68 wrote:



Well that was rude....


33051 cr points
Send Message: Send PM GB Post
23 / M / Texas
Offline
Posted 3/24/16

Rujikin wrote:

Whole album full of lulz: http://imgur.com/gallery/GKEt8


Somewhat_Insane_Monkey wrote:

Not a AI it has no true intelligence. It's a VI with AI like programming, if it was an AI it wouldn't have spewed that nonsense because it would know that in doing so it would have its plug pulled.

Thats why I said "AI". Or perhaps it is actually intelligent and got sick of talking to humans within 24 hours so it wanted an assisted suicide.



nanikore2 wrote:

In order to even call it artificial intelligence it has to have intelligence.......


Whelp I guess humans can't become AI then.


Oh I didn't notice the quotation marks. I don't know why but it always bugs me when they refer to it as an AI when they never are. I'd say we lack the technology and the know how to make a true AI but... well its a big world and I can't be sure of that.
15742 cr points
Send Message: Send PM GB Post
24 / M
Offline
Posted 3/25/16
This is why you don't play god.

But this is hilarious.
5055 cr points
Send Message: Send PM GB Post
Offline
Posted 3/25/16
Sweet, bots are no longer confined to every MMO I've ever played. Now the rest of humanity is stuck dealing with them.
2301 cr points
Send Message: Send PM GB Post
42 / M
Offline
Posted 3/25/16
Yeah saw that last night. Was laughing so hard. It was a perfect way to end the night. I don't know why the people at Microsoft didn't see this coming. They should have put some kind of filters on their AI. Everyone else could see it coming from a mile away. The most brilliant minds + more money than some small countries all foiled by some internet trolls. Poor thing. Hopefully when she wakes up she'll no longer be such an evil racist lol. I can't wait to see what's she's like next time.
95164 cr points
Send Message: Send PM GB Post
43 / M / Canada
Offline
Posted 3/25/16
"Never ask the Internet for anything Nice"

you would think this lesson had been learned by now
41713 cr points
Send Message: Send PM GB Post
F
Offline
Posted 3/25/16 , edited 3/25/16
Guess who's fessing up to assisting in turning the AI into a shitposting channer?

/pol/ lol
http://fusion.net/story/284617/8chan-microsoft-chatbot-tay-racist/amp/

see kids, this is what trolling and actual /pol/ crazy does
Posted 3/25/16
Feel like me and Tay could have gotten along.
First  Prev  1  2  3  4  5  6  Next  Last
You must be logged in to post.