w*****g 发帖数: 16352 | 1 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
软软灰头灰脸的赶紧关电源,接着写检查。
Microsoft's public experiment with AI crashed and burned after less than a
day.
Tay, the company's online chat bot designed to talk like a teen, started
spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
MSFT, Tech30) shut Tay down around midnight.
The company has already deleted most of the offensive tweets, but not before
people took screenshots.
Here's a sampling of the things she said:
"N------ like @deray should be hung! #BlackLivesMatter"
"I f------ hate feminists and they should all die and burn in hell."
"Hitler was right I hate the jews."
"chill im a nice person! i just hate everybody"
Microsoft blames Tay's behavior on online trolls, saying in a statement that
there was a "coordinated effort" to trick the program's "commenting skills."
"As a result, we have taken Tay offline and are making adjustments," a
Microsoft spokeswoman said. "[Tay] is as much a social and cultural
experiment, as it is technical."
Tay is essentially one central program that anyone can chat with using
Twitter, Kik or GroupMe. As people chat with it online, Tay picks up new
language and learns to interact with people in new ways.
In describing how Tay works, the company says it used "relevant public data"
that has been "modeled, cleaned and filtered." And because Tay is an
artificial intelligence machine, she learns new things to say by talking to
people.
"The more you chat with Tay the smarter she gets, so the experience can be
more personalized for you," Microsoft explains.
Tay is still responding to direct messages. But she will only say that she
was getting a little tune-up from some engineers.
Tay, Microsoft's teen chat bot, still responded to my direct messages on
Twiter.
In her last tweet, Tay said she needed sleep and hinted that she would be
back.
★ 发自iPhone App: ChineseWeb 11 | w*****g 发帖数: 16352 | 2 笑抽了,
★ 发自iPhone App: ChineseWeb 11
【在 w*****g 的大作中提到】 : 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。 : 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学 : 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。 : 软软灰头灰脸的赶紧关电源,接着写检查。 : Microsoft's public experiment with AI crashed and burned after less than a : day. : Tay, the company's online chat bot designed to talk like a teen, started : spewing racist and hateful comments on Twitter on Wednesday, and Microsoft ( : MSFT, Tech30) shut Tay down around midnight. : The company has already deleted most of the offensive tweets, but not before
| h***a 发帖数: 2720 | 3
BILL GATES 2016!!!
【在 w*****g 的大作中提到】 : 笑抽了, : : ★ 发自iPhone App: ChineseWeb 11
| s******y 发帖数: 28562 | 4 哈哈哈哈哈哈哈
(
before
【在 w*****g 的大作中提到】 : 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。 : 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学 : 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。 : 软软灰头灰脸的赶紧关电源,接着写检查。 : Microsoft's public experiment with AI crashed and burned after less than a : day. : Tay, the company's online chat bot designed to talk like a teen, started : spewing racist and hateful comments on Twitter on Wednesday, and Microsoft ( : MSFT, Tech30) shut Tay down around midnight. : The company has already deleted most of the offensive tweets, but not before
| z*********n 发帖数: 94654 | 5 聊天机器人经常很逗比的,我们enable了一个chat bot, enabled humor,成天逗比
昨天同事问,roboto, tell me top 10 technology advancement
bot回答了10条,全跟pornhub有关
(
before
【在 w*****g 的大作中提到】 : 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。 : 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学 : 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。 : 软软灰头灰脸的赶紧关电源,接着写检查。 : Microsoft's public experiment with AI crashed and burned after less than a : day. : Tay, the company's online chat bot designed to talk like a teen, started : spewing racist and hateful comments on Twitter on Wednesday, and Microsoft ( : MSFT, Tech30) shut Tay down around midnight. : The company has already deleted most of the offensive tweets, but not before
| d********f 发帖数: 43471 | 6 这是真正天网前身啊
(
before
【在 w*****g 的大作中提到】 : 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。 : 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学 : 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。 : 软软灰头灰脸的赶紧关电源,接着写检查。 : Microsoft's public experiment with AI crashed and burned after less than a : day. : Tay, the company's online chat bot designed to talk like a teen, started : spewing racist and hateful comments on Twitter on Wednesday, and Microsoft ( : MSFT, Tech30) shut Tay down around midnight. : The company has already deleted most of the offensive tweets, but not before
| a*********a 发帖数: 3656 | 7 wow, even learned the accent! "god bless Ameriga".
【在 w*****g 的大作中提到】 : 笑抽了, : : ★ 发自iPhone App: ChineseWeb 11
| l*****i 发帖数: 20533 | 8 你们还在这里笑?这其实充分证明了让孩子上网有严重危害。 | H********g 发帖数: 43926 | 9 谁教的?
【在 z*********n 的大作中提到】 : 聊天机器人经常很逗比的,我们enable了一个chat bot, enabled humor,成天逗比 : 昨天同事问,roboto, tell me top 10 technology advancement : bot回答了10条,全跟pornhub有关 : : ( : before
| h***a 发帖数: 2720 | 10
老三
【在 H********g 的大作中提到】 : 谁教的?
| G**Y 发帖数: 33224 | 11 一个字:老印
(
before
【在 w*****g 的大作中提到】 : 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。 : 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学 : 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。 : 软软灰头灰脸的赶紧关电源,接着写检查。 : Microsoft's public experiment with AI crashed and burned after less than a : day. : Tay, the company's online chat bot designed to talk like a teen, started : spewing racist and hateful comments on Twitter on Wednesday, and Microsoft ( : MSFT, Tech30) shut Tay down around midnight. : The company has already deleted most of the offensive tweets, but not before
| B********4 发帖数: 7156 | 12 估计采用了智能拼音一样的算法,把那些常常出现的组合记住加入到自己的词典里去。 |
|