I am not making this up: apparently Microsoft put a Twitter chat-bot online as part of an artificial-intelligence project, and after a few hours of online interaction it had turned into a Nazi.
Microsoft has since deleted its tweets, but some more of them are here.
The bot, called Tay, has now been taken down for “adjustments”. (With that nice Mr. O’Brien, in Room 101.)
Ah, AI. Our future. What could possibly go wrong?
6 Comments
Everyone will be a Nazi for 15 minutes.
With a few adjustments Tay can become anything. His next persona might be fun to watch.
Just imagine if Microsoft had trained Tay on Gawker tweets.
Yeah I know it isn’t anything to do with Tay but;
https://ricochet.com/for-james-lileks/
OK, JK, I’ll bite — what does it have to do with?
National Review. April 20th, 2015;
http://www.nationalreview.com/article/417205/gop-needs-run-against-last-16-years-charles-c-w-cooke
Then came June 2015.