Apparently happened before:

"In 2016, Microsoft released another chatbot, named Tay, which operate through a Twitter account. Within 24 hours, the system was manipulated into tweeting its admiration for Adolf Hitler and posting racial slurs, and it was shut down" - Independent

Reply to this note

Please Login to reply.

Discussion

No replies yet.