Here we go again: Another chatbot is trained on a big pile 'o online utterances, and — surprise, surprise — having soaked up Internet bile, begins repeating it.
Haven't developers learned anything from Microsoft's Tay? Man, this happens basically every time someone tries digesting online talk through the four stomachs of their neural network. — Read the rest