In March, Discord announced that it had integrated OpenAI’s technology into its bot named Clyde, turning it into an AI-powered chatbot. Just like with any other chatbot launched in the last few months, users have been trying to trick Clyde into saying things it’s not supposed to say, a process colloquially known as a “jailbreaking.” […]
Jailbreak tricks Discord’s new chatbot into sharing napalm and meth instructions by Lorenzo Franceschi-Bicchierai originally published on TechCrunch...