
Those limits are now being lifted, apparently after some work on the brand’s part.ĭiscord is aware of the potential for its ChatGPT-based chatbot to go down a similar road and recommends that users not rely on Clyde for “advice, Discord support, or safety issues.” Users will also receive a warning when opting in to use the AI-updated Clyde for the first time.Īdditionally, the company is aware that due to the knowledgeable demographic of its users, there is the potential for them to attempt jailbreaking the language models and force Clyde to say bad things or break terms of service. Users dealt with the depressed and unhinged responses generated from the chat in its early preview days, which led Microsoft to put restrictions on the turn limits in conversations. The biggest example has been Microsoft’s Bing Chat, which released its AI chatbot in February. However, the Clyde chatbot inundated with AI has the potential to have the same issues and limitations as other products that have been using language models provided by ChatGPT’s parent company, OpenAI.
