News

The Microsoft AI chatbot has professed love, threatened harm, and theorized how it could hack people. So, yeah, a weird rollout.
We've all had fun, we know that ChatGPT and Bing Chat can say silly things or can be 'jailbroken' to say things they shouldn't say at all.