113
'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned
(www.tomshardware.com)
This is a most excellent place for technology news and articles.
summary: using leet-speak got the model to return instructions on cooking meth. mitigated within a few hours.