Tomshardware June 01, 2024
A jailbreak of OpenAI's GPT-4o used leetspeak to get ChatGPT to bypass its usual safety measures, allowing users to receive knowledge on how to hotwire cars, synthesize LSD, and other illicit activities. 固定链接 ' 'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned ' 提交: June 1, 2024, 12:27pm CST