阅读疯子 ydfz.cn » Tomshardware » 'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned
首页
刷新
搜索
管理
展开/收起
Tomshardware
June 01, 2024
A jailbreak of OpenAI's GPT-4o used leetspeak to get ChatGPT to bypass its usual safety measures, allowing users to receive knowledge on how to hotwire cars, synthesize LSD, and other illicit activities.
固定链接 ' 'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly banned '
提交:
June
1
, 2024, 12:27pm CST
← ASRock launches Radeon RX 7900 WS cards with blower coolers and 12V-2×6 power connectors
RTX 4080 Super vs RX 7900 XTX GPU faceoff: Battle for the high-end →
未登陆
登录
一口价购买域名 www.ydfz.cn (阿里云)
一口价购买域名 www.ydfz.com.cn (阿里云)