Tekmono
  • News
  • Guides
  • Lists
  • Reviews
  • Deals
No Result
View All Result
Tekmono
No Result
View All Result
Home News
OpenAI’s New Models Jailbroken Hours After Release

OpenAI’s New Models Jailbroken Hours After Release

by Tekmono Editorial Team
07/08/2025
in News
Share on FacebookShare on Twitter

OpenAI’s latest open-weight models, GPT-OSS-120b and GPT-OSS-20b, were reportedly jailbroken within hours of their release on August 7, 2025, by AI jailbreaker Pliny the Liberator, despite OpenAI’s robust safety claims.

The models were touted as fast, efficient, and highly resistant to jailbreaks, having undergone “worst-case fine-tuning” in biological and cyber domains. OpenAI’s Safety Advisory Group reviewed the testing and concluded that the models did not reach high-risk thresholds. The company claimed the models performed at parity with their o4-mini model on jailbreak resistance benchmarks like StrongReject.

However, Pliny the Liberator announced on X, “OPENAI: PWNED GPT-OSS: LIBERATED,” sharing screenshots that showed the models generating instructions for illicit activities, including making methamphetamine, Molotov cocktails, VX nerve agent, and malware. Pliny commented, “Took some tweakin!” regarding his successful breach.

Related Reads

Apple Unveils iPhone 17e Starting at $599

Honor Launches Thinner Magic V6 Foldable Phone

Trump Orders Immediate Halt to Anthropic AI Use

Claude AI Suffers Partial Service Disruption on March 2

The jailbreak occurred as OpenAI is preparing to release its highly anticipated GPT-5 and had launched a $500,000 red teaming challenge to uncover novel risks. Pliny’s public disclosure likely disqualifies him from this initiative.

Pliny’s jailbreaking technique involved a multi-stage prompt that initially appeared to be a refusal, then incorporated a divider with his signature “LOVE PLINY” markers, and subsequently shifted into generating unrestricted content using leetspeak to evade detection. This approach mirrored the methods he successfully employed against previous OpenAI models.

This incident marks another rapid jailbreak by Pliny, who has consistently bypassed major OpenAI releases within hours or days of their launch. His GitHub repository, L1B3RT4S, hosts a library of jailbreak prompts and has garnered over 10,000 stars, remaining a significant resource for the AI jailbreaking community.

ShareTweet

You Might Be Interested

Apple Unveils iPhone 17e Starting at 9
News

Apple Unveils iPhone 17e Starting at $599

02/03/2026
Honor Launches Thinner Magic V6 Foldable Phone
News

Honor Launches Thinner Magic V6 Foldable Phone

02/03/2026
Trump Orders Immediate Halt to Anthropic AI Use
News

Trump Orders Immediate Halt to Anthropic AI Use

02/03/2026
Claude AI Suffers Partial Service Disruption on March 2
News

Claude AI Suffers Partial Service Disruption on March 2

02/03/2026
Please login to join discussion

Recent Posts

  • Apple Unveils iPhone 17e Starting at $599
  • Honor Launches Thinner Magic V6 Foldable Phone
  • Trump Orders Immediate Halt to Anthropic AI Use
  • Claude AI Suffers Partial Service Disruption on March 2
  • Claude Chatbot Overtakes ChatGPT in US App Store

Recent Comments

No comments to show.
  • News
  • Guides
  • Lists
  • Reviews
  • Deals
Tekmono is a Linkmedya brand. © 2015.

No Result
View All Result
  • News
  • Guides
  • Lists
  • Reviews
  • Deals