all InfoSec news
Microsoft Details ‘Skeleton Key’ AI Jailbreak Technique
June 28, 2024, 12:05 p.m. | Eduard Kovacs
SecurityWeek RSS Feed www.securityweek.com
Microsoft has tricked several gen-AI models into providing forbidden information using a jailbreak technique named Skeleton Key.
The post Microsoft Details ‘Skeleton Key’ AI Jailbreak Technique appeared first on SecurityWeek.
ai ai models artificial intelligence chatgpt forbidden gen generative ai information jailbreak key microsoft skeleton skeleton key using
More from www.securityweek.com / SecurityWeek RSS Feed
Jobs in InfoSec / Cybersecurity
Senior Streaming Platform Engineer
@ Armis Security | Tel Aviv-Yafo, Tel Aviv District, Israel
Senior Streaming Platform Engineer
@ Armis Security | Tel Aviv-Yafo, Tel Aviv District, Israel
Deputy Chief Information Officer of Operations (Senior Public Service Administrator, Opt. 3)
@ State of Illinois | Springfield, IL, US, 62701-1222
Deputy Chief Information Officer of Operations (Senior Public Service Administrator, Opt. 3)
@ State of Illinois | Springfield, IL, US, 62701-1222
Analyst, Security
@ DailyPay | New York City
Analyst, Security
@ DailyPay | New York City