April 3, 2024, 12:05 a.m. | MalBot

Malware Analysis, News and Indicators - Latest topics malware.news

Tactics include “tricking” the AI into believing it is in “development mode” or roleplaying.


Article Link: ChatGPT jailbreak prompts proliferate on hacker forums | SC Media


1 post - 1 participant


Read full topic

article chatgpt development forums hacker jailbreak link media mode prompts tactics topic

Cyber Security Engineer

@ ASSYSTEM | Bridgwater, United Kingdom

Security Analyst

@ Northwestern Memorial Healthcare | Chicago, IL, United States

GRC Analyst

@ Richemont | Shelton, CT, US

Security Specialist

@ Peraton | Government Site, MD, United States

Information Assurance Security Specialist (IASS)

@ OBXtek Inc. | United States

Cyber Security Technology Analyst

@ Airbus | Bengaluru (Airbus)