all InfoSec news
Encryption Based Covert Channel for Large Language Models
April 16, 2024, 9:18 a.m. |
IACR News www.iacr.org
ePrint Report: Encryption Based Covert Channel for Large Language Models
Yongge Wang
Transformer neural networks have gained significant traction since their introduction, becoming pivotal across diverse domains. Particularly in large language models like Claude and ChatGPT, the transformer architecture has demonstrated remarkable efficacy. This paper provides a concise overview of transformer neural networks and delves into their security considerations, focusing on covert channel attacks and their implications for the safety of large language models. We present a covert channel utilizing …
architecture channel chatgpt claude covert covert channel domains encryption eprint report introduction language language models large networks neural networks report wang
More from www.iacr.org / IACR News
A New Hash-based Enhanced Privacy ID Signature Scheme
1 day, 7 hours ago |
www.iacr.org
Hash-based Direct Anonymous Attestation
1 day, 7 hours ago |
www.iacr.org
Jobs in InfoSec / Cybersecurity
SOC 2 Manager, Audit and Certification
@ Deloitte | US and CA Multiple Locations
Security Architect - Identity and Access Management Architect (80-100% | Hybrid option)
@ Swiss Re | Madrid, M, ES
Alternant - Consultant HSE (F-H-X)
@ Bureau Veritas Group | MULHOUSE, Grand Est, FR
Senior Risk/Cyber Security Analyst
@ Baker Hughes | IN-KA-BANGALORE-NEON BUILDING WEST TOWER
Offensive Security Engineer (University Grad)
@ Meta | Bellevue, WA | Menlo Park, CA | Seattle, WA | Washington, DC | New York City
Senior IAM Security Engineer
@ Norfolk Southern | Atlanta, GA, US, 30308