April 2, 2024, 7:12 p.m. | Hengyuan Xu, Liyao Xiang, Hangyu Ye, Dixi Yao, Pengzhi Chu, Baochun Li

cs.CR updates on arXiv.org arxiv.org

arXiv:2304.07735v3 Announce Type: replace
Abstract: Revolutionizing the field of deep learning, Transformer-based models have achieved remarkable performance in many tasks. Recent research has recognized these models are robust to shuffling but are limited to inter-token permutation in the forward propagation. In this work, we propose our definition of permutation equivariance, a broader concept covering both inter- and intra- token permutation in the forward and backward propagation of neural networks. We rigorously proved that such permutation equivariance property can be satisfied …

applications arxiv concept cs.cr deep learning definition forward performance research token transformers work

Information Security Engineers

@ D. E. Shaw Research | New York City

Technology Security Analyst

@ Halton Region | Oakville, Ontario, Canada

Senior Cyber Security Analyst

@ Valley Water | San Jose, CA

Senior - Penetration Tester

@ Deloitte | Madrid, España

Associate Cyber Incident Responder

@ Highmark Health | PA, Working at Home - Pennsylvania

Senior Insider Threat Analyst

@ IT Concepts Inc. | Woodlawn, Maryland, United States