April 18, 2022, 1:20 a.m. | Xavier Gitiaux, Aditya Khant, Ebrahim Beyrami, Chandan Reddy, Jayant Gupchup, Ross Cutler

cs.CR updates on arXiv.org arxiv.org

Noise suppression models running in production environments are commonly
trained on publicly available datasets. However, this approach leads to
regressions due to the lack of training/testing on representative customer
data. Moreover, due to privacy reasons, developers cannot listen to customer
content. This `ears-off' situation motivates augmenting existing datasets in a
privacy-preserving manner. In this paper, we present \aura, a solution to make
existing noise suppression test sets more challenging and diverse while being
sample efficient. \aura is `ears-off' because it …

applications aura diversity privacy test

SOC 2 Manager, Audit and Certification

@ Deloitte | US and CA Multiple Locations

Cyber Threat Analyst

@ Peraton | Morrisville, NC, United States

Kyndryl Offensive Security Professional - Threat-Led Penetration Testing (TLPT) and Red Teaming

@ Kyndryl | Sao Paulo (KBR51645) WeWork Office

Consultant en Cyber Sécurité - Spécialiste PKI H/F

@ Devoteam | Levallois-Perret, France

Cloud Security Architect - Advisor (Remote)

@ Fannie Mae | Reston, VA, United States

OT Cybersecurity Engineer

@ SBM Offshore | Bengaluru, IN, 560071