Oct. 9, 2023, 5:49 p.m. | Jim Nash

Biometric Update www.biometricupdate.com


An annual U.S. Homeland Security study of bias in commercial biometric software, released in August, suggest similar results found in similar work done by the department since 2018.

AI has a harder time matching face images when the subject is female, darker in skin tone or wears glasses, according to the report.

In fact, 57 percent of models delivered lower mated similarity schools with darker skin tones. Eyewear confused algorithms on 96 percent of models.

However, image-acquisition products and matching …

algorithm august bias biometric biometric-bias biometric r&d biometrics biometrics news commercial demographic fairness department dhs facial recognition female found gender glasses homeland security images race report results security software study tone work

Social Engineer For Reverse Engineering Exploit Study

@ Independent study | Remote

SITEC- Systems Security Administrator- Camp HM Smith

@ Peraton | Camp H.M. Smith, HI, United States

Cyberspace Intelligence Analyst

@ Peraton | Fort Meade, MD, United States

General Manager, Cybersecurity, Google Public Sector

@ Google | Virginia, USA; United States

Cyber Security Advisor

@ H&M Group | Stockholm, Sweden

Engineering Team Manager – Security Controls

@ H&M Group | Stockholm, Sweden