Oct. 9, 2023, 5:49 p.m. | Jim Nash

Biometric Update www.biometricupdate.com


An annual U.S. Homeland Security study of bias in commercial biometric software, released in August, suggest similar results found in similar work done by the department since 2018.

AI has a harder time matching face images when the subject is female, darker in skin tone or wears glasses, according to the report.

In fact, 57 percent of models delivered lower mated similarity schools with darker skin tones. Eyewear confused algorithms on 96 percent of models.

However, image-acquisition products and matching …

algorithm august bias biometric biometric-bias biometric r&d biometrics biometrics news commercial demographic fairness department dhs facial recognition female found gender glasses homeland security images race report results security software study tone work

More from www.biometricupdate.com / Biometric Update

CyberSOC Technical Lead

@ Integrity360 | Sandyford, Dublin, Ireland

Cyber Security Strategy Consultant

@ Capco | New York City

Cyber Security Senior Consultant

@ Capco | Chicago, IL

Sr. Product Manager

@ MixMode | Remote, US

Corporate Intern - Information Security (Year Round)

@ Associated Bank | US WI Remote

Senior Offensive Security Engineer

@ CoStar Group | US-DC Washington, DC