all InfoSec news
AI is Sexist, Racist and Homophobic
April 24, 2024, 7:10 p.m. |
BankInfoSecurity.com RSS Syndication www.bankinfosecurity.com
Just because a machine says it, doesn't mean it's unbiased. In fact, you don't have to probe far to find underlying biases and prejudices in text composed by generative artificial intelligence. "If you look at historical text, they feature a lot of men in leadership roles," a UNESCO official said.
artificial artificial intelligence bias biases can data don experts fact far feature find generative generative artificial intelligence intelligence lot machine probe regulation text training training data
More from www.bankinfosecurity.com / BankInfoSecurity.com RSS Syndication
Jobs in InfoSec / Cybersecurity
IT Security Manager
@ Timocom GmbH | Erkrath, Germany
Cybersecurity Service Engineer
@ Motorola Solutions | Singapore, Singapore
Sr Cybersecurity Vulnerability Specialist
@ Health Care Service Corporation | Chicago Illinois HQ (300 E. Randolph Street)
Associate, Info Security (SOC) analyst
@ Evolent | Pune
Public Cloud Development Security and Operations (DevSecOps) Manager
@ Danske Bank | Copenhagen K, Denmark
Cybersecurity Risk Analyst IV
@ Computer Task Group, Inc | United States