In NYC, companies will have to prove their AI hiring software isn't sexist or racist
In NYC, companies will have to prove their AI hiring software isn't sexist or racist
AI-infused hiring programs have drawn scrutiny, most notably over whether they end up exhibiting biases based on the data they’re trained on.
AI-infused hiring programs have drawn scrutiny, most notably over whether they end up exhibiting biases based on the data they’re trained on.
You're viewing a single thread.
Isn't the whole point of AI decision making to provide plausible deniability for these sort of things?
1 1 ReplyYes, but if you train an AI on racist/sexist data, it will naturally do the same.
1 0 ReplyDepends how the law is applied...
Kinda like if a self driving car kills someone, who is liable, driver, manufacturer, seller?
I guess you pay insurance and they take on liability is another option.
1 1 Reply