In every reported case where police mistakenly arrested someone using facial recognition, that person has been Black
In every reported case where police mistakenly arrested someone using facial recognition, that person has been Black
Facial recognition software has always had trouble telling Black people apart, yet police departments are still using it.
Was this AI trained on an unbalanced data set? (Only black folks?) Or has it only been used to identify photos of black people? I have so many questions: some technical, some on media sensationalism