Porcha Woodruff was getting her two children ready for school when police officers presented her with an arrest warrant alleging robbery and carjacking, court documents show.
Detroit woman sues city after being falsely arrested while pregnant due to facial recognition technology::A Detroit woman is suing the city and a police detective after she was falsely arrested because of facial recognition technology while she was eight months pregnant, according to court documents.
According to a recent review, 100% of the people falsely arrested via facial recognition findings have been black.
The technology needs to be legally banned from law enforcement applications, because law enforcement is not making a good faith effort to use the technology.
We should ban patrol automation software too. They utilize historical arrest data to help automatically create patrol routes. Guess which neighborhoods have a history of disproportionate policing.
The problems with the approaches that tend to get used should be the cause of absolute outrage. They’re ones that should get anyone laughed off of any college campus.
The problem is that they lend a semblance of scientific justification to confirm the biases of both police departments and many voters. Politicians look to statisticians and scientists to tell them why they’re right, not why they’re wrong.
That’s why it’s so important for these kinds of issues to make the front pages.
A similar thing has happened here in the Netherlands. Algorithms have been used to detect fraud, but had a discriminatory bias and accused thousands of parents of child benefits fraud. Those parents came in huge financial problems as they had to back back the allowances, many even got their children taken away and to this day haven't gotten them back.
The Third Rutte Cabinet did resign over this scandal, but many of those politicians came back at another position, including prime minister Rutte, because that's somehow allowed.
Well it does have its place. DoD and DHS has a human verify after the system certified the match. After the human “touch”, is when mistakes like this do not occur. What needs to happen is follow what DoD and CBP have created to verify so called matches to reduce the impact against blacks.
Source: I’m the former Identity Operations Manager for a major agency.
A Detroit woman is suing the city and a police detective after she was falsely arrested because of facial recognition technology while she was eight months pregnant, according to court documents.
Porcha Woodruff, 32, was getting her two children ready for school on the morning of Feb. 16 when six police officers showed up at her doorstep and presented her with an arrest warrant alleging robbery and carjacking.
"Ms. Woodruff later discovered that she was implicated as a suspect through a photo lineup shown to the victim of the robbery and carjacking, following an unreliable facial recognition match," court documents say.
When Oliver learned that a woman had returned the victim's phone to the gas station, she ran facial technology on the video, which identified her as Woodruff, the lawsuit alleges.
On the day Woodruff was arrested, she and her fiancé urged officers to check the warrant to confirm whether the woman who committed the crime was pregnant, which they refused to do, the lawsuit alleges.
The office confirmed that facial recognition prompted police to include the plaintiff's photo in a six-pack, or array of images of potential suspects in the warrant package.
I'm going to buck the trend here and say this is less about the facial recognition software. The police used an 8 year old photo even though they had something more recent available. Then the victim identifies the woman. The only thing the software did was put her in the lineup.
I'm very much against facial recognition, even if it's 100% accurate. It's because it will get abused. Just like any other tech that reduces privacy.
Eyewitnesses are notoriously unreliable at picking people out of a lineup as well. But I can kind of understand how if two unreliable systems point to the same person, that could be seen as enough for an arrest. It shouldn't have taken nearly as long for her to be cleared of any charges, however.
It’s sort of the “guns don’t kill people, people kill people” argument. It just gives a shitty cop cover to keep being shitty. The tools should be improved to eliminate that cover unless it’s far more accurate.