I'm not directly impacted by this, for multiple reasons (different government/country, childless, hard to confuse with underage). Even then, this sounds like a blatantly Bad Idea® for me.
The data suggests that for those between 25 and 35, 15 out of 1,000 females vs 7 out of 1,000 males might be incorrectly classified as under-25 (and would have the option of verifying using another method)," the filing states. "The range of difference by skin tone is between 8 out of 1,000 vs 28 out of 1,000.
Let me rephrase this: if you got the "wrong" skin colour, there's a chance of 28/1k= 1 in 35 that you're assumed to be unrightfully trying to access entertainment above your assumed age range. And that's accordingly to the data from the filing, i.e. from the ones proposing the implementation of this system, so there's a good chance that they're lowering the numbers to make it look better (or rather, less worse) than it is. That's fucking awful when you're dealing with people; but those fuckers from the ESRB don't care, right? "You're a cash cow, not an actual human being."
And even if the numbers are accurate (yeah, sure, let's all be a bunch of gullible morons), one detail that is not being mentioned here is that, if you're black and a woman, you're specially screwed - because you're in both cohorts that increase the likelihood for false positives. I'm betting that, for black women, the false positive rate will be something between 50/1k = 1 in 20 and 100/1k = 1 in 10.
There's a legal principle around the world called "presumption of innocence"; in other words, that unless it's proved that you're doing something wrong, you should be treated as if doing something lawful. This legal principle applies in USA too, right? Guess what, it won't apply to those people incorrectly flagged as "I assume that you're underage", who'll need to go out of their way and through stupid bureaucracy and delays to show that they have rightful access to the piece of entertainment in question.
And let me guess something: once the system does stupid shit after stupid shit, the ones responsible for the system will find a thousand excuses to not take responsibility for it. Such as "it's the system, not me!" (treating a tool as an agent).
The ESRB dismissed concerns about the "fairness" of the system, however, saying that "the difference in rejection rates between gender and skin tone is very small."
The very data shows the opposite.
[from the ESRB report] "While bias exists, as is inherent in any automated system, this is not material."
The name for this shitty argument is red herring - it distracts you from what matters. If the bias is "ackshyually, not material" is irrelevant, what matters is the presence of the bias there on first place.
the ESRB presented its facial recognition plan as "an additional, optional verification method"
Slippery slope can be either a fallacy... or the acknowledgement that we humans tend to act like a bunch of boiled frogs.
There's a reasonable risk for any "optional" system or requirement to becomes "obligatory". Specially when handling legislation.
Ah, something that the article doesn't mention: the risk of false negative. Grab a picture of your dad/mum, move it a bit back and forth to pretend that it's an actual person, dumb bot says "okay".
Here's my question. See all those businesses that I mentioned at the top? How do they plan to profit with the potential implementation of this shitty idea?
All your percentages look like they are missing a decimal point before the last digit but since your 1 in... figures are correct i assume that is just a typo or glitch of some kind.
As octoperson said I've used permille (‰), not percent (%). The article was already listing ratios per 1000, so it was easier. (Plus old habits die hard - I used ‰ quite a bit in chemistry).
I hadn't even considered it might be that. I don't think I have seen the permille symbol on the web before (as opposed to scientific papers or on black boards or similar places).