University vending machine error reveals use of secret facial recognition | A malfunctioning vending machine at a Canadian university has inadvertently revealed that a number of them have been usin...
Snack dispenser at University of Waterloo shows facial recognition message on screen despite no prior indication
University vending machine error reveals use of secret facial recognition | A malfunctioning vending machine at a Canadian university has inadvertently revealed that a number of them have been usin...::Snack dispenser at University of Waterloo shows facial recognition message on screen despite no prior indication
This seems like an over reaction by people who don't understand the technology or associated risks. Focus on the implementation not the tech. There is no indication that the vending machine is inappropriatly storing or transmitting personally identifiable information or that its making decisions based on biased data.
Likely for general marketing feedback so not targeting individuals like Facebook, Google, etc. If the vending machine is GDPR compliant then it's not storing individuals PII on the machine (it would be physically insecure) or transmitting PII without consent. And anyway, the marketing team wouldn't care about individuals, they're looking for aggregate trends. I think we should have stricter anti-marketing laws but this is not a dangerous anti-privacy vector. Online marketing is far far worse so if we're concerned with privacy, let's implement laws and policies that protect privacy instead of these BS distractions that don't actually affect people's privacy.
This is a pretty "generous" take. I ask you then: if the company isn't doing communicating any of the scans/recordings, what is the purpose of the technology being installed in the first place?
This type of analysis is cheap nowadays. You could easily fit a model to extract demographics from an image on a Jetson Nano (basically a Raspberry Pi with a GPU). Models have gotten more efficient while hardware has also gotten cheaper.
MSRP is $100. Even assuming you can cut that to $50 in bulk, $50 per unit is something that manufacturers are going to take seriously as an added cost. They're not going to pay it without an intent to use it.
And that's before software costs. Even leveraging open source it's still going to take investment to tailor it to your deployment.
Marketing is often targeted, especially online (which is a huge privacy issue). I would guess they are using the data from these vending machines to measure the success of their marketing campaigns.
The FAA failed to regulate Boeing. I'm pro regulation and laws that protect people's privacy. And if this company and the individuals within it break the law they should receive appropriate punishments with fines tied to international revenue.
My point is that the laws should relate to privacy independent of the technology. The "ban face recognition" narrative misses the point and doesn't address the threats. Facial recognition technology can be used in ways that don't threaten individuals privacy and non facial recognition technologies can be a threat to individual privacy.
It's cynical to assume this company is breaking privacy with no evidence. But it's fair to say there needs to be greater punishments and regulations
That's not true. They're likely using a model that identifies some demographic attribute and associating that with a purchase. It's 2024, this can all be done on the machine. The machine doesn't need to store the individuals data etc. If the vending is storing enough data to identify individuals then it wouldn't be GDPR compliant.
Consent is a requirement for GDPR compliance. They are likely taking an image from the camera, extracting semantic attributes from the image, and then discarding the image. The length of time the individual is standing there making the purchase is likely longer than the image is stored in memory while extracting the attributes.
And it most definitely isn't. GDPR requires explicit consent for collecting OR processing personal information. As per the European Commission, just taking the picture and extracting some metrics off of it already counts as processing personal information:
There is no indication that the vending machine is inappropriatly storing or transmitting personally identifiable information or that its making decisions based on biased data.
And until the machine malfunctioned, there was no indication that the vending machine was collecting any data at all. Businesses can say whatever they want in the court of public opinion, but until these same claims are made in a court of law they should be considered lies to placate the public.
Furthermore, why even collect such data if it's not meant to be utilized? They already know what the most popular products are (since they know what they restock the most) so for what reason do they need to collect demographics?
Arguing that I have no concept of digital privacy because I choose to share my name and face is an ignorant statement and demonstrates how little you understand the concept of online privacy. For context, I work in tech in Canada, I deal with GDPR and other compliances. I understand the technology, the risks, and the attack vectors. These vending machines are not a serious threat to individuals privacy. Facebook, Google, Amazon, are serious threats. Focus your energy on the actual risks instead of making uninformed comments.
Did 2yo Marisol also make an informed choice to share her identity and location on the fediverse?
This vending machine is taking biometrics off of everyone who walks past it and you don't think that's the least bit concerning?
GDPR doesn't apply in Canada unless you are trying to operate business in Europe.
Compliance only matters if you can't afford a fine. If you can make more money violating regulations than the cost of the fine, it's just a business expense.
You pretend to care about consent and privacy and then mention my daughter by name here. You'll notice I share photos and details about my daughter from accounts on servers I control. There is an implicit agreement in the fediverse to respect people's privacy. I obviously don't rely on that implicit agreement because some people do unethical things as demonstrated in your post. I protect my daughter from legitimate online privacy and security threats, I don't play privacy and security theatre.
This vending machine is taking biometrics off of everyone who walks past
You have no evidence of this and there is no mention of this in the article. This also doesn't make any sense from an implementation perspective.
GDPR doesn’t apply in Canada unless you are trying to operate business in Europe.
You're correct that GDPR doesn't apply in Canada, it's just that GDPR is usually the strictest compliance so it's usual for companies to meet that compliance as a minimum.
Compliance only matters if you can’t afford a fine.
GDPR fines can be tied to global revenue.
When your beliefs don't align with the facts, consider changing your beliefs instead of doubling down on your opinions, making things up, and doing unethical things. Please try better.
The Canadian Human Rights Act protects Canadians from discrimination based on race, national or ethnic origin, colour, religion, age, sex, sexual orientation, gender identity or expression, marital status, family status, genetic characteristics, disability etc.
Lol yeah, if the easily checked facts don't align with beliefs then groupthink-people double down on their beliefs. Denying reality is easier than changing beliefs. It's the same reasoning skills that Trump supporters use 😅