A set of smart vending machines at the University of Waterloo is expected to be removed from campus after students raised privacy concerns about their software.
A set of smart vending machines at the University of Waterloo is expected to be removed from campus after students raised privacy concerns about their software.
The machines have M&M artwork on them and sell chocolate and other candy. They are located throughout campus, including in the Modern Languages building and Hagey Hall.
Earlier this month, a student noticed an error message on one of the machines in the Modern Languages building. It appeared to indicate there was a problem with a facial recognition application.
"We wouldn't have known if it weren't for the application error. There's no warning here," said River Stanley, a fourth-year student, who investigated the machines for an article in the university publication, mathNEWS.
I looked up the brand (Invenda). Their PDF includes "using AI", "measuring foot traffic", and gathering "gender/age/etc" e.g. facial recognition to estimate a persons age and gender
And in terms of "stored locally" this is straight from their website
The machine comes with a “brain” – Invenda OS – and is connected to the Invenda Cloud, which allows you to manage it remotely and gather valuable environmental, consumer and transactional data. The device can be branded according to your requirements to further enhance your brand presence.
The marketing also so fricken backwards that it reads like satire:
For a consumer, there’s no greater comfort than shopping pressure-free. Invenda Wallet allows consumers to browse, select and pay for products leisurely and privately 🤦♂️
What really bothers me is the "measuring foot traffic". I already refuse to use vending-machines because of the pricing and unhealthyness, but you're telling me I need to make GDPR takedown requests just for walking to class?
Also this is data that any reasonable company could get in like half an hour of searching and asking.
There is data on how many meals are sold a day at the mensa, how many students are enrolled, how many students live on campus...
Unless the vending machine is in the last corner of the third floor of an half empty building, all this information can be puzzled together to get a good estimate of how many people are passing the machine on a day to day basis.
People panic about face scan while the ongoing massive privacy breaches exist around online services and electronic devices. The amount of personal data that people pour into smartphones is enormous compared to using that vending machine. We need more GDPR.
I keep telling my zoomer son he needs to read 1984. Not to live his life in fear of it, but to help his awareness of it, and provide an example of what that sort of societal control can look like. It's probably the one thing I nag him about. 5 years later he still hasn't read it. lol
I haven't read it in decades, but I still feel it's hard to miss certain parallels with modern reality when you have.
A good book to pair with 1984 is A Brave New World. They both tackle forms of control but from two different approaches. In A Brave New World there's no need for thought police. Every person is designed and crafted from conception to adulthood to never have a criminal thought.
That's another good one! Thanks for reminding me of it! Kind of ironically I read most of that book while hiding from my job (that's a story) in the bathroom for short periods of time in my early twenties.
That plus Helen Nissenbaum. When you read 1984 and then start thinking about the concept of future contexts changing use of private data, you get real nervous.
There was a quaint old time, shortly after Google was founded, where people mused about privacy over the internet. It was forgotten about as the profits started rolling in and pretty much all other companies started following along. That was the time when we started transitioning into a period of massive data surveillance. Glad to see that the conversation is starting to pick up again in some areas, though it's definitely being actively suppressed in many others.
beuh, they obviously mean that the biometric data is stored and processed locally, not the data that results from that processing.
i mean that's still kinda creepy but you're making it seem like they didn't obviously admit to it in the original sentence.