Controversial firm, which acts as a search engine for faces, wins appeal against a watchdog.
"A company which enables its clients to search a database of billions of images scraped from the internet for matches to a particular face has won an appeal against the UK's privacy watchdog.
Last year, Clearview AI was fined more than £7.5m by the Information Commissioner's Office (ICO) for unlawfully storing facial images.
Privacy International (who helped bring the original case I believe) responded to this on Mastodon:
"The first 33 pages of the judgment explain with great detail and clarity why Clearview falls squarely within the bounds of GDPR. Clearview's activities are entirely "related to the monitoring of behaviour" of UK data subjects.
In essence, what Clearview does is large-scale processing of a highly intrusive nature. That, the Tribunal agreed.
BUT in the last 2 pages the Tribunal tells us that because Clearview only sells to foreign governments, it doesn't fall under UK GDPR jurisdiction.
So Clearview would have been subject to GDPR if it sold its services to UK police or government authorities or commercial entities, but because it doesn't, it can do whatever the hell it wants with UK people's data - this is at best puzzling, at worst nonsensical."
But as long as the data acquisition as a process and storage happens on UK territory, isn't it still illegal? Isn't it like saying I'm robbing a bank but since I wired the funds into a Swiss safe, I'm good?
Only if you're doing so in an official governmental capacity for your country.
The article is basically that they won the appeal because they only provide services to governments and law enforcement (having previously withdrawn their services to businesses because they lost a lawsuit in the USA)
So Clearview would have been subject to GDPR
if it sold its services to UK police or government
authorities or commercial entities, but because it
doesn't, it can do whatever the hell it wants with
UK people's data - this is at best puzzling, at worst
nonsensical.
While on an individual law level it's extremely frustrating the article has a quote which makes perfect sense.
it is not for one government to seek to bind or control the activities of another sovereign state
If that wasn't a concept in law any country could pass any law in and expect it to apply internationally.
Wouldn't a UK court only concern itself with the activities of a company operating in the UK? If this company does not operate in the UK I'm surprised it's got far enough to need overturning
The internet has made everything really weird in terms of jurisdictions. You can have photos of UK citizens taken in the UK and stored on a UK server, and if a company from somewhere else scrapes the data without permission and moves it out the UK, that doesn't obviously mean that it's now fine to use for whatever.
Now of course the law has to have some jurisdictional limits, but it's not surprising that there has been some disagreement about where they are.
but because it doesn’t, it can do whatever the hell it wants with UK people’s data - this is at best puzzling, at worst nonsensical
Let's not forget one teeny tiny fact here: the people whose data Clearview can do whatever the hell it wants put it up online ALL BY THEMSELVES! Clearview scrapes the internet to find its material.
I've refused to have my picture taken since 2000 under any circumstances - be it at work, in group photos in clubs, etc. The reason being, those photos invariably get uploaded somewhere, usually with a caption that says "From left to right: ..."
I've been called paranoid and batshit crazy since 2000. But guess what: Clearview doesn't have my photo. Who's having the last laught now eh?
Clearview is a hateful turd of an outfit. It should be shut down for obscene immorality and its CEO can burn in hell. But let's not forget that it exploits people's carelessness. People's data fuels the corporate surveillance economy and this has been public knowledge for more than a couple of decades. It should come as no surprise that somebody some day would attempt to match people's faces with people's names using the data people themselves provided.
People should be aware of the dangers of group photos, and my point is that they should have known this was coming a long time ago and should limit their exposure.
Whilst I agree it's wise to take precautions, it seems weird to me that we think its OK to expect the onus to be on us to curtail a normal activity like sharing a pic of you and your mates messing about rather than the onus on these companies not harvesting those pics to create a sellable database of us to allow governments to circumnavigate the need for warrants.
Edit: and with the amount of self-styled internet pranksters and influencers randomly shooting images and video of whatever they want, unless you leave the house wearing a balaclava you don't really have any choice of your face being part of their dataset.
it seems weird to me that we think its OK to expect the onus to be on us to curtail a normal activity like sharing a pic of you and your mates messing about rather than the onus on these companies not harvesting those pics
I totally agree with you. In a sane and functional society, corporate surveillance would be illegal. But we don't live in a sane society do we? The powers that be don't do much of anything to curb the gross privacy violations, and they don't because most of them are on big tech's payroll and do big tech's bidding.
With that in mind, how does a concerned individual live in such a society? Carefully. If you value your privacy and you want to limit the amount of data you share with Big Data, everything you do is basically hamstrung by the thought of what harm it will bring to your privacy.
Do you really think I like living my life denying everybody the right to take a photo with me in it? Of course I would like to be on the company's outings' photos. Of course I would like to show my face on that Teams meeting call. But I just don't want to show my face to Big Data, so I don't. I wish those awful companies couldn't legally misuse my data, but nobody is reining them in.
Data being public (and privacy in general) shouldn't be 'all or none'. The problem is people joining the dots between individual bits of data to build a profile, not necessarily the individual bits of data.
If you go out in public, someone might see you and recognise you, and that isn't considered a privacy violation by most people. They might even take a photo or video which captures in the background, and that, in isolation isn't considered a problem either (no expectation of privacy in a public place). But if someone sets out to do similar things at a mass scale (e.g. by scraping, or networking cameras, or whatever) and piece together a profile of all the places you go in public, then that is a terrible privacy violation.
Now you could similarly say that people who want privacy should never leave home, and otherwise people are careless and get what they deserve if someone tracks their every move in public spaces. But that is not a sustainable option for the majority of the world's population.
So ultimately, the problem is the gathering and collating of publicly available personally identifiable information (including photos) in ways people would not expect and don't consent to, not the existence of such photos in the first place.