TO EVERYONE SAYING THAT THIS IS NOT A CONCERN: Everybody has different laws in their countries (in other words, not everyone is American), and whether or not an admin is liable for such content residing in their servers without their knowledge, don't you think it's still an issue anyway? Are you not bothered by the fact that somebody could be sharing illegal images from your server without you ever knowing? Is that okay with you? OR are you only saying this because you're NOT an admin? Different admins have already responded in the comments and have suggested ways to solve the problem because they are genuinely concerned about this problem as much as I am. Thank you to all the hard working admins. I appreciate and love you all.
ORIGINAL POST
You can upload images to a Lemmy instance without anyone knowing that the image is there if the admins are not regularly checking their pictrs database.
To do this, you create a post on any Lemmy instance, upload an image, and never click the "Create" button. The post is never created but the image is uploaded. Because the post isn't created, nobody knows that the image is uploaded.
You can also go to any post, upload a picture in the comment, copy the URL and never post the comment. You can also upload an image as your avatar or banner and just close the tab. The image will still reside in the server.
You can (possibly) do the same with community icons and banners.
Why does this matter?
Because anyone can upload illegal images without the admin knowing and the admin will be liable for it. With everything that has been going on lately, I wanted to remind all of you about this. Don't think that disabling cache is enough. Bad actors can secretly stash illegal images on your Lemmy instance if you aren't checking!
These bad actors can then share these links around and you would never know! They can report it to the FBI and if you haven't taken it down (because you did not know) for a certain period, say goodbye to your instance and see you in court.
Only your backend admins who have access to the database (or object storage or whatever) can check this, meaning non-backend admins and moderators WILL NOT BE ABLE TO MONITOR THESE, and regular users WILL NOT BE ABLE TO REPORT THESE.
Aren't these images deleted if they aren't used for the post/comment/banner/avatar/icon?
NOPE! The image actually stays uploaded! Lemmy doesn't check if the images are used! Try it out yourself. Just make sure to copy the link by copying the link text or copying it by clicking the image then "copy image link".
How come this hasn't been addressed before?
I don't know. I am fairly certain that this has been brought up before. Nobody paid attention but I'm bringing it up again after all the shit that happened in the past week. I can't even find it on the GitHub issue tracker.
I'm an instance administrator, what the fuck do I do?
Check your pictrs images (good luck) or nuke it. Disable pictrs, restrict sign ups, or watch your database like a hawk. You can also delete your instance.
The logical fix would be to delete them automatically when unused for longer than let's say 24 hours. That should be in the lemmy code, and we should not depend on 3rd party utilities to do that.
Or, just tighten up the api such that uploaded pictures have a relatively short TTL unless they become attached to a post or otherwise linked somewhere.
A script is a fine stopgap measure, but we should try to treat the cause wherever possible, instead of simply addressing the symptom.
What's the practical difference? In both cases you're culling images based on whether they're orphaned or not.
If you're suggesting that the implementation be based on setting individual timers instead of simply validating the whole database at regular intervals, consider whether or not the complexity of such a system is actually worth the tradeoff.
"Complexity comshmexity", you might say. "Surely it's not a big deal!". Well... what about an image that used to belong to a valid post that later got deleted? Guess you have to take that edge case into account and add a deletion trigger there as well! But what if there were other comments/posts on the same instance hotlinking the same image? Guess you have to scan the whole DB every time before running the deletion trigger to be safe! Wait... wasn't the whole purpose of setting this up with individual jobs to avoid doing a scripted DB scan?
FYI to all admins: with the next release of pict-rs, it should be much easier to detect orphaned images, as the pict-rs database will be moved to postgresql. I am planning to build a hashtable of "in-use" images by iterating through all posts and comments by lemm.ee users (+ avatars and banners of course), and then I will iterate through all images in the pict-rs database, and if they are not in the "in-use" hash table, I will purge them.
Of course, Lemmy can be improved to handle this case better as well!
You find an issue, you report it to the right channel, you notify it. Good. This is how software development work, with active community reporting issues.
I'm not on GitHub. Nor is a lot here. I'm wording it this way so the issue gets the attention it deserves. Anyway, everybody already knows about this but nobody understood the consequences. Same reason why there's no option to disable image caching. These issues should have been addressed the moment image uploading was made available in Lemmy. It was just overlooked because of how tiny the platform was then.
It's funny because last month Mastodon CSAM was a hot topic in the Fediverse and people were being defensive about it. Look where we are now. Has Mastodon addressed the CSAM issue? Did they follow the recommendations made by that paper? I don't think so. There wouldn't be an open GitHub issue about it. Will Lemmy be like Mastodon or will it addressed the concerns of its users?
If you notify here, your information will be lost in 2 days. People forget, and move on to the next hot topic. Relevant stakeholders might very well completely miss this post, because they are not 24/7 on lemmy.
The way to make it more relevant is going in the place where the planning is done, i.e. Github for lemmy. Open an issue there, explain the problem and describe possible solution. Come back to lemmy, link the issue and ask people to react to it (i.e. show it is relevant for them).
This is the best way to obtain what you ask. Social media platforms are too broad and fuzzy for tracking real issues.
This is also why you see a lot of work is done on performances of sql of lemmy backend, because most issues in the past on github concerned that.
This is my suggestion. If you really care about this being implemented, open a ticket on github and follow the discussion there. If you see there is not enough traction ask help to fellow lemmings.
Suggestions for the github issue are:
be very specific
be polite
suggest solutions
If your solution is good, great, if not, people are more willing to think about a problem to show stranger on the internet they are wrong
How would they address your concerns? The chances that one of the devs follows you is nonexistent, I would wager. Instead of using the proper channels to inform them, you did the exact opposite and posted it someplace they are almost guaranteed not to see it.
I’m an instance administrator, what the fuck do I do?
There's one more option. The awesome @db0@lemmy.dbzer0.com has made this tool to detect and automatically remove CSAM content from a pict-rs object storage.
This is a nice tool but orphaned images still need to be purged. Mentioned on the other thread that bad actors can upload spam to fill up object storage space.
That is also very true. I think better tooling for that might come with the next pict-rs version, which will move the storage to a database (right now it's in an internal ky-value storage). Hopefully that will make it easier to identify orphaned images.
Sorry I haven't ran this myself yet nor have any experience with that kind of issues. But may I ask why you were concerned with running it inside of a container? Seems rather unnecessary to me.
Not sure how you're trying to run it in a container, but the answer would depend on a bunch of different factors. Nvidia has a utility you can install that assists in exposing the GPU to the container, documentation found here.
If you're using docker compose to run it as a service, there's a doc page for that too. Note that it uses the previous page I mentioned as prerequisite.
There's another way to get it working from within kubernetes that comes up every now and then on stackoverflow.
If it's Intel or AMD, no idea if this still applies.
Part of the problem with having an illegal series of bits. Of course people are going to use that as a weapon.
I don't think those images should be made fully legal, but maybe we should calm the fuck down about two notches. We should keep in mind that the real crime is creating the pictures. Being effectively legal bombed by them is kind of ridiculous. As is having to keep the detection tools secret.
If you're on a grand jury for csam, maybe you should actually see the evidence (with limited censorship) before you indict someone.
Maybe I'm wrong, but I don't think seeing a small number of pictures is going to scar you for life. I've seen goatse. I've seen people decapitated. It's not pleasant, and I avoid those things, but it's not scarring.
The Station Nightclub Fire is scarring. I've recommended that video to people because it's scarring in a way that can save lives. Seeing that stuff every day would absolutely be scarring.
I don't want to see that kind of stuff to become common, but I am disturbed that people are afraid of unused images hiding on their Lemmy server.
Regardless of the debate of whether admins should be legally liable for not deleting unknown child abuse digital files,
Maybe I’m wrong, but I don’t think seeing a small number of pictures is going to scar you for life. I’ve seen goatse. I’ve seen people decapitated. It’s not pleasant, and I avoid those things, but it’s not scarring.
You shouldn't use your own experiences to make this generalisation, given that people working at agencies prosecuting pederasts often have to receive therapy or even leave the job after continued exposure.
I am disturbed that people are afraid of unused images hiding on their Lemmy server.
Don't you think it's logical for someone to be worried about being vulnerable to being accused of what likely is, in many legal systems, a crime?
'm an instance administrator, what the fuck do I do?
Check your pictrs images (good luck) or nuke it. Disable pictrs, restrict sign ups, or watch your database like a hawk. You can also delete your instance.
How? I have checked, and there doesn't seem to be any way to see the photos on my server.
I actually shut down pictrs entirely on my instance. Running pictrs in its current state is criminally negligent imo.
It seems like self-hosting your own Lemmy instance with registrations, communities, and pretty much anything else turned off is still very safe to do. I still want to end up self-hosting my own Lemmy instance some time when I have more time. Though I'd rather wait for things to be more stable first, there's bugs I'd like to be ironed out before doing that probably, like one example is I still find it annoying that upvoting a comment in a thread deletes whatever comment you're currently typing.
You mean Gmail drafts? I know from at least one case where criminals used this, they shared the Gmail account password and messaged each other only via the drafts function. So technically there was never a mail send.
Wasn't facebook also found to store images that were uploaded but not posted? This is just a resource leak . I can't believe no one has mentioned this phrase yet. I'm more concerned about DoS attacks that fill up the instance's storage with unused images. I think the issue of illegal content is being blown out of proportion. As long as it's removed promptly (I believe the standard is 1 hour) when the mods/admins learn about it, there should be no liabilities. Otherwise every site that allows users to post media would be dead by now.
I'm a pentester and security consultant. From my point of view, this vulnerability has more impact than just a resource leak or DOS. We all know how often CSAM or other illegal material is uploaded to communities here as actual posts (where hundreds of viewers run into it to report it). Now imagine them uploading it and spreading it like this, and only the admin can catch it if they goes out of their way to check it?
I wouldn't call this a high risk issue for sure. But a significant security risk regardless.
Because anyone can upload illegal images without the admin knowing and the admin will be liable for it.
The admin/company isn't liable until it is reported to them and they don't do anything about it... That's how all social media sites work, Google isn't immediately liable if you upload illegal materials to GDrive and share it anonymously.
Just make a cron that runs the rm command every day or whatever to clean out the files. Then run a SQL query at the same time to truncate any draft posts in the database. There's no logic to this method, it just clears out the files and records related to draft posts, but it's fast and effective.
There's a small chance it might fuck somebody up if they were writing a post at that exact moment, but you can schedule the cron for when your instance is the quietest.
Other than fulling up storage, what is the actual issue? If the image is orphaned then surely nobody can actually access the content? Sure you could be blind hosting things but if nobody can get the content back out then the abuse is surely minimal apart from say a complex cyber and physical targetted campaign or simply fulling up storage...
The issue is that you can share the image link to other people. People CAN get the content back out and admins or moderators WILL NOT KNOW about it.
So if someone uploads an illegal image in the comments, copies the link and does not post the comment, then they have a link of an illegal image hosted on someone's Lemmy instance. They can share this image to other people or report it to the FBI. Admins won't know about this UNLESS they look at their pictrs database. Nobody else can see it so nobody can report it.
Because pictrs and most other components of Lemmy was designed for a much smaller use case by a very small development team. It was designed primarily by people volunteering their time and expertise. Most of the contributors have other things to do on a full-time basis. If you really want to see a change like this implemented NOW, then code it in yourself, file a new issue directly on their page with potential solutions, or donate to the people working on it.
Your post is good for the most part, but my patience is limited for the kind of entitled attitude you show under that heading specifically. Thanks for hearing me out.
OP is flagging a legitimate issue that can actually put instance owners at risk. Raising the issue that instance owners can unwittingly host illegal content and be liable for it - how is that entitled?
Totally understand that Lemmy devs are a small team, but the growth of use of the software is exploding now, and not being able to keep up is a problem of scale - gatekeeping others from raising issues does not help it get better and in fact discourages issue reports and promotes a head-in-the-sand culture.
I understand and raising the issue and discussion is fine. With all due respect to OP, I take it personally when the discussion is framed with the implication that the developers should not have released a project with some bugs and they should have put more effort here or there. I've contributed to Lemmy both in coding, translation and small donations, but I'm not here for people to push blame on devs. This is why bringing up the question "Why hasn't anything been done?", while I recognize it is a question on some people's minds, it gets on my nerves. It bothers me like a clickbait/ragebait title does for many.
I would rather the discussion focus on where efforts are made or will be made to mitigate and fix the problem.
Fair point. That question itself is what bothers me even if it is a valid one people have on their minds. The answer to that question should highlight more clearly what has been done, and if OP doesn't know, then IMO it would best be to not include that question/answer.
I have no problems with OP's post and the fact to bring up this issue and dicsuss it. Including that question with an incomplete answer bothers me like a clickbait headline for an article does, or how Tucker Carlson's show asks questions. This serves little purpose but put the people working on fixes in a bad light acting like they haven't been working on anything.
Entitled attitude? I'm just bringing it up again. It was brought up some time ago but wasn't given attention so I'm bringing it up again after the recent CSAM attacks.
I didn't demand anything in the post. I brought up the issue, explained why it's important, and what admins could do about it.
I don't know how to code but that doesn't mean I'm not allowed to bring this issue to light...
I have no issue with your post itself and discussing this issue it is important to highlight things like this. Thank you for bringing it up, and sorry if I sound mad at you for doing that.
I will point out, the specific thing that bothers me is that the heading
How come this hasn't been addressed before?
contains an incomplete answer that ignores work that is currently in progress by devs to address. I don't blame you for not knowing the answer but for including and answering that question when you don't know the answer. To me it's reminiscent of Tucker Carlson-style questioning, where some issue is brought up, questions are asked but then the answer is sparsely researched and the viewer is expected to come to some conclusion of who to blame. This specifically is what gets on my nerves.
If you can include where work to rectify the issue had been discussed and is in progress like github issues, discussion throughout Lemmy and other things, I'll edit my first reply to note my concern is assuaged.
E: Here are some of the relevant issues and discussion: