What is the best FOSS file sharing protocol/app? (preferably CLI)
Hello fellow lemmy users ,
i was wondering whats the best file sharing protocol/app/website , tbh send.vis.ee seems to be currently the best to me but still i wanted your opinion here are things i found
#from what i am hearing , magic-wormhole makes the most sense since they seem to be the most open standard of sharing files but still seems incomplete or the lack of information on such topic makes me feels wierd.
croc seems to have a lot of cve and magic wormhole passed that test from suse's audit. webtorrent seems to fit in a wierd niche and its implementations like file.pizza arent really that well built ( considering you cant send multi files over there)
i would prefer cli but gui's as well so that i can send it to somebody else , i would like foss protocol since we can build on that other apps as well , and earlier i used to use shareit which was so bad that literally the govt pulled it because of chinese concerns
currently using localsend but warp (magic-wormhole)/warpinator is also looking good
I'm lazy and really don't need anything more than direct web hosting, no encryption (beyond https), no auth, not even a web app.
An nginx instance uses try_files on a folder either returning the file you asked for or a 404 page.
Drop file in folder, Domain.tld/folder/file.ext returns file. Adding '/download/' to the start of the path adds the Content-Disposition 'attachment' header so it downloads instead of displaying (images/video/html/etc)
Not used for anything sensitive ofc, but handy for simple file sharing to friends/family. (or just stupid backgrounds for the warehouse computer 🤷)
It really depends on your use case. For example, I use Nextcloud, running on my own server, as a replacement for all things "cloud". In my use case, I wanted to have a system where pictures/videos/files which I took on my phone were auto-magically synced to a server. My main requirements were:
Server is under my control.
Android client compatibility.
Automatic syncing of files in folders I select when an internet connection exists.
Two factor authentication via YubiKey.
Encryption "in flight".
Open Source.
I now have NextCloud running in a container on my home server, with a public IP and domain. This gives me all the advantages of having my pictures, videos and important files, from my phone and computer, backed up to "the cloud" without having them on someone else's computer. The down side is that I have to sort out security, updates and backups on my own. I'm fine with that trade-off, though not everyone would be.
As a bonus, I can provide "cloud" functions to my family as well. And sharing files out to extended family is as easy as setting a file to "shared" and sending a link. Technically, that exposes the file to the public internet, but I only do this for files which I don't consider "sensitive" and the link contains a long, random string to obfuscate it. So long as I take it down before search engines have a chance to pick up on it, the risk is minimal.
No search engine is going to find a long obfuscated URL. I don't think NC publishes a site tree for a crawler to use.
In fact, unless you post your domain somewhere online or its registration is available somewhere, it's unlikely anyone will ever visit your server without a direct link provided by you or someone else who knows it.
You might still get discovered by IP crawlers, but even then they aren't going to be trial and erroring their way to shared files, for the same reason they can't brute force any sane SSH password.
Nah I have some services running on unpublished domains and I get hit by brute force attempts at SSH logins all the time. It might not be sane but botnet gonna botnet.
In fact, unless you post your domain somewhere online or its registration is available somewhere, it’s unlikely anyone will ever visit your server without a direct link provided by you or someone else who knows it.
If you use HTTPS with a publicly-trusted certificate (such as via Let's Encrypt), the host names in the certificate will be published in certificate transparency logs. So at least the "main" domain will be known, as well as any subdomains you don't hide by using wildcards.
I'm not sure whether anyone uses those as a list of sites to automatically visit, but I certainly would not count on nobody doing so.
That just gives them the domain name though, so URLS with long randomly-generated paths should still be safe.