has been met with poor reviews due to its price and performance. As a result, the FPS game’s player count is dwindling quickly, now dipping below 100 Steam users.
It was a horrid mess of a lazy release, also stole content from modders.
I’m sure they’ll use this to say people don’t want to buy old games rather than admit they did a shit job and wanted to make bank for it.
The Darkforces remaster looks to be amazing though. I’ll be picking that up at some stage, can’t justify a $43 price tag for a 30 year old title so I’ll wait for a sale.
Lol, it just reminded me of that. I don't care either way, good for you for standing up for shit. I just think anything I say will never have any impact on shit one way or the other. To me it's like having a conversation on the street and then saying, don't repeat what I said it's trademarked, or something. If you're posting art or actual creative content then fine, you have all reason to say so, but a comment on a discussion online... I'm not trying to copyright my shit takes on everyday speech. If you think for one second anyone cares or will care what we talk about here and now then go ahead, it doesn't affect me one way or another, but I don't see the need. That link will not stop anyone for using your words from bot training or whatever.
To me it’s like having a conversation on the street and then saying, don’t repeat what I said it’s trademarked, or something
People don't record your conversation on the street and sell that audio recording to a company to use to build/program their AI models, without compensating you.
I'm not the OP, but I don't feel like it would affect the process of harvesting your data or put some burden on the company doing it, since they have big bucks. But at the same time I'm not against it for it can lead to many humorous examples of AI putting this license after it's replies after learning on your content. It would be the platinum tier absurdity and I'm all for it.
I’m not the OP, but I don’t feel like it would affect the process of harvesting your data or put some burden on the company doing it, since they have big bucks.
Maybe. For me its a combination of very easy to add the license, hoping fellow coders who create the models will honor a Creative Commons license, and figuring that at some point in the future Congress will get around to passing laws about who owns content, how its labled, and how others can scrape such data. There's already arguments going on between big corporations about paying to use the content to build the models, so I'm assuming that lobbying is being done right now in that category.
Though honestly I might just get bored some day and talk to my lawyer friend about what I would need to do to test this all out. Boredom is something you have at times, when retired.
But at the same time I’m not against it for it can lead to many humorous examples of AI putting this license after it’s replies after learning on your content. It would be the platinum tier absurdity and I’m all for it.
lol! I never heard of this, that's really funny actually.
Now that you mention it, in theory, we could all "black box" input into the models by having wacky stuff in our comments.
It's superstitious clutter. Most websites require you to license the content you post to them without those restrictions, and AI training may not even involve copyright in the first place, meaning the license is moot. It just makes you look silly.
"Lemmy" isn't a website. I'm not even viewing this from a Lemmy instance, I'm on an mbin server. Do you understand how the Fediverse works? Your posts are being copied and transmitted to everyone regardless of what restrictions you claim you're putting on them, if you don't want them used that way then don't post in the first place.
And if you're finding this argument about your spam to be entertaining there's a word for that. I likely shouldn't be feeding that but this thread is already thoroughly derailed.
Allow me to play devil's advocate here, but what you are saying about the fediverse seems to be completely compliant with that license. The content can be freely redistributed provided it is fine in a noncommercial way and with attribution (which is the case, right? We see the comment author).
Also, the argument "X is going to be done regardless" applies to all licenses (thinking about open source licenses). There is nothing that physically stops you from taking open source code and violate its license but if you get caught doing so, you are liable.
Maybe today there is nothing that would make anybody accountable about grabbing public data, training AI on it and reselling it, but if in the future regulations will change, it will be hard(er?) for those companies to claim that certain content was distributed freely etc., in cases where the author explicitly and unequivocally stated the terms.
Whats this 'Freddyverse' that you speak of? Is it like Costco?
Your posts are being copied and transmitted to everyone regardless of what restrictions you claim you’re putting on them, if you don’t want them used that way then don’t post in the first place.
I'll be sure to petition the Lemmy web client people to remove the link button from their editor.
I just think it’s silly that people think it actually works.
Besides, if AI really is powerful enough to make a splash in the world, wouldn’t you WANT it to contain your data? That would make it more favorable to your viewpoints.
I’m quite familiar. It legally works, if you can prove that your data actually made it into the training set, you might be able to successfully sue them. That’s extremely unlikely though. If you can’t litigate a law, then it essentially doesn’t exist.
Besides, a researcher scraping websites isn’t going to take the time to filter out random pieces of data based on a link contained in the body. If you can show me a research paper or blog post or something where a process is described to sanitize the input data based on license, that would be pretty damn interesting. Maybe it’ll exist in the future?
Besides, the best way to opt-out of AI training is to enable site-wide flags, which mark the content therein as off limits. That would have the benefit of not only protecting you, but everyone else on the site. Lobbying your lemmy instance to enable that will get a lot more mileage than anything else you could do, because it’s an industry sanctioned way to accomplish what you want.
I’m quite familiar. It legally works, if you can prove that your data actually made it into the training set, you might be able to successfully sue them. That’s extremely unlikely though. If you can’t litigate a law, then it essentially doesn’t exist.
And what makes you think that can't be done? You make it sound like because (you believe) it's so hard to do you should have just not even bother trying, that seems really defeatist.
And like I said multiple times now, it's a simple quick copy and paste, a 'low-hanging fruit' way of licensing/protecting a comment. If it works, great it works.
Besides, the best way to opt-out of AI training is to enable site-wide flags, which mark the content therein as off limits.
I have no control over the Lemmy servers, I only have control over my own comments that I post.
Also, the two options are not mutually exclusive.
because it’s an industry sanctioned way to accomplish what you want.
Again, both you and I know the history of the robots.txt file and how often and how well it's honored, especially these days with the new frontier of AI modeling.
It would be best to do both, just to make sure you have coverage, so that if the robots.txt is not honored, at least the comment itself is still licensed.
Is there some Lemmy rule somewhere that I don't know about that says I can't attach a Creative Commons license to my comments?
It’s pretty much just a flag in the robots.txt
Because everyone knows that's always honored and obeyed, right?
Also, it's a proprietary flag created by Google and only used by Google (per the article you linked).
So if you want to actually make a difference, lobby your Lemmy instance to add this flag.
Or do both.
Because users are the final owners of their own content, their own comments. Not Lemmy, not anyone else. They have the first responsibility of protecting their rights.
The equivalent of this but for nerds and it's just as effective
Hell, they can't even be bothered to self host so they can at least pretend to have some kind of ownership over what they share on Lemmy and they admit to not having any plan to actually check if their data is used by AI companies, that's how ridiculous this is.
What is this link in your posts? I’m reading the site but I don’t understand what it is really.
I'm licensing my comments with a Creative Commons license, so that if anyone wants to use them to train their AI models/bots with, they have to at the very least give citation to that.
I'm hoping it's a way of deterring bot activity on my comments. It's something that I saw someone else doing, so I decided to emulate it, since it's just a simple copy and paste, and if it works, it's worth the momentary paste.
Plus it's really interesting that its gotten a lot of positive and negative feedback. Some people really get bent out of shape seeing it being there, and others just have a natural curiosity about it. So it's kind of interesting to see that as well, just by using it.
Honestly, I wasn't going to worry about that, I'm just doing a quick copy and paste, and moving on. If it works, it works.
I'm making the assumption that any AI model building developer who sees the license notation would honor the the Creative Commons license.We software developers usually care about those things, especially the open source style protecting ones.
Otherwise I will just wait for years from now when Congress creates new disclosure legislation. Companies are already starting to get pissed off at each other about who's paying who, and who's using what content to program their AI models with, and they find out who those other people are that is using their content. I'm pretty sure lobbying efforts are on going right now, and legislation will come out soon enough.
After that legislation exists, I can go back to all my comments and sue the companies, once those AI model building companies have to disclose their data source. I'm retired, I have time on my hands.
You're just like boomers on Facebook copy pasting a comment on their wall to say that Meta can't monetize their data.
If AI is trained on Lemmy content it will just scrub the site, convert it to raw text, chew the data and use it to spit out answers to stupid questions, your link will change fuck dick to that and even you are admitting that you don't intend to do anything about it.
The only way to make sure AI isn't trained on what you're writing is to have a journal that you share with no one.