Reddit's filing with the SEC makes clear that training AI with user posts is a core part of Reddit's new business model.
Reddit said in a filing to the Securities and Exchange Commission that its users’ posts are “a valuable source of conversation data and knowledge” that has been and will continue to be an important mechanism for training AI and large language models. The filing also states that the company believes “we are in the early stages of monetizing our user base,” and proceeds to say that it will continue to sell users’ content to companies that want to train LLMs and that it will also begin “increased use of artificial intelligence in our advertising solutions.”
The long-awaited S-1 filing reveals much of what Reddit users knew and feared: That many of the changes the company has made over the last year in the leadup to an IPO are focused on exerting control over the site, sanitizing parts of the platform, and monetizing user data.
Posting here because of the privacy implications of all this, but I wonder if at some point there should be an "Enshittification" community :-)
Reddit has long had an issue with confidently providing false statements as fact. Sometimes I would come along a question that I was well educated on, and the top voted responses were all very clearly wrong, but sounded correct to someone who didn't know better. This made me question all the other posts that I had believed without knowing enough to tell otherwise.
Llms also have the same issue of confidently telling lies that sound true. Training on Reddit will only make this worse.
There's also the issue of reddit comment sorting being entirely dominated by time. In something like 90% of posts, the top comment is one of the first five. Literally all you have to do is just comment first, and it'll likely be the top.
I noticed from the beginning that Lemmy's default comment sorting improves visibility of a variety of comments including newer ones. Gee, I wonder who could have helped make it that way ;)
Over the years I ended up getting a Reddit habit of replying to one of the top comments so that it could attain some visibility. I still do sometimes but less often on Lemmy.
Some of the better subreddits tried to mix it up and change how this affected upvotes. There was Muxing,..etc etc..
But then,.. Spez came in (back) and didn’t give af about anything at all except money.
Hiding vote counts is a good idea imo however, having the info visible can influence people's judgement of the comment and cause people to also vote based on the existing score rather than just the comment itself.
I strongly agree with this comment. To show my appreciation, you have my upvote. Had I only agreed a little bit, I might have not voted at all. If that comment had made me angry, I might have downvoted.
Actually calling these things votes instead of likes makes a lot of sense. I might not like a comment, but I might want it to be higher. I might not hate another comment, but I might want it to be lower because of other reasons.
Wow. You're extremely on point. No logical counterarguments but rather several downvotes for a field I'm very familiar with. Downvotes determine the validity of a comment, not their content.
Not really, I never paid much mind to it. I’m curious about the whole industry I guess, or anything you’d like to share or set the record straight about.
Oh there's lots I have to set the record straight about and there's lots I could talk about, but without being asked a specific question that would just leave me to write an open-ended essay and I'm not up for it right now
The problem is that SEO has made it impossible to find accurate information easily, since even "old, trustworthy brands" can't be trusted online. [This is an excellent article that explains the problem thoroughly, and brings receipts] (https://housefresh.com/david-vs-digital-goliaths/).
This is a great example of why it's so important to emphasize teaching critical thinking in school right now. Misinformation and disinformation is just going to continue to grow.
Literally why I bookmarked it. I'm an online teacher, so I'm going to advocate for adding that article to a grade 10 course that's used by thousands of students each year.
I'm a student teacher right now in elementary! I try to get my kids to think critically whenever I can. I hear kids talk about insane shit they saw/heard on tiktok (I got into an argument with a student who thought Slenderman was 100% real because of something they saw on tiktok) and I try to really get them to think and actually justify why they believe things.
A recommendation about teaching controversial topics: you need to build connection first.
I mean, that's true of all teaching, but when you start to question the (prejudiced) things they're hearing from trusted adults at home, you really need to have a strong relationship with the students.
Being an anti-racist pro-SOGI educator in conservative communities is hard.
I wish you success in your career! Teachers have such an opportunity to make a huge impact on the world.
That's a really good article, and it does a good job of highlighting the issues with modern day search results.
I've been guilty to use "best x" pages before, but if the website with the "best of page" doesn't have specific reviews linked I usually look up individual product reviews for the good sounding items on other websites.
@Fubarberry yes I saw this a lot too. Highly upvoted confidently incorrect comments, with the real answer or an answer debunking them with links to factual sources less upvoted.
I am a lawyer and I would get down voted for posts explaining the law that contained citations to the actual applicable statute if people didn't like the statute. Using reddit up votes as a measure of correctness is fundamentally a dumb idea.
I would come along a question that I was well educated on, and the top voted responses were all very clearly wrong, but sounded correct to someone who didn’t know better.
This can be said to https://news.ycombinator.com/ as well. I wonder how much of this is due to sock puppets and bots.