The wrongful death lawsuit against several social media companies for allegedly contributing to the radicalization of a gunman who killed 10 people at a grocery store in Buffalo, New York, will be allowed to proceed.
I really don’t like cases like this, nor do I like how much the legal system seems to be pushing “guilty by proxy” rulings for a lot of school shooting cases.
It just feels very very very dangerous and ’going to be bad’ to set this precedent where when someone commits an atrocity, essentially every person and thing they interacted with can be held accountable with nearly the same weight as if they had committed the crime themselves.
Obviously some basic civil responsibility is needed. If someone says “I am going to blow up XYZ school here is how”, and you hear that, yeah, that’s on you to report it. But it feels like we’re quickly slipping into a point where you have to start reporting a vast amount of people to the police en masse if they say anything even vaguely questionable simply to avoid potential fallout of being associated with someone committing a crime.
It makes me really worried. I really think the internet has made it easy to be able to ‘justifiably’ accuse almost anyone or any business of a crime if a person with enough power / the state needs them put away for a time.
This appears to be more the angle of the person being fed an endless stream of hate on social media and thus becoming radicalised.
What causes them to be fed an endless stream of hate? Algorithms. Who provides those algorithms? Social media companies. Why do they do this? To maintain engagement with their sites so they can make money via advertising.
And so here we are, with sites that see you viewed 65 percent of a stream showing an angry mob, therefore you would like to see more angry mobs in your feed. Is it any wonder that shit like this happens?
It's also known to intentionally show you content that's likely to provoke you into fights online
Which just makes all the sanctimonious screed about avoiding echo chambers a bunch of horse shit, because that's not how outside digital social behavior works, outside the net if you go out of your way to keep arguing with people who wildly disagree with you, your not avoiding echo chambers, you're building a class action restraining order case against yourself.
I’ve long held this hunch that when people’s beliefs are challenged, they tend to ‘dig in’ and wind up more resolute. (I think it’s actual science and I learned that in a sociology class many years ago but it’s been so long I can’t say with confidence if that’s the case.)
Assuming my hunch is right (or at least right enough), I think that side of social media - driving up engagement by increasing discord also winds up radicalizing people as a side effect of chasing profits.
It’s one of the things I appreciate about Lemmy. Not everyone here seems to just be looking for a fight all the time.
It depends on how their beliefs are challenged. Calling them morons won’t work. You have to gently question them about their ideas and not seem to be judging them.
Oh, yeah, absolutely.
Another commenter on this post suggested my belief on it was from an Oatmeal comic. That prompted me to search it out, and seeing it spelled out again sort of opened up the memory for me.
The class was a sociology class about 20 years ago, and the professor was talking about cognitive dissonance as it relates to folks choosing whether or not they wanted to adopt the beliefs of another group. I don’t think he got into how to actually challenge beliefs in a constructive way, since he was discussing how seemingly small rifts can turn into big disagreements between social groups, but subsequent life experience and a lot of good articles about folks working with radicals to reform their beliefs confirm exactly what you commented.
Nah. I picked that up about 20 years ago, but the comic is a great one.
I haven’t read The Oatmeal in a while. I guess I know what I’ll be doing later tonight!
Absolutely. Huge difference between hate speech existing. And funneling a firehose of it at someone to keep them engaged. It's not clear how this will shake out. But I doubt it will be the end of free speech. If it exists and you actively seek it out that's something else.
I think the design of media products around maximally addictive individually targeted algorithms in combination with content the platform does not control and isn't responsible for is dangerous. Such an algorithm will find the people most susceptible to everything from racist conspiracy theories to eating disorder content and show them more of that. Attempts to moderate away the worst examples of it just result in people making variations that don't technically violate the rules.
With that said, laws made and legal precedents set in response to tragedies are often ill-considered, and I don't like this case. I especially don't like that it includes Reddit, which was not using that type of individualized algorithm to my knowledge.
This is the real shit right here. The problem is that social media companies' data show that negativity and hate keep people on their website for longer, which means that they view more advertisement compared to positivity.
It is human nature to engage with disagreeable topics moreso than agreeable topics, and social media companies are exploiting that for profit.
We need to regulate algorithms and force them to be open source, so that anybody can audit them. They will try to hide behind "AI" and "trade secret" excuses, but lawmakers have to see above that bullshit.
Unfortunately, US lawmakers are both stupid and corrupt, so it's unlikely that we'll see proper change, and more likely that we'll see shit like "banning all social media from foreign adversaries" when the US-based social media companies are largely the cause of all these problems. I'm sure the US intelligence agencies don't want them to change either, since those companies provide large swaths of personal data to them.
While this is true for Facebook and YouTube - last time I checked, reddit doesn't personalise feeds in that way. It was my impression that if two people subscribe to the same subreddits, they will see the exact same posts, based on time and upvotes.
Then again, I only ever used third party apps and old.reddit.com, so that might have changed since then.
Mate, I never got the same homepage twice on my old reddit account. I dunno how you can claim that two people with identical subs would see the same page. That's just patently not true and hasn't been for years.
Quite simple, aniki. The feeds were ordered by hot, new, or top.
New was ORDER BY date DESC. Top was ORDER BY upvotes DESC. And hot was a slightly more complicated order that used a mixture of upvotes and time.
You can easily verify this by opening 2 different browsers in incognito mode and go to the old reddit frontpage - I get the same results in either. Again - I can't account for the new reddit site because I never used it for more than a few minutes, but that's definitely how they old one worked and still seems to.
It's probably not true anymore, but at the time this guy was being radicalized, you're right, it wasn't algorithmically catered to them. At least not in the sense that it was intentionally exposing them to a specific type of content.
I suppose you can think of the way reddit works (or used to work) as being content agnostic. The algorithm is not aware of the sorts of things it's suggesting to you, it's just showing you things based on subreddit popularity and user voting, regardless of what it is.
In the case of YouTube and Facebook, their algorithms are taking into account the actual content and funneling you towards similar content algorithmically, in a way that is unique to you. Which means at some point their algorithm is acknowledging "this content has problematic elements, let's suggest more problematic content"
(Again, modern reddit, at least on the app, is likely engaging in this now to some degree)
Attempts to moderate away the worst examples of it just result in people making variations that don't technically violate the rules.
The problem then becomes if the clearly defined rules aren't enough, then the people that run these sites need to start making individual judgment calls based on...well, their gut, really. And that creates a lot of issues if the site in question could be held accountable for making a poor call or overlooking something.
The threat of legal repercussions hanging over them is going to make them default to the most strict actions, and that's kind of a problem if there isn't a clear definition of what things need to be actioned against.
Bullshit. There's no slippery slope here. You act like these social media companies just stumbled onto algorithms. They didn't, they designed these intentionally to drive engagement up.
Demanding that they change their algorithms to stop intentionally driving negativity and extremism isn't dystopian at all, and it's very frustrating that you think it is. If you choose to do nothing about this issue I promise you we'll be living in a fascist nation within 10 years, and it won't be an accident.
There's nothing ambiguous about this. Give me a break. We're demanding that social media companies stop deliberately driving negativity and extremism to get clicks. This has fuck all to do with free speech. What they're doing isn't "free speech", it's mass manipulation, and it's very deliberate. And it isn't disclosed to users at any point, which also makes it fraudulent.
It's incredibly ironic that you're accusing people of an effort to control expression when that's literally what social media has been doing since the beginning. They're the ones trying to turn the world into a dystopia, not the other way around.
In her decision, the judge said that the plaintiffs may proceed with their lawsuit, which claims social media companies — like Meta, Alphabet, Reddit and 4chan — ”profit from the racist, antisemitic, and violent material displayed on their platforms to maximize user engagement,”
I don't think you understand the issue. I'm very disappointed to see that this is the top comment. This wasn't an accident. These social media companies deliberately feed people the most upsetting and extreme material they can. They're intentionally radicalizing people to make money from engagement.
They're absolutely responsible for what they've done, and it isn't "by proxy", it's extremely direct and deliberate. It's long past time that courts held them liable. What they're doing is criminal.
The algorithms themselves. This decision opens the algorithms up to discovery and now we get to see exactly how various topics are weighted. These companies will sink or swim by their algorithms.
I do. I just very much understand the extent that the justice system will take decisions like this and utilize them to accuse any person or business (including you!) of a crime that they can then “prove” they were at fault for.
I think the distinction here is between people and businesses. Is it the fault of people on social media for the acts of others? No. Is it the fault of social media for cultivating an environment that radicalizes people into committing mass shootings? Yes. The blame here is on the social medias for not doing more to stop the spread of this kind of content. Because yes even though that won't stop this kind of content from existing making it harder to access and find will at least reduce the number of people who will go down this path.
I agree, but I want to clarify. It's not about making this material harder to access. It's about not deliberately serving that material to people who weren't looking it up in the first place in order to get more clicks.
There's a huge difference between a user looking up extreme content on purpose and social media serving extreme content to unsuspecting people because the company knows it will upset them.
Is it the fault of social media for cultivating an environment that radicalizes people into committing mass shootings? Yes.
Really? Then add videogames and heavy metal to the list. And why not most organized religions? Same argument, zero sense. There's way more at play than Person watches X content = person is now radicalized, unless we're talking about someone with severe cognitive deficit.
And since this is the US... perhaps add easy access to guns? Nah, that's totally unrelated.
"Person watches X creative and clearly fictional content" is not analogous in any way to "person watches X video essay crafted to look like a documentary, but actually just full of lies and propaganda"
Sure, and I get that for like, healthcare. But ‘systemic solutions’ as they pertain to “what constitutes a crime” lead to police states really quickly imo
The article is about lawsuits. Where are you getting this idea that anyone suggested criminalizing people? Stop putting words in other people's mouths. The most that's been suggested in this thread is regulating social media algorithms, not locking people up.
Drop the melodrama and paranoia. It's getting difficult to take you seriously when you keep making shit up about other people's positions.
Do you not think if someone encouraged a murderer they should be held accountable? It's not everyone they interacted with, there has to be reasonable suspicion they contributed.
Depends on what you mean by "encouraged". That is going to need a very precise definition in these cases.
And the point isn't that people shouldn't be held accountable, it's that there are a lot of gray areas here, we need to be careful how we navigate them. Irresponsible rulings or poorly implemented laws can destabilize everything that makes the internet worthwhile.
I didn’t say that at all, and I think you know I didn’t unless you really didn’t actually read my comment.
I am not talking about encouraging someone to murder. I specifically said that in overt cases there is some common sense civil responsibility. I am talking about the potential for the the police to break down your door because you Facebook messaged a guy you’re friends with what your favorite local gun store was, and that guy also happens to listen to death metal and take antidepressants and the state has deemed him a risk factor level 3.
I must have misunderstood you then, but this still seems like a pretty clear case where the platforms, not even people yet did encourage him. I don't think there's any new precedent being set here
Rulings often start at the corporation / large major entity level and work their way down to the individual. Think piracy laws. At first, only giant, clear bootlegging operations were really prosecuted for that, and then people torrenting content for profit, and then people torrenting large amounts of content for free - and now we currently exist in an environment where you can torrent a movie or whatever and probably be fine, but also if the criminal justice system wants to they can (and have) easily hit anyone who does with a charge for tens of thousands of dollars or years of jail time.
Will it happen to the vast majority of people who torrent media casually? No. But we currently exist in an environment where if you get unlucky enough or someone wants to punish you for it enough, you can essentially have this massive sentence handed down to you almost “at random”.
Literally no one suggested that end users should be arrested for jokes on the internet. Fuck off with your attempts at trying to distract from the real issue.
This wasn't just a content issue. Reddit actively banned people for reporting violent content too much. They literally engaged with and protected these communities, even as people yelled that they were going to get someone hurt.
Also worth remembering, this opens up avenues for lawsuits on other types of "harm".
We have states that have outlawed abortion. What do those sites do when those states argue social media should be "held accountable" for all the women who are provided information on abortion access through YouTube, Facebook, reddit, etc?
I dunno about social media companies but I quite agree that the party who got the gunman the gun should share the punishment for the crime.
Firearms should be titled and insured, and the owner should have an imposed duty to secure, and the owner ought to face criminal penalty if the firearm titled to them was used by someone else to commit a crime, either they handed a killer a loaded gun or they inadequately secured a firearm which was then stolen to be used in committing a crime, either way they failed their responsibility to society as a firearm owner and must face consequences for it.
This guy seems to have bought the gun legally at a gun store, after filling out the forms and passing the background check. You may be thinking of the guy in Maine whose parents bought him a gun when he was obviously dangerous. They were just convicted of involuntary manslaughter for that, iirc.
Well you were talking about charging the gun owner if someone else commits a crime with their gun. That's unrelated to this case where the shooter was the gun owner.
The lawsuit here is about radicalization but if we're pursuing companies who do that, I'd start with Fox News.
If you lend your brother, who you know is on antidepressants, a long extension cord he tells you is for his back patio - and he hangs himself with it, are you ready to be accused of being culpable for your brothers death?
Oh, it turns out an extension cord has a side use that isn't related to its primary purpose. What's the analogous innocuous use of a semiautomatic handgun?
Self defense? You don’t have to be a 2A diehard to understand that it’s still a legal object. What’s the “innocuous use” of a VPN? Or a torrenting client? Should we imprison everyone who ever sends a link about one of these to someone who seems interested in their use?
You're deliberately ignoring the point that the primary use of a semiautomatic pistol is killing people, whether self-defense or mass murder.
Should you be culpable for giving your brother an extension cord if he lies that it is for the porch? Not really.
Should you be culpable for giving your brother a gun if he lies that he needs it for self defense? IDK the answer, but it's absolutely not equivalent.
It is a higher level of responsibility, you know lives are in danger if you give them a tool for killing. I don't think it's unreasonable if there is a higher standard for loaning it out or leaving it unsecured.
“Sorry bro. I’d love to go target shooting with you, but you started taking Vynase 6 months ago and I’m worried if you blow your brains out the state will throw me in prison for 15 years”.
Besides, youre ignoring the point. This article isn’t about a gun, it’s about basically “this person saw content we didn’t make on our website”. You think that wont be extended to general content sent from a person to another? That if you send some pro-Palestine articles to your buddy and then a year or two later your buddy gets busted at an anti-Zionist rally and now you’re a felon because you enabled that? Boy, that would be an easy way for some hypothetical future administrations to control speech!!
You might live in a very nice bubble, but not everyone will.
So you need a strawman argument transitioning from loaning a weapon unsupervised to someone we know is depressed. Now it is just target shooting with them, so distancing the loan aspect and adding a presumption of using the item together.
This is a side discussion. You are the one who decided to write strawman arguments relating guns to extension cords, so I thought it was reasonable to respond to that. It seems like you're upset that your argument doesn't make sense under closer inspection and you want to pull the ejection lever to escape. Okay, it's done.
The article is about a civil lawsuit, nobody is going to jail. Nobody is going to be able to take a precedent and sue me, an individual, over sharing articles to friends and family, because the algorithm is a key part of the argument.
Yeah man. Even if you loan it to them you shouldn’t be charged.
Lmfao okay yeah sure man. No one is this year. See you in 10. I know it’s easy to want to retreat to kind of a naive “this would never happen to ME!” worldview, and yeah. It probably won’t. But you have to consider all the innocent people it unjustly will happen to in coming years.
Also, not what a strawman is. You’re not really good at this.
Also you still can’t respond to anything not related to guns. All those VPN and torrenting points went right over your head huh? Convenient. When you get busted for talking about how to store “several TB of photos” to some guy that turns out to be hoarding CP I hope the “assisted in preserving pedophilic content” charge rests easy on you
You're really deluded into thinking you're correct and that your strawmen are good arguments. "If we do anything at all about this, then extension cords will be illegal," really wet sobbing.
"If this civil lawsuit is allowed to proceed then we are already under 1984's Big Brother police state, they are coming for you," wild. Your imagination is a very frightening place. You feel threatened by so many things. Must be hard.
Why would I participate in your side quests? You like writing strawmen, have fun with it on your own.
“Why would I participate in a conversation about the very real slippery slope of vague, easily exploited criminal rulings? That way I would have to think about it.”
Knowingly manipulating people into suicide is a crime and people have already been found guilty of doing it.
So the answer is obvious. If you knowingly encourage a vulnerable person to commit suicide, and your intent can be proved, you can and should be held accountable for manslaughter.
That's what social media companies are doing. They aren't loaning you extremist ideas to help you. That's a terrible analogy. They're intentionally serving extreme content to drive you into more and more upsetting spaces, while pretending that there aren't any consequences for doing so.
Did he also use it as improvised ammunition to shoot up the local elementary school with the chord to warrant it being considered a firearm?
I'm more confused where I got such a lengthy extension chord from! Am I an event manager? Do I have generators I'm running cable from? Do I get to meet famous people on the job? Do I specialize in fairground festivals?
And ironically the gun manufacturers or politicians who support lax gun laws are not included in these “nets”. A radicalized individual with a butcher knife can’t possibly do as much damage as one with a gun.