The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.
YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.
interesting… whether the sites will be found liable…. it’s pretty unlikely, but it sure does shine a spotlight on how each are magnets for alt-right crazies. I wonder if that will have any effect on their moderation?
They're also "magnets" for progressive, liberal, conservative and all other crazies and normal people. That's mostly because everyone uses them. It's the most popular video sharing site and (one of?) the most popular social media site.
Right, but since YouTube and Facebook are two of the most popular sites in the world, they aren't really just magnets for alt-right crazies, since they appeal to almost everybody.
Ok so isn’t the issue at hand whether the sites are to blame?
let’s break this down so I can answer you in what I think is an honest way:
Are the sites legally responsible for the content they host, generally speaking and/or in this context of radicalization and such subsequent results as these?
and
Do these sites bear any social/moral responsibility to moderate their more extreme content in good faith to try to prevent this sort of result?
and
Is there an overlap of 1 and 2?
1 - this is for a court to decide. I’m not familiar enough with the very specifics of case law or with the suits being brought to know exactly what is being alleged, etc. I can’t opine on this other that to say that, from what I do know, it’s unlikely that a court would hold these sites legally responsible.
2 - I fully believe that, yes, sites like these, massive, general-use public sites have a social and moral responsibility to keep their platforms safe. How and what that means is a matter for much debate, and I’m sure people here will do just that.
3 - is there overlap? again, legally, I’m not sure, but there might be, and in the near future, there might be much more. also, should there be more? another subject for debate.
Ok so isn’t the issue at hand whether the sites are to blame?
let’s break this down so I can answer you in what I think is an honest way:
Are the sites legally responsible for the content they host, generally speaking and/or in this contact of radicalization and such subsequent results as these?
and
Do these sites bear any social/moral responsibility to moderate their more extreme content in good faith to try to prevent this sort of result?
I didn't say they were. Facebook and YouTube didn't commit the shootings, and there isn't anything particularly special about them that would disproportionately attract the alt-right crazies. They're not hate sites.
The Trump supporters like to bitch that Facebook has been censoring their opinions, especially during 2020 and 2021. They felt the same way about Twitter until Elon turned it into a hell hole.