“We don’t think that censorship (including through demonetizing publications) makes the problem go away.”
More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:
I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.
While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”
It all depends on the nature and goals of the platform.
It's one thing to create a platform for positivity and brave new world (in a good sense). It's good we have those, and this probably should be the approach of mainstream media.
It's another to create a truly free speech platform. You can't claim free speech and then ideologically ban someone, even if that's someone bad. And you should have such venues - for among 9 terrible ideas (like Nazism) lies one that is underappreciated and misunderstood, and wrongly considered to be bad. Feminism was considered to be bad. LGBT people were considered to be bad etc. etc. And if you start banning some ideas, it leaves you with carte blanche to ban everyone you don't like, including people who actually promote healthy and positive ideas and values, but are misunderstood.
Only leaving the first option means starting a circlejerk where no good new idea has a chance to flourish.
As per Nazis, homophobes and other people with terrible ideas - you really can't overcome it by just pulling it under the rug. We need to develop patience and advance our rhetorics to counter those, and to quickly seed that grain of truth from which all their misconceptions get to shatter. That's literally the only way to combat ideology - by exposing how deeply wrong and flawed it is, and providing arguments.
Will there be people with no reason following such ideologies out of spite and emotion? Sure. But by stepping against them on an equal footing, we can show the rest how stupid their arguments are, rven on a dedicated free speech platform.
Because Nazism is not bad simply because we decided it to be so. It is a faulty ideology meant to distract people from real sources of their struggle while expending millions of lives in the process.
It removes critical understanding of economic processes by the masses, fooling them into believing the issues are caused by some nation and not their own elites - something that is well-researched and obvious to almost every other modern individual.
In order to retain people's decisiveness amongst deep economic crisis directly caused by application of such ideology (due to rampant expropriation, paranoid protectionism, economic mismanagement and removal of active economic participants), Nazi government always needs to wage a war - this way it can blame its faults on its enemies. And war inevitably comes with millions of deaths, deaths directly attributed to the ideology cause it can't run without them.
Finally, it's an ideology based on hate to a group with an immutable property - it ignores the differences all of us have and tries to attribute a certain property to an entire nation - something we know isn't true - and then exterminate people based on what is known to be wrong association.
No matter how you look at it, under any rational look, Nazism is just plain stupid, and in its stupidity it produces extreme and unnecessary suffering.
With all that being said, again, I don't think we should make platforms like Substack mainstream and we should moderate general-topic places to exclude Nazis and other harmful actors. But we sure as hell need them to be present.
Because the only thing worse than Nazis allowed to influence us is the tyranny of subjective good.
Recently, when discussing defederated instances, I've seen an interesting picture: people cheered defederating instances of Nazis and...pedophiles.
An average person would see no issue here. Right, one more terrible group banned! Take those perverts down! But there's a catch that I discovered quite a while ago, and it's a rabbit hole like no other.
First, pedophile is not a child molester. We equate the two wrongly all the time, and those words became synonymous.
Second, pedophilia is an immutable trait; unlike with Nazis, no one can decide to stop being one.
Third, many of the pedophile instances, including those massively banned, actually feature anti-contact pedophiles, i.e. those specifically dedicated to never ever touch or interact with a child in any inappropriate way; said instances also generally prohibit any forms of child sexual imagery or only allow fictional drawings. And early research suggests it actually helps them. We brought up some reads and scientific articles on the matter throughout discussion. Conclusion: there are uncertainties, but it seems to work in protecting kids and reducing suicide rates.
And when you see something like that, you clearly understand that there's a lot of things in the world people still heavily misunderstand, while feeling certain about the position they didn't have 5 minutes to research on, and that people are already on the slippery slope, banning groups they didn't have time and effort to comprehend. And there's a lot more of that than just pedophiles, this is just a very bright example that will probably make most of those reading this uncomfortable and will illustrate the concept best.
Also, I'm full aware that most people will likely choose to downvote this, not comment anything and end up thinking I support child molesters (hell no, if you support child molestation go get some mental health asap, fucking kids is very bad)
sorry what exactly about banning nazis causes one to ban non-offender pedophile support groups. like what is the actual causal link you're suggesting? if you just mean "I noticed random people endorse this thing I have no opinion on, and also this similar sounding thing I think is bad," that's not super compelling
I'm saying that banning Nazis comes from public opinion and perception (which is correct to my knowledge), and that banning pedophiles comes from public opinion of just the same people (which is wrong as far as I know). Both groups (third is instances full of bots and spam) are heavily banned on the Fediverse, so it's not "some people's opinion" but rather, essentially, a policy for majority of instances.
This is to the point that the organized banning of groups that shouldn't be banned and hate towards groups that shouldn't be hated didn't stop, and without venues for free speech, we may never know that and keep hating and banning those we need to support to make this world a better place.
by causal link, I mean how does banning nazis cause support groups for non-offending pedophiles to get banned. like how does that actually happen. please be as specific as you can be
It's not banning nazis directly causing banning non-offending pedophiles, it's banning people considered dangerous causing both, with Nazis just setting the precedent (because obviously they are bad, and there's little disagreement). Fedi is just one example where banning Nazis is not full stop. Other groups are banned too, sometimes without much consideration, and this happens on many different platforms - Tumblr, Discord, Facebook, and even daddy Elon's Xitter, to name a few.
This goes as part of my argument on why we need spaces with completely free speech. We cannot expect instance admins or even platform owners to be completely objective in their estimations of right and wrong, and we can't trust them to be unaffected by societal stereotypes.
Moreover, even in such an ideal scenario where they are fully objective, their userbase might think differently, forcing admins to take measures against various marginalized groups.
At that point, it seems to me like the only way out of this conundrum is having some platforms - not mainstream ones, mind you - allowing everything: platforms, from which positive, but initially rejected ideas can spread.
the site you are imagining, the supposed free speech site? it converges to gab. this dynamic is basic and I can't take you seriously if you don't get this.
nazis are encouraged to be equal voices on a platform
they use the platform's reach to radicalize fence sitters
other users, realizing their digital roommates are Nazis, are alarmed and leave
now it's a nazi site
what exactly do you think substack will consist of in two years if they don't do a 180? the entire reason we're having this conversation right now is that a bunch of substack writers said they would rather leave than hang out with nazis
I'm talking specifically about instances with strong rules, either prohibiting any child imagery or only allowing drawings (which is just about any anti-contact place). Both types are heavily defederated from, and barely anyone makes a difference between that and literal child porn instances (which should be not just defederated, but seized by authorities and admins brought to justice)
I've updated third bullet point in accordance with your comment, thank you.
Not banning a group of people that is intolerant and promotes killing people they don't like is totally a good idea. That worked well before....
If anyone can't tell, my first paragraph is sarcastic.
I am aware people outside of Europe and Rwanda are absolutist on free speech but that I think it is because they have not experienced first hand what it's like when hate speech becomes unbridled. It's a classical liberal value to promote free speech at all cost, believing that good ideas will filter out from a stream of bad ones because they believe humans are inherently rational. Well, for many in Germany and Rwanda before, it made sense for them to kill "others" because those at the top said so. I am not going to call old school liberals naive, because of course they did not foresee free speech morphing into hate speech and then making unspeakably evil action into reality centuries later.
As a side note, the US actually thought about electronically interfering a Rwandan-government run radio station that propagates dehumanisation of Tutsis, but the US opted not to out of principle for freedom of speech. That radio station contributed to fomenting hate that led to the Rwandan genocide.
So, no-- an intolerant being intolerant has no place in society. Giving the intolerant platform will eventually stamp down others and ultimately free speech and liberty. Banning Nazis in a platform is no brainer. Like, after all they have done, why on earth would the intolerant be tolerated?
I see your point, and this is exactly why I say this shouldn't happen on mainstream media - this should always be part of platforms one goes to in look for controversies.
We should not allow nor tolerate Nazism or other things like that on mainstream Lemmy instances, for example.
However, we have to set some place for everyone to have a voice. In many places, me calling for communism, for example, will be met with an instant permaban, with people saying I advocate gulags and bloody wars (I do not). And I always wish to have a place to voice my ideas, because I think they're right, regardless of the sentiment that may push mainstream platforms one direction or the other. But that means you'll end up needing a platform allowing everyone - and it should exist as well.
Those are just two systems, and both are necessary for us to prosper.