“We don’t think that censorship (including through demonetizing publications) makes the problem go away.”
More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:
I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.
While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”
Freedom of speech doesn't mean that you are obligated to host a platform so shitty people can use it to share shitty ideals. It simply means that you won't get arrested on a federal level.
Websites can do whatever they want, including deciding that they don't want to be a platform for hate speech. If people are seeking a place for this conversation genre to happen, and they want it enough, they can run their own website.
Imagine if you invited a friend of a friend over, and they were sharing nasty ideals at your Christmas party. And they brought their friends. Are you just going to sit there and let them turn your dinner into a political rally? No, you're going to kick them out. It's your dinner, like it is your website. If you don't kick them out, then at some level, you're aligning with them.
Yea... Meta took the same "free peaches" approach and the entire fucking globe is now dealing with various versions of white nationalism. So like, can we actually give censorship of hate a fucking try for once? I'm willing to go down that road.
To be clear — what McKenzie is saying here is that Substack will continue to pay Nazis to write Nazi essays. Not just that they will host Nazi essays (at Substack's cost), but they will pay for them.
They are, in effect, hiring Nazis to compose Nazi essays.
“we don’t like or condone bigotry in any form.” I mean they are litterally Condoning bigotry.
"His response similarly doesn’t engage other questions from the Substackers Against Nazis authors, like why these policies allow it to moderate spam and newsletters from sex workers but not Nazis."
This would be silly even if they didn’t moderate at all but they do. They don’t allow sex workers use their service. And we aren’t talking about “Nazis” as a code word for the far right. The complaint letter cited literal Nazis with swastika logos.
Plus, how grand are his delusions of grandeur if he thinks his fucking glorified email blast manager is the one true hope for free speech? Let the Nazis self-host an open source solution (like Ghost).
All joking aside, silencing Nazis and deplatforming them is LITERALLY fighting against them. How is allowing them to make money and market themselves on your platform doing anything to stem the tide of Nazism? Obviously they're playing culture war games and saying they're not.
Okay fine, I'm never clicking on a substack link again.
And after say a grace period of about 6 months to move elsewhere, I'm going to assume anyone associating with the service is at best a nazi sympathiser
Go ahead, be a Nazi bar, I'm sure their money is worth it
Facebook just shrugs off the rampant white supremacist content on its platform with great success, you can literally put up a profile photo with an "It's OK to be white" frame, or "white power" supplied by Facebook. I guess Substack thinks that if it works for Facebook it should be fine for them.
Incidentally Reddit banned me for posting pictures of Nazis on r/beholdthemasterrace, a subreddit for mocking white supremacy, when some Nazis went and complained to Reddit admins I was doing it. Reddit also sides with Nazis, they're just quieter about it.
On one hand, Substack is in it's rights and as a journalistic organization, they are in the right.
The issue is: Once you serve a Nazi in your bar, you become a Nazi bar. This is no longer a marginalized viewpoint you can ignore. Its actively recruiting and frightening. Inaction is enabling. Substack is going to become shitty, and fast. They will lose high engagement users, first when the ones who protested pile out for another platform and then quickly when the quality dips.
Also, their cavalier attitude will change when Stripe steps in.
In fact, stamping out dissent and controlling people is incredibly effective. Ask any dictator.
Control is effective and necessary when it comes to people actively trying to damage society. No, I’m not supporting dictatorship or authoritarianism, just pointing out that control is effective.
Being a sect of destructive assholes doesn’t mean you should get a platform.
Mckenzie needs to read that Reddit story about the bartender who kicked out a guy with the Third Reich eagle ensign on his shirt despite him quietly minding his own business. I really don't want Substack to "suddenly become a Nazi bar." I'm just a reader, but if I ever start a newsletter I may reconsider my platform. I am on a basic free plan for all Substack channels I read. I've thought about upgrading my subscription to some, but now I will hesitate.
Free speech POV aside, Substack is running a business as a publisher of content. They sell advertising space. You know what de values your advertising space? Unsafe hateful content. Advertisers care about "brand safety" in terms of what their ads appear next to. You can't run a good advertising sales business if the advertisers don't have guarantees on brand safety.
Gen Z needs to understand the historical lesson that the Blues Brothers taught those before them. Illinois Nazis exist, and some days they demonstrate, as per their right to freedom of speech - but this is as much as an opportunity to humiliate them and openly critique the mindset as anyone else. Dark little underground communities flourish behind closed doors.
TIL that Substack is apparently a bunch of crypto-fascists who expect people to believe they don't support Nazis, they just give them money and a place at their table to talk about it.
This tracks with my previous attempts at reporting that Sinfest guy. Posts hundreds of comics that blatantly break multiple official substack content guidelines and I get the effective equivalent of a promise for "action" combined with a dismissive eye roll. They completely ignored my follow-up email detailing the complete lack of action and the dozen or so new content guideline violations.
I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.
Are Musky and Hamish McKenzie’s friends because that sound like the same bullshit he would say. Also, hasn't deplatforming actually been shown to work?
For anyone who remembers the interview the CEO did with the Verge back when they launched Notes, this isn't surprising at all.
You can see a transcript here. The relevant section can be found by searching all brown people are animals or more specifically just animals and reading on from there.
I'm not sure if the video footage of the interview is still available, but it's even worse because you can see that the CEO is completely lost when talking about the idea of moderating anything and basically shuts down because they have nothing to say all while the interview is politely berating them about how they're obviously failing a litmus test.
Do note that above the point where "animals" occurs is some post-hoc context provided by the interviewer (perhaps why the video is no longer easily available?) where they point out that the question they asked and the response they got wasn't exactly as extreme as it first appeared. But they also point out that it's still very notable despite the slightly mitigating correction and I'd agree entirely, especially if you watch(ed) the video and clocked the CEO's demeanor and lack of any intelligent thought on the issue.
Ugh, so disgusting. The only reason I use Substack is to follow Gregory Warner after NPR cancelled Rough Translation. I really hope he moves somewhere else.
Submitted for good faith discussion: Substack shouldn’t decide what we read. The reason it caught my attention is that it's co-signed by Edward Snowden and Richard Dawkins, who evidently both have blogs there I never knew about.
I'm not sure how many of the people who decide to comment on these stories actually read up about them first, but I did, such as by actually reading the Atlantic article linked. I would personally feel very uncomfortable about voluntarily sharing a space with someone who unironically writes a post called "Vaccines Are Jew Witchcraftery". However, the Atlantic article also notes:
Experts on extremist communication, such Whitney Phillips, the University of Oregon journalism professor, caution that simply banning hate groups from a platform—even if sometimes necessary from a business standpoint—can end up redounding to the extremists’ benefit by making them seem like victims of an overweening censorship regime. “It feeds into this narrative of liberal censorship of conservatives,” Phillips told me, “even if the views in question are really extreme.”
Structurally this is where a comment would usually have a conclusion to reinforce a position, but I don't personally know what I support doing here.
Ehhh, it's one of those things where I agree with the principle, but the principle fails. It's the so called tolerance paradox (which isn't actually a paradox at all, but that's tangential).
On principle, no company should be in the business of deciding what is and isn't acceptable "speech". That's simply not something we really want happening.
But then there's nazis and other outright insane bigots. But we still don't really want companies making that call, because they'll decide on the side of profit, period. If enough of the nazi types get enough power and money going, every single fucking company out there that isn't owned by a single person, or very small group of people that share the same ideals, is going to be deciding that it's the nazi bullshit that's the only acceptable speech.
This is something that has to come from the bottom to the top and be decided on a legal level first. We absolutely can ban nazi type bullshit if we want to. There's plenty of room for it to be pointed at as the incitement to violence that it is. There need to be very specific, very limited definitions to govern what is and isn't part of that
And the limitations have to be impossible to expand without starting all the way over with the kind of stringency it takes to amend the constitution.
That takes it out of the hands of corporations, and makes it very difficult to game. But it has to come from us, as a people first.
(transcribed from a series of tweets) - iamragesparkle
I was at a shitty crustpunk bar once getting an after-work beer. One of those shitholes where the bartenders clearly hate you. So the bartender and I were ignoring one another when someone sits next to me and he immediately says, "no. get out."
And the dude next to me says, "hey i'm not doing anything, i'm a paying customer." and the bartender reaches under the counter for a bat or something and says, "out. now." and the dude leaves, kind of yelling. And he was dressed in a punk uniform, I noticed
Anyway, I asked what that was about and the bartender was like, "you didn't see his vest but it was all nazi shit. Iron crosses and stuff. You get to recognize them."
And i was like, ohok and he continues.
"you have to nip it in the bud immediately. These guys come in and it's always a nice, polite one. And you serve them because you don't want to cause a scene. And then they become a regular and after awhile they bring a friend. And that dude is cool too.
And then THEY bring friends and the friends bring friends and they stop being cool and then you realize, oh shit, this is a Nazi bar now. And it's too late because they're entrenched and if you try to kick them out, they cause a PROBLEM. So you have to shut them down.
And i was like, 'oh damn.' and he said "yeah, you have to ignore their reasonable arguments because their end goal is to be terrible, awful people."
And then he went back to ignoring me. But I haven't forgotten that at all.
While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation.
In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions.
“We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said.
In a 2020 letter from Substack leaders, including Best and McKenzie, the company wrote, “We just disagree with those who would seek to tightly constrain the bounds of acceptable discourse.”
The Atlantic also pointed out an episode of McKenzie’s podcast with a guest, Richard Hanania, who has published racist views under a pseudonym.
McKenzie does, however, cite another Substack author who describes its approach to extremism as one that is “working the best.” What it’s being compared to, or by what measure, is left up to the reader’s interpretation.
The original article contains 365 words, the summary contains 157 words. Saved 57%. I'm a bot and I'm open source!
Monetization of such content is questionable for sure, but I'm affirmative about what he says about the propagation of such extreme views. Simply being unaware about such things won't make them go away. People should know who they are and why they are so we can deal with them better. There's alot we can do better but can't do because of limited awareness and our own negative attitude to deal with them.
I can ALMOST see his point... If you push them underground, you push them to find a space where nobody will challenge them, and they can grow stronger in that echo chamber.
Allowing them to be exposed to the light of day and fresh air makes their evil apparent to all and searchable.
And besides, "Punch a Nazi Day" just isn't the same without Nazis. :)
Honestly? Unless I'm missing something, this sounds fine.
The internet I grew up on had Nazis, racists, Art Bell, UFO people, software pirates, and pornographers. The ACLU defended KKK rallies. Some of the people who were allowed a platform, that "everyone hated" and a lot of people wanted to censor, were people like Noam Chomsky who I liked hearing from.
I think there's a difference between "moderation" meaning "we're going to prevent Nazis from ruining our platform for people who don't want to hear from them" -- which, to me, sounds fine and in fact necessary in the current political climate -- and "moderation" meaning "if you hold the wrong sort of views you're not allowed to express them on my platform." The Nazi bar analogy, and defederating with toxic Lemmy instances, refers to the first situation. If I understand Substack's platform properly, it's the second: Only the people who want to follow the Nazis can see the Nazis. No? Am I wrong in that?
I'm fully in agreement with McKenzie that not allowing "wrong" views to be expressed and legitimately debated makes it harder to combat them, not easier. They're not gonna just evaporate because "everyone agrees they're bad" except the people who don't.
I realize this is probably a pretty unpopular view.
OK? But I'm going to think Substack is a hardened Nazi supporter whan I all of a sudden don't see Antifa openly talking about their plans for disposing of their Nazi opposition on their platform, which would be appropriate discussion in said situation. I'm also guessing that their coffers are now open to any and all well known terrorist organizations. Maybe we shouldn't of given corporations any power at all, they have proven time and time again to have absolutely no morals.
And I'm going right back to sleep, so if anyone wants to argue about free speech I'll give my opinion on that now. I draw the line at helping sick individuals try to organize the genocide of most of the people on this planet. I'm all fine for a mentally ill person (Nazi) to be yelling their propaganda from their soapbox in the town square, but letting and even helping the Nazis openly spread their well documented genocidal hate is too far for me.
Edit: I'm a little confused about the fast downvotes?
Maybe mentioning that Antifa (you know the opposite of Nazis) should be equally represented if your platform supports Nazis is considered a bad thing here on Lemmy, but that don't make much sense.
Maybe it's just the corporates paving the way for Facebooks infiltration and organized downfall of all their competition?
Maybe it's just the Nazis.
Then again it's probably just me having asd and speaking directly without a filter.
Don't worry though the Nazis have plans for people like me.
I actually prefer this type of hands-off approach. I find it offensive that people would refuse to let me see things because they deem it too "bad" for me to deal with. I find it insulting anyone would stop me reading how to make meth or read Mein Kampf. I'm 40yo and it's pretty fucking difficult to offend me and to think I'm going to be driven to commit crime just by reading is offensive.
I don't need protecting from speech/information. I'm perfectly capable and confident in my own views to deal with bullshit of all types.
If you're incapable of dealing with it - then don't fucking read it.
Fact is the more you clamp down on stuff like this the more you drive people into the shadows. 4chan and the darkweb become havens of 'victimhood' where they can spout their bullshit and create terrorists. When you prohibit information/speech you give it power.
In high school it was common for everyone to hunt for the Anarchists/Jolly Roger Cookbook. I imagine there's kids now who see it as a challenge to get hold of it and terrorist manuals - not because they want to blow shit up, but because it's taboo!
Same with drugs - don't pick and eat that mushroom. Don't burn that plant. Anyone with 0.1% of curiosity will ask "why?" and do it because they want to know why it's prohibited.
Porn is another example. The more you lock it down the more people will thirst for it.
Open it all up to the bright light of day. Show it up for all it's naked stupidity.
It all depends on the nature and goals of the platform.
It's one thing to create a platform for positivity and brave new world (in a good sense). It's good we have those, and this probably should be the approach of mainstream media.
It's another to create a truly free speech platform. You can't claim free speech and then ideologically ban someone, even if that's someone bad. And you should have such venues - for among 9 terrible ideas (like Nazism) lies one that is underappreciated and misunderstood, and wrongly considered to be bad. Feminism was considered to be bad. LGBT people were considered to be bad etc. etc. And if you start banning some ideas, it leaves you with carte blanche to ban everyone you don't like, including people who actually promote healthy and positive ideas and values, but are misunderstood.
Only leaving the first option means starting a circlejerk where no good new idea has a chance to flourish.
As per Nazis, homophobes and other people with terrible ideas - you really can't overcome it by just pulling it under the rug. We need to develop patience and advance our rhetorics to counter those, and to quickly seed that grain of truth from which all their misconceptions get to shatter. That's literally the only way to combat ideology - by exposing how deeply wrong and flawed it is, and providing arguments.
Will there be people with no reason following such ideologies out of spite and emotion? Sure. But by stepping against them on an equal footing, we can show the rest how stupid their arguments are, rven on a dedicated free speech platform.
Because Nazism is not bad simply because we decided it to be so. It is a faulty ideology meant to distract people from real sources of their struggle while expending millions of lives in the process.
It removes critical understanding of economic processes by the masses, fooling them into believing the issues are caused by some nation and not their own elites - something that is well-researched and obvious to almost every other modern individual.
In order to retain people's decisiveness amongst deep economic crisis directly caused by application of such ideology (due to rampant expropriation, paranoid protectionism, economic mismanagement and removal of active economic participants), Nazi government always needs to wage a war - this way it can blame its faults on its enemies. And war inevitably comes with millions of deaths, deaths directly attributed to the ideology cause it can't run without them.
Finally, it's an ideology based on hate to a group with an immutable property - it ignores the differences all of us have and tries to attribute a certain property to an entire nation - something we know isn't true - and then exterminate people based on what is known to be wrong association.
No matter how you look at it, under any rational look, Nazism is just plain stupid, and in its stupidity it produces extreme and unnecessary suffering.
With all that being said, again, I don't think we should make platforms like Substack mainstream and we should moderate general-topic places to exclude Nazis and other harmful actors. But we sure as hell need them to be present.
Because the only thing worse than Nazis allowed to influence us is the tyranny of subjective good.
Good for them. I'm all for allowing people make their own choices about what kind of content they want to see instead of a corporation/government deciding for them.
I can't think of a single thing we've succesfully gotten rid of by banning it. I however can think of several examples where it has had an opposite effect.