More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:

I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.

While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”

  • jjjalljs@ttrpg.network
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Fine.

    It’s weird to say that you can’t have a newsletter that has a literal swastika on it, because people will be able to read it but unable to realize that what it’s saying is dangerous violence. Apparently we have to have someone “in charge” of making sure only the good stuff is allowed to be published, and keeping away the bad stuff, so people won’t be influenced by bad stuff. This is a weird viewpoint.

    How is “can’t” happening, here? It’s not the government. Are you arguing against private entities having editorial freedom? Should private entities not be in charge of their own publications and platforms? And if they do choose to publish nazi stuff, shouldn’t the rest of us be free to say “Fuck off, nazi scum” ?

    As I said to someone else, there is presumably a line that’s too much to cross. Is it “live stream of grinding up live babies and puppies and snorting them”? If there is no line, I don’t even know where to begin. That’s a whole other conversation. If there is a line, I think nazi content should be on the far side of it. Don’t you? And if the line is something like “whatever’s technically legal”, well that’s just punting responsibility to a slower, less responsive, ruleset run by the government.

    Personally, I do think that there’s a place for organized opposition to slick internet propaganda which pulls people down the right-wing rabbit hole, because that’s a huge problem right now.

    Platforms taking some responsibility for what they allow would go a long way without requiring a heavy handed government solution. Substack could just say “nah, we’re not letting nazis post stuff.”

    But if a platform is making a lot of money with nazi content, they’re probably going to be reluctant to deal with it. So if you still don’t want heavy government involvement (which can be a reasonable position, probably), you fall back to individuals saying “Fuck you. I’m not going to use your service while you serve nazis.”

    But then you have several related problems. Something that’s a hugely dominant player in a market is hard to avoid. YouTube doesn’t have a lot of major competitors, for example, and is pretty ubiquitous. AWS is basically impossible to avoid. And on top of that, many people are apathetic or too busy trying to survive to spend a lot of time curating things.

    I agree with the spirit of it in addition to the letter, though

    What even is the spirit of it?

    • mo_ztt ✅@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Are you arguing against private entities having editorial freedom? Should private entities not be in charge of their own publications and platforms?

      Yes, absolutely. Lemmy.world should be able to ban Nazis if they want to, as should Substack. Personally, I think it would be better in some cases if people didn’t. Although, there’s so much overlap between Nazis and general-toxic-behavior users that I wouldn’t really fault them for banning Nazis outright even if they theoretically supported the Nazis’ right to free speech.

      Notably though, I think Substack should also be free to not ban Nazis, and no one should give them shit for it. In particular, they definitely shouldn’t be talking about trying to get their Stripe account cancelled, or pressuring their advertisers, as I’ve seen other posters here advocate for (although I think the thing about advertisers is just a result of pure confusion on the poster’s part about how Substack even makes income).

      In this particular case, I think allowing the Nazis to speak is the “right answer,” so I definitely don’t advocate for interfering in anything Substack wants to do with their private servers. But no, I also don’t think anyone who doesn’t want to host Nazis should have to, and it’s a pretty good and reasonable question.

      As I said to someone else, there is presumably a line that’s too much to cross. Is it “live stream of grinding up live babies and puppies and snorting them”? If there is no line, I don’t even know where to begin.

      Let me say it this way: If what you’re doing or saying would be illegal, even if you weren’t a Nazi, it should be illegal. It shouldn’t suddenly become illegal to say if you’re wearing a Nazi uniform. Threatening violence? Illegal. Threatening violence as part of your Nazi political platform? Illegal. Wearing a Nazi uniform, saying that white people are superior and the holocaust didn’t happen? Legal as long as you’re not doing some other illegal thing, even though historically that’s adjacent to clearly-illegal behavior.

      I realize there can be a good faith difference of opinion on that, but you asked me what I thought; that’s what I think. If it’s illegal to wear a Nazi uniform, or platforms kick you off for wearing one, then it can be illegal to wear a BLM shirt, and platforms can kick you off for saying #blacklivesmatter. Neither is acceptable. To me.

      Probably the closest I can come to agreeing with you is on something like Patriot Front. Technically, is it legal to gather up and march around cities in threatening fashion, with the implication that you’ll attack anyone who tries to stop you? Sure. Is it dangerous? Fuck yes. Should it be legal? Um… maybe. I don’t know. Am I happy that people attacked them and chased them out of Philadelphia, even though attacking them was interfering with their free speech? Yes. I put that in a much more dangerous category than someone hosting a web site that says the holocaust didn’t happen.

      Platforms taking some responsibility for what they allow would go a long way without requiring a heavy handed government solution. Substack could just say “nah, we’re not letting nazis post stuff.”

      Would it go a long way, though?

      Youtube, Facebook, and Twitter have been trying to take responsibility for antivax stuff and election denialism for years now, and banned it in some cases and tried to limit its reach with simple blacklisting. Has that approach worked?

      Nazi stuff is unpopular because it’s abhorrent and people can see that when they read it. I genuinely don’t think that allowing Nazi speech on Substack is a step towards wider acceptance of Naziism. I don’t think there are all these people who might have been Nazis but they’re prevented by not being able to read it on Substack. I do think allowing Nazi stuff on Substack would be a step towards exposing the wider community to the actual reality of Naziism, and exposing the Nazis to a community which can openly disagree with them instead of quarantining them in a place where they can only talk to each other.

      I do think responsibility by the platforms is an important thing. I talked about that in terms of combatting organized disinformation, which is usually a lot more sophisticated and a lot more subtle than Nazi newsletters. I just don’t think banning the content is a good answer. Also, I suspect that the same people who want the Nazis off Substack also want lots of other non-Nazi content to be “forbidden” in the same way that, e.g. Dave Chappelle or Joe Rogan should be “forbidden” from their chosen platforms. Maybe I’m wrong about that, but that’s part of why I make a big deal about the Nazi content.

      • jjjalljs@ttrpg.network
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Thank you for the detailed response.

        Notably though, I think Substack should also be free to not ban Nazis, and no one should give them shit for it.

        Substack can host nazis given the legal framework in the US. But why shouldn’t I speak up about their platforming of evil? Substack can do what they want, and I can tell them to fuck off. I can tell people who do business with them that I don’t approve, and I’m not going to do business with them while they’re engaged with this nazi loving platform. That’s just regular old freedom of speech and association.

        Their speech is not more important than mine. There is no obligation for me to sit in silence when someone else is saying horrible things.

        It feels like you’re arguing for free speech for the platform, but restricted speech for the audience. The platform is free to pick who can post there, but you don’t want the audience to speak back.

        Let me say it this way: If what you’re doing or saying would be illegal, even if you weren’t a Nazi, it should be illegal. […] I realize there can be a good faith difference of opinion on that, but you asked me what I thought; that’s what I think. If it’s illegal to wear a Nazi uniform, or platforms kick you off for wearing one, then it can be illegal to wear a BLM shirt, and platforms can kick you off for saying #blacklivesmatter. Neither is acceptable. To me.

        You’re conflating laws and government with private stuff. The bulk of this conversation is about what can private organizations do to moderate their platforms. Legality is only tangentially related. (Also it doesn’t necessarily follow that banning nazi uniforms would ban BLM t-shirts. Germany has some heavy bans on nazi imagery and to my knowledge have not slid enthusiastically down that slope)

        A web forum I used to frequent banned pro-trump and pro-ice posts. The world didn’t end. They didn’t ban BLM. It helps that it was a forum run by people, and not an inscrutable god-machine or malicious genie running the place.

        I’m also not sure I understood your answer to my question. Is there a line other than “technically legal” that you don’t want crossed? Is the law actually a good arbiter?

        Youtube, Facebook, and Twitter have been trying to take responsibility for antivax stuff and election denialism for years now, and banned it in some cases and tried to limit its reach with simple blacklisting. Has that approach worked?

        I don’t think they’ve actually been trying very hard. They make a lot of money by not doing much. Google’s also internally incompetent (see: their many, many, canceled projects), Facebook is evil (see: that time they tried to make people sad to see if they could), and twitter has always had a child’s understanding of free speech.

        I do think responsibility by the platforms is an important thing. I talked about that in terms of combatting organized disinformation, which is usually a lot more sophisticated and a lot more subtle than Nazi newsletters. I just don’t think banning the content is a good answer. Also, I suspect that the same people who want the Nazis off Substack also want lots of other non-Nazi content to be “forbidden” in the same way that, e.g. Dave Chappelle or Joe Rogan should be “forbidden” from their chosen platforms. Maybe I’m wrong about that, but that’s part of why I make a big deal about the Nazi content.

        A related problem here is probably the consolidation of platforms. Twitter and Facebook as so big that banning someone from it is a bigger deal than it probably should be. But they are free to move to a more permissive platform if their content is getting them kicked out of popular places. We’re not talking about a nationwide, government backed-by-force content ban.

        I’m not sure what to do about coordinated disinformation. Platforms banning or refusing to host some of it is probably one part of the remedy, though.