I respect your right to gather support for a cause. I also respectfully disagree with the proposal of censorship on Substack as the best way forward.
In my opinion, squashing speech we disagree with does not make it go away, but instead creates the conditions for that speech to grow more righteous. As a parent, I see this play out between my kids and myself when I attempt to exert control -- it just becomes a righteous battle of my ego vs their ego.
Also reminds me of how the U.S. has handled terrorist groups in other countries. Instead of discovering the underlying needs and providing support there (i.e. a need for food, water, shelter, work, and safety) just demonize the terrorists as "bad" and enforce more and more restrictions, leading to more and more terrorist activity. It is a self-fulfilling prophecy -- we took away their "XYZ" and look at how bad they are being now! See, they were bad all along!
Do any of these restrictions change people or just lead them to gather somewhere else, potentially interpreting censorship as a new head on the hydra they call their enemy to justify pushing harder on their cause and having more fuel for recruitment?
"Other social media platforms have actively given reach to an enormous amount of divisive content, and moderation has amounted to private companies deciding who to deplatform based on their own agenda. Facebook has struggled with hate speech and misinformation no matter what it has tried with its moderation policies, and Twitter’s moderators have actively suppressed stories that might sway an upcoming election, among other discrepancies.
There can be no doubt that there is a lot of hateful content on the internet. But Substack has come up with the best solution yet: Giving writers and readers the freedom of speech without surfacing that speech to the masses. In your Substack Inbox, you only receive the newsletters you subscribe to. Whether you’re a reader or a writer, it is unlikely you’ll receive hateful content at all if you don’t follow it."
I share my opinion openly, I won't sign in agreement but I will also not stand in your way. Do what you feel you must do.
I like that distinction, Holly. Who would you trust to determine what speech should be monetizable? Would it be okay for you is Substack charged users for this service (to vet the speech)?
"Substack cannot be used to publish content or fund initiatives that incite violence based on protected classes. Offending behavior includes credible threats of physical harm to people based on their race, ethnicity, national origin, religion, sex, gender identity, sexual orientation, age, disability or medical condition."
Since I haven't actually read anything the Nazi's you wish to have expelled have produced (Not my usual reading regimen), I'll just ask, are there actual calls to incite violence on their Substacks?
I suspect it's actionable if they have, and in those specific instances, Substack should be encouraged to take the appropriate measures.
If they haven't, or, as we've seen recently with the Congressional hearings concerning student protesters and demonstrations calling for the rape, killing and genocide of a certain religious demographic, they've come really close to it, then it's probably not clearly actionable.
As reprehensible and vile as some of these antisemitic screeds can be, content that violates our consciences doesn't necessarily qualify for TOS violations. I think this is why there's push-back: none of us like these misanthropes, but we recognize that we must protect certain freedoms and expressions no matter how vile, because once we assign hall monitors, our own voices can be silenced solely at the discretion of some nameless, faceless malcontent that hates dissent.
Hi Matthew, I appreciate your position, and personally, I tend to agree. When platforms moderate content based on political views, it can be dangerous and a slippery slope.
The issue is that Substack is not only allowing content creators that engage in hate speech, but also profiting on that content. This has been going on for years now and Substack refuses to take action, making many others wary about being part of this platform. Given that many other platforms have drawn the line at hate speech and violence, it would be nice if Substack would address the contradictions between their Terms of Service and reality.
"Substack cannot be used to publish content or fund initiatives that incite violence based on protected classes. Offending behavior includes credible threats of physical harm to people based on their race, ethnicity, national origin, religion, sex, gender identity, sexual orientation, age, disability or medical condition."
What this letter is saying is that some of these Nazi commentators are crossing that line and are not being held accountable. And many of us would like Substack's leadership to explain why this is happening.
Thanks for the thoughtful reply, Jackie. I appreciate the grey area here. In your experience, are these groups actively threatening via hate speech on Substack or is their belief system the threat by implication?
I will admit I don't spend much time reading right-wing propaganda, but from the ones I have seen (and yes, I have read some of them in the past), it can be both. And even if there is no violence overtly expressed, when someone writes blantant racist, antisemitic, anti-immigrant, transphobic, homophobic etc. content, it contributes to a culture where violence against those populations is acceptable. We have seen instances, for example, where our former president has made statements against a person or group and then someone has taken those statements as encouragement to take action. We can pretend that doesn't happen or say "but they didn't actually call for violence" but we all know what these people are doing and why, and what they expect the outcomes to be. It's just like Moms for Liberty campaigning against having trans flags in a school. They aren't REALLY concerned about flags. They are fanning the flames of anti-trans sentiments that often then turn violent. And they are okay with that.
Dec 15, 2023·edited Dec 15, 2023Liked by Jackie Dana
Hi Matthew, I respect your take on free speech and theoretically you should be right. But what happens when a certain discourse reaches critical mass and bleeds into mainstream thought? A healthy immune system can keep cancer cells under control. But we all know what happens when those cells get out of control. As someone who grew up in a country that was fervently communist post-WW2, and fervently Nazi pre- and during WW2, I know a thing or two about how the wrong ideology can sweep havoc in the healthy immune system of a society. It doesn’t take much to tip the balance.
First to you, I appreciate that you see and feel something from direct experience. I'm so sorry to hear that you have been near to this kind of oppression and fear-based power grabbing. Which country did you grow up in?
I’ve lived in the U.S. and Canada. The friend I’ve known the longest (since Kindergarten) is from Germany. We became friends while being unable to speak the same language.
I too am concerned about things that reach critical mass and bleed into mainstream thought.
And to all in this thread…
I took some time to think about this issue some more and hope I can be granted some space to continue the conversation, with a twist to where I've been speaking from.
First, I want to acknowledge that many people have been either personally harmed, or have a lineage of trauma due to the atrocities of Nazi-ideological violence. I am not a history buff by any means, but I know that around 6 million people were murdered due to Nazi genocide. That is traumatic for all involved who witnessed horrors, had connections to people they lost (family or otherwise), and likely everyone around the world that heard about the genocide from afar. And that trauma is still with us today unfortunately.
Thank you for sharing the links. I’m not happy that Substack promoted someone who is anti-semetic. I scanned one of Richard Hanania’s articles and the comments section. It seemed to be about Jews only being valuable as donors if they can be swayed to a conservative leaning to balance something out of balance. That’s a good example of an unhealthy antisocial mindset (to say the least) in my opinion—blaming a group for your suffering based on ethnicity, religion, or culture and thereby dehumanizing them, which paves the way for violence.
I get it. We have examples of blame for suffering in so many contexts sadly. But the Nazi one has a global collective trauma connected to it, so it’s all the more frightening.
The question of why Hanania was invited onto a Substack’s podcast is fair.
And at the end of this message, I share what some Holocaust survivors felt about defending the free speech of Nazis.
But first, while I know this may read as beating a dead horse, I will take that risk to ask you all about "what censorship is and is not" because it seems like censorship is being treated like a sweater nobody wants to wear.
If I try to define censorship as neutrally as I can, censorship seems to be "a tool to use when you want to deny a group’s public expression (as determined on a case by case basis) in order to seek alignment in a society."
Is that a fair definition? Anyone have a better one?
So far, it appears we’ve split into camps around defending free speech vs. not tolerating Nazi-inspired violence. While at the same time it seems we agree that both things matter. I don't hear anyone saying Nazis don’t inspire violence—they do.
I know this group is not saying Substack must censor people like Hanania, but where else would this end up going if you got what you wanted? It continues to sound like censorship is the goal even if you are not explicitly asking for it. Am I getting that wrong?
Is there something other than censorship that would satisfy you? Whatever the Substack leaders say, I hope it does illuminate the choice to solicit and feature an anti-semetic writer (which says a lot about how corporations are often rewarded by sociopathic behavior).
If Substack takes a free speech stance and leaves it at that (which I think is the case so far), I assume that you will not be satisfied. And that’s okay—it’s completely your right to feel and express however you do about this issue. And your right to try to create change in a direction you deem better!
What I am troubled by is that the frame of the conversation so far seems to be a 1-solution option:
- Problem: Nazis having a megaphone and monetizing their message via Substack;
- Vision: a world where we don’t have to feel threatened by Nazis;
- Solution: compel Substack to de-monetize/censor them (I would say demonetizing is a form of censorship).
Let me be clear—I agree with that vision!
I would define the problem differently because I think it goes way beyond Substack’s influence. And I would also want to explore possible solutions rather than just one. Did you explore other possible solutions? What were they?
Back to those Holocaust survivors. The Jewish man, David Goldberger, at the ACLU who defended the Nazi’s right to peaceful demonstration in the 1970’s was joined by some Holocaust survivors who agreed with his defense.
Goldberger writes, “There were times when, during speeches I gave about the Skokie case, Holocaust survivors courageously stood up to say that I was right to have represented the Nazis. Several years later, another survivor sent me a letter saying the same thing. These survivors said that they did not want the Nazis driven underground by speech-repressive laws or court injunctions. They explained that they wanted to be able to see their enemies in plain sight so they would know who they were.” – The Skokie Case: How I Came To Represent The Free Speech Rights Of Nazis - https://www.aclu.org/issues/free-speech/skokie-case-how-i-came-represent-free-speech-rights-nazis
Did they agree with Goldberger’s stance because they saw censorship as a core tool used by Nazi’s to suppress voices?
I attempted to glean some info using ChatGPT which fed this back to me: “Nazis sought to control cultural and artistic expression to align with their ideology. Artists, writers, and musicians whose work did not conform to Nazi ideals faced censorship, persecution, and exclusion. In Nazi-controlled Germany, strict censorship laws were enacted, allowing authorities to prevent the publication or distribution of materials deemed harmful to the regime. This preventive censorship targeted not only political content but also any form of expression that challenged Nazi ideology.”
That happened, right?
I would love a chance to share my view of the deeper problem. But I’ve already taken up a lot of space here, so I’ll wrap it up.
I would also love to explore alternative solutions based on what has worked elsewhere to address threats of Nazi influence. Is anyone else open to that?
I'm sure you think what you are doing here is noble, however was it needed on this Substack? Couldn't it have been shared on personal publications and not ones that claim:
"Fictionistas is a space for fiction writers to get to know each other. A place we can announce virtual meetups and discuss tips and tricks that worked for us. As we grow, we welcome ideas from all fiction writers on Substack on how we can best serve each other as a community."
Should it be corrected to include:
"As we grow, we welcome ideas from all fiction writers (with approved views on monetized speech) on Substack on how we can best serve each other as a community
I enjoy the idea of being exposed to more fiction here on Substack but I can't sign on to censoring Substacks even if I disagree with everything they have to say. We are all adults here.
In closing I quote the proverb Jay-z: "If you don't like my lyrics, you can press fast forward"
Again, we are not calling for censorship. If you read the letter and my comments here, we are calling on Substack to be accountable and to also hold all of the publications on this platform to their terms of service. The problem is, a lot of people are oversimplifying this call to be "we want censorship."
And why did I post this on Fictionistas? First off, I did share it on Story Cauldron, one of my personal Substacks, as well. But I shared it here for a few reasons.
First, we already had a conversation about allowing a couple of our members who feel passionately about this issue to write a guest post. They declined to do so, but we (those of us running Fictionistas) had agreed to broach this issue with our subscribers.
Second, it is my contention that Substack really does have an issue with hate speech and it is causing real waves on the platform, with some writers already leaving and others not wanting to sign up. That hurts all of us.
Finally, I want Fictionistas and my personal Substacks to be considered "safe spaces" where everyone feels safe and valued. That means that I will call out hate speech wherever I see it, and will unapologetically support any organized calls to deplatform anyone who engages in it.
While I believe in free speech, I believe that that doesn't mean anything goes, and that there should be limits to how far that should extend. As I see it, just as free speech should not extend to people who make unsafe statements (the proverbial "Fire!" in a movie theater) or to people calling for violence, it also should not extend to those who make hateful or bigoted comments about anyone else based on things including ethnicity, religion, country of origin, sexual identiy or gender identity, as a means to silence, harm, or hurt people.
I appreciate taking the time to explain the logic for my questions. The issues I think most of us that oppose this measure, specifically me is that
1. I don't want other people telling me what I can and can not read or what I can or can not buy with my money that I earned through my labor.
2. We as a society have gotten way too comfortable imposing our views on other people
3. What does holding Substack accountable look like? You say it isnt about censorship but aren't you implying that Substack not host this type of information?
4. With content moderation and moderation teams it becomes a game of bad incentives. The people employed to moderate content will need to narrow what is acceptable or thus they risk showing no value to the company...losing their jobs. If they solve the problem, they are unemployed. When you have a team of hammers, everything looks like a nail eventually.
Additionally, once people know how to get something removed, others will use the same mechanism to remove content that don't agree with, no matter if it is true or not.
5. Removing their monetization or removing their substacks only proves the point that the "elite" or "globalist" or <insert term> doesn't want them to share this information, thus giving it the illusion of being true. For example: You aren't going to convert Ben Shapiro listeners to Islam by booting him off the internet.
I respect your opinion (along with anyone else who signed on to this) and hope this doesn't come off as malice. I want to provoke the idea of thinking long term and not what feels right in this kneejerk moment. Throughout history, it has never been "the good guys" censoring information.
What you say is entirely fair and I don't see malicious intent in your comments. I appreciate you taking the time to make them.
Holding Substack accountable to me starts with them enforcing their terms of service, and perhaps not promoting Substacks written by people with a history of hate speech onto the platform's podcast or promoting those Substacks. But again, I think it should be generally accepted regardless of political position that hate speech frequently leads to violence against others and if Substack refuses to ban it, they should at the very least not be profiting from it.
I should also note that back in the day I worked for an even larger platform on the Terms of Service team, and I know first-hand that these questions are not easy ones to answer. But having said that, Substack can't just put their head in the sand and pretend it doesn't matter or it's too hard so they won't address it. And if nothing else, they need to either revise their TOS or enforce it across the board and not selectively ignore Substacks that have large followings and theoretically bring in a lot of money.
I believe there's a middle ground here between 1) silence hate groups or 2) do nothing.
There are non-oppressive tools that can be employed to get at the core issue of people gathering in support of hate towards others.
We are a creative bunch here. We all have the ability to think outside the box and create something new on a daily basis.
It is reasonable say to ourselves, yes "some action" can be taken. If you're feeling that, then bless you for being alive with purpose and meaning.
If we feel called to alleviate suffering, how can we go about that in a way that does not provoke others or perpetuate suffering for a group deemed as unworthy?
I will give you some examples of ways that can reduce the harm of hate groups that do not include censorship:
1) Education and Awareness: in response to the ACLU defending the right for Nazis to demonstrate peacefully in the late 1970’s, the residents of Skokie, Illinois created the Illinois Holocaust Museum. From the ACLU website, they say that the museum “honors the lives lost in the Holocaust. So, out of the pain and anger generated by the Skokie case arose the perfect answer to the Nazis — a monument to ensure that the damage done by the Holocaust will never be forgotten.” – The Skokie Case: How I Came To Represent The Free Speech Rights Of Nazis - https://www.aclu.org/issues/free-speech/skokie-case-how-i-came-represent-free-speech-rights-nazis
Now some answers from Chat GPT, because I am trying to find potential answers that might offer a pathway that creates conditions in which men seeking a tribe and a place to feel they belong and have power have better options available to them. Just as with terrorist groups -- I believe solutions ought to address their needs as human beings making terrible choices, instead of treating them like scum and pushing them both away and further into their righteousness (i.e. not an actual solution).
Please feel free to elaborate or discount anything that does not feel true below…
2) Community Engagement: Encouraging open dialogue within communities to address concerns, fears, and misconceptions can foster understanding and unity. Building relationships between diverse communities can create a sense of solidarity and make it harder for hate groups to gain traction.
3) Counseling and Exit Programs: Offering counseling and support programs for individuals involved in extremist groups can provide an off-ramp for those seeking to leave such ideologies behind. Programs that facilitate rehabilitation and reintegration into society for individuals who have disengaged from hate groups can be effective in preventing recidivism. https://www.lifeafterhate.org/exitusa-client/
4) Promotion of Human Rights: Emphasizing the importance of human rights and equal treatment for all can be a powerful counter-narrative against supremacist ideologies. Supporting organizations and initiatives that work to protect human rights and combat discrimination reinforces a commitment to the principles of equality and justice.
In other words, build a society around the troll that creates the conditions for trolling to become obsolete. This requires human being to human being respect.
And does that happen overnight? No, of course not. Healing never does.
What do you all make of these alternative solutions?
finally a group that was (loosely, at this point) affiliated with the marketing-heavy push of elle and her bullshit that is actually saying "hey fuck dem nazis" and not just hiding behind a very weak free speech argument on a private platform. good job Jack and Jeff
I respect your right to gather support for a cause. I also respectfully disagree with the proposal of censorship on Substack as the best way forward.
In my opinion, squashing speech we disagree with does not make it go away, but instead creates the conditions for that speech to grow more righteous. As a parent, I see this play out between my kids and myself when I attempt to exert control -- it just becomes a righteous battle of my ego vs their ego.
Also reminds me of how the U.S. has handled terrorist groups in other countries. Instead of discovering the underlying needs and providing support there (i.e. a need for food, water, shelter, work, and safety) just demonize the terrorists as "bad" and enforce more and more restrictions, leading to more and more terrorist activity. It is a self-fulfilling prophecy -- we took away their "XYZ" and look at how bad they are being now! See, they were bad all along!
Do any of these restrictions change people or just lead them to gather somewhere else, potentially interpreting censorship as a new head on the hydra they call their enemy to justify pushing harder on their cause and having more fuel for recruitment?
This post by Elle Griffin resonated with me as an alternative view on the matter: https://www.elysian.press/p/substack-writers-for-community-moderation?r=2agnvt&utm_campaign=post&utm_medium=web
"Other social media platforms have actively given reach to an enormous amount of divisive content, and moderation has amounted to private companies deciding who to deplatform based on their own agenda. Facebook has struggled with hate speech and misinformation no matter what it has tried with its moderation policies, and Twitter’s moderators have actively suppressed stories that might sway an upcoming election, among other discrepancies.
There can be no doubt that there is a lot of hateful content on the internet. But Substack has come up with the best solution yet: Giving writers and readers the freedom of speech without surfacing that speech to the masses. In your Substack Inbox, you only receive the newsletters you subscribe to. Whether you’re a reader or a writer, it is unlikely you’ll receive hateful content at all if you don’t follow it."
I share my opinion openly, I won't sign in agreement but I will also not stand in your way. Do what you feel you must do.
I like that distinction, Holly. Who would you trust to determine what speech should be monetizable? Would it be okay for you is Substack charged users for this service (to vet the speech)?
What part of the TOS are you referring to?
https://substack.com/content
"Substack cannot be used to publish content or fund initiatives that incite violence based on protected classes. Offending behavior includes credible threats of physical harm to people based on their race, ethnicity, national origin, religion, sex, gender identity, sexual orientation, age, disability or medical condition."
Since I haven't actually read anything the Nazi's you wish to have expelled have produced (Not my usual reading regimen), I'll just ask, are there actual calls to incite violence on their Substacks?
I suspect it's actionable if they have, and in those specific instances, Substack should be encouraged to take the appropriate measures.
If they haven't, or, as we've seen recently with the Congressional hearings concerning student protesters and demonstrations calling for the rape, killing and genocide of a certain religious demographic, they've come really close to it, then it's probably not clearly actionable.
As reprehensible and vile as some of these antisemitic screeds can be, content that violates our consciences doesn't necessarily qualify for TOS violations. I think this is why there's push-back: none of us like these misanthropes, but we recognize that we must protect certain freedoms and expressions no matter how vile, because once we assign hall monitors, our own voices can be silenced solely at the discretion of some nameless, faceless malcontent that hates dissent.
Hi Matthew, I appreciate your position, and personally, I tend to agree. When platforms moderate content based on political views, it can be dangerous and a slippery slope.
The issue is that Substack is not only allowing content creators that engage in hate speech, but also profiting on that content. This has been going on for years now and Substack refuses to take action, making many others wary about being part of this platform. Given that many other platforms have drawn the line at hate speech and violence, it would be nice if Substack would address the contradictions between their Terms of Service and reality.
Specifically, at https://substack.com/content they state:
"Substack cannot be used to publish content or fund initiatives that incite violence based on protected classes. Offending behavior includes credible threats of physical harm to people based on their race, ethnicity, national origin, religion, sex, gender identity, sexual orientation, age, disability or medical condition."
What this letter is saying is that some of these Nazi commentators are crossing that line and are not being held accountable. And many of us would like Substack's leadership to explain why this is happening.
Thanks for the thoughtful reply, Jackie. I appreciate the grey area here. In your experience, are these groups actively threatening via hate speech on Substack or is their belief system the threat by implication?
I will admit I don't spend much time reading right-wing propaganda, but from the ones I have seen (and yes, I have read some of them in the past), it can be both. And even if there is no violence overtly expressed, when someone writes blantant racist, antisemitic, anti-immigrant, transphobic, homophobic etc. content, it contributes to a culture where violence against those populations is acceptable. We have seen instances, for example, where our former president has made statements against a person or group and then someone has taken those statements as encouragement to take action. We can pretend that doesn't happen or say "but they didn't actually call for violence" but we all know what these people are doing and why, and what they expect the outcomes to be. It's just like Moms for Liberty campaigning against having trans flags in a school. They aren't REALLY concerned about flags. They are fanning the flames of anti-trans sentiments that often then turn violent. And they are okay with that.
Hi Matthew, I respect your take on free speech and theoretically you should be right. But what happens when a certain discourse reaches critical mass and bleeds into mainstream thought? A healthy immune system can keep cancer cells under control. But we all know what happens when those cells get out of control. As someone who grew up in a country that was fervently communist post-WW2, and fervently Nazi pre- and during WW2, I know a thing or two about how the wrong ideology can sweep havoc in the healthy immune system of a society. It doesn’t take much to tip the balance.
Hi Claudia,
First to you, I appreciate that you see and feel something from direct experience. I'm so sorry to hear that you have been near to this kind of oppression and fear-based power grabbing. Which country did you grow up in?
I’ve lived in the U.S. and Canada. The friend I’ve known the longest (since Kindergarten) is from Germany. We became friends while being unable to speak the same language.
I too am concerned about things that reach critical mass and bleed into mainstream thought.
And to all in this thread…
I took some time to think about this issue some more and hope I can be granted some space to continue the conversation, with a twist to where I've been speaking from.
First, I want to acknowledge that many people have been either personally harmed, or have a lineage of trauma due to the atrocities of Nazi-ideological violence. I am not a history buff by any means, but I know that around 6 million people were murdered due to Nazi genocide. That is traumatic for all involved who witnessed horrors, had connections to people they lost (family or otherwise), and likely everyone around the world that heard about the genocide from afar. And that trauma is still with us today unfortunately.
Thank you for sharing the links. I’m not happy that Substack promoted someone who is anti-semetic. I scanned one of Richard Hanania’s articles and the comments section. It seemed to be about Jews only being valuable as donors if they can be swayed to a conservative leaning to balance something out of balance. That’s a good example of an unhealthy antisocial mindset (to say the least) in my opinion—blaming a group for your suffering based on ethnicity, religion, or culture and thereby dehumanizing them, which paves the way for violence.
I get it. We have examples of blame for suffering in so many contexts sadly. But the Nazi one has a global collective trauma connected to it, so it’s all the more frightening.
The question of why Hanania was invited onto a Substack’s podcast is fair.
And at the end of this message, I share what some Holocaust survivors felt about defending the free speech of Nazis.
But first, while I know this may read as beating a dead horse, I will take that risk to ask you all about "what censorship is and is not" because it seems like censorship is being treated like a sweater nobody wants to wear.
If I try to define censorship as neutrally as I can, censorship seems to be "a tool to use when you want to deny a group’s public expression (as determined on a case by case basis) in order to seek alignment in a society."
Is that a fair definition? Anyone have a better one?
So far, it appears we’ve split into camps around defending free speech vs. not tolerating Nazi-inspired violence. While at the same time it seems we agree that both things matter. I don't hear anyone saying Nazis don’t inspire violence—they do.
I know this group is not saying Substack must censor people like Hanania, but where else would this end up going if you got what you wanted? It continues to sound like censorship is the goal even if you are not explicitly asking for it. Am I getting that wrong?
Is there something other than censorship that would satisfy you? Whatever the Substack leaders say, I hope it does illuminate the choice to solicit and feature an anti-semetic writer (which says a lot about how corporations are often rewarded by sociopathic behavior).
If Substack takes a free speech stance and leaves it at that (which I think is the case so far), I assume that you will not be satisfied. And that’s okay—it’s completely your right to feel and express however you do about this issue. And your right to try to create change in a direction you deem better!
What I am troubled by is that the frame of the conversation so far seems to be a 1-solution option:
- Problem: Nazis having a megaphone and monetizing their message via Substack;
- Vision: a world where we don’t have to feel threatened by Nazis;
- Solution: compel Substack to de-monetize/censor them (I would say demonetizing is a form of censorship).
Let me be clear—I agree with that vision!
I would define the problem differently because I think it goes way beyond Substack’s influence. And I would also want to explore possible solutions rather than just one. Did you explore other possible solutions? What were they?
Back to those Holocaust survivors. The Jewish man, David Goldberger, at the ACLU who defended the Nazi’s right to peaceful demonstration in the 1970’s was joined by some Holocaust survivors who agreed with his defense.
Goldberger writes, “There were times when, during speeches I gave about the Skokie case, Holocaust survivors courageously stood up to say that I was right to have represented the Nazis. Several years later, another survivor sent me a letter saying the same thing. These survivors said that they did not want the Nazis driven underground by speech-repressive laws or court injunctions. They explained that they wanted to be able to see their enemies in plain sight so they would know who they were.” – The Skokie Case: How I Came To Represent The Free Speech Rights Of Nazis - https://www.aclu.org/issues/free-speech/skokie-case-how-i-came-represent-free-speech-rights-nazis
Did they agree with Goldberger’s stance because they saw censorship as a core tool used by Nazi’s to suppress voices?
I attempted to glean some info using ChatGPT which fed this back to me: “Nazis sought to control cultural and artistic expression to align with their ideology. Artists, writers, and musicians whose work did not conform to Nazi ideals faced censorship, persecution, and exclusion. In Nazi-controlled Germany, strict censorship laws were enacted, allowing authorities to prevent the publication or distribution of materials deemed harmful to the regime. This preventive censorship targeted not only political content but also any form of expression that challenged Nazi ideology.”
That happened, right?
I would love a chance to share my view of the deeper problem. But I’ve already taken up a lot of space here, so I’ll wrap it up.
I would also love to explore alternative solutions based on what has worked elsewhere to address threats of Nazi influence. Is anyone else open to that?
I'm sure you think what you are doing here is noble, however was it needed on this Substack? Couldn't it have been shared on personal publications and not ones that claim:
"Fictionistas is a space for fiction writers to get to know each other. A place we can announce virtual meetups and discuss tips and tricks that worked for us. As we grow, we welcome ideas from all fiction writers on Substack on how we can best serve each other as a community."
Should it be corrected to include:
"As we grow, we welcome ideas from all fiction writers (with approved views on monetized speech) on Substack on how we can best serve each other as a community
I enjoy the idea of being exposed to more fiction here on Substack but I can't sign on to censoring Substacks even if I disagree with everything they have to say. We are all adults here.
In closing I quote the proverb Jay-z: "If you don't like my lyrics, you can press fast forward"
Again, we are not calling for censorship. If you read the letter and my comments here, we are calling on Substack to be accountable and to also hold all of the publications on this platform to their terms of service. The problem is, a lot of people are oversimplifying this call to be "we want censorship."
And why did I post this on Fictionistas? First off, I did share it on Story Cauldron, one of my personal Substacks, as well. But I shared it here for a few reasons.
First, we already had a conversation about allowing a couple of our members who feel passionately about this issue to write a guest post. They declined to do so, but we (those of us running Fictionistas) had agreed to broach this issue with our subscribers.
Second, it is my contention that Substack really does have an issue with hate speech and it is causing real waves on the platform, with some writers already leaving and others not wanting to sign up. That hurts all of us.
Finally, I want Fictionistas and my personal Substacks to be considered "safe spaces" where everyone feels safe and valued. That means that I will call out hate speech wherever I see it, and will unapologetically support any organized calls to deplatform anyone who engages in it.
While I believe in free speech, I believe that that doesn't mean anything goes, and that there should be limits to how far that should extend. As I see it, just as free speech should not extend to people who make unsafe statements (the proverbial "Fire!" in a movie theater) or to people calling for violence, it also should not extend to those who make hateful or bigoted comments about anyone else based on things including ethnicity, religion, country of origin, sexual identiy or gender identity, as a means to silence, harm, or hurt people.
I appreciate taking the time to explain the logic for my questions. The issues I think most of us that oppose this measure, specifically me is that
1. I don't want other people telling me what I can and can not read or what I can or can not buy with my money that I earned through my labor.
2. We as a society have gotten way too comfortable imposing our views on other people
3. What does holding Substack accountable look like? You say it isnt about censorship but aren't you implying that Substack not host this type of information?
4. With content moderation and moderation teams it becomes a game of bad incentives. The people employed to moderate content will need to narrow what is acceptable or thus they risk showing no value to the company...losing their jobs. If they solve the problem, they are unemployed. When you have a team of hammers, everything looks like a nail eventually.
Additionally, once people know how to get something removed, others will use the same mechanism to remove content that don't agree with, no matter if it is true or not.
5. Removing their monetization or removing their substacks only proves the point that the "elite" or "globalist" or <insert term> doesn't want them to share this information, thus giving it the illusion of being true. For example: You aren't going to convert Ben Shapiro listeners to Islam by booting him off the internet.
I respect your opinion (along with anyone else who signed on to this) and hope this doesn't come off as malice. I want to provoke the idea of thinking long term and not what feels right in this kneejerk moment. Throughout history, it has never been "the good guys" censoring information.
What you say is entirely fair and I don't see malicious intent in your comments. I appreciate you taking the time to make them.
Holding Substack accountable to me starts with them enforcing their terms of service, and perhaps not promoting Substacks written by people with a history of hate speech onto the platform's podcast or promoting those Substacks. But again, I think it should be generally accepted regardless of political position that hate speech frequently leads to violence against others and if Substack refuses to ban it, they should at the very least not be profiting from it.
I should also note that back in the day I worked for an even larger platform on the Terms of Service team, and I know first-hand that these questions are not easy ones to answer. But having said that, Substack can't just put their head in the sand and pretend it doesn't matter or it's too hard so they won't address it. And if nothing else, they need to either revise their TOS or enforce it across the board and not selectively ignore Substacks that have large followings and theoretically bring in a lot of money.
I believe there's a middle ground here between 1) silence hate groups or 2) do nothing.
There are non-oppressive tools that can be employed to get at the core issue of people gathering in support of hate towards others.
We are a creative bunch here. We all have the ability to think outside the box and create something new on a daily basis.
It is reasonable say to ourselves, yes "some action" can be taken. If you're feeling that, then bless you for being alive with purpose and meaning.
If we feel called to alleviate suffering, how can we go about that in a way that does not provoke others or perpetuate suffering for a group deemed as unworthy?
I will give you some examples of ways that can reduce the harm of hate groups that do not include censorship:
1) Education and Awareness: in response to the ACLU defending the right for Nazis to demonstrate peacefully in the late 1970’s, the residents of Skokie, Illinois created the Illinois Holocaust Museum. From the ACLU website, they say that the museum “honors the lives lost in the Holocaust. So, out of the pain and anger generated by the Skokie case arose the perfect answer to the Nazis — a monument to ensure that the damage done by the Holocaust will never be forgotten.” – The Skokie Case: How I Came To Represent The Free Speech Rights Of Nazis - https://www.aclu.org/issues/free-speech/skokie-case-how-i-came-represent-free-speech-rights-nazis
Now some answers from Chat GPT, because I am trying to find potential answers that might offer a pathway that creates conditions in which men seeking a tribe and a place to feel they belong and have power have better options available to them. Just as with terrorist groups -- I believe solutions ought to address their needs as human beings making terrible choices, instead of treating them like scum and pushing them both away and further into their righteousness (i.e. not an actual solution).
Please feel free to elaborate or discount anything that does not feel true below…
2) Community Engagement: Encouraging open dialogue within communities to address concerns, fears, and misconceptions can foster understanding and unity. Building relationships between diverse communities can create a sense of solidarity and make it harder for hate groups to gain traction.
3) Counseling and Exit Programs: Offering counseling and support programs for individuals involved in extremist groups can provide an off-ramp for those seeking to leave such ideologies behind. Programs that facilitate rehabilitation and reintegration into society for individuals who have disengaged from hate groups can be effective in preventing recidivism. https://www.lifeafterhate.org/exitusa-client/
4) Promotion of Human Rights: Emphasizing the importance of human rights and equal treatment for all can be a powerful counter-narrative against supremacist ideologies. Supporting organizations and initiatives that work to protect human rights and combat discrimination reinforces a commitment to the principles of equality and justice.
In other words, build a society around the troll that creates the conditions for trolling to become obsolete. This requires human being to human being respect.
And does that happen overnight? No, of course not. Healing never does.
What do you all make of these alternative solutions?
finally a group that was (loosely, at this point) affiliated with the marketing-heavy push of elle and her bullshit that is actually saying "hey fuck dem nazis" and not just hiding behind a very weak free speech argument on a private platform. good job Jack and Jeff