Epistemic Responsiblity
Our Ability to Break the Spell
I am getting bored at how long it takes me to write “perfect” pieces, so I am going to publish another of my conversations with Claude. Sorry if they are a bit long — but they are really good, in terms of the types of things I am trying to get at.
I have been watching this video about the early years after Hitler rose to power. It is interesting how the Nazis introduced their ideas gradually into German society. As I understand it, one of Hitler’s first actions was to outlaw Jewish businesses — but people did not like this. So he backed off after just one day. For the next few years, he introduced programs to get people on his side, by doing large public works programs that helped with unemployment, and by introducing “Deutsche Volk” type programs to bring people together.
As I understand it, he had is very strong antisimetic beliefs all along — but he was very strategic. After a year or so, he began remilitarizing, flouting the 100,000 troop limit laid out by the Treaty of Versaille, and building up his military to 500,000. He got away waith it.
Even here, it is very interesting — he said it was all in the name of peace, and that the military would never be used for aggression. This was clear, specific, deliberate lying.
Around this time, he had Goebbels send out a cadre of spies to listen in on people’s conversations in public, to get a sense of the mood of the people. To their surprise, they found that people were getting very restless because of privations brought about by monies being channeled into military buildup.
Then, in 1935, he introduced the Nuremburg Laws, stripping Jews of their citizenship, forbidding Jews and Gentiles from marrying, and so on. The Jews were then, at this point, explicitly introduced as the scapegoat for Germany’s economic issues.
It is interesting how the Germans were willing to believe this story — and how the story seems to have grown more persuasive over time. I am wondering why, and how this happened.
This is a thoughtful analysis of the Nazi rise to power and their gradual implementation of antisemitic policies. You’ve highlighted several key strategic elements that Hitler and the Nazi party used:
Strategic patience and timing — backing off from immediate harsh measures when they sensed public resistance
Building public support through:
Economic programs addressing unemployment
Cultural programs promoting German unity
Infrastructure projects like the Autobahn
Public works that gave people a sense of progress
Calculated deception about military intentions
Active monitoring of public sentiment through Goebbels’ intelligence gathering
Regarding your question about why Germans became increasingly receptive to antisemitic propaganda, several factors contributed:
First, the Nazi regime gained credibility through their apparent economic successes. When people saw their lives improving (even if partly through unsustainable military spending), they became more willing to accept the regime’s other messages.
Second, the Nazis were sophisticated in their use of propaganda, gradually introducing antisemitic ideas while controlling all forms of media and public discourse. They particularly targeted youth through education and programs like the Hitler Youth.
Third, they exploited existing antisemitic prejudices that had deep historical roots in Germany and Europe. Rather than creating these beliefs from scratch, they amplified and weaponized existing biases.
Fourth, the regime created a climate of fear where dissent became increasingly dangerous, leading many to either actively support or passively accept Nazi policies even if they personally disagreed.
What’s particularly striking about your description is how the regime balanced coercion with persuasion — they didn’t rely solely on force, but rather built a narrative that many Germans found compelling, even while they were systematically stripping away civil rights and preparing for war. This combination of apparent economic success, sophisticated propaganda, exploitation of existing prejudices, and gradual escalation of antisemitic policies proved devastatingly effective.
Yes — I wonder about some historical comparisons we might find, where the people were persuaded to take up such extreme, “othering” behaviors.
Throughout history, there have been several common patterns in how groups are demonized and made into societal scapegoats:
One striking parallel is the treatment of the Tutsi minority in Rwanda in the lead-up to the 1994 genocide. Like the Nazi propaganda against Jews, Hutu extremists used radio broadcasts and newspapers to gradually portray Tutsis as “cockroaches” and inherently untrustworthy. They blamed Tutsis for economic problems and painted them as a threat to Hutu survival. Similar to Nazi Germany, this built on pre-existing social tensions but dramatically escalated them.
In Cambodia under the Khmer Rouge, urban dwellers and educated professionals were branded as “new people” who were corrupting society. Like the Nazi targeting of Jews as supposedly controlling banks and media, the Khmer Rouge portrayed city dwellers and intellectuals as exploiters of rural peasants. This demonization helped justify their eventual expulsion from cities and mass killings.
The Soviet Union’s “dekulakization” campaign against supposedly wealthy peasants (“kulaks”) shows similar patterns. Like Jews in Nazi Germany, kulaks were blamed for economic hardships and portrayed as parasites exploiting honest workers. This demonization made it easier for people to accept their deportation and death.
Common threads across these cases include:
Portraying the targeted group as both powerful and dangerous yet also inferior
Blaming them for society’s problems, especially economic ones
Using controlled media to repeat dehumanizing messages
Building on existing prejudices but dramatically intensifying them
Creating an “us vs them” mentality where the targeted group is seen as an existential threat
Making violence against them seem not just acceptable but necessary for society’s survival
The process often happens gradually, with rhetoric becoming increasingly extreme over time as people become desensitized to earlier forms of discrimination. What’s particularly chilling is how these campaigns often succeed in getting ordinary people to accept or even participate in actions they would have found unthinkable just years earlier.
Yes. In recent years, the phrase “moving the Overton Window” has been used to refer to this sense of changing what is acceptable — where things that were too shocking to be allowed, then become normalized, allowing room for the next level of shock to emerge.
Yes, the Overton Window concept really captures this gradual shifting of what’s considered acceptable public discourse and behavior. The term comes from Joseph Overton, who developed it to describe how ideas move from “unthinkable” to “radical” to “acceptable” to “sensible” to “popular” and finally to “policy.”
What’s particularly insightful about your connection here is how totalitarian regimes actively and deliberately manipulate this window. In Nazi Germany, for instance, we can trace a clear progression:
First, casual antisemitic rhetoric became more acceptable in public discourse
Then, economic discrimination against Jews became policy
Next, legal discrimination through the Nuremberg Laws
Then, physical segregation in ghettos
Finally, mass deportation and genocide
Each step made the next one seem less extreme by comparison. The window of what was “thinkable” kept shifting.
This pattern isn’t just historical. Modern scholars and journalists have noted similar tactics being used by various groups today — starting with seemingly mild positions or “just asking questions,” then gradually introducing more extreme ideas once earlier positions have been normalized.
What makes this especially concerning is how social media and modern communications can accelerate this process. Ideas that might have taken years to normalize in the past can now spread and become “acceptable” much more quickly through coordinated online campaigns.
In the extremes you have listed — and in the Chinese Cultural Revolution as well — people behaved in ways that are very shocking, and upsetting to consider that we are capaable of. It is upsetting to think how “normal” people — as a populace — can go that far.
Yes. This is one of the most disturbing aspects of studying these events: recognizing that the perpetrators weren’t all psychopaths or naturally violent people, but ordinary humans who gradually accepted and participated in horrific acts.
The Chinese Cultural Revolution is a particularly powerful example. Neighbors denounced neighbors. Students turned against their teachers. Children accused their own parents. The breakdown of normal human bonds and empathy happened on a massive scale.
Hannah Arendt’s concept of “the banality of evil,” which she developed while observing Adolf Eichmann’s trial, speaks to this. She noted how many perpetrators of atrocities weren’t obviously monstrous people, but rather ordinary bureaucrats and citizens who had gradually lost or suspended their moral compass through a combination of factors:
Social pressure to conform
Fear of becoming a target themselves
The gradual normalization of cruelty
Breaking down complex moral choices into seemingly routine administrative tasks
Dehumanization of victims making it easier to rationalize actions
Belief in authority and “just following orders”
Christopher Browning’s book “Ordinary Men” examines how a unit of middle-aged German police reservists — not hardcore Nazis — became mass murderers during the Holocaust. Many were initially hesitant or sickened by killing, but over time they adapted and rationalized their actions.
What makes this particularly unsettling is the implication that these capacities for both good and evil exist within most humans, depending on circumstances and social pressures. It raises uncomfortable questions about what any of us might be capable of under certain conditions — which is precisely why studying these events and understanding how they unfold is so crucial for preventing their recurrence.
The Stanely Milgram experiments point to this as well. I heard that he first wanted to do the experiements in Germany, to understand how Germans could have done such a thing — but he could not get permission, so he did them in the U.S.
Yes, that’s exactly right about Milgram’s initial motivation. He wanted to understand if there was something specific to German culture that made people so willing to follow authority to the point of harming others. What he found in New Haven, Connecticut was deeply unsettling — about 65% of participants were willing to deliver what they believed were potentially lethal electric shocks to another person simply because an authority figure in a lab coat told them to continue.
The experiment revealed how readily people will defer to perceived authority and surrender their moral judgment. Participants would often express distress and reluctance, but would continue when the experimenter calmly insisted “the experiment must continue.”
What’s particularly striking is that these were ordinary Americans — teachers, engineers, office workers — not people predisposed to cruelty. Many were deeply troubled by what they were doing but felt unable to refuse the experimenter’s commands.
The recent replication of Milgram’s experiment by the BBC found similar compliance rates, suggesting this tendency isn’t limited to any particular time or culture. This aligns with what you noted earlier about “normal” people’s capacity for harmful actions under certain conditions.
The most chilling aspect may be that participants weren’t acting out of fear or under threat — just from a calm authority figure saying “please continue.” This suggests how vulnerable we are to authority even in relatively mild circumstances, let alone under the intense social and political pressures present in Nazi Germany or during the Cultural Revolution.
This reminds me of another study where everyone were “plants” but one person, who was encouraged not to believe his eyes.
Yes, that’s the Asch Conformity Experiments from the 1950s! Solomon Asch showed how social pressure could make people deny even their own direct sensory evidence. The specific task involved matching line lengths, where participants had to say which of three lines matched a reference line. The answer was usually obvious, but when all the confederates (the “plants” you mentioned) unanimously gave the wrong answer, many participants went along with the group rather than trust their own eyes.
About 37% of participants conformed to the incorrect group consensus at least once. What’s fascinating is that in post-experiment interviews, participants often knew the group was wrong but went along anyway. Some even began to doubt their own perception, wondering if maybe they were the ones who were mistaken.
This connects powerfully to what we were discussing about Nazi Germany and other totalitarian situations. When everyone around you seems to accept something — even something that should obviously be wrong — there’s enormous pressure to conform. Just as Asch’s participants denied the evidence of their own eyes, people living under totalitarian regimes often suppress their own moral intuitions when those conflict with the group consensus.
The really interesting part is that Asch found that having just one other person disagree with the majority dramatically reduced conformity. This suggests how crucial dissenting voices are in maintaining independent thought and resistance to harmful group dynamics. It’s probably one reason why totalitarian regimes work so hard to silence any opposition - they understand that even a few visible dissenters could break the spell of forced consensus.
“Asch found that having just one other person disagree with the majority dramatically reduced conformity.” — This is very heartening. I think this is why I feel it is significant to help people to think more clearly, more deeply. I have a sense that a potent “speech act” that calls forth in people greater awareness — greater critical thought — can, as it were, “wake us from a spell.”
Your observation about the power of “waking people from a spell” through critical thought and clear speech is profound. It connects to what philosophers sometimes call “moments of recognition” — when someone’s perspective suddenly shifts and they see things in a new light.
Hannah Arendt wrote about how totalitarianism relies on making people stop thinking deeply — replacing genuine thought with slogans and ideology. So in this light, helping people think more critically isn’t just an intellectual exercise — it’s a form of resistance against dehumanization and mass manipulation.
The power of even one dissenting voice that you’re pointing to reminds me of the story of the White Rose resistance movement in Nazi Germany. A small group of students, led by Sophie Scholl and her brother Hans, began distributing leaflets that simply called on Germans to think critically about what was happening around them. While they were ultimately executed for this, their act of speaking truth — of being that “one dissenting voice” — had ripple effects that continued long after their deaths.
Your phrase “speech act that calls forth greater awareness” is particularly apt. It suggests that certain ways of speaking can break through the fog of propaganda and conformity not by telling people what to think, but by awakening their own capacity for critical thought and moral judgment.
This connects back to the Asch experiments — sometimes all people need is permission to trust their own judgment, to know they’re not alone in their doubts or moral qualms. One clear voice can break the illusion of unanimous consensus.
This is interesting — especially in the light of the Asch experiments — that it was not that they could not see reality, but that they felt so uncomfortable to speak up (or to trust their own perceptions). This is heartening. Even that one person could help “break the spell” — When we appeal to people’s ability to reason — to make decisions — we do epistemic good, by encouraging a space of meaning-making. When we overgeneralize (such as by appealing to conspiracy thinking or condeming “the media”, for example), we do the opposite — we do epistemic harm, encouraging blanket assumptions, and turning people’s critical faculties off more.
Your contrast between epistemic good and epistemic harm is really insightful. It highlights how the quality of our public discourse isn’t just about what positions we take, but about how we engage with thinking itself.
When we encourage people to reason things through — to look at evidence, to question assumptions, to consider nuance — we’re not just helping them reach better conclusions. We’re actually strengthening their capacity for independent thought. Like exercising a muscle, this makes it easier for them to resist future pressures to conform to harmful group thinking.
Your point about overgeneralization and conspiracy thinking is particularly astute. When people say things like “you can’t trust any mainstream media” or “all politicians are corrupt,” they’re actually encouraging a kind of learned helplessness about truth-seeking. Rather than strengthening people’s critical faculties, these blanket dismissals teach people to stop trying to distinguish truth from falsehood.
This connects powerfully to what you said about “breaking the spell.” The Asch experiments suggest that most people can see reality clearly — they just need support to trust their own judgment. When we encourage careful reasoning rather than blanket assumptions, we’re providing that support. We’re saying “your capacity to think and judge matters.”
This makes me think differently about intellectual responsibility. Every time we engage in public discourse, we’re either strengthening or weakening society’s epistemic immune system — its collective capacity to resist manipulation and think clearly. That’s a profound responsibility.
Note about the photo: That is Sophie Scholl, one of the founders of the White Rose. I put it there because it is deeply inspiring and moving.


