In yesterday's essay on 'Parallel Realities,' I explored how algorithmic sorting and information manipulation have created fragmented realities where we no longer share a common understanding of facts. Today, I'll dig into the flip side: why our minds cling to beliefs despite evidence of manipulation.
A saying, dubiously tied to Mark Twain, puts it: "It's easier to fool people than to convince them they've been fooled." True or not, it nails a psychological reality - once we've committed to a belief, abandoning it can feel nearly impossible.
In my previous work, I've explored how our information landscape is systematically engineered through algorithmic division (‘Engineering Reality‘), institutional narratives (‘Reading Between the Lies’), and the systematic dismissal of pattern recognition (‘That Can't Be True’). But understanding these external systems is only half the equation. The other half lies within us – the psychological mechanisms that make us resistant to changing our minds even when confronted with overwhelming evidence.
Why Being Wrong Hurts
I was talking with a friend recently about historical events that don't add up. When I suggested he look at some evidence questioning the official 9/11 narrative, he shut down immediately - not because he's unintelligent or incurious, but because "he lost a friend that day." His emotional connection to the event has created a psychological fortress that no evidence can penetrate. Similarly, many who zealously defended COVID policies now acknowledge "mistakes were made" but insist "experts had good intentions." This isn't reckoning; it's rationalization.
An Atlantic article I just came across titled ‘Why the COVID Reckoning Is So One-Sided’ (non-paywalled version) perfectly illustrates this psychological resistance to changing beliefs. Jonathan Chait, the author, smugly criticizes conservatives while demonstrating the very cognitive blindness I'm describing - brushing off liberal "mistakes" as mere good-faith errors rather than the systematic failures that devastated lives. Nowhere does he acknowledge the weaponized censorship that crushed dissent.
This connects directly to what I discussed yesterday about parallel realities. My own experience illustrates this divide - when I spoke against mandates, many in my personal and professional circles couldn't defend their positions with science or logic. Rather than engage, they simply stopped communicating with me. Now we exist in the separate timelines I described yesterday. I'm not bitter - just genuinely confused by how easily human connections fractured when beliefs were challenged. I have forgiveness in my heart, but I won't forget how quickly people revealed their true priorities when social conformity conflicted with open inquiry.
These reactions reveal something profound about human psychology: admitting we've been manipulated isn't merely a matter of processing new information. It requires confronting the possibility that our fundamental understanding of reality – and perhaps our very identity – was built on falsehood.
The Cost of Admission
Consider the mRNA vaccines. For parents who rushed to get their children vaccinated, or doctors who enthusiastically promoted them to patients, acknowledging potential harms isn't simply a matter of updating their risk assessment. It would mean confronting the unbearable possibility that they may have harmed those they love most.
Healthcare workers were prioritized for vaccination, locking them into the narrative early. Once you've taken the shot and pushed it on patients, your identity - professional judgment, ethics, self-image as a healer - hinges on its safety. The cost of admitting error becomes psychologically prohibitive.
The cost becomes devastatingly personal. Several friends now take their children to cardiologists for issues that developed after vaccination. Only one has privately confided that he believes the shots caused his child's condition. For the others, acknowledging this possibility would mean confronting an unbearable guilt - that they may have harmed their child by following what they believed was responsible medical advice.
This explains why some of the most dedicated defenders of these interventions are often healthcare providers who administered them. As psychologist Leon Festinger and his colleagues demonstrated in their landmark 1957 study ‘When Prophecy Fails,’ when evidence contradicts a core belief, many people don't abandon the belief – they double down on it while dismissing the evidence.
Identity's Trap
The same psychological mechanisms operate in the transgender youth debate. Parents who've supported their child's medical transition face an insurmountable psychological barrier to reconsidering, regardless of emerging evidence about risks or regret rates.
As I observe friends navigating this terrain with their own children, I'm struck by the parallels to other forms of belief entrenchment. Parents who've greenlit their child's transition face the same trap - reconsidering risks admitting a potential catastrophe.
The more public their support, the higher the stakes. Once you've proudly announced your child's transition on social media, testified before school boards about the importance of gender-affirming care, or been celebrated as a "model supportive parent," the identity trap snaps shut. Changing your mind isn't merely adjusting to new information – it's a form of social and psychological suicide.
The social contagion aspect is striking. A friend recently shared that in their child's 9th grade class at an elite NYC private school, nearly 50% of the girls now identify as something other than female. In my day, many of those girls would've just painted their nails black and said they were goth.
Stepping back from the current narrative, what I believe we're witnessing is an ephemeral fad rather than a revelation about human nature that somehow remained hidden throughout history until now. This perspective isn't denying anyone's experience - it's simply placing it in the context of how teenagers have always navigated the turbulent waters of self-discovery.
What breaks my heart is watching these kids navigate adolescence - already difficult enough - with genuine pain and confusion. Their struggles deserve to be taken seriously. But I worry that instead of helping them explore identity in ways that preserve future options, we've rushed to medicalize what may be normal developmental phases, often leading to irreversible interventions before they've fully developed their sense of self.
Of course gender dysphoria exists, and those experiencing it deserve not just compassion and dignity, but our unwavering support. My concern isn't about affirming identities - it's about the timing and permanence of medical decisions. We don't let children get tattoos, join the military, or make other life-altering choices precisely because we understand developmental psychology. Yet in this one area, thoughtful caution is labeled as hatred, making meaningful conversations nearly impossible.
When Authority Blinds
Beyond identity, our trust in authority deepens the bind - think Milgram's experiments at Yale in the 1960s that revealed humanity's disturbing tendency to obey authority even when doing so violates our own moral compass. Participants continued administering what they believed were painful electric shocks because a man in a lab coat assured them it was necessary.
Our modern parallel is striking: well-educated professionals suspended their judgment and ethical concerns because public health officials in positions of authority assured them that unprecedented measures were necessary. When experts recommended policies with no historical precedent or supporting evidence, many educated people reflexively complied – not from careful evaluation, but from deference to authority.
"Trust the Science™" became the modern equivalent of Milgram's "The experiment must continue" – a thought-terminating phrase designed to override individual judgment. This deference wasn't a sign of scientific understanding but its opposite – the substitution of authority for evidence.
The Status Shield
In "The Illusion of Expertise," I explored how our professional class often mistakes credentials for wisdom. This dynamic creates another barrier to changing beliefs: status protection.
For many educated professionals, their social standing depends on being seen as informed and rational. Admitting they were fundamentally wrong about important matters threatens not just their beliefs but their status. If you've built your identity around being "evidence-based" or "following the science," acknowledging you were misled challenges your core self-concept.
This explains the vehemence with which many defended increasingly incoherent COVID policies. Their fierce attachment wasn't to the policies themselves but to their self-image as rational followers of expert guidance. Changing their position wasn't merely a factual update – it meant losing face.
How Our Brains Fight Truth
Research in cognitive neuroscience suggests a compelling insight: our brains process challenges to core beliefs similarly to how they process threats. When presented with evidence contradicting deeply held views, people often experience a physiological stress response—not just intellectual disagreement. Our neural circuitry seems designed to protect our worldview almost as vigilantly as our physical safety.
This explains why presenting facts rarely changes minds on emotionally charged issues. When someone responds to contrary evidence with anger or dismissal, they're not being stubborn – they're experiencing a neurological threat response.
Our brains evolved to prioritize social acceptance over objective truth – a survival advantage in tribal settings where rejection could mean death. This creates a fundamental vulnerability: we're wired to conform to our social group's beliefs even when evidence suggests they're wrong.
So how do we overcome wiring this primal?
Breaking the Spell
If human psychology creates such powerful resistance to changing beliefs, how can we ever hope to break through? The first step is compassion – understanding that these mechanisms aren't signs of stupidity but of being human.
When someone refuses to acknowledge even overwhelming evidence that contradicts their beliefs, they're not necessarily being dishonest or irrational. They're protecting themselves from psychological harm that feels as real as physical danger.
Breaking through these barriers calls for:
Creating Safe Spaces for Doubt: People need environments where questioning doesn't mean immediate rejection. The more socially costly it is to express doubt, the more entrenched beliefs become.
Preserving Dignity: Change becomes possible when people can save face. This means focusing on systems rather than personal failings, allowing people to update beliefs without feeling like fools.
Building Trust Through Shared Values: Before challenging someone's beliefs, establish common ground. People are more receptive to difficult truths from those they perceive as sharing their fundamental values.
Patience With Process: Belief change typically happens gradually, not in dramatic conversions. A person might privately question long before publicly shifting position.
Leading With Questions, Not Assertions: The Socratic method remains powerful – questions that prompt reflection often succeed where direct challenges fail.
What’s at Stake
This isn't merely about winning political arguments or being proven right. The psychological mechanisms that prevent us from updating false beliefs create vulnerabilities that extend to every aspect of society.
A population unable to acknowledge manipulation becomes increasingly susceptible to it. When we can't admit we were wrong about Iraq's WMDs, we're vulnerable to the next war narrative. When we can't reconsider lockdown harms, we're primed for the next emergency response. When we can't question pharmaceutical industry influence, we're defenseless against the next profitable intervention.
As I explored in "Empty Gestures," this psychological resistance enables some of our darkest moments – periods when otherwise good people participate in persecution because acknowledging the truth would mean confronting their own complicity.
The Compartmentalized Mind
The most profound barrier to changing beliefs may be what I described yesterday as the fragmentation of mind - our ability to compartmentalize information so effectively that contradictions can coexist without creating the dissonance that might prompt reconsideration.
Real growth requires what I called "the undivided mind" – the capacity to hold complexity without retreating into simplistic narratives, to recognize patterns without succumbing to paranoia, to maintain principles without demonizing those who disagree.
This integration isn't just intellectual but emotional – learning to tolerate the discomfort of uncertainty and the pain of admitting error. It's a form of psychological maturity that our current information environment actively discourages.
The Courage to Reconsider
The question isn't whether you or I have been manipulated – we all have been, in various ways. I've certainly fallen for narratives that later proved false, and had to face the uncomfortable process of reconsidering deeply held positions. The difference lies not in our immunity to deception, but in our willingness to acknowledge it when evidence emerges. A sign of true intelligence isn't the credentials one holds or even the knowledge one possesses, but the willingness to reconsider viewpoints when new information comes to light.
Those who seem most resistant to changing their minds often have the most invested in the status quo – whether professionally, socially, or psychologically. Their resistance isn't evidence of inferior intelligence but of deeper investment in the systems that shaped their success.
Meanwhile, those with less to lose from system change – the working class, the marginalized, those who've witnessed systemic failure firsthand – often display a more grounded skepticism toward institutional narratives.
Understanding these psychological barriers doesn't mean abandoning the pursuit of truth. Instead, it means approaching it with greater compassion, recognizing that behind every fierce defense of a false narrative lies a very human fear of what changing one's mind might cost.
We've all been manipulated - it's universal. The difference is in owning it. A society that can't grows ever more vulnerable. Truth requires not just better systems but self-awareness - making reconsideration an act of courage, not defeat
Thank you for this post. i am a veterinarian and highly trained in biological sciences, immunology, vaccines, etc. i immediately fell back on my knowledge base and began questioning the vaccine issue. To see intelligent people fall into total compliance has been so difficult to wrap my head around. Now I am facing another similar issue with a genetically engineered product released on the market to treat arthritis in cats and dogs. My entire profession nearly, colleagues who I highly respected, have once again, believed Pharma when they say their are no side effects, even as their patients are collapsing with sudden paralysis, dementia, multiorgan failure and immune mediated melt down. Man this is so tough to witness. I try to open the discussion and they look at me like Im a monster with 3 heads.
Wow! Just Wow. What an incredibly insightful, well-written piece to help us understand the mire we are in. And no finger-pointing! I am so impressed.