On The Unfortunate Temptation To Avoid Falsification
On building software development habitats where being wrong is not a social risk but a professional obligation
We fail to seek falsification not because we are careless with truth,
but because truth demands behaviour that runs against the grain of being human.
We complain that LLMs are merely coherence engines, but is that so different from us? One of the foundational tenets of the scientific method is that we should seek falsification, and it is one of the most difficult aspects of the discipline for us software engineers because it asks us to think in a way that is not natural for our brains.
Applying the scientific method does not merely ask us to observe, reason, or hypothesise. It asks us to behave against several deep psychological currents at once. Currents shaped by evolution to preserve coherence, status, and momentum rather than epistemic hygiene.
To be clear, we are not broken thinkers. In software engineering, more often than not we are well-adapted humans operating in environments that reward speed, certainty, and narrative closure. Falsification asks us to suspend those rewards and tolerate discomfort in service of long-term clarity. It asks a lot of us, and in this entry we’ll explore why.
The Mind Seeks Sense, Not Disproof
Humans are meaning engines. Once a pattern clicks into place, your mind experiences relief. A hypothesis that “works” closes a circuit. To reopen it feels like damage.
This is confirmation bias, first systematised in psychology and later popularised by Daniel Kahneman. Once we hold a hypothesis, we preferentially:
Notice evidence that fits it
Explain away anomalies as noise
Stop searching once a story feels “good enough”
Falsification demands something alien. It asks us to keep a wound open longer than feels comfortable.
Disconfirmation Is Psychologically Painful
Being wrong is not neutral.
Contradictory evidence triggers cognitive dissonance, a term coined by Leon Festinger. When belief and evidence collide:
The mind experiences tension
The fastest relief comes from reinterpretation, not revision
Beliefs are unconsciously defended to restore equilibrium
Falsification is therefore emotionally expensive. It requires us to tolerate unresolved discomfort longer than our nervous system prefers.
Hypotheses Become Identities
Once a belief becomes mine, attacking it feels like attacking me. This entanglement intensifies when:
The hypothesis is public
It is tied to expertise, authorship, or reputation
Others have invested trust in the conclusion
In organisational and software contexts, hypotheses quietly harden into positions. At that point, falsification threatens:
Status
Credibility
Narrative ownership
Experiments persist, but inquiry can degrade into theatre.
Groups Punish Disproof More Than Error
Socially, being confidently wrong often costs less than being visibly uncertain. Groups tend to reward:
Coherence over accuracy
Confidence over curiosity
Progress over pause
Falsification slows things down. It reopens decisions others may want closed. Teams respond unconsciously by designing experiments that are:
Vague
Underpowered
Interpretable either way
Your involved and invested experiment can succeed socially while it quietly fails epistemically.
We Reason in Stories, Not Statistics
We are storytellers before we are scientists. Human intuition is built around:
Causal narratives
Exemplars
Vivid anecdotes
Falsification often demands:
Abstract statistics
Counterfactual reasoning
Imagining worlds that did not occur
These modes are cognitively costly and emotionally thin. A single compelling success story can outweigh ten quiet disconfirmations.
Success Feels Informative Even When It Isn’t
Success bias is among the most subtle traps. When something works, we infer:
The hypothesis was correct
The mechanism is understood
The model is validated
But success is ambiguous. Many hypotheses predict success. Few predict specific failure modes. Falsification asks a more valuable question:
Under what conditions should this break?
That question almost never arises spontaneously, it must be cultivated.
Falsification Violates Our Sense of Progress
Modern work cultures worship momentum. Falsification looks like:
Backtracking
Slowness
Indecision
Epistemically it is progress in disguise. It collapses possibility space. It removes paths we no longer need to explore. Psychologically, pruning feels like loss, not gain.
Popper Was Right and Unnatural
Karl Popper argued that science advances through bold conjectures and ruthless attempts at refutation. The catch is simple. Popper described an ideal method, not a natural instinct.
Falsification requires:
Ego discipline
Emotional regulation
Structural support
Cultural permission to be wrong
Absent these and even the most sincere scientists revert to confirmation.
When making scientific decisions as software engineers we do not avoid falsification because we dislike truth. We avoid it because falsification asks us to:
Feel discomfort without rushing to relief
Separate identity from belief
Choose long-term clarity over short-term coherence
Trade social smoothness for epistemic integrity
That is not a thinking problem. It is a human problem.
Which is why the environments we design—experiments, teams, platforms, incentives—matter as much as individual virtue.
Some practices to consider
Design experiments that specify how failure should look
Separate hypothesis authorship from evaluation
Reward epistemic losses, not just delivery wins
Treat ambiguity as a crucial phase, not a flaw
Some things to avoid
Treating success as validation
Confusing motion with learning
Performing experiments that cannot disappoint
Mistaking psychological safety for epistemic safety
Falsification is not just a refinement of thinking; it is a re-training of the self. It asks us to slow our reflex to explain, to sit longer with unease, and to treat the collapse of a belief not as failure but as evidence that reality has spoken and we have progress in an uncomfortable form.
The discomfort is the signal. Where certainty comes cheaply, knowledge usually does not. What feels like loss in the moment is often the quiet gain of having removed an entire class of errors from our future thinking.
This is why falsification cannot be left to personal heroics. It must be designed for. Cultures, teams, and systems must make it easier to be corrected than to be consistent, safer to disprove than to defend, and more honourable to retire a belief than to protect it.
Truth does not advance by confidence alone. It advances when we build habitats in which being wrong is not a social risk, but a professional obligation.
Non veritatem fugimus; dolorem eius vitamus.
We do not flee the truth. We flee its discomfort.


