The Words We Didn’t Have
The real debt is not in the code. It is in the growing cost of restoring understanding when the system surprises us
Words and metaphors, especially those that confuse the habitat that software teams work within, are my bag. And so when I read an article and a paper on the ”Theory of Troubleshooting: The Developer’s Cognitive Experience of Overcoming Confusion” by Arty Starr & Margaret-Anne Storey, my mind was, suitably, blown.
And when that happens, a story inevitably follows…
The espresso machine at Le Bon Mot hissed, as it sometimes did, in something suspiciously close to Latin.
“Confusio non est defectus,” it whispered to no one in particular.
(Confusion is not a defect.)
Case looked up from her notebook.
“That thing’s getting philosophical again.”
The Librarian polished a glass with the air of someone who had long ago accepted that appliances occasionally developed opinions.
“Or,” she said, “it’s simply keeping up.”
The door opened with a soft chime.
A woman stepped in, carrying the peculiar weight of someone who had been thinking very carefully about something most people preferred not to name.
“Arty,” said the Librarian, as if greeting an old regular.
Case smiled. “Ah. You’re the one giving us words.”
The Arrival of a Language
Arty took a seat by the fire, where Sophie the French bulldog was already stationed like a small, attentive philosopher-queen.
“I’ve been talking to developers,” she said. “Asking them what it feels like when something… doesn’t make sense.”
Case raised an eyebrow. “You mean debugging?”
Arty shook her head gently.
“No. Not debugging. That’s what we call one activity. I mean the moment when your mental model breaks — and you don’t yet know why.”
Sophie tilted her head, which in this café was equivalent to requesting clarification.
Arty continued.
“They all described the same thing. Different words — puzzling, fuzzy, confusing — but the same experience.”
Case leaned forward.
“The moment when you realise you’re wrong… but you don’t know how.”
“Exactly.”
The Librarian set down her cup.
“And what did you call it?”
Arty paused.
“The confusion experience.”
The First Fracture
Case closed her eyes briefly.
“Yes,” she said. “That moment when the world stops agreeing with you.”
Arty nodded.
“It’s not just an emotion. It’s a cognitive shift. Attention changes. Effort increases. The brain starts trying to reconcile reality with expectation.”
“Prediction error,” murmured the Librarian.
“Or,” said Case, “the moment the map tears.”
Sophie barked once. Agreement. Arty smiled.
“And once it begins, everything else follows.”
The Work Beneath the Work
“Developers think they’re writing code,” Arty continued, “but much of their time is spent doing something else…”
Case finished the point, “… Trying to understand what the system is actually doing.”
“Yes. Troubleshooting isn’t just fixing bugs. It’s rebuilding a mental model under uncertainty.”
The Librarian nodded.
“A kind of archaeology. But the ruins keep moving.”
“And,” Arty added, “it’s exhausting.”
The Cost of Not Knowing
Case laughed softly.
“Ah. Now we reach the real subject. When confusion persists, it consumes attention. Working memory. Focus. Developers described it as tiring in a different way.”
The Librarian raised a finger.
“Cognitive fatigue.”
Arty continued, “Yes. And it builds. Not all at once — but over time. Until people start missing things. Skimming. Becoming… blind.”
Case winced.
“The blindness effect.”
Sophie sneezed, which in this café often meant: Yes, obviously.
Arty continued.
“And here’s the part that matters. The system hasn’t just become harder to work on.
It has become harder to understand.”
Case sat back.
“Which is far more dangerous.”
The Lie of Debt
The Librarian poured more coffee.
“And yet,” she said, “we still call this ‘technical debt.’”
Arty sighed.
“Yes. And that’s the problem.”
Case smiled thinly.
“Debt implies predictability. A schedule. A repayment plan.”
“But what developers described,” Arty said, “wasn’t controlled. It was loss of control. Systems going off the rails. Work becoming… ungovernable.”
The Librarian nodded.
“A habitat collapsing, not a ledger increasing.”
Case tapped her notebook.
“We’ve been using the wrong metaphor. Again”
The Habitat Reveals Itself
From somewhere near the back shelves, a book shifted. Or perhaps listened more closely. Case spoke slowly now.
“If the real work is restoring understanding… then the real system is not the code.”
Arty watched her.
“It’s the environment in which understanding can be recovered.”
The Librarian smiled.
“Now you’re speaking the language of habitats.”
Sophie wagged her tail approvingly.
Poking and Seeing
Arty reached into her bag and pulled out a small sketch.
“Developers described how they actually navigate confusion.”
Case leaned in.
“Let me guess. They don’t follow a neat hypothesis loop.”
Arty laughed.
“No. They poke. They run things. They observe. They gather clues.”
“Poking and seeing,” said the Librarian.
“Yes. It’s how they learn. Not by deduction alone, but by interaction.”
Case nodded.
“A system that cannot be safely poked cannot be understood.”
“And therefore,” said the Librarian, “cannot be trusted.”
The Hidden Metric
The fire crackled. Arty’s voice softened.
“There’s something else. As troubleshooting becomes harder… it takes longer. And longer. And longer.”
“Yes,” said Case.
“And at some point,” Arty continued, “the system crosses a threshold. Not where it stops working, but where understanding can no longer be restored easily.”
The Librarian set down her glass.
“A sustainability boundary.”
Arty nodded.
“Troubleshooting time becomes a signal. A leading indicator. Of loss of control.”
Case smiled, but it was not a comfortable smile.
“So the system becomes dangerous before it becomes broken.”
The Djinn at the Door
At this, the door opened again. The Djinn slipped in, confident as ever.
“I hear you’re discussing efficiency,” it said.
“No,” said Case. “We’re discussing understanding.”
The Djinn waved a hand.
“I can generate solutions instantly.”
Arty studied it.
“And when those solutions fail… can you explain them?”
The Djinn hesitated, only slightly.
“I can attempt—”
“Troubleshoot them?” Case interrupted. “Rebuild the mental model? Restore coherence?”
Silence. Sophie stared at the Djinn with quiet, canine scepticism.
The Librarian spoke gently.
“A faster path into confusion is not progress.”
The Final Shape
The espresso machine hissed again.
“Claritatis habitat, non velocitatis,” it offered.
(The habitat of clarity, not of speed.)
Case laughed.
“It’s getting better.”
Arty gathered her notes.
“I didn’t set out to describe habitats,” she said. “I just wanted to give developers words.”
The Librarian nodded.
“And in doing so, you revealed the environment those words describe.”
Case stood.
“The real debt is not in the code,” she said.
“It’s in the growing cost of restoring understanding when the code surprises us.”
Sophie wagged her tail again.
“And a healthy system,” Case continued, “is one where confusion is neither denied nor normalised—”
“—but shortened,” said the Librarian.
“Instrumented,” said Arty.
“And gently brought back into clarity,” Case finished.
The fire settled. The shelves bent, just slightly, as if accommodating the weight of a new idea. And somewhere, quietly, the habitat adjusted itself around the truth it had just learned.
Some lessons from this story
Habitat Thinking asks you to consider 5 design obligations.
First, treat troubleshooting as normal work, not exception handling. The paper explicitly says developers “drop into troubleshooting mode” constantly during creation, not just during formal bug fixing. That means the habitat must support rapid reorientation, not assume smooth forward motion.
Second, measure restoration of understanding. Not just DORA-style flow, but time-to-clarity, explanation friction, and where developers repeatedly get stuck. That is strongly consonant with your cognitive-debt sensing ideas.
Third, design feedback loops for “poking and seeing.” The paper’s language here is unusually useful: developers learn by probing, running, observing, and gathering clues. Habitable systems make this cheap, quick, and safe. Uninhabitable ones make every experiment expensive or ambiguous.
Fourth, externalise intent, not just implementation. The paper notes that difficult bugs were often hard because intent was undocumented. That connects directly to your argument that the environment must remember for the team.
Fifth, judge AI and automation by whether they preserve diagnostic legibility. Faster generation that worsens troubleshooting is not an optimisation; it is deferred habitat damage.
Here’s a Further Reading section designed in your Software Enchiridion style—curated to deepen the themes of confusion, cognitive load, troubleshooting, and habitat engineering.
📚 Further Reading
On confusion, cognition, and the design of habitable systems
On Confusion, Cognitive Load, and Developer Experience
Theory of Troubleshooting: The Developer’s Cognitive Experience of Overcoming Confusion — Arty Starr & Margaret-Anne Storey
The foundational work behind the story. Establishes confusion as a first-class cognitive experience and reframes troubleshooting as the act of rebuilding mental models under uncertainty.
Idea Flow — Arty Starr
Explores how interruptions and friction degrade a developer’s ability to maintain continuity of thought—an essential companion to understanding confusion as a disruption of flow.
The Neuroscience of Developer Productivity Engineering — Hans Dockter
Frames cognitive fatigue as a core engineering concern. A bridge between intuitive developer frustration and measurable system design concerns.
Other Entries in the Enchiridion
The Fingerpost is Not the Road — On confusing indicators for understanding
On Cognitive Debt and the Care of Systems — On accumulated friction in thinking
Flow is a Property of the System — On environments that enable or destroy progress
On the Unfortunate Temptation to Borrow Certainty — On confusing confidence for truth


