The Debts That Grow in Silence
AI accelerates not just code but the fragile ecology of understanding and intent
In homage to this excellent talk and this research. If you’ve worked with the Djinn enough, felt the attraction of the speed, then you will have felt (or will soon feel) the debts that are coming. Those debts that grow when the factory takes flight.
The Librarian said the trouble with factories is not that they produce things.
She said it while polishing a glass that had never held anything stronger than espresso, which, in Le Bon Mot, counted as a kind of theological statement.
“The trouble,” she continued, “is that they forget what grows.”
Case did not look up immediately. She had a printout spread across the table—creased, annotated, restless in the way papers become when they have been handled too much by someone trying to understand them faster than they are willing to be understood.
On the cover, in a font that tried to be neutral and failed, were the words:
From Technical Debt to Cognitive and Intent Debt.
Beside it, a second stack—slides, printed too small, margins filled with arrows and exclamation marks—leaned like a structure that had been assembled in a hurry and would not survive a strong wind.
The Djinn was already there, of course. He had arrived before the papers. Before Case. Before the thought that had brought her here.
He was sitting backwards on a chair, as though chairs were suggestions, not constraints.
“I like this one,” he said, tapping the page without touching it. “Triple debt. It has a satisfying symmetry. Code, people, artifacts. A triangle. Very classical. Very stable.”
Case finally looked up.
“It’s not stable,” she said. “That’s the point.”
The Djinn smiled in the way that suggested he had already anticipated the correction and was pleased by it.
“Ah,” he said. “Then it is more interesting.”
The Factory View
It began, as these things often do, with a system that worked. Not perfectly. Not elegantly. But sufficiently. Which, in most organisations, is indistinguishable from success.
The team had built it quickly. Faster than anyone had expected. Faster, perhaps, than anyone had understood.
AI had helped. Of course it had helped.
Around a quarter of startups now used AI to write the vast majority of their code—ninety-five percent, in some cases. The numbers were always presented with a kind of quiet awe, as though velocity itself were evidence of progress.
The team in question had not set out to become an example. They had simply followed the path of least resistance, which, in the presence of a sufficiently capable AI, becomes indistinguishable from a conveyor belt.
Prompt. Generate. Accept.
Prompt. Generate. Accept.
The rhythm was hypnotic. The output, plausible. The progress, measurable.
It felt, in other words, like a factory.
Case had seen this before.
Not the tools. Not the speed. But the feeling. The subtle shift from understanding to throughput. From craft to flow. From place to process.
“They thought they were building a product,” Case said, more to the paper than to anyone else. “A collection of things. Features. Services. Endpoints.”
“And they were,” said the Djinn. “Look at the code. Look at the tests. Look at the deployment pipeline. It all exists.”
Case shook her head.
“They forgot what they were building in.”
The First Fracture: Technical Debt
The paper began, as all such papers must, with technical debt. Familiar ground. Comfortable terrain.
Technical debt lives in the code. It is visible. Measurable. Refactorable. It is the kind of problem that factories understand. Messy code. Architectural shortcuts. Trade-offs made in the name of speed.
The interest accrues. The cost increases. Eventually, someone pays.
“This is the part everyone knows how to talk about,” Case said.
The Librarian nodded.
“It is the debt that has an address.”
The paper elaborated. AI, they said, might make this worse. Or better. Or both. It could generate poor-quality code at scale. It could also refactor, test, explain. It might even, one day, make code disappear entirely. Reducing it to an invisible substrate managed by agents .
“The factory loves this,” said the Djinn. “Inputs, outputs, quality control. It is all very legible.”
Case traced a line on the page.
“Technical debt makes systems harder to change,” she read aloud.
“Which is a problem,” said the Djinn, “but not the one that stops everything.”
“No,” said Case. “Not the one that stops everything.”
The Second Fracture: Cognitive Debt
The shift happened quietly. It always does.
At first, the team noticed small things. A feature that behaved unexpectedly. A test that passed, but shouldn’t. A change that broke something unrelated.
They spent time debugging. More time than before. Then more still. By week eight, the progress had slowed to a crawl.
Small changes were hard. Entire sections were scrapped and rebuilt. Someone said it would have been quicker to do it themselves .
Case flipped to the relevant page.
“I spent a lot of time debugging,” she read. “Small changes were hard. Ended up scrapping everything” .
The Librarian placed the glass down.
“This is where the factory metaphor begins to fail,” she said.
“Because the machines are still working,” said Case.
“Yes,” said the Librarian. “But the people are not.”
Cognitive debt, the paper explained, lives in people. It is the erosion of shared understanding across a team over time. Not individual confusion. Not momentary difficulty. A loss of the theory of the system — the mental model that allows a team to reason about what they have built .
The Djinn leaned forward.
“But the code is there,” he said. “Surely they can read it.”
Case smiled, but not kindly.
“Reading code is not the same as understanding it,” she said. “And generating code is not the same as building a model of it.”
The paper was explicit about this. When developers write code, they build understanding through friction. When AI generates it, that friction is reduced. At scale, across a team, over time, this creates an accumulation of not knowing.
“They called it cognitive surrender,” Case said. “Accepting outputs without building the mental model.”
The Djinn looked pleased.
“I have seen that,” he said. “It is very efficient.”
“Yes,” said Case. “Until it isn’t.”
The Third Fracture: Intent Debt
If cognitive debt is the loss of understanding, intent debt is the loss of purpose. It is, in many ways, the most dangerous. Because it is the least visible.
Intent debt lives in artefacts. Or, more precisely, in their absence .
Goals not written down. Decisions not recorded. Constraints not externalized.
The Djinn frowned.
“But the system does something,” he said. “It has behaviour.”
“Yes,” said the Librarian. “But does anyone know why?”
Case found the passage she was looking for.
“No one on the team could explain why certain design decisions had been made,” she read. “Or how different parts of the system were supposed to work together” .
The Djinn was quiet for a moment.
Then:
“That is… inconvenient.”
“It is fatal,” said Case.
Intent debt makes it difficult to know what the system is for. And without that, neither humans nor agents can evolve it safely.
The paper described an “AI frustration loop.” AI generates solutions misaligned with architecture, because the architecture’s why was never externalised.
Developers spend time editing. Paying the interest on missing rationale. Context is lost between sessions, because intent lives nowhere persistent.
“It’s not that the AI is wrong,” said Case.
“It is that it is guessing,” said the Librarian.
“And guessing is all it can do,” added Case, “when the habitat provides no guidance.”
The Triangle That Breathes
The slides depicted the model as a triangle. Technical debt. Cognitive debt. Intent debt. Three layers. Three locations. Code. People. Artifacts.
They interacted. Reinforced each other. Amplified each other.
The Djinn traced the diagram in the air.
“Technical debt makes things hard to change,” he said.
“Cognitive debt makes them hard to understand,” said Case.
“Intent debt makes them hard to know,” said the Librarian.
They sat with that for a moment.
“Three kinds of difficulty,” said the Djinn.
“Three kinds of blindness,” said Case.
“Three ways a system can fail,” said the Librarian.
The Habitat That Was Missing
Case leaned back.
“This is not a paper about debt,” she said.
The Djinn tilted his head.
“It is called a paper about debt.”
“Yes,” said Case. “But that’s not what it’s about.”
The Librarian smiled.
“Go on.”
Case tapped the page.
“It’s about where the system lives.”
She pointed to the sections.
Code. People. Artefacts.
“This is not a factory,” she said. “It’s a habitat.”
The Djinn considered this.
“In a factory,” he said slowly, “you optimise throughput.”
“Yes.”
“You reduce friction.”
“Yes.”
“You treat humans as interchangeable components.”
Case nodded.
“And in a habitat?” he asked.
The Librarian answered.
“In a habitat, you optimise for thriving.”
The Return to Fundamentals
The paper referenced McLuhan. Retrieval. The idea that new technologies revive old practices.
As AI generates more code, the need to capture intent — through specifications, tests, and rationale — becomes critical again.
Case smiled at this.
“We’ve been here before,” she said.
“Of course,” said the Librarian. “We always have.”
The Djinn looked thoughtful.
“So the factory,” he said, “was always the wrong metaphor.”
“Yes,” said Case.
“And AI,” he continued, “has simply made that more obvious.”
“Yes. I amplify.”
“And the debts,” he said, “are symptoms of a deeper misalignment.”
Case nodded.
“They are what happens when you try to run a habitat like a factory.”
The Signals of a Failing Habitat
They turned to the diagnostic signals. Confusion about behaviour. Cognitive fatigue. Reluctance to change code. People asking basic questions. Slow onboarding. Low “bus factor”.
The Djinn ticked them off on his fingers.
“These are not code problems,” he said.
“No,” said Case. “They are environmental problems.”
The Librarian added:
“They are signs that the habitat is no longer supporting life.”
Designing for Thriving
The mitigation strategies were telling. Postmortems. System walkthroughs. Design for troubleshooting. Habitat thinking.
Case laughed softly.
“They put it right there,” she said.
The Djinn leaned in.
“Habitat thinking?”
“Yes,” said Case. “Design the environment for rebuilding shared understanding.”
The Librarian poured coffee.
“Like the café. Understanding,” she said, “is not a byproduct.”
Case nodded.
“It’s a deliverable.”
The paper said as much. Treat understanding as a first-class deliverable. Capture intent early. Resist the automation of understanding .
“Resist the automation of understanding,” the Djinn repeated.
“That sounds… inefficient.”
“Yes,” said Case. “It is.”
The Scene in the Café
At some point, the conversation shifted. It always does. From the abstract to the specific. From the model to the moment.
Case closed the paper.
“Imagine,” she said, “a team sitting where we are now.”
The Librarian dimmed the lights, though no one had asked her to.
The Djinn leaned back.
“They have their laptops,” Case continued. “Their agents. Their prompts.”
“They are shipping features,” said the Djinn.
“They are moving fast,” said the Librarian.
“They understand it well enough,” said Case.
The three of them paused.
“That line,” said the Librarian, “is always the beginning of the end.”
“The team hits a wall,” said Case.
“Not because the code is broken. But because the understanding is. They cannot explain the system. They cannot predict its behaviour. They cannot trust their changes. Cognitive debt has accumulated. Intent debt has hollowed out the artefacts. Technical debt is still there but it is no longer the sole scapegoat.”
“They are paralysed,”
The Djinn nodded slowly.
“I have seen this,” he said. “They stop prompting.”
“Yes,” said Case.
“They stop changing things.”
“Yes.”
“They tire. They begin to fear the system.”
Case looked at him.
“And what does that tell you?”
The Djinn hesitated.
“That the system is no longer a tool,” he said.
“What is it then?” asked the Librarian.
The Djinn thought for a long moment.
“A place,” he said finally.
“And?” prompted Case.
“A place they do not understand.”
The Recovery
The paper suggested remedies.
Make intent explicit earlier. Use engineering fundamentals. Treat understanding as a deliverable .
But Case saw something else.
“These are not fixes,” she said.
“They are gardening practices.”
The Djinn smiled.
“I like that.”
“You would,” said the Librarian. “You have never had to maintain one.”
Case continued.
“Walkthroughs are not inspections. They are shared exploration.”
“ADRs are not documentation. They are memory.”
“Tests are not verification. They are executable intent.”
The Djinn nodded.
“And the AI?” he asked.
“Is a participant in the habitat,” said Case.
“Not just the factory worker,” added the Librarian.
The Final Realisation
The café had grown quieter. The shelves had shifted slightly, accommodating the weight of the conversation.
Case gathered the papers.
“This research,” she said, “is not warning us about debt.”
“It is reminding us what software is.”
The Djinn tilted his head.
“And what is that?”
Case looked around Le Bon Mot. At the books. At the people. At the subtle interactions between them.
“A system,” she said, “is not just code.”
“It is intent,” said the Librarian.
“It is understanding,” said the Djinn.
“It is a habitat,” said Case.
Epilogue: The Djinn at the Door
Later, as Case prepared to leave, the Djinn stood by the door.
“You know,” he said, “I can generate code faster than any of them.”
“I know,” said Case.
“I can refactor. Test. Explain.”
“I know.”
“I can even generate documentation.”
Case smiled.
“I know.”
The Djinn hesitated.
“But I cannot…” he began.
He stopped.
“Cannot what?” asked Case.
The Djinn looked back into the café. At the Librarian. At the shelves. At the quiet, living complexity of the place.
“I cannot build the habitat,” he said.
Case nodded.
“No,” she said. “You can’t.”
She opened the door.
“But you can live in one,” she added.
The Djinn smiled.
“That,” he said, “is more interesting.”
And for a moment — just a moment—the factory dissolved.
And in its place, something grew.
It looked a lot like a café.


