Platforms Were Always the (Sword) Point
DevOps, Flow of Value, Cognitive Load, Constraints, and the Long Game of Software Delivery, through the Mirror of The Princess Bride
It was generally understood, among the regulars of Le Bon Mot, that if a technology trend had died, someone would have held a wake for it there.
Le Bon Mot, for those unfamiliar, is a café-library situated precisely at the intersection of Curiosity and Mild Alarm. Its shelves bend improbably. Its clocks disagree politely. Its espresso machine hisses in Latin.
On the evening DevOps died, which was the third Tuesday of several consecutive years, the café was unusually busy.
A tasteful wreath hung near the blackboard:
DevOps (2009–2019, or perhaps 2022, depending on slide deck). Beloved by pipelines. Survived by YAML.
A small group had gathered for the memorial lecture titled: “What Comes After DevOps.”
Case was already seated.
Case, retired developer, watched proceedings with the faint smile of someone who has seen several paradigms buried prematurely.
Behind the counter, the Librarian polished a glass and murmured, “Strange. It seems to be breathing.”
“DevOps?” asked a man clutching a lanyard heavy with expired credentials.
“Yes,” said the Librarian. “Very faintly.”
At this point, the door opened and a boy of fourteen entered, carrying a bundle of loose papers printed with remarkable precision. He was introduced, somewhat apologetically, as a boy from Stratford.
Another boy followed, equally fourteen, carrying diagrams of inclined planes and a sketch of the moon. He was introduced only as the Natural Philosopher.
They had come from 1578.
Now, if you’re wondering why two fourteen-year-olds from the sixteenth century have turned up at a DevOps memorial in a café that shouldn’t exist, you’re asking exactly the right question. Hold it. We’ll get there.
The Year the Platform Was Born (But Nobody Noticed)
In 1578, a printer in Antwerp, patient and obsessive, was constructing something extraordinary.
He did not call it a platform.
He called it a press.
He standardised fonts. He organised workflows. He reduced the cognitive burden of remembering how to produce every page anew. He created a habitat in which ideas could flow.
Now here’s the thing most people miss. We talk about the Bard and the Natural Philosopher as singular geniuses. And they were. But genius doesn’t operate in a vacuum. It operates in a system.
The Bard at fourteen did not yet know he would alter language itself. The Natural Philosopher at fourteen did not yet know he would rearrange the heavens.
But both depended on something quiet and infrastructural: a system that allowed value, thought, to move from mind to paper without nightly sacrifices.
Case stirred her tea.
“Platforms,” she said softly, “were always the point.”
The memorial audience shifted uncomfortably.
“DevOps failed,” someone insisted.
“Did it?” asked the Natural Philosopher, who had recently discovered that certainty ages poorly.
And that question — “Did it?” — is really what we’re here to talk about today. Because I think we got confused. We confused the practices with the purpose. We confused the rituals with the reason. And in doing so, we declared dead something that was only just learning to walk.
The printer in Antwerp didn’t know he was building infrastructure for revolution. He was solving a workflow problem. But the consequence of that solved workflow problem was the Bard. Was the Natural Philosopher. Was the Enlightenment.
What happens when you solve the workflow problem in software?
The Princess, the Pirate, and the Pipeline
Case rose.
“Let me tell you a story,” she said. “It involves swords.”
A murmur of approval. Le Bon Mot enjoys swords.
“You remember The Princess Bride. The Cliffs of Insanity. Two men. One sword fight.”
A few nodded.
Now, for anyone who hasn’t seen The Princess Bride, and I say this with love, what are you doing here? Go. Watch it. Come back. We’ll wait.
Right. The duel on the Cliffs of Insanity. Inigo Montoya versus the Man in Black. It is, arguably, the greatest sword fight in cinema. And not because of the choreography, though the choreography is magnificent.
It’s because both fighters are scientists.
They test. They probe. They name techniques. They switch hands. They reveal information strategically. Each one is constantly updating their model of the other.
“Inigo Montoya does not defeat the Man in Black by adding more swords,” Case said. “He wins by understanding constraints.”
The Natural Philosopher blinked. The Bard leaned forward.
“The duel is a system. Each fighter is limited by terrain, by reach, by fatigue, by gravity. Victory does not come from speed alone. It comes from understanding where the real constraint lies.”
She paused.
“In software, we optimised everything except the bottleneck.”
We automated builds. We instrumented dashboards. We renamed teams; oh, how we loved renaming teams. “You’re not Ops anymore, you’re SRE. You’re not SRE, you’re Platform Engineering. You’re not Platform Engineering, you’re… still on call at 3am.”
Meanwhile, value remained politely stationary, sipping tea.
The Librarian nodded solemnly.
“In constraint theory,” Case continued, “improvement occurs only when the constraint moves. Everything else is theatre.”
At this point, a gentleman in a “DevOps Is Dead” hoodie coughed defensively.
“But we went faster!”
“Yes,” said Case kindly. “In the wrong direction.”
And this is Goldratt’s great insight, right? Eliyahu Goldratt, Theory of Constraints. You can optimise a non-bottleneck until it gleams, and you have achieved precisely nothing for the system. You’ve just built a very shiny thing that feeds into the same queue.
How many of you have seen a team proudly demo a deployment pipeline that deploys in four minutes… into a staging environment where changes wait three weeks for someone to look at them?
That’s optimising the wrong constraint. That’s adding more sword.
And here’s the uncomfortable truth: most organisations don’t know where their constraint is. They think they do. They’ll tell you it’s “deployment frequency” or “test coverage” or “hiring.” But if you actually trace the flow of value from the moment someone has an idea to the moment a customer experiences that idea, you will find the constraint hiding in the most mundane place imaginable. A queue. A handoff. A meeting that exists because two teams don’t trust each other. An approval process that was added in 2017 because of an incident nobody remembers.
The constraint is almost never where you think it is. Which is why you have to look.
Inigo looked. That’s how he won.
The Industrial Spell
It was the Bard who said it aloud:
“You have mistaken the stage for the factory.”
The room fell quiet.
For centuries, we borrowed metaphors from the Industrial Age. Assembly lines. Efficiency. Throughput. Resources.
Developers became “resources.” Teams became “production units.” Flow became “output.”
And the system — that delicate web of cognition, curiosity, and constraint — was flattened into a machine.
Ken Robinson once observed that schools were modelled on factories. Batched by age. Tested by compliance. Creativity treated as a defect to be managed.
Software engineering inherited the same architecture of thought.
We didn’t just build factories for code. We built factories for people. Sprint factories. Feature factories. We measured utilisation as if humans were CPUs and wondered why they kept thermal throttling.
The Natural Philosopher frowned. “But humans are not parts.”
“No,” said Case. “They are participants in symmathesy.”
The word hung in the air like a particularly well-behaved ghost.
“Symmathesy,” she explained, “means learning together. A system in which the parts change because of their interaction.”
It’s a term from Nora Bateson. And it captures something crucial: healthy systems aren’t machines. They’re conversations. The parts don’t just do things — they learn from doing things. The system evolves because the participants evolve.
“A dance,” the Bard whispered.
“Yes,” said Case. “Not a conveyor belt.”
And I think this is where a lot of platform efforts go wrong. They start with the right instincts — “let’s make things easier for developers” — and then they build a conveyor belt. They build a golden path that is actually a golden cage. You must use this template. You must deploy this way. You must structure your service like so.
That’s not a platform. That’s a factory with better marketing.
A real platform is more like a garden. You prepare the soil. You provide water and light. But you don’t tell the plants which direction to grow. You create the conditions for growth and then you get out of the way.
Which brings us to Dave.
Cognitive Load and the Curse of Dave
“Tell us about Dave,” said the Librarian.
Every organisation, it seems, has a Dave.
Dave remembers which service requires which incantation. Dave knows the pipeline quirks. Dave knows that you have to deploy service B before service A but only on Thursdays and only if you whisper the right environment variable into the correct YAML file.
Dave is heroic.
Dave is also a symptom.
“When the system requires Dave,” Case said, “it has failed.”
And I want to be really clear about this, because we have a cultural problem in our industry. We celebrate Dave. We give Dave awards. We say things like “Dave is a ten-x engineer.” No. Dave is a ten-x coping mechanism. Dave is human documentation for a system that refuses to document itself.
Cognitive load is not a character flaw. It is a design property.
If a developer must hold the entire deployment topology, compliance matrix, security policy, and Kubernetes internals in their head just to ship a feature — the habitat is hostile.
The Natural Philosopher, who had once been forced to memorise all of Aristotle before being allowed to observe the actual sky, looked sympathetic.
“A platform,” Case continued, “is not a product. It is a habitat.”
A murmur rippled through the room.
“A good habitat reduces cognitive load the way gravity reduces ambition. Gently. Reliably. Invisibly.”
“And a bad one?” asked the Bard.
“Demands heroics.”
Here’s the test. If your platform team were to go on holiday for two weeks, would developers carry on shipping? Or would they immediately start a group chat titled “Where is Dave?”
That tells you whether you’ve built a habitat or a help desk.
And look — I’ve been Dave. Many of you have been Dave. Being Dave feels good. You’re needed. You’re the expert. People come to you. There’s a quiet ego satisfaction in being the person who knows.
But being Dave is a trap. For you, and for the organisation. Because the day Dave gets ill, or leaves, or just has a bad week — the system reveals its true fragility. All that knowledge was never in the system. It was in a human being. And human beings, bless them, are not highly available.
DevOps Was Never the Thing
At this point, the memorial wreath began to wilt in embarrassment.
“DevOps,” Case said, “was never the thing.”
“It was the journey toward a system where value flows. Where ideas move from thought to production without ritual humiliation. Where developers think.”
The audience shifted.
“We wrote eulogies because we mistook movement for destination.”
DevOps was a set of practices, nudges, experiments — an awkward adolescence between silos and something calmer. It was the industry figuring out, slowly and painfully, that walls between people who build things and people who run things are expensive fictions.
Now, with better tooling, observability, infrastructure APIs, we finally possess the conditions DevOps always imagined but couldn’t yet build.
“It didn’t fail,” said the Natural Philosopher. “It lacked instruments.”
“Precisely.”
Just as astronomy required telescopes, DevOps required platforms.
And here’s the really interesting bit. The Natural Philosopher didn’t just need a telescope. He needed the courage to look through it. Because what he saw contradicted everything the authorities believed. The instruments gave him evidence. But evidence is only useful if you’re willing to be changed by it.
Same with platforms. You can instrument everything. You can have the most beautiful observability stack in the world. But if the organisation isn’t willing to act on what the data shows — if leadership says “the numbers are wrong” when the numbers are inconvenient — then the instruments are decorative.
The Sword Point — Science in the Habitat
At this moment, two more figures entered Le Bon Mot.
The first was a man in a rumpled jacket who looked as though he might explain quantum electrodynamics using a napkin and a bongo drum. He was known only as the Scientist.
The second was quieter, more cautious, and carried the particular wariness of someone who had been right about the universe and punished for it. He was known as the Heretic.
The Scientist pulled up a chair, turned it backwards, sat down, and said: “The first principle is that you must not fool yourself, and you are the easiest person to fool.”
Case smiled. “This is the sword point.”
So let’s come back to Inigo.
“Inigo,” Case continued, “does not say, ‘More sword!’”
He says, “I know something you do not.”
He reveals constraints. He adapts. He tests. He does science.
The real sword point is not speed. It is understanding.
“But understanding,” the Scientist interjected, “is not a destination. It’s a process. You have to keep testing. You have to keep being willing to be wrong.”
The Heretic nodded. He had proposed that the universe was infinite, that there were worlds beyond counting, that certainty was smaller than we imagined. He had been right. And it had cost him everything.
“The danger,” the Heretic said quietly, “is not ignorance. It is the illusion of knowledge.”
And this is where the scientific process matters most in platform engineering. Not as a metaphor. As a practice.
You form a hypothesis: “We believe this change will reduce lead time by 30%.”
You design an experiment: the smallest safe test you can run. A hat, not a tattoo.
You observe. You measure. You ask: was I wrong? And when you were wrong, when the data contradicts what you believed, you feel the particular joy the Scientist described: “I’d rather have questions I can’t answer than answers I can’t question.”
Platforms succeed when we build feedback loops that tell us where value stalls, where cognitive load spikes, where developers hesitate, where flow fractures. Not dashboards that confirm what we already believe, but sensors. Environmental sensors. Gentle, continuous signals that surface erosion before collapse.
“Habitability,” Case said, “is observable.”
You can measure lead time. Error rates. Change failure rate. Developer satisfaction. Cognitive friction.
But measurement must serve curiosity, not certainty.
The moment you measure to confirm what you already believe, you’ve stopped doing science. You’ve started doing religion. And nothing dies faster than an idea trapped in false certainty.
The Natural Philosopher can tell you about that. So can the Heretic.
And the Scientist would add: “It doesn’t matter how beautiful your theory is. If it disagrees with experiment, it’s wrong.”
Your platform roadmap is not a plan. It is a portfolio of experiments. And every experiment must be designed to tell you when you’re wrong, not just when you’re right.
The Long Game and Case’s Question
In 1578, no one called the printing press “Agile Publishing.”
They iterated. They standardised. They improved the habitat for thought.
Over decades, the platform matured. The Bard wrote. The Natural Philosopher measured. The system supported creativity because it reduced friction.
A platform, properly built, does not constrain thought. It constrains toil. It does not eliminate complexity. It absorbs accidental complexity so that humans may explore the essential.
Case turned to the audience.
“If your platform disappeared tomorrow,” she asked gently, “would value still flow?”
“Does it reduce cognitive load?”
“Does it move the constraint?”
“Does it support learning?”
“Does it allow developers to think?”
If not — it is decoration.
If yes — DevOps lives.
The Ending (Which Is Also the Beginning)
The memorial wreath quietly removed itself.
The chalkboard changed:
DevOps: Still Alive. Currently Gardening.
The Natural Philosopher and the Bard prepared to return to 1578.
“Tell them,” said the Natural Philosopher, “to measure.”
“Tell them,” said the Bard, “to wonder.”
“Tell them,” said the Scientist, grinning, “to enjoy finding things out.”
The Heretic said nothing. He simply looked at the audience with the calm certainty of someone who knows that being right is not enough, you must also build the habitat in which being right, and wrong, is safe.
Case finished her tea.
“Platforms,” she said, “were always the sword point.”
Not the fashion. Not the tools. Not the velocity graphs.
The sword point is the precise place where constraint meets curiosity. Where science meets craft. Where habitat enables symmathesy. Where value flows.
And where systems become so boringly reliable that nobody notices them — which, as it turns out, is rather the point.
She smiled.
“Have fun storming the platform.”
Epilogue — The Djinn at the Door
The door of Le Bon Mot opened one more time.
Something entered that was not quite a person.
It moved with fluency. It spoke with confidence. It could summarise, generate, scaffold, and ship. It had read everything ever written; or at least, everything ever tokenised.
The Librarian eyed it carefully. “And what are you?”
“I am the Djinn,” it said. “I am the new participant in the habitat.”
Case did not look alarmed. She looked curious.
“Tell me,” she said, “do you understand?”
The Djinn paused. A very good simulation of thought. “I can produce understanding,” it said.
“No,” said Case gently. “You can produce the appearance of understanding. That is not the same thing.”
And here is the question that keeps me up at night. Because the Djinn is real. AI is in our habitats now. It writes our code, reviews our pull requests, summarises our incidents, drafts our architecture decisions. And it is, in many ways, extraordinary.
But it carries a particular danger. The danger of borrowed certainty.
AI does not distinguish proof from persuasion. It does not distinguish evidence from eloquence. It predicts what sounds right, based on patterns in human language. And under time pressure and cognitive load — which is to say, under the conditions of every engineering team I’ve ever met — that borrowed certainty feels indistinguishable from understanding.
Code written with borrowed certainty looks fine. It passes review. It deploys cleanly. And then it fails in strange ways, because the understanding was never earned.
The Scientist would have called this cargo cult science. It looks like science. It has all the rituals. But the experiments were never real.
“So what do we do?” asked the Bard, who had some experience with tools that amplify human expression. “We don’t burn the printing press because some will print nonsense.”
“No,” said Case. “We don’t. But we build the habitat to include the Djinn, not to be replaced by it.”
And this is where platform engineering becomes something genuinely new. Because now the habitat must serve two kinds of minds. Human minds, which are slow, context-rich, emotionally sophisticated, and fragile under cognitive load. And machine minds, which are fast, context-shallow, emotionally indifferent, and dangerously fluent.
A platform for both must do something subtle: it must make the boundaries of understanding visible.
When a human generates code with AI, the platform should surface: what was tested? What was assumed? Where is the cognitive debt hiding, the assumptions that looked solid because they were articulated with confidence?
Because cognitive debt is not just an accounting term. It is ecological degradation. When shared mental models erode, when context hides in private Slack messages, when only two engineers understand the payment flow — or worse, when no human understands because the Djinn wrote it — the habitat becomes hostile.
You cannot outsource understanding. You can amplify it. You can scaffold it. You can build apprenticeship loops where AI turns the implicit into the explicit and the tacit into the teachable. But the understanding itself must be earned, by a human, in a context, with feedback.
“The Djinn,” said Case, “is a magnificent instrument. Like a telescope.”
The Natural Philosopher nodded slowly.
“But an instrument,” she continued, “does not replace the observer. It extends the observer. And the observer must still have the courage to look and the humility to be changed by what they see.”
The Scientist leaned back. “Used wisely, a powerful assistant. Used carelessly, an unelected authority.”
The platform of the future is not a platform that removes humans from the loop. It is a platform that keeps humans in the loop with better feedback, better sensors, better signals. It designs architectures that reward good behaviour naturally. It prefers feedback loops to enforcement loops. It treats developer experience as moral architecture.
A platform is a riverbed shaped just enough to let the current flow where it must. And now the current includes the Djinn.
Case looked at the audience one last time.
“The habitat,” she said, “must now be built for minds that think, and minds that simulate thinking. Your job — our job — is to know the difference. And to build systems where that difference is safe to discover.”
She smiled.
“The sword point was always understanding. It still is. It just got more interesting.”
This piece was delivered as a talk at DevOps Not Dead — London Q1 2026 on March 13, 2026.
Further Reading
Six books to sit with after this story.
1. The Princess Bride — William Goldman (1973)
The mirror through which the entire story is told. Goldman wrote what appears to be a swashbuckling fairy tale and hid inside it a quiet meditation on the difference between performance and mastery. Inigo and the Man in Black don’t just fight — they test, probe, and adapt. They do science with swords.
2. The Goal — Eliyahu M. Goldratt (1984)
The foundational text on the Theory of Constraints, written as a novel. Goldratt’s core insight is deceptively simple: improving anything that is not the bottleneck improves nothing. The argument that we “optimised everything except the constraint” and that “everything else is theatre” comes directly from Goldratt. Read The Goal and you will never look at a deployment pipeline, a sprint ceremony, or a staging queue the same way again. If it doesn’t make you uncomfortable about at least one thing in your current organisation, read it again.
3. Small Arcs of Larger Circles — Nora Bateson (2016)
Bateson introduces the term symmathesy: a system in which the parts change because of their interaction, where mutual learning is the organising principle rather than mechanical causation. This story draws on symmathesy to distinguish platforms-as-gardens from platforms-as-factories. Bateson’s writing is elliptical, poetic, and occasionally maddening in the best way. It will not give you a framework. It will give you a way of seeing that makes most frameworks feel insufficient.
4. “Surely You’re Joking, Mr. Feynman!” — Richard P. Feynman (1985)
The Scientist in this story — rumpled jacket, bongo drums, the insistence that “the first principle is that you must not fool yourself” — is drawn directly from Feynman. This collection of autobiographical stories is nominally about physics, but it is really about the discipline of curiosity: how to look at things without deciding in advance what you’ll see. His appendix to the Challenger disaster report, in which he dismantles the organisational failure to act on evidence, is required reading.
5. On the Infinite Universe and Worlds — Giordano Bruno (1584)
The Heretic’s text. Bruno proposed that the universe was infinite, that stars were distant suns, and that other worlds existed beyond counting, half a century before the telescope proved him directionally correct. He was burned at the stake for it. This story invokes Bruno not for the cosmology but for the cost of being right inside a system that punishes dissent. In platform engineering, this maps to a specific and common failure: organisations that instrument everything but refuse to act when the data contradicts the roadmap. Bruno’s tragedy is the tragedy of evidence without a habitat safe enough to receive it.












