On Designing the Habitat as Symmathesy
Designing your software engineering habitat so you can learn together
There is a smell to a software system that has stopped learning. It smells faintly of burnt coffee and quiet resentment. Of Jira tickets reopened for reasons no one remembers. Of Slack threads where someone asks, “Does anyone know how this works?” and three people type, “It’s complicated,” and no one elaborates.
You can taste it in the pauses during retrospectives. The way engineers glance at each other before speaking, as if the system itself might overhear.
We like to believe our systems are machines. Mechanical. Deterministic. Predictable, given enough dashboards. The factory metaphor still lingers in our vocabulary like cigarette smoke in an old pub: pipelines, throughput, velocity, efficiency. We measure outputs and call it progress.
But your software systems are not machines. They are ecosystems of mutual influence. Developers shape the code. The code shapes the developers. Organisational structure shapes architecture. Architecture shapes conversation. Incidents shape fear. Fear shapes design.
And all of it learns together.
This is where Jessica Kerr’s invocation of symmathesy lands like milk on your throat after a phaal. A symmathesy, literally ‘learning together’, reframes everything. Your software system is not a product. It is a learning organism. A network of humans, tools, code, incentives, history, and constraints, adapting in response to each other.
If you can accept that perspective then the role of a platform, of a habitat, is not to enforce compliance or accelerate deployment.
It is to shape and support how the system learns.
In regulated environments, banks, healthcare, public infrastructure, this becomes sharper. There, the cost of misunderstanding is not merely technical debt. It is regulatory intervention, operational collapse, reputational damage. And so we add controls. More guardrails. More forms.
But guardrails without understanding do not produce safety. They produce fear. And fear is a terrible teacher.
The real question for the platform engineer is not:
“How do we make developers move faster?”
It is:
“What kind of learning dynamics are we creating?”
Do developers learn safely? Do they learn collectively? Do they learn from consequences quickly? Do they share a coherent mental model? Does the system adapt — or merely react?
In short, its feedback loop tuning at the service of learning together.
And that, not velocity, not automation, not golden paths, is the primary responsibility of platform engineering.
Design your software engineering habitat so you can learn together
A software system is a symmathesy of symmathesies: a network of mutual learning between:
Code
Developers
Teams
Operators
Compliance
Governance
Users
Tooling
History
Incentives
Mental models do not live solely in individual heads. They emerge collectively. If the habitat fragments learning:
Mental models diverge.
Myths replace facts.
Fear replaces experimentation.
Cognitive debt accumulates.
If the habitat amplifies learning:
Mental models converge.
Causality becomes visible.
Constraints become trusted.
Adaptation becomes safe.
The goal is not control. The goal is coherent collective learning.
Some practices to consider
Make Structure Legible — Structure is shared pattern recognition. If developers see the same patterns, they learn the same system. Consider investing in surfacing and clarifying:
Service boundaries
Ownership
Dependency direction
Domain seams
Surface Causality — Help the system observe itself. Without causal clarity, learning becomes superstition. Consider enabling:
Distributed tracing
Clear log narratives
Event lineage
Failure propagation visibility
Encode Collective Memory — Without memory, the software engineering system becomes amnesiac. Every constraint feels arbitrary. Consider maintaining:
Architectural Decision Records
Regulatory rationale
Trade-off documentation
Post-incident learnings
Make Constraints Visible — Constraints are riverbanks. They make exploration safe. Invisible constraints create fear. Visible constraints create creativity. Consider surfacing:
Security boundaries
Regulatory invariants
Data integrity guarantees
SLOs
Shorten Feedback Loops Ruthlessly — Feedback topology defines evolutionary pressure. Slow feedback produces defensive learning. Fast feedback produces adaptive learning. Optimize for:
Fast CI
Clear test insights
Safe deployment paths
Progressive delivery
Easy rollback
Encourage Participation — A symmathesy with passive participants decays. Consider supporting:
Pair and mob programming
Blameless retrospectives
Safe experimentation
Some things to avoid
Treating observability as merely tooling and not collective cognition
Treating compliance and governance as enforcement, not constraint encoding
Optimising for throughput over understanding
Allowing decision context to decay
Confusing documentation with shared mental models
A checklist I’ve found helpful
Ask regularly:
Can a developer simulate system behaviour mentally?
Can they predict failure propagation?
Do they know why constraints exist?
Is change safe and reversible?
Are feedback loops fast enough to shape behaviour?
Does knowledge propagate beyond individuals?
If the answer is “no” to several of these, your habitat is not enabling symmathesy.
It is potentially suppressing it.
A real world example
In a regulated banking environment:
A platform encodes:
Policy-as-code compliance checks
Automated audit trails
Clear service ownership
Environment parity
Traceability from commit to production behaviour
And so Developers:
See constraints before violation
Learn regulatory rationale through tooling
Receive fast feedback
Share post-incident learnings
The system adapts without panic. Compliance becomes collaboration. Learning becomes institutional, not tribal.
Closing Reflection
A crash is visible.
A security breach is visible.
A production outage is visible.
But the most dangerous failure is invisible:
When the system stops learning.
When developers no longer trust their mental models.
When compliance becomes adversarial.
When architecture becomes archaeology.
The habitat must do more than host code.
It must host learning.
Because in the end, the system that learns coherently survives.
The one that does not?
It becomes a museum of decisions no one remembers making.
Some Further Reading
Jessica Kerr — talks and writing on symmathesy and observability
Nora Bateson — Small Arcs of Larger Circles
Peter Senge — The Fifth Discipline


