I'm a Lead Software Engineer in Nottingham.

Enthusiastic about engineering culture, and building high-quality Products that make a difference. Rebellious artist, climber, aesthete and urbanist.

← Back

Safety Over Alchemy: How the Engineering Organisation becomes Irrational

We might assume that all we need for right principles to be adopted is that they are correct and logical. As we’ll see, when in a culture that promotes it, smart people create suboptimal systems. Correct prescriptions and judgements are sidelined if it means preservation of a prevailing system of incentives.

Alchemy; The Limitation of Rational Optimisation

I picked up Rory Sutherlands’ new book ‘Alchemy’ 2 this week and loving it. It’s an explore into the real ‘why’ behind consumer trends, and the absurdities in our consumer and work cultures, such as:

  • Red Bull, despite being a huge brand is regarded by consumers as outright ‘disgusting’ (in no uncertain terms) which is why it’s so appealing
  • We don’t like delays, but not because of the delay itself; rather because of the anxiety caused by ambiguous, un-actionable information
  • The opposite of a ‘good idea’, is also a good idea

I cite Sutherland because his book clarified a conclusion I had been circling: when non-technical organisations attempt to become engineering or product-led, they often dismiss the second-order, psychological dimensions of change.

We might assume that all we need for right principles to be adopted is that they are correct and logical. Yet, in cultures that reward safety and legibility, smart people routinely create suboptimal systems. Correct prescriptions are sidelined when they threaten prevailing incentives; if a culture does not believe in Alchemy, it designs systems that cannot accommodate emergence.

I want to show how orgs, in maturing their understanding of the nature of software delivery, can move beyond attempts to apply reductionist methods to software delivery, and toward a model that embraces the trust, learning and emergence necessary in building novel and valuable software.Idoes mark

The Real Incentive: Safety Over Effectiveness

Dave Farley, Kent Beck, Martin Fowler and many in the canon have written about how to address the challenges of managing software delivery (notably in Accelerate, Tidy First?), often through establishing a culture around habits of excellence, feedback and accountability rather than heavy process and inspection.

High performing engineering teams are enabled and autonomous to do so. Adopting truly generative habits, devolved product decision-making based on learning, XP and Trunk Based Development, which, while less ‘visible’ in the sense the org has learned to value it it, are the ways to create a high performing team 3

They instead operate although not explicitly, on established, locally-optimised incentives, power structures, and risk containment. These are dynamics engineers - myself included - often underestimate, or don’t recognise and find so uncomfortably incompatible with our intuition of software in its pure sense.

In Alchemy, Sutherland points out the mechanic at work: It’s hard to get fired for an initiative failing that was all-together sensible and not at all counter-intuitive in the current culture. And so many large organisations tend toward the mean; the defensible. High performing software teams are outliers in these cultures.

For example Trunk Based Development, TDD, and XP/pair programming which have associations of being an affront to the organisation’s evaluation of what risk is become high emotional effort to promote in a culture incentivised around visibility and governance of predictable output. These practices do lead to better outcomes. 5

Yet, technical orgs miss the chance to invert; embracing emergence instead of expensive attempts to mitigate against it.

But I’m coming to an understanding - and about time! - that maybe the high cost and waste of optimising for visibility and inspection, and a lagging view on what is even valuable, is revealed as something the org would rather absorb. 6 3

Incentives for the Desert and the Forest - Illogic as safety

This idea is captured so eloquently by Kent Beck in an entertaining talk on the two universes of software as Desert and Forest

The same terms we might encounter in each world, mean different things (think accountability, metrics, observability), and the incentives at work are vastly different, such that organisations with a Desert mindset - scarcity, fear of unpredictability - are incentivised to adopt behaviours that make sense in the Desert. 8

So the system behaves accordingly: the cultural biases decide what risk is not what is actually risky if we follow the logic to its second-order.

The organisation, then, is logical in its own way. But not quite as logical as any individual within it. A kind of psycho-logic.

But by repeatedly patching with first-order, symptomatic solutions: meetings, gates, hand-offs, inspection - organisations lock-in the symptoms while the underlying culture of passivity that led to the system in the first place, becomes normalised. 9

What’s needed is a culture of ownership at the team level: feeling empowered to drive change we as engineers know is right, supported by strong leadership signals of trust and decentralised command from the organisation.

Knowing the Limits of our Influence

We take on our role implicitly knowing that for the most part, we are unable to singularly influence an optimal workflow if it violates the organisation’s need for visibility, reporting and predictability.

Nevertheless, it causes me some unrest when I’m embedded in the inertia of an irrationally inefficient system in large tech.

It is psychologically costly for many Engineers to apply too much of ourselves onto a system; to try to redeem it, without considering what is actually being optimised.

As engineers we’re particularly vulnerable to this. We’re beings driven by making the world coherent to us, applying rationality at work; impulses to which our environment can feel consistently resistive, which we absorb. We forget this is actually a feature.

To re-appropriate Pascal, “the heart systems have a reasons of their own of which reason knows nothing.”

So while we as engineers should apply critical systems-thinking at the local level, we should be careful to consider the cost to ourselves and why the environment favours what it does.

I don’t think this is being passive, but realistic and self-sustaining to be effective at what we can do within the system without needing to redeem it.

Perhaps the suboptimal — the crumbs from the table, as it were — is entirely acceptable by the org all-told, and doesn’t preclude sufficient market success at all.

It is, after all, just fine by them.

Just.

But that’s enough.

Footnotes

First class activities:

The extent to which high-performing engineering habits are first-order activities in a company, serves as a strong signal of how that org regards their Technology division. ie. Whether it’s a primary driver of value or a cost centre: its members fungible, without incurring any switching costs, and linearly scalable. This in my experience is a reliable signal on the experience you might have as engineer there.


[1] https://www.youtube.com/watch?v=lhlS-Wds02M

[2] Rory Sutherland, 'Alchemy' (book, cited in text)

[3] There are many failure modes and beliefs stemming from this reductive worldview: Long-lived teams aren't inherently essential, people are fungible cogs, switching costs don’t exist, extra bodies makes work faster, software can be accurately planned, quality can be handed off and inspected-in…

[5] https://dora.dev/capabilities/trunk-based-development/#how-to-implement-trunk-based-development

[6] This was covered in Illegible Perception, a response to Sean’s brilliant Seeing like a Software Company’, which describes this optimisation for visibility, even at the expense of executing on the supposedly critical function of delivering software.

[7] https://www.youtube.com/watch?v=W7XL_LZgvKI

[8] This behaviour is effectively incentivised in the way large orgs regard risk, and respond to failures punitively, implied if not explicit. This is a sufficiently commonplace effect that it is captured in the cuttingly-accurate Larman’s Law

[9] There is an idea in manufacturing - Theory of Constraints (TOC) - my favourite go-to for software analogies, that bottlenecks can not not eliminated, only moved to the next slowest part of the system.

.