Last month’s roundup on lean books touted the recent reissue of Lean Logic: A Dictionary for the Future, a massive book of linked essays that ambitiously link lean principles to a wealth of topics ranging from sustainability to systems dynamics to agriculture and so much more. Written by the late thinker David Fleming and edited by Shaun Chamberlin, this elegant resource merits investigation by any serious lean thinker. The following random collection of excerpts from the book give a sense of its promise. More information on the book and the ideas can be found here.
Dirty Hands: The sign of being prepared to compromise high-flown principles in the interests of encountering a complex system, engaging with the detail and being capable of acknowledging awkward conclusions. A person willing to get his hands dirty himself is sufficiently secure not to need to make an icon of himself and his own high moral standing, or to join the intellectual’s desertion of local culture and narrative; he gets to grips with the story.
This requires you to make the nuanced judgments of fuzzy logic, to get down into the detail, and perhaps take a famously inconvenient route…
Lean Thinking: A frame of reference for enabling people to join together in a shared aim.
Two ways of making something happen can be compared. One of them—top-down management—is to tell people what to do: issue instructions, regulations, incentives, penalties, targets; exert managerial control; do the thinking for them; give orders, make sure they are carrying them out, check that they have done them right and, if they haven’t, tell them to do it again.
The other way is to set people up with the necessary resources, the skills and equipment, a common purpose, and the freedom to apply their judgment. This has advantages: it brings to life the imagination and tenacity of the people; it transforms the quality of decisions; it is flexible; it sets up conditions for alert feedback; it makes the needs of the system quickly apparent, responding to the local and real, rather than to a distant caricature.
Pull: Pull recognizes that the people best placed to deal with a complex task are those who are doing it—who are engaged with the practical detail. Once the intention (or common purpose) is defined, participants do not need forever to rely on instructions; they can respond to actual local circumstances, guided and pulled along by observation, rather than pushed through in response to rules or general principles, or a regulatory agency that claims a monopoly on decision making….
Pull means that people are allowed to switch on their brains—responding to a challenge on its own terms, and building on their local wisdom as to the needs of a particular time and place. In a pull system, the people involved apply their creative intelligence to pull answers out of the situation; they invent solutions, they discover ways forward which management does not have to work out for itself.
In this context, aims can be defined without any reliable knowledge of how they are to be achieved, or even whether they are achievable, for pull can enable the creative discovery of means which are at present unknown or out of sight and which, when they are invented or revealed, may surprise.
Resilience: The ability of a system to cope with shock.
System-Scale Rule: The key rule governing systems-design: large-scale problems do not require large-scale solutions; they require small-scale solutions within a large-scale framework.
Systems Thinking: Guidelines for thinking about networks of interaction.
Lean Logic makes a distinction between two kinds of systems: the complex system and the modular system. It also recognizes two more kinds of systems which are special applications of these: the complicated system and the ecological system. [The following chart summarizes the four]:
We cannot control a system—practically everything we ever do has unintended consequences—but we can look about us before rushing into the indignant campaign or the technical fixes. This is called manners.
Going beyond that—tracing through some, at least, of the often hard-to-understand causes and effects, and having some skill in thinking through the consequences—is called systems thinking. And even that can be inconclusive, in the sense that systems thinking, no matter how clear-sighted, may succeed in no more than improving a situation, falling far short of solving it. Or it may improve matters for some of the interests involved but leave things unaffected (or perhaps even worse) for others.
Technical Fix: The strategy of ignoring what a system is trying to tell you, and forcing it along in the same dismal direction even faster.