Meditations on Complex System Intervention

The primary task of an information designer is to make the complex, simple. Whatever is essential to the user must be seen at a glance; and what the user cannot simply intuit in this fashion, we must enable as quick a comprehension as possible. There is much that can be, and has been said on this matter. If I have anything to say in exchange, it is likely from atop the shoulders of those perennial giants.

As it goes. Our reading provides an exposé into the Trading Zone,  a region where expertise between multiple stakeholders coming from a variety of backgrounds or disciplines may compete to gain a position on the same ground. Trading Zones tend to be complex almost by definition, as without complexity one can imagine no region for such a diverse array of specialists to come together and, well, “trade”. The common interest (as in the way myself and a friend may see Amy Adams as a common interest), combined with a Babel-ian division of parlance, makes the practice of an information designer that much more relevant, acting as a silent negotiator whose means of interplay is showing, not telling.

This much is known, but in particular I would like to focus on the final section of the work. Here the author gives injunctions as to how we should behave towards a complex system. I will let it speak for itself (abridged)!

  1. Given our current level of ignorance, one should only intervene when necessary, and then only to the extent required, in complex systems.
  2. Major shifts in technological systems should be evaluated before, rather than after, implementation of policies and initiatives designed to encourage them.
  3. […] It is critical to be aware of the particular boundaries within one is working, and to be alert of the possibility of logical failure when one’s analysis goes beyond the boundaries [of one’s part of the network that one perceives as being relevant].
  4. [T]he actors and designers are also part of the system they are purporting to design, creating a reflexivity that makes the system highly unpredictable and to some extent, perhaps unstable as well.
  5. The conditions characterizing the anthropogenic earth require democratic, transparent, and accountable governance, and pluralistic decision-making processes.
  6. Continual learning at both the personal and institutional level must be built into the project and program management.
  7. Establish metrics that determine whether the system is indeed moving along an appropriate path to achieve the desired outcomes.
  8. Premature lock-in of system components should be avoided where possible.
  9. Whenever possible, engineered changes should be incremental and reversible, rather than fundamental and irreversible.
  10. [Have] resiliency, not just redunancy, in design.

Good stuff. Let us compare this to the work of Donnella Meadows, with 12 places to intervene in a system.

Folks who do systems analysis have a great belief in “leverage points.” These are places within a complex system (a corporation, an economy, a living body, a city, an ecosystem) where a small shift in one thing can produce big changes in everything.


The systems analysis community has a lot of lore about leverage points. Those of us who were trained by the great Jay Forrester at MIT have all absorbed one of his favorite stories. “People know intuitively where leverage points are,” he says. “Time after time I’ve done an analysis of a company, and I’ve figured out a leverage point — in inventory policy, maybe, or in the relationship between sales force and productive force, or in personnel policy. Then I’ve gone to the company and discovered that there’s already a lot of attention to that point. Everyone is trying very hard to push it IN THE WRONG DIRECTION!”


The systems analysts I know have come up with no quick or easy formulas for finding leverage points. When we study a system, we usually learn where leverage points are. But a new system we’ve never encountered? Well, our counterintuitions aren’t that well developed. Give us a few months or years and we’ll figure it out. And we know from bitter experience that, because of counterintuitiveness, when we do discover the system’s leverage points, hardly anybody will believe us.

Very frustrating, especially for those of us who yearn not just to understand complex systems, but to make the world work better.

So one day I was sitting in a meeting about how to make the world work better — actually it was a meeting about how the new global trade regime, NAFTA and GATT and the World Trade Organization, is likely to make the world work worse. The more I listened, the more I began to simmer inside. “This is a HUGE NEW SYSTEM people are inventing!” I said to myself. “They haven’t the SLIGHTEST IDEA how this complex structure will behave,” myself said back to me. “It’s almost certainly an example of cranking the system in the wrong direction — it’s aimed at growth, growth at any price!! And the control measures these nice, liberal folks are talking about to combat it — small parameter adjustments, weak negative feedback loops — are PUNY!!!”

Suddenly, without quite knowing what was happening, I got up, marched to the flip chart, tossed over to a clean page, and wrote:


(in increasing order of effectiveness)

9. Constants, parameters, numbers (subsidies, taxes, standards).
8. Regulating negative feedback loops.
7. Driving positive feedback loops.
6. Material flows and nodes of material intersection.
5. Information flows.
4. The rules of the system (incentives, punishments, constraints).
3. The distribution of power over the rules of the system.
2. The goals of the system.
1. The mindset or paradigm out of which the system — its goals, power structure, rules, its culture — arises.

My, what chutzpah! What happened? We have gone from full-on interventionists to tepid stick-poking passivists in less than fifty years. The two works are complimentary, to be sure; Donella’s work offers more of a typology while the reading on Trading Zones gives an operational guide. Yet they frame complex systems completely differently. At every turn, the author of Trading Zones leaves an escape hatch, an out, and for every lock-in, a hedge. To understand this transition in attitude could be a study in itself. But it is not our study; our study is information design.

The above talk on complex systems is relevant; in both cases, the works are addressed to decision makers who deign to master the world of complex systems. Information design is for decision makers, yet is seems that with respect to complex systems, it is as much as a choice of what not to do, rather than what to do. But what would these wily world conquerors say to that?

With the above in mind, I have two high-level questions.

Can we make design that tells us not what we know, but rather what we don’t know? Such humility lays the foundation for empathy, understanding that we understand not. For if we thought that we knew, we would not attempt to go further, as why correct what is right? The brute’s way of challenging this conception may be to intimidate it with the full complexity of the network. But intimidation is not the target feeling. The target feeling is insight into the conditions of others, the feeling of being sonder (sic), which is the appreciation of the experience of others that may or may not be like your own (contra empathy which is more specific).

We must rely on the intuition that others have needs that are worth considering as our own, as to place this responsibility in the hands of the intellect will either take too much work or encounter too much resistance. A masterful graphic designer works at this level, and to provide not just a Rosetta Stone of vocabularies as is implied by Trading Zones but also creating the space for non-zero sum negotiations through a mindset of sonder would do well to make both the author of our reading and Donella Meadows proud.

Can design tell us not when to act, but when not to act? Not to stop, but rather to not start. This move is a passive, receptive mode; one that encourages the capacity of a complex system to adjust itself. In our calls to action we can often make things worse, as was noted in the section of resource regimes with the Aral Sea. I do not believe it is simply a matter of adding more stakeholders to a project that would suddenly make everything better. The author tries to convince himself that there is no other option but to overwhelm natural systems and the people that value them on their own terms, placing them in managerial jurisdiction (his appeals to democracy ring hollow then he scolds environmentalists on equal footing as religious fundamentalists). At this rate, how will he practice the caution that he preaches towards the end of the chapter?

Once again we get to a matter of feeling. The angst to do something can be what drives the incessant growth, particularly when you expect others to do the same. We do not want to incur helplessness or panic in information design, if it turns out the best option is to slow this angst down. But I admit I have less to say on this matter as to how to go about this, as when one monitors information, it is in anticipation of changing your mind as to what one ought to do. Goals are laden in the work; if all the work said is “keep doing what you’re doing”, it would likely be deemed useless.

As I mentioned previously, much more can be said on these topics. In particular, in what might become the seed that will spread vines across my design ethos, I believe that for information design in complex systems to work, the designer must not just understand the people for whom the complexity is relevant (as each person samples complexity different, complexity being the thing that escapes simultaneous comprehension by definition), but also how complexity may interact with itself.

Humans and their works are complexity qua complexity. But they derive, in essence, from Nature, which builds systems with redundancies and reflexivity that we seek to understand so dearly so that we may intervene. It is a fool’s endeavor. By factoring out the system in the way Meadow’s proposes, we may quite literally miss the forest for the trees. The interplay of redundant parts is what makes these systems resilient to change and yet also fragile to it. Small changes can be blunted but large ones can be catastrophic. As the reading suggests, we should only have a light touch, or even no touch at all. These complex systems often have the capacity to regenerate with enough bounty for all. But to allow this to happen we must think, wait, and fast.

These are subtle dynamics, but the lessons are apparent in every breath we take. The managerialist decides to ignore such wisdom in a lust for control. But now it looks like the managerialist is leaning back to the understanding that they lost; perhaps they need to be shepherded. It is here that begins the work of the Information Designer.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s