The dense and tangled web of life—the interconnected nature of reality—reveals itself daily. Since September 11, think of how much you’ve learned of people, cultures and nations that previously you knew little about. We’ve been learning how the lives of those far away affect our own. We’re beginning to realize that in order to live peacefully together on this planet, we need to be in new relationships, especially with those far distant from us.
When my children were small, I had a slogan on my refrigerator that read: “If mama ain’t happy, ain’t nobody happy.” Perhaps that was my children’s first lesson in systems thinking. We adults learn this, too. If others are struggling, we experience the consequences of their struggle. If others don’t feel safe, we aren’t safe.
Great teachers have been trying to teach us this for thousands of years. Buddhism teaches that any one thing is here because of everything else. The great American naturalist John Muir said that if we tug on any one part of the web of life, we get the whole web. But we’ve been very slow to learn the lesson.
In fact, Western culture has spent decades drawing lines and boxes around interconnected phenomena. We divide the world into pieces, rather than explore its web-like nature. Think of all the boundaries that exist in maps, organizational charts and job descriptions. The neat lines define what goes on inside each box, and the natural messiness of interconnectedness disappears—at least on paper.
These boundaries have been a means to control people and events. War is rallied by reinforcing the boxes of national or ethnic identity. At work, employees are told which box they occupy on an organizational chart. If they step outside the box by presenting an idea or criticism, they’re punished or ignored. Over time, people seek the protection of their box. They know what is and what is not their job.
I experienced this self-protective attitude shortly after the first anthrax incident in Palm Beach County, Florida. A friend of mine is a judge at the courthouse there. The judge’s secretary noticed that someone had been at her desk—papers were disturbed, things moved around. Given the possibility of an anthrax threat, the judge immediately called building security. The guard told her blandly that it was not his job to secure the secretary’s office: “My job is to secure the judge’s chambers, that’s all.” He knew his box and remained oblivious to this new world where danger knows no boundaries.
But I do not fault the security guard. Like so many of us, he had been given this message by his supervisors. Like so many of us, he had learned to do what he was told, and to use his job description as a way to avoid being blamed. Most organizations, because they manage by boxes, have created millions of withdrawn, dependent, frightened and cynical employees.
This is now a huge problem, because our safety and our future depend upon whether each of us can step outside our boxes and participate intelligently in a complex world of interconnections. Here are two truths about living and working within a complex system that I hope we can learn in time:
In a complex system, there is no such thing as simple cause and effect. There’s no one person to blame or to take the credit. Watch how, during a crisis or success, people immediately step up to either assign blame or to take all the credit. Why has crime decreased in the U.S. over the past few years? Police say it’s because there are more officers, judges say it’s due to tougher sentences, parents say it’s because of better parenting, and so on. But it was all of their contributions, interacting in inexplicable ways, that gave birth to success.
Focusing makes things fuzzier. The more we study a complex phenomenon, the more confused we are bound to become. Few of us like to feel confused or to be confronted by messiness. But interrelated phenomena are very messy. Our attempts at understanding (by reading reports, listening to different commentaries, thinking about issues) only drag us into further complexity. Instead of clarity, we experience more uncertainty.
I believe that our very survival depends upon our becoming better systems thinkers. Where do we intervene when we can’t determine a straightforward cause and effect relationship? What kinds of actions make sense when we’re confronted by increasing uncertainty? Here are a few principles I’ve learned:
Start something and see who notices. It’s only after we initiate something that we see the connecting threads in a system. That’s when someone we don’t even know suddenly appears, feeling either outraged or helpful. Before we started, we didn’t know there was any connection between us, but the response makes the connection clear. Now we need to develop a relationship with that person.
Whatever you initiate, expect unintended consequences. Because all of the interactions can’t be seen ahead of time, every effort to change a system creates unintended consequences. Unintended consequences often result when humans try to change the natural ecology of a place. Fertilizer is introduced to farm fields without noticing how rain water connects fields to oceans. Over time, we’ve gotten bountiful crops but fewer fish. I know of one corporation that created a Museum of Unintended Consequences: they wanted to notice the impacts of any effort at organizational change. When we’re willing to look at all of the impacts, they teach a great deal about how a system operates.
Reflect often. If we take time to notice what has just happened, we learn how the system operates. Without reflection, we go blindly on our way, creating more unintended consequences and failing to achieve anything useful. It’s amazing to me how much we do, but how little time we spend reflecting on what we just did.
Seek out different interpretations. Run ideas by many different people. Everyone in a complex system has a slightly different interpretation. The more interpretations we gather, the easier it becomes to gain a sense of the whole.
Look for insights to emerge out of messiness. Puzzling and messy situations often provoke flight. Either we grab an easy answer or we take actions that have no rational basis. But confusion can create the conditions for insights to appear, often when we least expect them. We can trust these insights and use them as the basis for action.