The Wisdom To Choose Action or Inaction
Complicated versus complex systems
Butterfly Fractal, Source: Wikipedia
Western politics and progressive logic are generally dominated by the idea that, when encountering a problem or injustice, it is better to act than not. There is injustice in the world. We must act to prevent that injustice. Children born in underdeveloped economies are dying of starvation, or malaria, or diarrheal disease, or a thousand other preventable things. We must act on their behalf. The climate is changing. We must act. An infection is spreading across national and geographic boundaries. We must act to prevent death and disease. A perceived moral imperative exists to mitigate injustice, suffering ….. and to prevent changes that threaten current homeostasis; threats to the current order of things.
Modern Western society has come to believe the myth that it is possible to be all-knowing, that the power of Science, Scientism, or Engineering enables us to pierce the veil of time and determine how best to intervene to mitigate risk and prevent bad things from happening to individuals, populations, ecosystems, or the globe. That these solutions can be implemented surgically and precisely so that only the intended outcome occurs if we have the moral will to do the right and just thing.
In a recent lecture at the Brownstone Institute annual retreat, Dr. Bret Weinstein spoke on the difference between complicated and complex systems. As I listened to him develop and examine this logic train, my first reaction was that this was a rather esoteric and academic topic to present to a diverse general assembly of dissenters. But Bret is onto something fundamental. In his choice of examples to illustrate his points, he demonstrated that he understood how this seeming nuance was at the core of many of the most important philosophical conflicts in modern politics, governance, and Western society. Computers are complicated. Biology and ecosystems are complex. Computers are the product of engineers. Biology and ecosystems are the product of evolution.
Computers are complicated, but they can be understood. With sufficient understanding, their “behavior” can be predicted quite precisely. This is a common property of complicated systems. Although seemingly mysterious to the untutored, with sufficient data and knowledge, complicated systems can be understood with enough precision to be modified accurately and predictably. As a former computer science student, I have a fundamentally solid understanding of computer hardware, software, and network process architecture. Computers are not mysterious to me. But for the untutored, computer hardware, software, and networking is a sort of magic.
Biology, and ecosystems, are complex. As are individual biologic species, including humans. They can be studied, and predictions about their behavior as individuals and as systems can be made, but there is an underlying unpredictability intrinsic to complex systems. There is a fractal and chaotic nature to their structure and behavior, a self-assembling property that emerges from this complexity that is very sensitive to small changes in the conditions within which they exist. Often referred to as the “butterfly effect;” this is a favorite topic of science fiction visionaries contemplating the risks of time travel. The traveler inadvertently steps on a butterfly while traveling into the past, and returns to a future that has been transformed due to this small, seemingly inconsequential act. As a specialist in evolutionary biology, Dr. Weinstein is particularly attuned to the intrinsic unpredictability of complex systems.
Neither ecosystems, humankind, or even individual immune systems are machines. They are not the product of human engineers. They are complex, not complicated systems. Their current state at any one-time results from unpredictable interactions with a wide range of variable conditions. Both are intrinsically chaotic, self-assembling, and unpredictable. No matter how much data are acquired, their properties can not be fully comprehended. The overall structure of their behavior as systems can be partially predicted, but they are so complex that the impact of altering the conditions within which they exist can not be reliably predicted. The best predictions that can be achieved require an interpretive process in which a controlled representative sample of the complex system is subjected to a change in conditions. Then the impact of that intervention is observed. Depending on structure and context, this process is referred to as “trial and error,” “experimentation,” or “evolution”. However, the information gleaned highly depends on the sample's nature, starting conditions, implementation of the intervention, and the overall context or environment.
Human behavior, human political ecosystems, and human innovation or adaptation to external constraints are complex. No matter how comprehensive the database cataloging their metadata, no matter how deep the historical catalog of information or complete the sociological, philosophical or psychological profiling, individual human biology, the complexity of the human mind, social interactions between individuals, and interfaces between humans and their environment yields a chaotic outcome that is extremely sensitive to context and conditions. Interventions in these systems, whether medical or political, always have unpredictable consequences.
And this is a fact, a force if you will, that seems to escape those proponents of social engineering who think it is possible to predict the short and long-term consequences of “morally justified” actions. In the 1960s, a “war on poverty” and a “war on hunger” were launched by the most powerful Nation-State on the globe. For all for the “best” and most “morally justified” reasons. The United States had the resources and capability, and there was a broad consensus that it had a moral obligation to act to mitigate suffering. Both have had enormous, unpredicted, and devastating effects on a broad cross-section of humanity. In the intelligence community, this type of cascade of unintended consequences is known as “blowback.” An intervention may seem rational, reasonable, or predictable in the short term, but over the long term, the butterfly effect prevails. Behaviorally, humans are generally brilliant and able to rapidly adapt to their environment (perhaps more rapidly than any other large species); society and the human condition are chaotic and unpredictable. Humanity has emergent properties that are exquisitely sensitive to even minor changes in environmental conditions. The best-laid plans of mice and men often go awry. Be careful what you wish for, because you might get it.
Humanity and digital computational systems are very different. Which underlies a fundamental modern problem. A new generation of oligarchs has emerged consequent to the profitability of the digital revolution. And these oligarchs and their technocratic servants lack understanding or even awareness of the difference between complicated systems and complex systems. Of course, they see humanity and social engineering as a problem set that involves acquiring sufficient data and developing predictive algorithms. Just as the evolutionary biologist sees the world through the lens of his (or her) evolutionary biology metaphor, those whose fortunes are consequent to involvement in the birth of modern digital systems see the world from that perspective. But humans are not computers, and ecosystems are not the internet.
Hubris is the consequence of not recognizing one’s limitations. That includes not recognizing the bias intrinsic to the intellectual metaphors, language, experiences, and external variables that structure our thinking and worldview. The opposite of hubris is humility.
Some physicians recognize that the best medicine is often a tincture of time. Wisdom lies in knowing when not to act. To carefully observe, allow the passage of time to reveal aspects of underlying complexity, and then to act in a limited way on a small sample. Think globally, act locally and incrementally, and then observe the consequences of the action before generalizing and testing at a larger scale. Because patients are complex. And the consequences of intervening in a complex system are unpredictable.
For example, the Director of the United Nations has stated that “Agenda 2030” and the “Pact for the Future” include “the best plans,” and these plans should be implemented globally as soon as possible. This is an example of hubris operating at a global scale. The only thing predictable about this level of intervention in human affairs is that the consequences will be unpredictable, and history indicates that catastrophic outcomes are much more likely than the naively proposed, overly optimistic, untested predictions of starry-eyed social engineers.
To bring it back to Dr. Weinstein and his central metaphor, the wise, time-tested path is to allow complex systems to evolve to respond to their environmental context and changing conditions. And to do so in a decentralized manner. Allow different “societies” (or social experiments) to constantly and autonomously seek to adapt to their local conditions. To do this without external intervention by wealthy, resource-rich or more highly developed third-party agents, nation-states, transnational organizations, or non-governmental organizations.
I suggest that one should be deeply wary of unilateral interventions in the internal affairs of societies or Nation-States based on external concepts of “morality.” At the margins, a helping hand for internally developed initiatives may be constructive, but should be implemented cautiously and on a limited incremental basis. Centrally planned and globally implemented unilateral “solutions” will inevitably lead to widespread global tragedy. The idea that the behavior of complex systems can be predictably engineered as if human society is the same as digital systems reflects ignorant, naive, and deeply dangerous hubris.
A wise leader knows when to act, when to not act, and practices humility in recognizing the difference.
“A leader leads by example, not by force”
“In immovability, be like a mountain”
“Victory comes from finding opportunities in problems”
Sun Tzu, The Art of War.
The idea that interpersonal communication between human beings should be censored and constrained to facilitate the implementation of global social engineering plans will compound and deepen the predictable tragedies and suffering because it will prevent human societies from adapting and from learning from the multitude of small experiments that provide a key advantage to decentralized systems as they encounter environmental changes. Censorship and thought control will destroy the unique human superpower of decentralized communication, which is what enables us (as individuals and as a species) to adapt to change rapidly and will allow us to overcome the dark predictions of neo-Malthusianism. The “logic” of nudging, censorship, PsyWar, thought, and emotion control will prevent us from evolving and adapting to environmental and societal changes.
Instead, we should encourage decentralized diversity in thought and society, choose to respect the unpredictability of the future and have the wisdom to act cautiously and incrementally when appropriate and, at times, to not act at all but rather to practice humble, patient, watchful waiting. To be aware that the best medicine is often a tincture of time. As a superpower and global leader, this would be a much more mature position instead of the short-term interventionism and opportunism that almost always characterizes US foreign policy.
Because humans and humanity are complex, not complicated.
PsyWar: Enforcing the New World Order
The audio version narrated by me and is available on Amazon.
PsyWar is also available in hardcover and kindle at Amazon, Skyhorse publishing, and other booksellers.