Complex Problems: What Does the Nature of the Problem Tell Us About Its Solution
1. Overview
In The 7 Timeless Steps to Guide You Through Complex Problem Solving, we discussed a generic approach that could be systematically applied to solving complex problems. Since not all problems are complex, and many gradations of complexity exist, it is probably a good idea to start by defining what complex problem-solving involves and what categories of problems are most suitable to tackle using that approach. For this reason, understanding complex problems made the top position on the list.
Practically all living organisms deal with complex problems, from single-celled amebas to societies of Homo sapiens, and surprisingly, the solution-creation process can be very similar, at least on the conceptual level. This article will elaborate on this point further, articulating the terminology and ideas often associated with how complex adaptive systems solve complex problems. More specifically, we will answer the following questions:
2. Agenda
This article is part of a series on complex problem-solving. The list below will guide you through the different subtopics.
3. An Intuitive Definition of Complex Problems
We all intuitively grasp the characteristics of challenging problems, at least at their fundamental levels. For instance, we can promptly recognize that fixing a faulty washing machine is relatively simple. First, we need basic technical skills to identify the faulty part. Next, we would read the code on the back, order a spare part, and finally replace it.
In simple problems, there is no uncertainty around the root cause or the solution.
On the other hand, deciding whether or not to accept a job offer is anything but simple. Firstly, you will never have sufficient information to make an optimal decision. Secondly, you cannot predict the consequences of such a decision. Finally, whichever choice you make will change your worldview, rendering any forecasts you have made of the future almost instantly obsolete.
The following characteristics distinguish complex problems.
So, what are complex problems?
Complex problems — an intuitive guide.
Non-triviality
Complex problems generally admit non-trivial solutions. In addition to strong field expertise and solid analytical skills, they require a high cognitive load to formulate.
Uncertainty
Solutions to complex problems cannot be guaranteed as the behaviour of the system to which the solution is applied is always unpredictable.
Consensus
Diagnosing complex problems is especially challenging because consensus on facts, root causes, and solutions can be difficult to obtain, especially in large groups.
3. Challenges of Working With Complex Problems
Experts like Nassim Taleb, Gerd Gigerenzer, and Daniel Kahnemann insist that solving complex problems is relatively easy once we understand which tools to apply. In their view, failures come from applying engineering methods like optimization rather than intuition, heuristics, biases, imitation, and many other techniques refined over millennia of evolution and accumulated wisdom.
4. Complex Problems in the Literature
Experts have extensively researched topics associated with intuition, cognitive psychology, risk management, organisational behaviour, and decision-making under uncertainty. This has left us with a rich body of knowledge popularized by best-selling authors such as Daniel Kahneman and Nassim Taleb, which will be reviewed next.
4.1 Fooled by Randomness (Taleb, 2001)
Fooled by Randomness is one of Taleb’s best-selling books, and its central story revolves around the hidden role of chance in our lives. In Taleb’s view, we grossly and routinely overestimate our capabilities to forecast future events (the turkey problem) and cope with that failure through mechanisms like the narrative fallacy and our ability to reconstruct past events based on new information.
Key takeaways from Fooled By Randomness
4.2 Thinking, Fast and Slow (Kahneman, 2011)
Thinking, Fast and Slow is a best-selling book by Daniel Kahneman popularizing his work in cognitive psychology about the mechanism and efficiency of human judgment and decision-making under conditions of uncertainty. His original idea revolves around modelling the human mind as two systems, which he refers to as System 1 and System 2.
Key takeaways from Thinking, Fast and Slow
4.3 Process Consultation (Schein, 1969)
Professor Edgar Schein is a leading authority in organizational behaviour, culture, and psychology. His short but insightful book Process Consultation: Its Role in Organizational Development dedicates a full chapter to group problem-solving and decision-making. Schein explores how leaders and their groups tackle complex problems in this chapter.
Key takeaways from Process Consultation
5. The Information Sufficiency Problem
5.1 How Much Data Is Enough?
During the Newtonian age, physicists believed that once the initial conditions of a physical system were precisely determined, its future evolution could be predicted with arbitrary precision. For example, the laws of dynamics allow us to calculate the infinite trajectory of a point mass given its initial position and velocity.
What happens when the system consists of innumerable particles, each with a different initial speed and position? For practical reasons, we substitute the individual particles with a unit of volume where its macro properties can be calculated by averaging over its constituent particles. For example, instead of registering the speed and position of every molecule in a gas container, we substitute those numbers with temperature and pressure calculated on a coarse-grained subvolume. This coarse-graining allows us to explore the system’s physical properties without drowning in data.
5.2 The Rise of Statistical Mechanics and Probabilistic Models
The coarse-graining method and the impracticality of precise calculations on the molecular level gave rise to statistical mechanics, which Boltzmann and others pioneered. Under statistical mechanics, physical systems are governed by the laws of thermodynamics. The second law of thermodynamics is the most famous, dictating that a system’s entropy (or disorder) must always increase.
The practical advantages of using coarse-graining came at a cost, as a probabilistic model replaced the classic view of deterministic evolution. In this new paradigm, a physical system is predisposed to evolve into one of numerous states. We can only predict the probability that it will be in a given future state, but we can never be sure which one.
But all is not lost. Even with the probabilistic model, we can still calculate a system’s future state and create contingency plans for each scenario. We might even be able to influence the outcome by applying pressure on known system levers. This assumption forms the basis of Strategic Choice Theory.
Strategic choice theory, in the realm of organizational theory, emphasizes the influence of leaders and decision-makers on an organization’s direction. It contrasts with earlier views that saw organizations solely responding to external forces.
Managing Probabilistic Systems
In probabilistic models, we assume that the system’s future states are well-defined and their probabilities are calculable. Given this information, adequate planning and optimization processes can be applied to maximize a specific utility function.
5.3 Probabilistic Models Cannot Account for Innovation
Any physical, chemical, or biological system that shows innovation cannot, by definition, be analyzed using probability models, as the latter assumes all future states are static and knowable in advance. Also, the probabilities for reaching any of those states are either fixed or vary according to well-specified rules.
Therefore, probabilistic models are not good enough to predict the future behaviour of human systems. This also spells trouble for Strategic Choice Theory, which relies on simple causal relationships between leaders’ interventions and desired consequences to achieve progress or resolve conflicts.
If a system can produce novel behaviour, it is unpredictable and, therefore, hard to manage. Ecologies of living organisms can only be understood through complexity theory and managed by principles that consider that.
Complex systems presenting complex problems will never offer sufficient information, and managers must make choices under uncertain conditions.
Even if we consider every atom (or elementary particle) in the universe, we still would not be able to predict the rich diversity of phenomena (including biodiversity on Earth) that we currently observe. Quantum mechanics and symmetry breaking ensure enough randomness is injected into the system to produce rich but unpredictable results.
The same applies when we try to understand the source of consciousness in our brains. Would it help to incorporate every neuron and synapse in a gigantic mathematical model? Even if this becomes practical someday, experts seem to believe that emerging consciousness in the inanimate matter is far away.
In summary, there seems to be a hard limit on how much useful information, in principle and practice, can be gleaned by observing a complex system.
6. Problem Classification
6.1 Maximizing Utility Functions
Problems can present themselves in many different ways. However, we are interested in those characterized by a utility function.
A utility function is a concept primarily used in economics, decision theory, and game theory to represent an individual’s preferences over different outcomes or states of the world. It assigns a numerical (or utility) value to each possible outcome or combination, reflecting the individual’s subjective satisfaction or preference associated with those outcomes.
Here are some key points about utility functions:
Using utility functions, people can compare complex options involving chance or risk and make decisions based on their preferences and risk tolerance.
6.2 Ordered, Chaotic, Complex, and Random Systems
Imagine that you have the following problem. You are required to configure an air conditioning system for a data centre. The system is composed of two machines: a cooling engine and a computer connected to it. The computer has temperature and humidity sensors and various switches and dials that allow operators to set control parameters such as maximum temperature or humidity.
The engineer setting up the system must configure it to minimize power consumption while keeping the room at a given temperature and humidity level. The only issue is that the system does not have an operations guide, and the engineer has to figure out how to set it up using trial and error.
Four scenarios are possible: Ordered, Random, Complex, and Chaotic.
Ordered Systems
Random Systems
Chaotic Systems
Complex Systems
7. Small Worlds, Optimization, and Unknown Unknowns
7.1 Leonard J. Savage’s “Small World”
Leonard Jimmie Savage (1917-1971) was an American statistician and economist who significantly contributed to statistics, decision theory, and econometrics.
“Savage’s Small World” refers to a thought experiment proposed by the statistician and economist Leonard Jimmie Savage. This concept is often cited in discussions about subjective probability and decision theory.
Decision-Making in a Simple World
L. J. Savage’s “Small World”
The significance of Savage’s Small World lies in its implications for decision theory. It illustrates an idealized scenario where uncertainty is minimized, and individuals have perfect knowledge about the consequences of their actions. In reality, however, decision-makers often face uncertainty and incomplete information, prompting probabilistic reasoning and subjective judgment.
By contrasting Savage’s Small World with the complexities of real-world decision-making, Savage highlighted the importance of subjective probability for navigating uncertainty and making rational choices. Subjective probability allows individuals to express their beliefs and uncertainty in a formal framework, facilitating reasoned decision-making even when complete information is lacking.
7.2 Optimization Techniques
Optimization techniques can be effectively applied in a Small World scenario where all outcomes and probabilities can be precisely computed beforehand. This is because decision-makers have complete knowledge of the system, allowing them to accurately assess the consequences of their actions and choose the optimal course of action based on predetermined criteria.
In contrast, in the real world, uncertainty, complexity, and incomplete information often make it challenging to compute outcomes and probabilities beforehand precisely. As a result, optimization techniques may not be as effective, as they rely on accurate information to generate optimal solutions. Decision-makers must contend with uncertainty and imperfect knowledge, which can lead to suboptimal outcomes even when applying optimisation techniques.
One example where optimization relies on known outcomes and their probabilities is in the context of inventory management.
In inventory management, a retailer determines the optimal inventory level for each product to minimize costs while ensuring that customer demand is met. In this case, the utility function represents the retailer’s objective, which typically involves minimizing inventory holding costs and stockouts.
Optimisation Process in Inventory Management
Here’s a rigorous breakdown of the optimization process:
By incorporating known outcomes (demand scenarios) and their probabilities into the utility function and using optimization techniques, retailers can manage their inventory effectively, minimizing costs while ensuring customer satisfaction and maintaining adequate product availability.
7.3 Optimisation in Complex Worlds
Optimization techniques may encounter challenges in complex situations, particularly those governed by power laws (see discussion on Gaussian Distributions vs Power Laws: Your Ultimate Guide to Making Sense of Natural and Social Phenomena and their impact on our understanding of complex natural phenomena), due to several reasons:
Practical challenges of estimating model parameters in power laws versus Gaussians
Comparing the practical difficulties of estimating model parameters such as mean and variance in power laws versus Gaussians:
7.4 Unknown Unknowns
The concept of “unknown unknowns” refers to phenomena or factors that are not only unknown but also unknowable.
In Savage’s “Small World,” which represents an idealized scenario where decision-makers have perfect knowledge of outcomes and their probabilities, the concept of “unknown unknowns” highlights the limitations of this idealization. In a Small World, nothing new ever happens, and there can be no “Unknown Unknowns”.
In contrast, complex adaptive systems constantly display emergent behaviour; patterns that could not have been anticipated. A leader managing such a system cannot list all possible outcomes, let alone assign each probability.
8. Subjectivity and the Role of the Observer
In decision-making, a leader’s subjective experience contrasts with their role as an objective observer. For example, in systems thinking and cybernetics, the leader must diagnose problems based on data and evidence and formulate a logical and rational solution.
A leader working on complex problems in a social group is an integral part of the system. As we have seen in previous sections, the leader is unable to gather sufficient information about the system in principle and practice, and whatever data they gather will be coloured by their subjective experience
Systems Thinking in a Nuntshell
Systems Thinking is a holistic approach to understanding complex systems by examining their interconnectedness, interdependencies, and dynamics. It views the leader’s role in an organization as crucial for effective strategy formulation and decision-making by emphasizing the following principles:
Systems Thinking addresses the paradox of the leader being part of the system being managed by acknowledging the leader’s dual role as both a participant within the system and an external observer guiding its direction. Several key principles help resolve this conflict:
9. Summary
Exploring problem-solving reveals that not all problems are created equal. Distinguishing between simple and complex problems reveals the underlying nature of the systems they belong to. Simple problems are typically found within ordered systems, whereas complex problems are inherent to complex systems. These systems extend beyond biological ecologies to encompass social groups and organizations, where intricate interactions and emergent behaviours define their complexity.
One defining characteristic of complex systems is their governance by power laws, rendering traditional optimization techniques ineffective. Unlike in ordered systems, where linear solutions may suffice, complex systems defy such neat categorizations. Applying optimization strategies proves futile due to the non-linear, unpredictable dynamics governed by power laws.
Heuristics emerge as promising alternatives to optimization in navigating the labyrinth of complex systems. These intuitive, rule-of-thumb approaches allow for adaptive decision-making, acknowledging complex systems’ inherent uncertainty and non-linearity.
10. References
- Thinking, Fast and Slow — by Daniel Kahneman, 2011
- Fooled by Randomness — by Nassim Nicholas Taleb, 2001
- Process Consultation — by Edgar Schein, 1969
- The Quark and the Jaguar — by Murray Gell-Mann, 1994