One of the reasons I started FourWeekMBA back in 2015 was that I could not make sense of the current business landscape by simply leveraging what I had learned in business school.
I needed a different mental framework, so I started documenting my journey. In my research, a concept came up that slightly changed my view about the real world: Bound Rationality!
I'll show you, in a bit, why most of the conclusions psychologists draw from "biases in decision-making" might have been entirely off in real-world scenarios and what alternatives to use!
Let me explain...
In a 1996 paper entitled "Reasoning the Fast and Frugal Way: Models of Bounded Rationality," psychologists Gerd Gigerenzer and Daniel G. Goldstein highlighted:
Humans and animals make inferences about the world under limited time and knowledge. In contrast, many models of rational inference treat the mind as a Laplacean Demon, equipped with unlimited time, knowledge, and computational might.
This is a very important concept to start with. Where modern psychologists and theorists of mind manufacture experiments in the lab, those experiments are tied to specific scenarios that are hardly replicable in the real world.
Why is that? It all starts with a very narrow theory of mind.
A narrow definition of rationality
Experiments are manufactured and often based on assumptions about how our minds work.
For instance, if a psychologist labels rationality as the ability to optimize during a decision-making process (just like a machine would do), this requires the mind to gather all the possible information to come to a logical decision.
However, in the real world, decisions are made with incomplete information, a high degree of uncertainty, and little to no understanding of what's coming next.
Therefore, when the psychologist mutters about the human brain's inability to understand statistics or logic in the real world, that means survival.
If surviving means losing some efficiency or avoiding optimization to prevent massive failure, our mind works as it should.
Risk vs. Uncertainty
Another component of the conventional or prevailing school of thought is the lack of understanding of the domain in which the human mind operates.
That's a key point in understanding the difference between risk and uncertainty.
Risk is computable
Risk is a concept that analysts love. Why? It's something that can be modeled. Thus, it is circumscribed to scenarios that have definite rules, like games. You often see in business books how game theory helped businessmen to be successful.
But that is a story crafted in hindsight. Game theory or your skills as a chess player might help you (in impressing others) in normal circumstances (assuming those exist), but they won't help you much in the real world unless you have an alternative toolbox made of heuristics.
Uncertainty is not computable
When financial analysts evaluate risks, they fall into the trap of thinking we can understand the real world by modeling it.
Modern approaches to entrepreneurship try to bring this same logic to the business world with swift consequences.
When there is a high variability of outcomes, it's impossible to model the risk. If at all, you need a simple set of rules of thumb to avoid the worst-case scenario because if that materializes, no risk model will help.
Indeed, the consequences of an uncertain scenario might be too dire for you to see its outcome because survival is at stake.
"Unmodeling the real world"
When psychological experiments are performed in a lab, oftentimes, the psychologist starts with a preconceived idea of the human mind, and she works her way back to prove it with an experiment.
When that happens, experiments are "manufactured" (in many cases unconsciously) to produce a certain result (in short, biases are more a domain applicable to psychologists than to laypeople dealing with real-world uncertainty).
This has come up recently with what is called a Replication Crisis, which, as highlighted on Wikipedia:
The replication crisis (or replicability crisis or reproducibility crisis) is, as of 2019, an ongoing methodological crisis in which it has been found that many scientific studies are difficult or impossible to replicate or reproduce. The replication crisis affects the social sciences and medicine most severely.
Part of this trend is the use of improper statistical tools for real-world analyses and the fact that research sometimes becomes an attention-driven activity.
As pointed out by Noah Smith in Bloombergs' "Why 'Statistical Significance' Is Often Insignificant:"
In psychology, in medicine, and in some fields of economics, large and systematic searches are discovering that many findings in the literature are spurious. John Ioannidis, professor of medicine and health research at Stanford University, goes so far as to say that “most published research findings are false,” including those in economics. The tendency research journals have of publishing anything with p-values lower than 5 percent -- the arbitrary value referred to as “statistical significance” -- is widely suspected as a culprit.
To be sure, this is not to say those experiments aren't valid.
Worse than that, in some instances, they carry from the beginning assumptions about the psyche of the subjects that are biased themselves.
In short, the biases we all talk about nowadays, especially in the business world, might easily be explained with a theory of mind that goes beyond the conventional definition of rationality.
This definition starts by thinking of our mind as an easily tricked machine that, due to its survival mechanisms, isn't well-adapted anymore to modern times. Thus, it can easily fall prey to dozens, if not hundreds, of biases that affect our daily lives.
That is, we see anywhere today in business publications massive lists of cognitive biases that make us more "aware."
Heuristics: dirt and quick? Not really!
As highlighted in "Heuristic Decision Making:"
The goal of making judgments more accurately by ignoring information is new. It goes beyond the classical assumption that a heuristic trades off some accuracy for less effort.
The central perspective for which heuristics have been studied and communicated to a mass business audience is through the fact that, by definition, a heuristic is quick and dirty. In short, our error-prone mind generates biases because we use heuristics that make us sacrifice efficiency for speed in the face of a lazy mechanism of the mind.
According to this view, the mind might ignore important information in an efficiency-driven way, almost like it was optimizing for computing power.
The mind might have learned that ignoring useless information is a more effective survival mechanism in that specific context. Therefore, focusing on one key data point is more reliable than taking more information. This completely changes the paradigm.
Whereas a lazy-driven mind avoids too much information because it's not computable to process it (thus sacrificing efficiency for speed, almost like a computer).
In a new paradigm, heuristics and rules of thumb become central as a necessary filtering mechanism of the mind that learns how to ignore useless and irrelevant information.
In short, the outcome of the action matters, not the process or the motivation that drives the process.
Fast, frugal, yet accurate
Another key concept to internalize to deeply understand this alternative view of bounded rationality is ecological rationality. Ecological rationality seeks strategies better suited for a specific environment and context.
The key point here is that there is no best strategy or optimization strategy because that would not be possible in a large, uncertain world.
Therefore, the rules of thumb we might be able to use for each circumstance will help us take advantage of the structure of our environment.
Thus, in this decision-making process, we create a small world but are highly adapted to context and circumstance, which is the opposite of what classic theories of rationality do, assuming that our mind works in a vacuum or a sort of free-context reality.
The two sides of Bounded rationality
Bounded rationality is a concept attributed to Herbert Simon, an economist and political scientist interested in decision-making and how we make decisions in the real world.
In fact, he believed that humans follow what he called satisficing rather than optimizing (which was the mainstream view in the past decades).
Based on what we have said so far, let's look again at the concept of bounded rationality. According to the definition given by his father, Simon, bounded rationality has two main sides:
ecological
and cognitive
It's ecological because "the mind is adapted for real-world environments." Therefore, on the one hand, the mind makes decisions based on the structure of the environment, and on the other hand, there is the decision-maker's computational capability (cognitive side).
As Gerd Gigerenzer and Wolfgang Gaissmaier highlighted in "Heuristic Decision Making," modern psychologists have focused their attention on the latter (the cognitive side).
More precisely, the focus on the cognitive side has produced the misunderstanding that as the human mind has a limited ability to process information, it creates a set of irreparable biases.
Part of this misunderstanding might be due to the fact that those presumably simple heuristics that the mind uses to solve real-world problems are not sophisticated enough to look attractive to the norms of classical rationality.
Bounded Rationality: Real-World Case Studies
Inventory Management at Zara:
Situation: Zara, a leading fashion retailer, faced the challenge of keeping up with rapidly changing fashion trends.
Bounded Rationality in Action: Instead of trying to predict fashion trends months in advance, Zara adopted a quick-response system. They produced in smaller batches and relied on frequent feedback from store managers about what was selling.
Result: Reduced unsold inventory risks and maximized sales of popular items. This agility became a significant competitive advantage for Zara.
Southwest Airlines' Flight Routes:
Situation: While most airlines operated on hub-and-spoke models, Southwest Airlines needed a different strategy to stand out.
Bounded Rationality in Action: Recognizing the complexities of predicting optimal routes, Southwest opted for point-to-point routes with quick turnarounds.
Result: Increased aircraft utilization, fewer delays, and reduced dependency on any single airport.
IKEA's Flat-Pack Furniture:
Situation: IKEA wanted to reduce shipping and storage costs.
Bounded Rationality in Action: Instead of predicting the best sizes for shipping, IKEA decided to let customers assemble products, enabling flat packaging.
Result: Significant reduction in shipping and storage costs, and the self-assembly concept became part of IKEA's unique selling proposition.
Toyota's Lean Manufacturing:
Situation: Toyota wanted to reduce waste in its manufacturing process.
Bounded Rationality in Action: Recognizing the challenges of predicting every inefficiency, Toyota implemented the "Kaizen" approach—continuous improvement based on feedback from the production floor.
Result: A more efficient production system that could quickly adapt to changes and reduce costs.
Amazon's Marketplace:
Situation: Amazon wanted to expand its product range without holding vast amounts of inventory.
Bounded Rationality in Action: Instead of trying to predict the best products to stock, Amazon opened its platform to third-party sellers.
Result: Expanded product range without the associated inventory risks, and additional revenue from seller fees.
Netflix's Algorithmic Recommendations:
Situation: With a vast library of content, Netflix needed a way to guide users to shows and movies they'd enjoy.
Bounded Rationality in Action: Recognizing the impossibility of manually curating recommendations for every user, Netflix invested in machine learning algorithms that adapt based on user behavior.
Result: Increased viewer engagement and retention as users received tailored content suggestions.
Recap: In This Issue
Limited Time and Knowledge: Humans and animals make inferences about the world under limited time and knowledge, in contrast to traditional models of rational inference that assume unlimited resources.
Narrow Definition of Rationality: Traditional definitions of rationality assume optimization and gathering all possible information for logical decision-making, which may not align with real-world decision-making that involves incomplete information and uncertainty.
Risk vs. Uncertainty: Traditional models focus on computable risks with definite rules, while uncertainty, characterized by high variability and unpredictable outcomes, is harder to model and requires heuristics to navigate effectively.
"Unmodeling the Real World": Psychological experiments conducted in controlled lab settings may be biased and fail to capture the complexities of real-world decision-making, leading to a replication crisis in scientific research.
Bounded Rationality: Bounded rationality, proposed by Herbert Simon, suggests that humans satisfice rather than optimize, taking into account both the ecological aspects of decision-making (adapting to the environment) and the cognitive limitations of decision-makers.
Heuristics and Biases: Heuristics, often considered quick and dirty shortcuts, play a crucial role in decision-making by filtering out irrelevant information. Biases arise from the use of heuristics and can affect daily lives, but not in the sense of how behavioural psychology has classified them! Rather as an adaptive toolbox.
Ecological Rationality: Ecological rationality focuses on strategies that are better suited for specific environments and contexts rather than seeking an overall best strategy. It acknowledges the adaptability of decision-making to different circumstances.
Two Sides of Bounded Rationality: Bounded rationality comprises ecological and cognitive aspects. The ecological side emphasizes decision-making based on the structure of the environment, while the cognitive side considers the computational capabilities of decision-makers.
Ciao!
With ♥️ Gennaro, FourWeekMBA