Decision Making In a Professional Environment: Techniques and Pitfalls
1. Overview
Every day, we make hundreds, if not thousands, of decisions. Some of these are inconsequential and mundane, while others can be life-changing.
Most of these decisions are made intuitively, swiftly, and without forethought. They involve taking a stroll in the park while conversing with a colleague, preparing lunch, and scrolling pages on social media.
Other types of decisions can be metamorphic and irreversible. These usually are deliberate, logical, rational, and effort-intensive exercises.
Research in social and cognitive psychology has extensively studied our decision-making processes and has come up with a few exciting hypotheses that have been confirmed repeatedly by experiments.
The following statements summarise the significant results of this research:
- Individually, human beings are not as rational as we think. Their decision-making processes are illogical, irrational, inconsistent, and flawed.
- Shared assumptions drive us as a group and form our organisational culture. This cognitive construct helps us make sense of our environment and cope with any challenges it throws our way. The changing nature of our environment constantly forces us to transform our culture to address new challenges.
Given our ill-equipped mental capabilities to come up with logical and rational decisions to complex questions and the ever-present pressure from our environment to adapt and transform, the question now becomes how do we collectively come up with the best decisions for the survival of our group?
This question does not have an obvious and straightforward answer; it will require some unpacking, precisely what we will do throughout this article.
First, we will explain why decision-making is usually tricky. Next, we will look at some techniques that will allow us to circumvent those challenges.
2. What is Decision-Making?
2.1 Judgement and Choices
Decision-making usually involves two types of activities:
- Judgement can be defined as determining whether a statement is True or False, estimating the likelihood of an event, or the size of a faraway object.
- Choice consists of selecting between two or more options, including some degree of risk and uncertainty. An example that IT professionals might be familiar with is the (sometimes) difficult choice between two alternative technologies.
2.2 Measuring Decision-Making Processes
What do superior decision-making processes in groups offer? Superior decision-making processes are both effective and efficient in that they:
- Produce good outcomes for the business
- With consistent quality over time and least variations in outcomes
- Involve the relevant stakeholders for maximum commitment and engagement
- Reach consensus as quickly as possible
2.3 Intiution and Rational (or Fast and Slow) Systems
We can distribute all decision-making activities in our minds between two logical systems: the fast and slow systems.
The fast system is:
- Always active: constantly scanning our surroundings for threats or signs of abnormality, which in this context refers to activities that our current view of the world does not endorse.
- Ready to jump to conclusions, offering quick answers. Its current worldview, short-term predictions, and threat evaluation are generally accurate.
- Constantly providing the slow system with data collected from the environment and its assessment for evaluation and validation. The slow system either endorses this assessment or rejects them. In the latter case, the slow system will try to offer a new solution.
- Partly acquired through evolution and partly through experience. Some of the knowledge available to the fast system is innate, while acquired skills and expert intuition can be defined as the consistent application of proven methods to known problems.
On the other hand, the slow system:
- Is lazy, only engaged when the fast system generates assessments that do not agree with the currently held view of the world, identifies a threat, or is unable to find a ready solution to a problem with which it was presented.
- Requires more energy to perform intensive operations. Running in lazy mode makes it more efficient.
- Helps you control your behaviour, like staying polite in a hostile social setting.
We can safely argue that groups operate similarly using their collective mental capacities.
The equivalent of the fast system for a group is the usage of the shared assumptions they hold as part of their organisational culture to solve immediate problems and explain the events around them. When faced with novel situations that existing beliefs cannot explain, the group might alter its assumptions (and consequently its culture) to find novel solutions.
Like the fast and slow systems in individuals, groups lazily approach problems in their environment. They start reexamining their current beliefs only when there is enough inconsistency in the data incoming from their environment (sometimes amounting to existential threats).
3. Why Decision-Making Is Difficult
The difficulty of decision-making can be attributed to risk, uncertainty, and the limitations of our decision-making organs.
A superior decision-making process must cater to any combination of these three factors.
3.1 Heuristics and Biases
Cognitive biases are systematic errors of judgement we perform every day when our minds try to answer a difficult question for which an exact answer is not readily available.
These errors are systematic because they result from consistently applying heuristics to complex problems instead of rigorous, logical, and rational thinking.
Cognitive bias was first coined in a 1974 research paper published by Amos Tversky and Daniel Kahneman. From the author’s original paper, we read the following:
People rely on a limited number of heuristic principles which reduce the complex tasks of assessing probabilities and predicting values to simpler judgemental operations… These judgements are all based on data of limited validity, which are processed according to heuristic rules.
— Amos Tversky, Daniel Kahneman
The below list shows a tiny sample of the cognitive biases that psychologists have documented:
- Thinking when tired induces us to take the standard, default solutions presented instead of thinking a problem through. When we are low on energy, we tend to fall back on our intuitive modes rather than using our energy-consuming rational ones. This phenomenon has been demonstrated by observing the discrepancies between judges’ decisions while hungry versus those made when they were not.
- Priming effects have a dramatic influence on the outcomes of our decisions. Priming challenges our long-held views as conscious, autonomous, and rational beings whose decisions are made on logical, evidence-based premises. Instead, our opinions are shaped by trivial stimuli from the world around us, as psychologist Kathleen Vohs demonstrated in one of her ground-breaking studies. Her findings showed that people’s behaviour was more selfish, individualist, cold, and business-like when primed with money cues (such as a screensaver with the image of a dollar).
- Cognitive ease defines a state of mind where we are comfortable and relaxed. During these episodes, our mood is usually happy. We are more inclined to believe what we hear or see, follow our intuition, be less creative, and be more prone to committing logical errors. Cognitive strain is where we are vigilant, active, sad, creative, and ready to invest more thought to find a solution. Experiments confirmed that a message repeatedly presented to participants induced feelings of familiarity and was more prone to be taken as accurate (see Robert Zajonc’s 1968 study). Displaying a message in clear fonts raises your chances of having it accepted by your audience with less scrutiny.
- Our intuitive thinking mode is ready to accept conclusions solely based on the story’s coherence, regardless of the quality and quantity of the presented data. This bias is known as WYSIATI or What You See Is All There Is. WYSIATI induces overconfidence in our decisions and explains the halo effect where we accept someone’s words, for example, mainly due to their good looks.
- The hindsight bias is a familiar phenomenon that we frequently experience. It consists of readjusting our world views after the fact and convincing ourselves that “we knew it all along”. We are much less inclined to admit our ignorance of how the world works and accept the role of chance in it. Philip Rosenzweig’s book The Halo Effect describes the fallacies that business people are prone to believe when trying to figure out the rules of success of enterprises based on their history.
3.2 Conclusion
The above list of cognitive biases is a representative sample of the types of systematic errors we are prone to when making judgements; there are many more.
For this discussion, what is interesting for us are the conclusions that we can draw after this analysis, and these are as follows:
- Our thinking organ is far from perfect and is not the rational, logical, fact-driven engine we thought it was. Therefore, our opinions and views of the world are sometimes seriously flawed.
- Our view of the world as individuals is subjective and abstract and does not correspond to an objective, physical image that everybody can experience if they invest enough effort.
- What drives and motivates our actions as members of a larger group are subconscious emotions, intuitions, environmental stimuli, and memories of past experiences unique to every individual in the group.
These three factors combined make decision-making in a group of professionals very challenging.
4. Guide to Making Better Decisions
4.1 Overview
Making better decisions as a group involves two secondary objectives. The first is raising the quality of the decisions made, while the second ensures commitment from the team.
In the coming sections, we will look at nine rules that will assist you in making better decisions. We will explain the rationale for adopting these rules and describe how these rules will address the challenges of decision-making presented earlier.
4.2 Rule 1: Avoid Strategic Mistakes
Good decision-making implies actively avoiding strategic mistakes by recognising precarious situations and taking appropriate action.
Strategic mistakes are those where recovery is impossible or prohibitively expensive.
Think of a game where you can double your money or yield everything if you lose. Regardless of the odds, participating in such a game is a strategic error as you stand to lose everything sooner or later.
An example of such a gamble is the systematic sacrificing of quality for speed in software delivery. There is no upside to it as, sooner or later, technical debt will turn your code into unsustainable legacy software.
Game theory distinguishes between two types of games: finite and infinite. While the former is played for a limited number of turns and with a fixed number of players, the latter can be indefinitely repeated as beaten players quit and new players join.
The players’ objective in an infinite game is to keep playing as long as possible; you only lose if you are forced to quit.
Infinite games model real-world businesses more closely than finite ones. No organisation can remain at the top forever. Still, it can stay in business if it successfully responds to a changing environment and beats its competition for a very long period.
Organisations suffer strategic setbacks when their leaders do not distinguish between the two categories and play for the first position at the expense of staying in the game.
Also, modelling organisations as one single infinite game is an oversimplification. In most real-world situations, the evolution of organisations is best viewed as a series of tiny parallel subgames that can be finite or infinite.
If you attempt to win in one game while compromising your ability to play in the others, you would be committing an error of strategic magnitude.
Focusing on short-term goals, creating a toxic organisational culture, ignoring customers’ needs, and hiding from the competition are examples of conscious decisions that can lead to costly and irreparable errors.
4.3 Rule 2: Engage in Adversarial Collaboration
In his monumental work Thinking Fast and Slow, Daniel Kahneman recounts the story of an intellectual battle with Gary Klein that raged for years and culminated in a joint paper they published in 2009.
Both held wildly different views on intuition and expertise and decided to collaborate to settle things with scientific methods. The author called it adversarial collaboration.
The moral of the story is as follows. Suppose you are deliberating a critical decision and are presented with two diametrically opposite views. In that case, one method of deciding which approach is the best could be by designing an experiment that unequivocally provides you with an answer.
A widely used example in user experience research is A/B Testing. A more complex instance would be pitting two variants of Agile (or DevOps) against each other on three separate projects.
These experiments can be expensive to set up and risky to complete, so you must be selective. The success criteria must also be well articulated before the experiment begins.
4.4 Rule 3: Setup a RACI Matrix
When decision-making procedures are highly unstructured, as they tend to be when power is dispersed, and widespread participation is invited, the particular decision taken on a particular occasion will depend almost entirely on the detailed context of the time. […] In this sense, the particular decision made comes to depend upon chance.
— Strategic Management and Organisational Dynamics – Ralph D. Stacey (4th edition)
The RACI matrix is an excellent tool for structuring your team to allow them to make the best decisions on various issues.
It consists of a table with tasks on one axis and resources on the other:
John | Sara | Beth | |
---|---|---|---|
Task A | A | R | C |
Task B | I | A | R |
Task C | N/A | I | A /R |
There are four types of values that you would want to assign to every box in a RACI Matrix, and these are as follows:
- Accountable (A): represents ownership of a task and is the one to be held accountable for its outcome. Only one person can be made Accountable for a specific task.
- Responsible (R): refers to the person(s) responsible for carrying it out. It is OK to have a person Accountable and Responsible for a particular task simultaneously.
- Consulted (C): defines the person(s) who needs to be consulted before or during the task.
- Informed (I): This group needs to be informed of the result but is not required to contribute to the work.
Using a RACI matrix provides quite a few advantages:
- First, it elucidates the roles and responsibilities of every team member.
- The second advantage is the sharing of decision-making powers and responsibilities. This division of labour can be critical as teams grow and leaders become overwhelmed.
- Thirdly, a RACI matrix ensures that people of expertise have a say in how things are done and that decision-making powers are not allocated based on hierarchy alone. The buck stops with a designated person for every task on the list.
- Fourth, a properly set up RACI matrix ensures that information flows from top to bottom and vice versa, and nobody is left in the dark. Controlled, bidirectional information flows are essential for a coordinated effort across the organization by ensuring that decision-makers have access to all the information they need.
- Fifth, a RACI matrix prevents deadlocks by giving the Accountable person enough authority to decide what happens next. In such a setup, no other person can veto the decision and impede progress.
- Finally, the RACI matrix provides a framework for decision-making that can facilitate governance and improvement.
Like any other collaboration tool, its success or demise will depend on whether the participants use it in good faith. There needs to be a genuine desire to get things done in the best possible way.
4.5 Rule 4: Understand the Limits of Expert Judgement
Given everything we discussed so far on the nature of judgement and decision-making and all the systematic errors involved, the question now becomes: what are the limits of expert intuition, and where does it start to fail?
Herbert A. Simon was a political scientist who received the Nobel prize in Economic Sciences in 1978. He defined expert intuition as follows:
The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.
We already know that our brains tend to substitute difficult questions with easier ones (the halo effect) when an exact solution to the former does not readily present itself in memory.
We also know that our minds do not question the quality and quantity of evidence when formulating a judgement (WYSIATI) but instead would try to come up with the most coherent story they can find.
And if that were not sufficient, our rational mind readily accepts senseless propositions from its intuitive counterpart when we are in a good mood.
When we are tempted as experts to propose a solution, perhaps we can ask the following questions before suggesting an answer:
- Is this situation similar to something we are already very familiar with, or is it a novel?
- Are the conditions under which the problem was presented relatively unchanged? Does the same solution still hold?
If the answer to either of the above is negative, ask yourself the below before offering a solution:
- Do we have enough data or evidence to support our proposition?
- Is the data of good quality?
- Did the data we have answer the actual question, or are we quickly jumping to conclusions?
Expert judgement is vital for nurses, firefighters, artisans, and any profession where problems present themselves more or less the same way and solutions remain valid for a long time.
On the other hand, research indicates that expert intuition is no better than throwing a die for professions like artists, creators, investment portfolio managers, and CEOs.
4.6 Rule 5: Make Decisions by Consensus
Generating consensus over a contentious problem can be time-consuming and emotionally taxing, so why do we need it?
There are a few advantages that decision-making by consensus offers, which we will showcase through two techniques, one used in The Toyota Way and the other is the Six Thinking Hats.
4.7 Decision-Making at Toyota
4.7.1 The Toyota Way
In this section, we will discuss Toyota’s approach to decision-making. The Toyota Way is a set of 14 principles covering the operation and management of a successful manufacturing business.
The principles are so powerful and universal that we have heavily relied on them in articulating our central theme: Operational Excellence in Software Development.
How the giant car manufacturer perceives and solves common business management problems is unique and inspiring.
The following three subsections will discuss three ideas under Principle 13: Decision Making by Consensus.
4.7.2 Thorough Considerations in Decision Making
The following quote from Liker’s excellent book on the subjects highlights Toyota’s problem-solving philosophy.
Toyota stands out as the preeminent analyst of strategy and tactics. Nothing is assumed. Everything is verified. The goal is to get it right.
— Jeffrey K. Liker – The Toyota Way
Toyota prefers that decisions turn out inadequate, but the decision-making process is precisely followed over having good decisions with incorrect processes.
4.7.3 Considering Many Alternatives
How much effort do you invest in planning before you go ahead with your implementation?
The answer to this question depends on a parameter we call cost of change. If the cost of change is low, you can afford to experiment. On the other hand, if the cost of changes is high, you better get it right the first time.

However, there is one upside to investing in design rather than experimentation: considering different alternatives, which you typically omit when jumping straight into experiments.
If you start experimenting early on, you get invested in one particular approach, which happened to be number one on your list.
It is granted that this solution will get better and better with several iterations and fine-tunings. However, it will converge to what can be called a “local minimum”, which is only the second- or even third-best solution.
For complex problems, considering multiple alternatives is, of course, time-consuming. In the long term, however, it will always yield higher-quality results on average.
4.7.4 Decision-Making by Consensus
Griffith is not an engineer, so I asked why an administrator would get this document. He seemed surprised that I would ask that and said that Toyota is always looking for broad input and he, too, will have an opinion on the vehicle.
— Jeffrey K. Liker – The Toyota Way
If you get everybody on board, you avoid having people root for their interests instead of the group’s.

Toyota’s Principle 13: Make Decisions Slowly By Consensus uses various decision-making techniques, but their preferred method is group consensus with management’s approval.
With this method, a group comes up with several options that have been fully explored, and these options are subsequently presented to management for approval.
Deliberations can be slow, and this is usually OK. The motto is to make decisions slowly but implement them quickly.
If a quick decision is needed and group consensus cannot be achieved, the management might step in and make the decision.
4.8 The Six Thinking Hats
The Six Thinking Hats is a method Dr Edward de Bono proposed as early as the 1970s. It aims to facilitate group discussions and allow teams to make better decisions in shorter timeframes.
The method suggests using six coloured hats (either literally or metaphorically). Each hat represents a different thinking mode:
- BLUE: Meetings usually start with the blue hat, where members look at the big picture, Consider All Factors (CAF), and list the First Important Priorities (FIP).
- WHITE: Participants are encouraged to consider “Facts and Information” when wearing this hat.
- RED: The red hat requires participants to declare their “Feelings and Emotions”. In this round, participants expressed their gut feelings about the ideas discussed.
- BLACK: While wearing this hat, participants must voice negative thoughts, such as risks, concerns, and challenges.
- YELLOW: Under this hat, the group is asked to list positive impressions, such as the value, benefits, and any beneficial implications of adopting the ideas under discussion.
- GREEN: The Green Hat focuses on “New Ideas”, possibilities, and alternatives. Participants are encouraged to challenge the proposed method and explore alternative options.
The reader might have noticed that The Six Thinking Hats technique requires some discipline and commitment to succeed. On the other hand, you can use a subset of the six hats for different meetings (feedback, information gathering, idea evaluation).
The advantages of using The Six Thinking Hats are significant:
- It imposes a structure on meetings, and the most cited result of this structuring is a shortening of meeting times and increased productivity.
- Due to its design, The Six Thinking Hats technique requires participants to focus solely on the current thinking mode. This requirement engages their rational mental faculties and turns off the casual intuitive system.
- By allowing separate times for discussing the ideas’ neutral, negative, and positive aspects, The Six Thinking Hats reduces destructive interference from participants. People are encouraged to focus on a single dimension while wearing the same hat, and strayers are discouraged from derailing the meeting by the leader.
- The Six Thinking Hats technique encourages participants to provide input at each discussion stage, thus ensuring maximum collaboration and the elimination of egotistical emotions.
- Negative emotions and past experiences are more readily available for retrieval from memory. When feelings and thinking modes are focused and channelled by using one hat at a time, participants will exercise more effort to probe their memory for relevant information and opinions instead of defaulting to negative ones.
4.9 Rule 6: Heuristics vs Expert Intuition
Heuristics are simple methods that allow you to answer complex questions quickly. A weighted sum of a few parameters is a typical example of decision heuristics.
Daniel Kahneman advises building heuristics to aid (and not replace) human intuition. His advice consists of the following steps:
- Select at most six traits that, when combined, allow you to formulate an assessment with enough accuracy. For example, suppose you are looking to assess the qualities of a candidate for a software developer position. In that case, you might want to look at their technical skills, teamwork, reliability, and years of experience on the job.
- The traits should be independent of each other. If you already included technical skills, avoid similar characteristics such as expertise in front-end development as the overlap is evident.
- The selected traits should be effortlessly and reliably gauged with information collected from each candidate, typically via a questionnaire.
- Create a scale from 1-5, for example, for each question with guidelines on what can be construed as very weak or very strong.
- Finally, sum up the points received on each question or trait to obtain the final score.
Kahneman recommends the usage of such heuristics instead of making judgements solely based on first impressions. He also recommends avoiding overruling the heuristic results in favour of hasty, superficial, and casual intuitive decisions.
4.10 Rule 7: Stakeholder Involvement
One of the top requirements for outstanding leadership is consistently making the right decisions. Actively involving subordinates in the decision-making process keeps them engaged and motivated.
To illustrate the importance of stakeholder involvement, let’s look at a study completed by a group of researchers from Ohio and Florida State Universities.
They conducted an experiment where two groups were required to select a candidate for public office. The first group used an Expert System (ES) to assist in selecting candidates, while the other group did not. The results of the study were quite surprising.
While making consistently higher-quality decisions, the group using the ES had less confidence and less commitment to the results.
From the best-selling book The Seven Habits of Highly Effective People:
No involvement, no commitment.
— Stephen Covey
Another interesting statistic on the importance of open feedback loops in any decision-making process was completed by S. H. Park and J. Westphal from the University of Michigan.
The researchers found that a 1-standard deviation increase in flattery that a CEO receives can raise the chances of getting her fired by 64%. They believe this phenomenon is explained by flattery, lack of transparency, and broken feedback loops.
4.11 Rule 8: Setup Self-organising Teams
The topic of social groups that can self-organize and demonstrate complex hierarchies has been dealt with in a separate article.
In this section, however, we will extend those ideas presented in decision-making.
Perhaps we must start by stating what self-organising teams are NOT:
- Structures for anarchic and erratic interactions where team members interact without any constraints.
- A setup where individual capabilities are irrelevant is best if individuals do not actively participate.
- A social group where higher classes are depowered and lower classes empowered.
Self-organising teams are structures where senior management can allocate resources according to its plans and objectives, instruct teams to operate according to its guidelines, inspire people with its visions and mission statements, set up an organisational culture and demand that people abide by its rules.
However, senior management cannot prevent people from responding according to their capacities. This response will generate further actions, creating additional responses until a state of equilibrium is finally reached.
The hope is that this chain reaction produces the best decisions and outcomes.
The alternative is a series of decisions parachuted from above with little input, feedback, involvement, or commitment from the lower echelons.
We have argued how crucial these factors are in broadening the chances of better decisions. This argument seems to favour such team setups as those with the ability to self-organize.
5. Final Words
We have emphasised the importance of topic fluency when approaching a specific problem, and our position remains the same for decision-making.
As you have presumably appreciated from reading this article, decision-making is not as simple as we think. Nor are we the rational, logical, sensible decision-making machines we believe we are.
Our views of the world and the reality we construct are shaped by our DNA (optimism, for example, is a hereditary trait), past experiences, education, and knowledge of our mind’s limitations.
Fortunately, eminent minds like Daniel Kahneman, Nassim Taleb, Eduard de Bono and many others have placed at our disposal an excellent toolset that would allow us to overcome some of these challenges. We need to research these tools, recognise, and use them.
Decision-making is a complex group activity that can be analysed differently. Primarily, it is a process by which decisions can be made.
Groups’ decision-making processes can be influenced by many factors, including organisational culture and process maturity. The quality of the decisions made directly impacts performance and survival.
Finally, collaboration in decision-making can lead to better compliance, more commitment, and heavier involvement of major stakeholders. This inclusive attitude legitimises management decisions and makes governing significantly smoother.
We believe that decision-making is not just about getting it right but also about empowerment, collaboration, growth, and business survival.
6. Further Reading
Below is a list of recommended books on the topic. We invite you to regularly check our page on influential books for new listings and reviews.
- Fooled by Randomness by Nassim N. Taleb
- The Black Swan by Nassim N. Taleb
- The Toyota Way by Jeffrey K. Liker
- Organisational Culture and Leadership by Edgar Schein
- Six Frames for Thinking About Information by Eduard de Bono
- Thinking Fast and Slow by Daniel Kahneman
- Strategic Management and Organisational Dynamics by Ralph Stacey