by Marc Bond, Senior Associate
Success and value creation in the oil and gas industry has not been particularly good, as evidenced by its relative performance. This is widely recognized both in the global markets and within the industry itself. There are certainly many reasons that may explain why the oil and gas industry has not done well over the years; and one area that has received a lot of traction in recent years is concept of cognitive bias.
Cognitive biases are predictable, consistent, and repeatable mental errors in our thinking and processing of information that can, and often do, lead to illogical or irrational judgments or decisions.
Surprisingly, the notion of cognitive bias has not been around for many years. It was first proposed by Amos Tversky and Daniel Kahneman in 1974 in an article in Science Magazine (Tversky and Kahneman, 1974). Since then, there have been numerous publications and research studies on the various cognitive biases and how they impact our judgments and decisions.
The book that introduced the concept of cognitive biases and their influence on decisions to the general population was the seminal publication by Nobel Prize winning psychologist Daniel Kahneman – Thinking, Fast and Slow (Kahneman, 2011). It is interesting to note that Dr. Kahneman won the Nobel Prize in 2002 not in Psychology, but rather Economics. Why? Because traditional economic theory assumes that we are rational creatures when we make decisions or choices, and yet research and observations continually show that we do not.
There are many different cognitive biases (see Wikipedia), but there are a few that play a significant role within the oil and gas industry. These biases can act individually or in a combination, leading us to poor judgments and decisions.
HOW BIASES MAY BE REPRESENTED IN THE OIL & GAS INDUSTRY
For example, imagine an exploration team is assessing a prospective area that is available for license bids. In the analysis of the data, the focus is on a very productive analogue to describe the play. Nearby there has been a successful well recently drilled; and although it is acknowledged to be in a different play, the team is very excited about the hydrocarbon potential of their new play.
There are some existing, older wells that suggest that the new play may not work, but it is felt by the technical team that those wells were either old or poorly completed, and hence could be dismissed as a valid data points. Given the uncertainty, the prospects and leads developed should have a very wide range of resource potential. However, given the team’s confidence in the seismic amplitudes, the range of GRVs estimated is quite narrow.
The team is also optimistic about the play potential and presents the opportunity to management in very favorable terms. If the company were to bid on and be awarded the license by the government, the team would be quite excited; and of course, success is often rewarded. The company ends up bidding on the license with a commitment of several firm wells. Upon further data collection and data analysis, a new team re-assessed the hydrocarbon potential and it is now believed to be limited; and yet there still is a large commitment to fulfil.
What happened to cause this result? Was the original team overconfident in their expectations? Did they think that because they understood their commercial analog, they understood the prospectivity? Were they so focused on the nearby successes? Was the data that was dismissed actually highly relevant? Were other alternatives and models not considered, which might have suggested that the resource size could be small?
Although the above narrative may appear to be contrived and one’s reaction to the scenario would probably be “I would never do that”, each of the justifications and decisions made are possibilities and all of them are rooted in forms of cognitive bias. You likely have recognized all or part of the scenario from your own experience. Further, these biases can work together in a complimentary fashion, reinforcing the biased assessment, making one “blind” to other possibilities.
Cognitive biases and their negative impact do not just present themselves during the exploration phase. There are numerous similar real-world scenarios observed in appraisal, development, production, and project planning projects.
STRATEGIES TO MANAGE
The bottom line is that these cognitive errors lead to poor decisions regarding work to undertake, issues to focus on, and whether to forge ahead or exit a project. This makes it important to identify them and lessen their impact. Unfortunately, awareness alone is not sufficient. These biases are inherent in our judgments and decision making and serve a purpose in helping us make rapid judgments based upon intuition and experience. In our everyday life, they work generally well. Unfortunately, particularly in complex and uncertain environments such as the oil and gas industry, they can lead us to poor choices.
Hence, it is important to understand first what the biases are, why they occur, and how they can influence our assessments. This will then help us identify when our own, or our colleague’s judgments, assessments, and decisions may be affected by these cognitive biases. We then need to learn mitigation strategies. Given that these cognitive biases are normal and serve a purpose, the goal cannot be to remove them, but rather is to recognize the biases and then apply mitigation strategies to lessen their impact.
As noted above, there has been a lot of research on the biases, yet there is little published on actual, practical mitigation strategies. Hence, to help our industry, my colleague Creties Jenkins and I have developed a course entitled Mitigating Bias, Blindness and Illusion in E&P Decision Making course, where we go into further detail regarding these vitally important mitigation strategies. We use petroleum industry case studies and real-world mitigation exercises to reinforce the recognition of the biases. Finally, we show how to employ the mitigations to ensure any assessments or decisions are as unbiased as possible.
Kahneman, Daniel, 2011, Thinking, Fast and Slow, Penguin Books, 499p.
Tversky, Amos and Kahneman, Daniel, 1974, Judgment Under Uncertainty: Heuristics and Biases, Science, vol. 185, no. 4157, pp. 1124-1131.
Wikipedia, List of Cognitive Biases, https://en.wikipedia.org/wiki/List_of_cognitive_biases