Posted on March 18, 2021 by Lisa Ward
by Marc Bond, Senior Associate
Today I would like to talk with Creties Jenkins, co-creator of our Mitigating Bias, Blindness and Illusion in E&P Decision Making to gain another perspective on bias and how they impact our interpretations and decisions. Creties is a Partner at Rose & Associates with 35 years of diverse industry experience. As a geological engineer, he compliments my geoscience background.
Marc: Creties, welcome to my Understanding and Overcoming Bias blog. I appreciate you taking the time to give our readers some of your insights on our course.
Marc: I’d like to ask you what inspired you to put together the Mitigating Bias course.
Creties: First off Marc, thank you for the opportunity to provide some commentary for the bias blog. My primary inspiration for the Mitigating Bias course was Pete Rose’s AAPG Distinguished Lecture called “Cognitive Bias, the Elephant in the Living Room of Science and Professionalism”, which can be viewed on YouTube. He made the point that our lack of objectivity, due to errors in thinking, contributes to underperforming projects and portfolios. He also noted that the biggest challenge is convincing technical and management professionals that they are subject to bias, and concluded his talk by calling for a renewed commitment to the ‘rigor of the scientific method’. This is where our course picks up to provide some practical guidance.
Marc: In the course, we talk about Illusions. Can you give us some more insights?
Creties: We define an ‘Illusion’ as a misleading belief based on a false impression of reality. We focus on the Illusions of Potential, Knowledge, and Objectivity. Illusions are fueled by biases—we anchor on supporting data, we ignore disconfirming information, and we become overconfident in the expected result. My grandson, who’s a big superhero fan, was crushed when the Superman cape he ordered didn’t give him the ability to fly around the house. It never occurred to him that if this was real, friends and family members would already be using them. He was blinded by his own reality, which can happen to us as well.
Marc: Can you give an example?
Creties: All of us have seen Executive and Technical presentations touting the game-changing advantages of a given project, transaction or technology in our industry. We’ve come to expect that companies will overstate their knowledge and potential of these opportunities in order to generate investor buzz. But more importantly, we see companies believing their own press and not thinking critically enough about their proposed investments or having processes in-place to rigorously assess them and apply the lessons learned to new projects. The “Shale Revolution” in North America is a good example of companies repeatedly overpromising and underdelivering.
Marc: Do you see a relationship between Illusions and Cognitive Biases?
Creties: I do think that cognitive biases fuel illusions. We focus on small bits of data and analogs (information bias) that favor our intent (anchoring bias), ignore conflicting information (confirmation bias), convince us that our strategic plan is correct (framing bias) and that fame and glory will follow (motivational bias). So we think opportunities are better than they are (illusion of Potential), that we understand them more deeply than we do (illusion of Knowledge), and that we’re being honest and impartial in our resulting decisions (illusion of Objectivity). Without a constant awareness of this state and the application of mitigation techniques, we teach in our course, this sequence is all but certain to repeat itself. Just about every person reading this can recall at least one project in their company that followed this pattern with a disastrous result. And yet the cause and cure still receive scant attention.
Marc: What is one of your most surprising observations when teaching the course?
Creties: What’s most surprising to me is how few companies are interested in assembling case studies of their project failures and understanding the role that cognitive errors like ‘Illusions’ played. These case studies are really powerful because you have to admit that if a failure happened once in your company, it could happen again without some changes. I saw this first-hand at ARCO where the Illusion of Knowledge (mistaking familiarity for real understanding) led to a failed waterflood project because of unrecognized connected natural fractures. The inability to learn from this led a decade later to a billion-dollar failure of a miscible gas injection project for the same reason.
Marc: What is your biggest learning from teaching the course?
Creties: How prominent and impactful these cognitive errors are. We’ve presented this course nearly 100 times to everyone from field personnel to executives and nearly every attendee (based on course reviews) sees this problem within their company. Yet most companies are not addressing it or think it’s sufficient for personnel to simply have awareness. I did a half-day leadership version for one company and was told afterward that the attending geoscience managers favored a 2-day mitigation course for their reports, while the engineering managers favored a 1-day awareness course for their people. This led one of the geoscience managers to remark that geoscientists were interested in addressing the problem while the engineers were only interested in identifying it in others!
Marc: And could you leave us with a final message for our readers?
Creties: We provide our course attendees with an understanding of the different types of cognitive errors along with examples and steps to mitigate them in their daily work. But to create change, everyone in the organization needs to have a common vocabulary and processes (e.g., framing sessions, peer assists, performance lookbacks) that will expose and lessen the impact of cognitive errors. HR departments understand how these errors affect hiring, performance reviews, promotions, and employee interactions. We need the same recognition and desire for change on the technical side.
Check out more of Marc’s articles on bias and illusion on his LinkedIn profile.
Posted on February 24, 2021 by Lisa Ward
by Creties Jenkins, Partner
Quickly say the words in these three triangles.
If you didn’t pronounce the repeated words, you’re not alone. Nearly everyone fails to do so. But why? Our familiarity with these phrases causes us to predict the fourth word from the first three and ignore what’s in between. It demonstrates that events consistent with our expectations are processed easily while those that contradict them are ignored or distorted.
Our expectations have many diverse sources, including past experience, professional training, cultural norms, and organizational frameworks. These predispose us to pay particular attention to certain kinds of information and to organize and interpret it in certain ways. We are also influenced by the context in which information arises. Hearing footsteps behind you in the office hallway is very different from hearing them behind you in a dark alley!
These patterns of perception tell us subconsciously what to look for and how to interpret it. Without these patterns, it would be impossible to process the volume and complexity of data we receive every day. But we need to be aware of the downsides associated with these patterns:
01. Your view will be quick to form but resistant to change. Once you form an expectation for your project, this conditions your future perceptions.
02. New information will simply be assimilated into your existing view. Gradual, evolutionary change associated with new data often goes unnoticed, which is why a fresh set of eyes can reveal insights overlooked by someone working on the same project for many years.
03. Initial exposure to ambiguous information interferes with accurate perception. The greater the initial ambiguity, and/or the longer you’re exposed to it, the clearer the succeeding information must be before you’re willing to make or change an interpretation.
04. We tend to perceive what we expect to perceive. It takes more unambiguous information to recognize an unexpected outcome than an expected one.
05. Organizational pressures resist changing your view. Management values consistent interpretations, particularly those that promise added value to investors!
Fortunately, some techniques can help us overcome these downsides. We need to list our assumptions and chains of inference. We have to specify sources of uncertainty and quantify risk. Key problems should be examined periodically from the ground up. Alternative points of view should be encouraged and expounded.
These techniques, as well as others, are discussed and practiced in our Mitigating Bias, Blindness, and Illusion in E&P Decision-Making course. Please consider joining us for a virtual or in-person offering.
Reference excerpted for this blog: Heuer, Richard J., Jr., 1999, Psychology of Intelligence Analysis, Center for the Study of Intelligence.