Posted on January 6, 2021 by Lisa Ward

by Henry Pettingill and Gary Citron

“It is amazing what you can accomplish if you do not care who gets the credit.”
Harry S. Truman

BEGINNINGS

In 2000, a tipping point occurred when many companies wanted to see a consistent set of industry-derived best practices around amplitude characterization for a chance of success determination (prospect ‘risking’). In response, Rose & Associates’ founder Pete Rose and his Senior Associate Mike Anderson turned to former Shell and Maxus geophysicist and executive Mike Forrest to consistently weave seismic amplitude anomaly information into the fabric of prospect chance assessment. With the input of others, they decided to form a consortium of companies to capture best practices in a process that quickly evolved to include a user-friendly software and database, which became SAAM (‘Seismic Amplitude Analysis Module’). They reached out to Geologist Roger Holeywell, who was actively commercializing other risk analysis software for R&A through a partnership with his employer, Marathon Oil Corporation, to serve as SAAM programmer.

The DHI Consortium officially began at Dallas Love Field on December 7, 2000, a day recorded famously as the starting flag, with the inaugural meeting of the 13 founding companies in January 2001. Shortly thereafter Rocky Roden (Repsol’s Chief Geophysicist and representative to the first Consortium Phase, and a thought leader in geophysics) ‘retired’ from Maxus and joined Mike and Roger as the third director of the DHI Consortium.


DHI Consortium group photo, May 2001 in Houston

WHAT IS A DHI?

In many basins with sandstone targets (especially those deposited during the Tertiary Period), the seismic signal associated with that target can be quite strong. Rock density and seismic velocity contrast noticeably between units, and that contrast is amplified by the presence of oil or gas in the pore system, making accumulations appear ‘anomalous’. The most common measure of the strength of the anomaly is the seismic amplitude (amount of signal deflection). These anomalies first became observable in the mid-1960s on relatively low-fold seismic lines and yielded significant quantifiable information as the fold increased.

While interpreting such seismic data, a geophysicist will measure an objective’s amplitude level in comparison to the ‘background’ level surrounding the objective amplitude. Significant amplitude strength above the background is referred to as an ‘anomaly’ or a ‘bright spot’. Mike Forrest is credited as one of the first explorers to recognize the exploration impact of seismic amplitude-bearing prospects when he was a Gulf of Mexico exploration project leader at Shell in the 1960s. The acronym DHI stands for Direct Hydrocarbon Indicator, suggesting that the seismic amplitude (hopefully) results from hydrocarbon charge.

THE CONSORTIUM FOR 20 YEARS

At the very first meeting in January 2001, the Consortium commissioned Roger to program an innovative interrogation process that facilitates a thorough, systematic, and consistent grading of the amplitude anomaly, based purely on observations (as opposed to interpretations). It begins with the geologic setting, a preliminary chance assessment solely from geology (as if no DHI were present), and many key attributes the seismic data are designed to extract. SAAM requires the exploration team to answer questions about AVO classification, seismic data quality, rock property analysis, amplitude character, analogs, and potential pitfalls (also referred to as false positives). In other words, SAAM successfully institutionalizes a process through which explorers address salient issues, forcing those who would otherwise treat it in a perfunctory fashion to digitally record the key information they may later forget or lose. Employing a weighting system, SAAM registers the impact of each scored characteristic to determine a ‘DHI Index’, which in conjunction with the Initial Pg yields a ‘Final Pg’. This ‘Final Pg” is now calibrated by over 354 drilled wells.

Initially, each Consortium phase lasted 12 to 18 months. In 2016 Consortium phases were aligned with the calendar year to better synch with client budgeting. Consortium membership is paid through a phase fee and each new company that joins must purchase a license to SAAM. Companies may license a version of SAAM without joining the Consortium, but that version does not include the powerful analog and calibration database, which at the end of 2020 SAAM has 354 drilled prospects. The final Pg can then be analyzed in several ways, and critically compared to the success rates of similar DHIs and further calibrate the weighting system. SAAM’s burgeoning database is owned by Rose & Associates, with each member company having rights to internal usage.

In accepting AAPG’s highest honor in 2018 (the Sidney Powers award), Mike Forrest commented on the Consortium: “We expected it to last a year or two, and almost 20 years later it’s still going on!” Roger attributes the longevity to the breakthrough thought that meetings should be member-driven, designed around member company prospect presentations.


First European Chapter meeting, October 2007 in London

Can you think of any other venue to see drilled prospects, learn how they originated, how they were technically matured, the drilling outcome, and lessons learned from the journey – all within two hours? Companies benefit from seeing the tools and techniques other companies use in the analysis. Although some companies are unable to share the exact location of certain prospects, all the key attributes are shown and a SAAM file is populated for each prospect in real-time. Then the participants are asked for their opinions of prospect quality and ultimately whether they would drill it. The company then reveals the drilling results, and all inputs, outputs, and drilling results are added to the SAAM database.

As member participation in Consortium meetings grew, with more younger staff involved, the challenge became how to get more people in the room involved to avoid domination of discussion by senior members? Roger answered this question in 2017 by introducing the use of individual wireless keypads (’clickers’) that quickly register, compile, and display the entire group’s answers to a variety of grading questions anonymously. This permitted the leaders to ask people to explain their diverse views, which highlights how differing perspectives can get the discussion to a higher level.

SAAM’s architecture and workflow have constantly evolved but were always based on a collaborative framework. For every prospect shown in a meeting, during a subsequent weekend the Consortium Leadership gathered again to review the SAAM file, ensure consistency, and discuss what could be improved in the software, based on key observations from the prospect presentation.

Founding Consortium Chair Mike Forrest still consults with Consortium, which since 2019 has been under the direction of R&A’s Henry S. Pettingill. Henry was a member of the Consortium with Repsol during its inaugural year, and as Director of Exploration at Noble Energy, oversaw their participation since 2002. Roger and Rocky have guided and influenced the technical direction of the Consortium throughout its history. With companies opting in and out through the years, much of the stewardship of SAAM (updating functionality, testing, checking for consistency, and database trends) is left up to Roger’s discerning eye. He takes advantage of the crew changes, always on the lookout for the fresh perspective provided by new Consortium member feedback.

The Consortium’s secret sauce is relationships. That starts with the Leaders, who have known each other and worked together for over 20 years. Rocky and Henry both worked for Mike in the 1980s and 1990s at Maxus and Shell respectively, Rocky and Henry then worked together within Repsol/YPF Maxus. All along the way, Roger interacted with Mike, Rocky, and Henry, as authors of R&A’s software products.

But there is also a strong bond between many of the members, some of whom have interacted in and out of the Consortium for over 20 years. One of the highlights of Consortium meetings in Europe is the way each host company highlights the unique aspects and special history of their city and culture through a ‘networking dinner’ featuring the local cuisine. These became an instant tradition, strengthening the network by building friendships amongst the industry’s elite DHI practitioners. There have even been occasional field trips to classic locations in Europe and South Africa.

Each year, the Consortium typically holds five meetings in Houston and three in Europe. Since March 2020, due to the COVID-19 pandemic, the meetings were replaced by monthly webinars. This turned out to be a blessing in disguise as it caused a surge in participation and expanded the reach of the Consortium. Whereas most people can attend a meeting only occasionally (and those in remote field offices virtually never), webinar attendance far exceeded regular meeting attendance, with new participation from field offices in the Netherlands, Oman, Malaysia, Indonesia, and New Zealand.

THE CONSORTIUM TODAY

As we celebrated our 20th anniversary in this month’s webinar, we looked back on the 20 years and some of its accomplishments – too many to list here. In numbers, the SAAM database has 354 drilled prospects from 30 basins, allowing calibration of assessments of undrilled prospects, as well as providing valuable benchmarking data. Over 80 companies have participated, most of whom have contributed drilled prospects to the database, and we have 36 member companies this year. But probably the most enduring accomplishment is the heightened prospecting skills and intuition of the participants. This has resulted in the Industry’s most comprehensive DHI prospect database, all evaluated using consistent methodology and peer-reviewed by a roomful of advanced practitioners.


Consortium membership and SAAM database vs. time

Perhaps most remarkable is how the Consortium has evolved with time and technology. For instance, seismic imaging and other advances have allowed for things unthinkable just a few years ago, like imaging amplitudes beneath salt. Computer power and machine learning have allowed analyses like never before. And each year, the Consortium sets goals according to where we are in this evolution.
It all leads us to ask: what will the next 20 years have in store for us? Most of us agree that changes will continue to come in DHIs and associated exploration technologies, and make the unimaginable not just imaginable, but even more fun.

Posted on December 9, 2020 by Lisa Ward

by Marc Bond

Following a groundswell of interest generated by a presentation at the 2008 AAPG Annual Convention by Glenn McMaster et al entitled “Risk Police: Evil Naysayers or Exploration Best Practice?” several (including myself, then at BG Group) thought it would be an excellent idea to organize a workshop to discuss best practices and challenges of exploration assurance. Glenn (then at bp) was great at speaking truth to power and embraced the idea. Hence, he and Gary Citron (at Rose & Associates) convened the first Risk Coordinators Workshop (RCW) on November 18-19, 2008, graciously hosted by bp in Houston. Twenty-eight industry leaders from 18 companies attended. Twelve presentations were given by the attendees, mostly focused on the state of assurance within the company of the presenter which was a rare insight at the time. That openness fostered a sharing and collaborative environment, defusing our concern that this would be a “one-off “event. Rather, the enthusiasm and interest of the successful workshop encouraged us to continue.

I particularly enjoyed attending and contributing to the Workshop through the years. We now continue with a yearly workshop, with the goal of sharing common experiences, issues, challenges, and suggested best practices. There is a nominal fee for attendees to cover expenses. The only obligation is to be open and share. We held our 19th annual workshop in November 2020. Given the current situation of the pandemic, the RCW was held virtually for the first time; and measured by the commentary and feedback, was a great success.
When I joined Rose & Associates in 2014, I brought in the idea of increasing workshop frequency, and we now meet two to three times a year (in North America and England, and every other year in Asia). We also now include Breakout Sessions to explore relevant assurance themes and provide a Summary Report to capture the outcomes of each workshop.

In 2015, we established the Risk Coordinators network as a natural follow-up to the RCW, which consists of a group of subsurface assurance experts who are responsible for assuring their companies’ opportunities. The network is an informal group that includes over 70 companies (ranging from super-majors to small companies) and over 160 people who are very open and passionate about assurance, risk analysis, and prospect assessment. Along with the workshops, we have now been active for over 12 years.

We work with the network on other assurance-related items, such as delivering a periodic Assurance Survey (2015 and 2019). The Survey results are shared with the network to monitor the current state of assurance and provide them with learnings to help improve their own assurance process.

Doug Weaver (Partner Rose & Associates) and I now manage the network. I would like to personally thank Gary for his support and coaching over the years. If you have any questions about the network or ideas for the next RCW, please contact us

Stay safe and healthy.

Posted on November 10, 2020 by Lisa Ward

by Doug Weaver

Last time we discussed the need to quantify everything in exploration, using my college glacial mapping project as an example. Let’s move back to the world of oil and gas exploration.

The main takeaway from my first blog is that an engineer’s role in exploration is to quantify. Geoscientists make interpretations of data and then engineers turn those interpretations into resource and economic assessments. The ultimate goal is to generate an inventory of opportunities that can be high-graded, allowing investment in those that are the most financially worthy. But how do we combine, resources, chance of success, costs, and economics to do this? We employ the expected value equation.

(Pc x Vc)-(Pf x Vf) = Expected Value

It’s a very simple equation. Let me describe the terms. Pc is the chance of success, Vc is the value of success. Pf is the chance of failure, Vf is the value (or cost) of failure. When we subtract the two terms we generate an expected value. If the expected value is positive the project is an investment candidate, if it’s negative, we’re gambling. We could still invest in a project with a negative expected value, but likely we’re going to lose money, and we’ll certainly lose if we invest in enough of them.

So let’s assume you’ve just generated a prospect, and you can make some estimate of a few items to describe it. You’ve got a rough idea of a chance of geologic success, maybe from working on a specific trend. You have some notion of size, either from your own volumetric assessment or again trend data. The engineer assisting your team with project evaluations should provide the team with a few key items to help with prospect screening.

  • Threshold sizes – how big do prospects need to be to be commercial?
  • NPV/Bbl (or Mcf) – what is the NPV/bbl for fields of various sizes? We’ll use this to transform barrels into dollars,
  • Dry Hole cost – what is the dry hole cost for an exploration failure in the trend? (Might want to get depth specific here)

Back to the equation. First the success case. Notice both P (chance) and V (Value) in the success case have the subscript c, meaning commercial. What we’re looking for is the Commercial Chance and Commercial Value, not the geologic counterparts. If you have done a formal resource assessment this conversion is easy, you just determine where the threshold volume intersects the resource distribution. In the example below if the threshold is 40mmbo, it intersects the resource distribution at the 75th percentile. If this project has a geologic chance of success of 30%, the commercial chance of success is simply 30% x 75% or 22.5%. (For anyone not familiar with the convention, 40mmbo means 40 million barrels of oil).

The Commercial Volume would be determined by the resource that exists between the threshold volume and the maximum volume or between 40 mmbo and 76 mmbo. There are better ways to determine this, but for now let’s just use an average value of 58 mmbo.

Now you may ask, especially for screening, what if I don’t have this resource distribution? What if I’ve just made a quick deterministic volume estimate multiplying prospect area times a guess at thickness times a likely recovery yield (Bbl/ac-ft)? Can I still estimate the expected value? Sure, just try to apply the process described above as best you can. If the threshold is 40 mmbo and you calculate a resource of 300 mmbo, adjustments to geologic chance and volume will be minimal when considering their commercial values. If you calculate a volume of 45 mmbo, I might not try to estimate commercial values, but you already know the prospect is likely challenged commercially.

Now that we have an estimate of volume and chance, we need to convert our volume to value. The simplest way to do this is with a metric called NPV/bbl. The engineer assisting your team has likely evaluated many fields of various sizes in his evaluation efforts. Your group has probably generated other prospects in the trend, evaluated joint venture opportunities, and maybe even had a few discoveries.

For each of these opportunities the engineer has had to estimate the success case value or NPV (Net Present Value) for a given field volume, usually at the mean Expected Ultimate Resource(EUR). The NPV is going to account for the time value of money at your company’s specific discount rate. A typical discount rate is 10%, resulting in what is referred to as an NPV10. The NPV calculation accounts for all production (therefore revenue) and all costs and expenses over the life of the field, including the costs of completing the discovery well and drilling and completing appraisal wells, and reduces them to a single value. When this value is divided by the volume associated with the evaluation, we generate the metric in dollars/barrel of NPV/bbl. Given that these types of evaluations have been generated for several opportunities within a play, we can get a pretty good idea of how NPV/bbl changes with field size.

Note that for a given play in a given country NPV/bbl often doesn’t change dramatically. If you’ve only got a few field evaluations at your disposal the engineer should still be able to provide a usable NPV/bbl. Better yet embrace the uncertainty and test your prospect over a range of values. Finally, to determine Vc I simply need to multiply my mean EUR volume by my NPV/bbl.

The failure values are much easier to determine. Pf, the chance of failure is simply 1-Pc. Simple as that. For conventional exploration opportunities Vf or value (cost) of failure is usually just the dry hole cost. Most explorationists working on a trend have a pretty good idea of that cost, if not ask a drilling engineer. For the expected value equation, you should input an after-tax dry hole cost. Obviously, the tax rate will change from country to country, for the US the after-tax dry hole cost is about 70% of the actual cost.

Now we have all the pieces we need to generate the expected value. Let’s start with the plot earlier in this discussion and do that.

We have:

A commercial success volume of 58 mmbo

A commercial success chance  of 22.5%

A failure chance of 77.5%

Let’s also assume an NPV/bbl of $2.00 and a dry hole cost of $20mm.

A couple of preliminary calculations

Value of success = 58mmbo x $2.00/bbl = $116mm

Cost of failure = $20mm x 0.7(tax) = $14mm

Here’s our equation

(Pc x Vc)-(Pf x Vf) = Expected Value

Plugging in values

(22.5% x $116mm)-(77.5% x $14mm) = Expected Value

$15,250,000 = Expected Value

Is this good? Yes, we’ve generated a positive value. Remember if it’s negative, we could still pursue the project but now we’re not investing we’re gambling. The key is that we need to perform this analysis on all our projects, look at our available funds, and invest in the best. That’s portfolio analysis and the topic of a later discussion.

The point of this blog was to simply walk you through the process, and encourage prospect generators to apply it to your opportunities as early as practical, even if it’s a “back of the envelope” calculation. Beyond chance and volume, all you need is a few values from your engineer. You’ll be able to use this tool to judge whether the prospect you’re working on is likely to be pursued or not. It may also give some insights as to what can be focused on to improve your prospect. For example, if you generate a low (or negative) expected value are there areas for improvement in chance or volume? If not, maybe it’s time to move on to the next one.

Posted on September 30, 2020 by Lisa Ward

by Marc Bond, Senior Associate

COGNITIVE BIAS

Success and value creation in the oil and gas industry have not been particularly good, as evidenced by its relative performance. This is widely recognized both in the global markets and within the industry itself. There are certainly many reasons that may explain why the oil and gas industry has not done well over the years, and one area that has received a lot of traction in recent years is the concept of cognitive bias.

Cognitive biases are predictableconsistent, and repeatable mental errors in our thinking and processing of information that can, and often do, lead to illogical or irrational judgments or decisions.

Surprisingly, the notion of cognitive bias has not been around for many years. It was first proposed by Amos Tversky and Daniel Kahneman in 1974 in an article in Science Magazine (Tversky and Kahneman, 1974). Since then, there have been numerous publications and research studies on the various cognitive biases and how they impact our judgments and decisions.

The book that introduced the concept of cognitive biases and their influence on the decisions of the general population was the seminal publication by Nobel Prize-winning psychologist Daniel Kahneman – Thinking, Fast and Slow (Kahneman, 2011). It is interesting to note that Dr. Kahneman won the Nobel Prize in 2002 not in Psychology, but rather in Economics. Why? Because traditional economic theory assumes that we are rational creatures when we make decisions or choices, and yet research and observations continually show that we do not.

There are many different cognitive biases (see Wikipedia), but there are a few that play a significant role within the oil and gas industry. These biases can act individually or in combination, leading us to poor judgments and decisions.

HOW BIASES MAY BE REPRESENTED IN THE OIL & GAS INDUSTRY

For example, imagine an exploration team is assessing a prospective area that is available for license bids. In the analysis of the data, the focus is on a very productive analog to describe the play. Nearby there has been a successful well recently drilled; and although it is acknowledged to be in a different play, the team is very excited about the hydrocarbon potential of their new play.

Some existing, older wells suggest that the new play may not work, but it is felt by the technical team that those wells were either old or poorly completed, and hence could be dismissed as valid data points. Given the uncertainty, the prospects and leads developed should have a very wide range of resource potential. However, given the team’s confidence in the seismic amplitudes, the range of GRVs estimated is quite narrow.

The team is also optimistic about the play potential and presents the opportunity to management in very favorable terms. If the company were to bid on and be awarded the license by the government, the team would be quite excited; and of course, success is often rewarded. The company ended up bidding on the license with a commitment of several firm wells. Upon further data collection and data analysis, a new team re-assessed the hydrocarbon potential and it is now believed to be limited; and yet there still is a large commitment to fulfill.

What happened to cause this result?  Was the original team overconfident in their expectations? Did they think that because they understood their commercial analog, they understood the perspective?  Were they so focused on the nearby successes? Was the data that was dismissed highly relevant? Were other alternatives and models not considered, which might have suggested that the resource size could be small?

Although the above narrative may appear to be contrived and one’s reaction to the scenario would probably be “I would never do that”, each of the justifications and decisions made are possibilities and all of them are rooted in forms of cognitive bias. You likely have recognized all or part of the scenario from your own experience. Further, these biases can work together in a complementary fashion, reinforcing the biased assessment, and making one “blind” to other possibilities.

Cognitive biases and their negative impact do not just present themselves during the exploration phase. There are numerous similar real-world scenarios observed in appraisal, development, production, and project planning projects.

STRATEGIES TO MANAGE

The bottom line is that these cognitive errors lead to poor decisions regarding work to undertake, issues to focus on, and whether to forge ahead or exit a project.  This makes it important to identify them and lessen their impact. Unfortunately, awareness alone is not sufficient. These biases are inherent in our judgments and decision-making and serve the purpose of helping us make rapid judgments based on intuition and experience. In our everyday life, they work generally well. Unfortunately, particularly in complex and uncertain environments such as the oil and gas industry, they can lead us to poor choices.

Hence, it is important to understand first what the biases are, why they occur, and how they can influence our assessments. This will then help us identify when our own, or our colleague’s judgments, assessments, and decisions may be affected by these cognitive biases. We then need to learn mitigation strategies. Given that these cognitive biases are normal and serve a purpose, the goal cannot be to remove them but rather to recognize the biases and then apply mitigation strategies to lessen their impact.

As noted above, there has been a lot of research on the biases, yet there is little published on actual, practical mitigation strategies. Hence, to help our industry, my colleague Creties Jenkins and I  have developed a course entitled Mitigating Bias, Blindness, and Illusion in E&P Decision Making course, where we go into further detail regarding these vitally important mitigation strategies. We use petroleum industry case studies and real-world mitigation exercises to reinforce the recognition of the biases. Finally, we show how to employ the mitigations to ensure any assessments or decisions are as unbiased as possible.

REFERENCES

Kahneman, Daniel, 2011, Thinking, Fast and Slow, Penguin Books, 499p.

Tversky, Amos and Kahneman, Daniel, 1974, Judgment Under Uncertainty: Heuristics and Biases, Science, vol. 185, no. 4157, pp. 1124-1131.

Wikipedia, List of Cognitive Biases, https://en.wikipedia.org/wiki/List_of_cognitive_biases

Posted on September 2, 2020 by Lisa Ward

I’m Doug Weaver, and I’m a partner with Rose and Associates residing in Houston, Texas. I joined Rose a little over three years ago after retiring from a 39-year career with Chevron. I’ve spent well over half of my career in exploration as a petroleum engineer.

I’m often asked, “Why would an engineer be so interested in exploration”? There are many reasons, but let me pose one of my usual responses – “if you think it’s difficult to generate resource estimates with all the data you’d want – try doing it with none”.

I hope to continue this blog well into the future and get into some of the services engineers provide for exploration teams. But in this first session, let me convey an observation on a topic that will be pervasive in future notes – Engineers and Geoscientists approach problems differently.

As I was scheduling my final semester of undergrad, I met with my advisor to get his feedback on one last technical course. Though my major was geotechnical engineering, I was a bit surprised when he suggested an advanced course in geology. Being that my advisor was one of the top geotechnical engineers in the world, I took his advice and enrolled in Geomorphology. The class consisted of about twenty geologists – and me. A good background for a future engineer in exploration!

All my engineering, math, and science classes had followed a very familiar cadence. Three hourly exams and a final. No reading, no reports, just understanding equations and concepts and solving problems with that knowledge on a test. Solve problems with math.

In the geomorphology class, we were posed with the problem of figuring out where a glacier had stopped and created a moraine. We collected data in the field. We then went back to the lab, plotting and interpreting this data. To my surprise, I was able to plot the exact location where the glacier had stopped. No formulas, just data collection and interpretation.

I’m fairly sure that Professor Hendron not only intended for me to learn about geomorphology but also to give me the experience of this alternate approach to solving a problem.

From what I’ve observed, this typifies the way most engineers and geologists solve problems (of course, I’m typecasting us all). Engineers start with a systematic workflow leading to a precise answer, while geoscientists use a more fluid, interpretive approach. Which leads us to the best answer? Both methods – when used together. The issues we face in exploration will certainly not allow the precise answer an engineer would normally want. In exploration, engineers need to embrace the uncertainty present in every aspect of their calculations. But, at some point, we need to quantify our analysis. We can’t make effective decisions if we can’t quantify and rank the investment options for our companies. And that becomes the primary role of the engineer in exploration – to quantify opportunities.

Back to our glacial moraine. Suppose I’m a Midwestern gravel company looking for mining opportunities. It’s great that I’ve identified my moraine and a potential quarry, but what does that imply from an investment perspective? How does this deposit compare to others I might exploit? What’s the quality of the sand and gravel within the deposit? Are others more accessible?

Switching hats from geologist to engineer, my task is now to answer these questions. I now understand that I will never know the exact size of the deposit, as it is uncertain. I’ll have to rely on samples collected to build a representation of the nature of the deposit, realizing the samples reflect a tiny portion of the total moraine. This data will inform me about the range of possible sizes of this deposit. I’ll want to investigate other deposits in the area to support the analysis of the samples I’ve collected in my own deposit and investigate how they were developed to get some idea of how to best evaluate the costs and timing of the extraction process. Finally, I somehow have to transform my moraine map and all these answers into a range of economic metrics, primarily Net Present Value, or if risk is present, Expected Value.

That’s where we’ll pick up next time, interrogating the Expected Value equation. Thanks for reading!

Posted on June 15, 2017 by Lisa Ward

Rose & Associates Success Plan

Suppose I said to you “Sue’s got a bug”. Quickly now…what do you think Sue has? If you’re a programmer, you probably think Sue has a computer virus. But if you’re a doctor, perhaps the flu comes to mind. And if you’re an entomologist, a ladybug may be your first thought. Did you consider all three as possible outcomes? Probably not. And what about some others? If you’re a spy, you might think Sue found a listening device. If you sell cars, you might think Sue bought a Volkswagen Beetle. The list goes on and on.

So why didn’t all of these come to your mind? Well first off, I asked you to respond quickly, which reduced the time you spent thinking about it. And second, you based your response on your intuition, instinct, or experience. You responded reflexively. This is inherently how we make most decisions every day. Do you know how much fat and calories are in that Sausage McMuffin you ordered? Did you review the economic fundamentals before acting on a friend’s stock tip? Did you read the TripAdvisor reviews that mentioned bedbugs in the hotel you booked? The answer to all of these is probably “no”. We neither have the time nor stamina to properly frame each decision in terms of uncertainty and risk.

The same is true in our working lives. However, the difference is that we’re paid to make good decisions in our jobs, and those decisions often involve millions of shareholder dollars. In these situations, we can’t afford to think reflexively. Instead, we need to think reflectively, which requires deliberate time and effort.

There are multiple tools to help us approach oil and gas decision analysis reflectively including…

  • staged approach focuses on determining what project stage you’re in, the key risks and uncertainties associated with that stage, and what data gathering and analyses you want to undertake to make a good decision about whether to move to the next stage.
  • Probabilistic thinking requires that we quantify the range of possible outcomes and assign a degree of confidence to any given outcome. This is much better than providing a single deterministic value as the most likely case because this is rarely (if ever) the actual outcome.
  • An assurance process, which provides independent and consistent guidance in the assessment of opportunities. This commonly involves subject matter experts involved in peer assistance and/or peer reviews.
  • Asking the right questions, means decision-makers need to probe 01. the work used to justify the recommendation, 02. whether the base case could be pessimistic/optimistic, and 03. whether credible alternatives were considered.

This sounds straightforward enough, but companies struggle to implement and apply these processes to their decision-making consistently. New management teams want to reorganize the way things are done. Staff turnover erodes the memory of what worked and what didn’t. Teams have turf to defend and walls to build. All of these contribute to lapsing into reflexive thinking.

“So what”, you say. “Let’s be bold and use our gut to guide us”. Could this be a successful strategy? Occasionally it does work, which provides memorable wildcatter stories (consider Dad Joiner). But given that oil and gas companies are in the repeated trials business, you’ll eventually succumb to the law of averages. For example, if we look at shale plays in the U.S., only about 20% of these have been commercially successful. You might get lucky by drilling a series of early horizontal wells in a shale play, but it’s more likely that you’ll squander millions of dollars you didn’t need to spend to realize that the play doesn’t work. In this sense, we’re like Alaskan bush pilots. There are old bush pilots and bold bush pilots. There are no old and bold bush pilots. If you want longevity, you need discipline.

Recently, we’ve begun to understand more about how people make decisions with their gut. It turns out that these reflexive decisions are very likely to be affected by cognitive bias. These are errors in thinking whereby interpretations and judgments are drawn in an illogical fashion. Some definitions and examples of this cognitive bias in the oil and gas industry are listed below:

  • Anchoring: attaching an evaluation to a reference value. Example: focusing on one geological model or a favored seismic interpretation.
  • Availability: overestimating the likelihood of more memorable events. Example: the recent well drilled by an offset operator with a huge initial production rate.
  • Confirmation: interpreting data in a way that confirms our beliefs. Example: collecting data in the most prospective area and extending this interpretation elsewhere.
  • Framing: reacting to a particular choice depending on how it is presented. Example: only comparing your opportunity to successful analogs.
  • Information: having a distorted assessment of information and its significance. Example: equating missing or low-quality data with a low or high chance of success.
  • Overconfidence: overestimating the accuracy of one’s own interpretation or ability. Example: generating a narrow range of resource estimates.
  • Motivational: taking actions or decisions based on a desire for a particular outcome. Example: Overstating the chance of success or size of the prize to get a project funded.

So if you’re going to make decisions “with your gut”, at least realize the types of cognitive bias that could impact your decisions, and take some steps to lessen their impact on your exploration risk analysis, resource play evaluation, or production type curve generation.

With this in mind, we’ve come up with a new 2-day course at Rose and Associates called “Mitigating Bias, Blindness, and Illusion in E&P Decision-Making”. This course, in concert with our portfolio of courses, consulting, and software designed to help you think more reflectively about your project, is aimed at helping you make better decisions. Check out our offerings.

~ Creties Jenkins, P.E., P.G., Partner – Rose and Associates

Posted on February 9, 2017 

Improving Decision-Making With Limited Data

Professionals routinely face the challenge of making informed decisions with limited data sets. Our exploitation of unconventional resource plays has exacerbated the problem. We commonly refer to these resource plays as “statistical plays,” as large programs have provided repeatable year-over-year results. Decision-making that relies on limited data sets has been driven by competitive pressures and the desire to get to the right answer as soon as possible. Development decisions are often made without due consideration of how representative the data are. Similarly, we frequently test new technologies with limited samples, expecting that a simple arithmetic comparison of the average results can validate or refute their further application. This talk presents the theory and use of aggregation curves as a pragmatic, graphical approach to determining the uncertainty in the sampled mean relative to the desired average program outcome.

James Gouveia, a partner in Rose & Associates, is a professional engineer with a diverse technical, business, and operations background. He has worked in a variety of technical and managerial assignments in exploration, reservoir engineering, strategic and business process planning, and portfolio and risk management. Gouveia served as an assurance champion and asset manager for BP and previously as director of risk management at Amoco Energy Group of North America.

Source: SPE DL 2017 Short Bios and Abstracts – New

Posted on August 5, 2016

Watch the AAPG presentation “Why Bother? (With Play Based Exploration)”, co-authored by R&A Associate Jeff Brown and Ian Longley.

Posted on October 14, 2015 

It’s no secret that the oil and gas industry has been going through major shifts these past few months. Every day the possibility for big changes in trends becomes more and more likely. But just because there is room for shifts does not mean that you or your company has to be subject to the unknown. Here at Rose & Associates, we offer multiple forms of software that make predicting the future of your business easier than ever.

One of the best software options we have to offer is our Multi-Method Risk Analysis (MMRA) software. This software is designed for prospect and zone evaluation, a huge thing to consider when plotting your next move in the exploration of gas and oil. Your chances for success increase when using a risk analysis of any sort, because you eliminate the possibility of encountering issues you might otherwise have been able to avoid. This software in particular considers many things to help plot and determine your next move. With an easy-to-use and familiar Excel-based interface, understanding the software and learning how to use it will take no time at all, allowing you to maximize your potential earning power.

Because no two business plans and no two companies are alike, the MMRA software was designed with flexibility in mind. Multiple options are afforded to you, including:

  • Volumetric Methods: This option includes area, average net pay, and the recovery yield. It also has a second option that includes gross rock volume, net/gross, and the recovery yield.
  • Area vs. Depth: This is also known as the Gross Rock Volume vs. Depth. This option uses workstation-generated top reservoir area-depth pair values and a definition of the primary “shaping” variable for irregularly shaped traps or fixed fluid contacts.
  • Scenario-based: This is based on interpretations to explore multiple working hypotheses.
  • There is also an option for Resource Forecasts.

Implementing even just one of these options automatically allows you not only a better chance statistically of doing better, but it allows you peace of mind in knowing that all your decisions are educated, well-thought-out, and made with success in mind. The MMRA was built by geoscientists for geoscientists, so clients can feel secure in their estimations.

Posted on April 27, 2015 

The oil and gas industry remains the primary source of the world’s energy despite efforts to enhance the viability and acceptability of alternative sources. The growth potential for this industry is stable provided oil and gas risk analysis is deployed at strategic phases. The challenges faced by this sector span various aspects, including financial, strategic, operational, and regulatory compliance.

Risks Faced by the Oil and Gas Industry

Financial Risks
Price volatility has been a major concern for the sector, but the urgency of this issue has been heightened with increasing costs of extraction and the frequency of political events that affect oil prices. For the most part, the industry favors extraction locations where the political system is stable since a change in leadership may lead to different regulations that directly affect operations.

Strategic Risks
While competition from alternative energy sources and new technologies remains limited, the oil and gas industry has to contend with fluctuations in demand. Politics may also add to strategic challenges. Access to reserves, risk of nationalization, and a shift in the regulatory climate can be costly for the industry.

Operational Risks
Oil and gas experts are involved in frequent testing to ensure that estimates of accessible reserves approximate actual values, but geological risk also includes challenges with extraction, cost containment issues, and ensuring safe conditions as drilling has moved to less hospitable environments.

Compliance Issues
Regulatory compliance has exacerbated operational and financial challenges. As safety regulations and environmental guidelines are tightened, the oil and gas sector is pressured to add substantial investments to ensure compliance.

Quantitative Oil and Gas Risk Assessment

Significant risks faced by the oil and gas industry coupled with massive investments involved to sustain operations have driven the need to deploy leading-edge methodologies to evaluate projects and measure risks. Mitigation strategies are most effective when oil and gas risk assessment involves an in-depth study of risks involved, including detailed determination and quantitative evaluation of risks involved to optimize investment returns.

DCF or Discounted Cash Flow, Sensitivity, and Scenario Analyses
The Discounted Cash Flow method compares the targeted rate of return or hurdle rate to the estimated net present value of the cash flow of the project. The DCF method is widely used in the industry as it provides a sound approach to accounting for the time value of financial investments, and it provides a clear baseline for critical decision-making. This method comes with a few inherent issues, including assumptions that cash flow is static, the discount rate sufficiently accounts for project risks, and inadequate assessment of risk mitigation efforts.

The application of sensitivity analysis and scenario analysis methodologies may address these shortcomings of the DCF method. Evaluating for uncertainty generates a range of values for the project’s metrics although the output may not adequately describe the range of possible outcomes for the project.

Quantitative Risk Analysis
Quantitative risk analysis takes each input and defines a set of characteristics to describe probability distributions. These metrics may include minimum and maximum values, expected values, standard deviations, and percentiles. Valuation models correlate the distributions to generate a relevant description of possible outcomes.

Advantages of Risk Analysis

Quantitative oil and gas risk assessment provides for a broader and more in-depth accounting for uncertainties in project outcomes. The qualitative portion of the analysis identifies underlying factors that enhance risks. The ability to evaluate critical risk factors for oil and gas projects is crucial to optimizing outcomes and planning for effective and cost-efficient risk mitigation programs.

Substantial investments are required for gas and oil exploration and production projects. The attendant risks involved in this industry are considerable especially given increased regulation and vulnerability to political factors, which are among the wide-ranging factors affecting this sector.

Regardless of the size of the project or the outfit, operators may benefit from experts who specialize in petroleum economics consulting. Professionals with the experience and skill set to audit project risks, generate risk assessment surveys, and present mitigation strategies based on possible outcomes will certainly provide industry-relevant parameters for managing oil and gas projects.