Posted on June 11, 2021 by Lisa Ward

by Gary Citron, Senior Associate

“It is amazing what you can accomplish if you do not care who gets the credit.” –Harry Truman

In any successful business, some individuals make significant contributions but remain out of the spotlight. As R&A’s risk analysis software company transitions to a new generation of products, we cannot move forward without recognizing the immense impact R&A Senior Associate Roger Holeywell had as our chief programmer during our MS Excel-based era (1997 to 2020).

HOW IT BEGAN

Looking for topical yet practical training around 1994, Marathon’s Roger Holeywell attended the AAPG prospect risk analysis class that Pete Rose, Ed Capen and Bob Clapp taught. After completing this classic course, Roger had an epiphany. Converting the concepts, and formulae to MS Excel over the next few weeks, Roger created what became Marathon’s first standardized prospect characterization software. Roger convinced Marathon to widely distribute the software he built. By 1996 Marathon had a standard, consistent package for their prospects.

CREATING A SOFTWARE BUSINESS

By 1995 risk analysis concepts became rooted in many larger companies, who similarly built their software packages. The Amerada-Hess Exploration VP phoned Pete to ask if anyone had built software to apply the concepts taught. Pete approached Mobil, Conoco, and Marathon with that request. In short order, Mobil and Conoco said no, but Roger, after checking with Marathon management, replied “Sure, Marathon will license theirs.” Within a couple of weeks, Marathon received $10,000 from Hess, and Hess was a happy client with a new software tool. A couple of months later, Roger called Pete to inform him that Marathon no longer wanted to directly sell software but was willing to partner with Pete because of his extensive contacts. A 1997 contract between Marathon and Pete’s LLC, Telegraph Exploration, provided for a 50-50 revenue split, with Marathon retaining ownership of the code, and Telegraph handling all business matters. To help manage the growth, Pete approached Roger to run the new software company, Lognormal Solutions (LSi), owned by the newly formed Rose & Associates (R&A) in 2000. Roger declined that offer, as he wanted to focus on progressing his career at Marathon.

However, in what became a win-win situation, Roger expressed interest in continuing to raise the profile of his progeny. By 2001 Roger received written approval from Marathon to work for LSi, further progressing the software, but on his own time (nights, weekends, and selected vacation days). By January 2005, Marathon executed an addendum to the 1997 agreement relinquishing all ownership rights of the software to LSi. Roger’s retirement from Marathon in 2015 allowed him to become a full-time Senior Associate and programmer at R&A.

In addition to the prospect software, Roger also coded the original versions of multiple zone aggregation software. These products evolved into Multi-Zone Master (MZM) and Multi-Method Risk Analysis (MMRA). MMRA and Multi-Zone Master would be bundled with a versatile utility program Roger created (Toolbox) to gather, condition, and analyze data to fashion inputs into MMRA software. The toolbox features myriad curve fitting capabilities and calculation of hydrocarbon fluid expansion and shrinkage attributes.

R&A’s software was particularly attractive to companies of mid-size market capitalization. But to serve smaller companies who wanted a consistent characterization platform to demonstrate their savvy Roger built the Essentials Suite in 2004, which had prospect software with two-zone aggregation capability, limited data plotting, and a portfolio aggregator. A much lower price point with basic capabilities admirably serves these smaller companies.

How did all this work get done essentially through one person who worked as a full-time Marathon employee? The answer is PWP (Pajama Weekends Programming). While at home over the weekends Roger’s attire was strictly unadulterated pajamas-only. The design and programming sessions were hardly a picnic, as Roger shared some of the major challenges they tackled. For example, how to keep the software working optimally after Microsoft released versions of Excel that conflicted with the code to create pervasive security or performance issues? Roger experienced one of his most gratifying programming moments in 2005, harnessing Microsoft’s VBA to provide an internal Monte Carlo simulator.

PLAY IT AGAIN, SAAM

Incredibly, the energy described above that Roger infused into R&A software constituted about 50% of his moonlighting time. The remaining 50%, Roger spent coding SAAM (Seismic Amplitude Analysis Module), the software product generated by R&A’s DHI Consortium. In mid-2000, when many of our clients wanted to see a consistent set of industry-derived best practices around amplitude characterization for chance of success determination (commonly part of ‘risking’), Pete Rose turned to Mike Forrest to geologically weave seismic amplitude anomalies into the fabric of prospect chance characterization. We planned from the start to have Consortium best practices coded into software, so we reached out to Roger to serve as a programmer.

For SAAM, Roger programmed an innovative interrogation process that facilitates a systematic and uniform grading of the amplitude anomaly, beginning with the geologic setting, a preliminary chance assessment solely from geology (Pg), and salient amplitude attributes the seismic survey is designed to extract. SAAM requests the exploration team to answer questions about AVO classification, seismic data quality, rock property analysis, analogs, and potential pitfalls. Thus, SAAM successfully institutionalizes a thorough process they might otherwise avoid or forget. Through the Consortium-derived parameter weighting system, SAAM registers the impact of data quality and seismic observations as well as any rock properties modeling, to determine a modifier to the initial Pg. This modification or ‘Pg uplift’ is now calibrated by over 350 drilled wells. Success rates recorded in SAAM’s database can be critically compared to the forecast success rates to further calibrate the weighting system employed. The Consortium remains a vital industry gathering. Roger attributes longevity to the breakthrough thought that meetings should be member-driven, designed around presentations about a prospect by a member company. During Consortium meetings Roger populates a SAAM file for each prospect in real-time during the presentation, and then the members discuss if they would drill the prospect guided by the SAAM inputs and outputs. Finally, the company reveals the drilling results. All inputs, outputs, and results are added to the database. SAAM’s architecture and workflow were based on a collaborative framework from the very beginning.

THANK YOU, ROGER

It’s hard to fathom the magnitude of such varied software-related contributions built solely in his ‘spare time,’ so for all he has done, here’s a toast to Roger Holeywell, an unsung hero working behind the scenes creating value from risk analysis.

Posted on May 4, 2021 by Lisa Ward

Abridged from a presentation by David Cook and Mark Schneider.
View the original poster
Read the full paper

Rose & Associates introduced the Pwell concept at the AAPG 2017 convention in Houston and released the methodology in our Multi-Method Resource Assessment (MMRA) program in early 2018. This method is now available in RoseRA, our current prospect risk analysis software. The Pwell function in RoseRA provides insights into the balance between the chance of success and the potential resources when a well is drilled in a location that is downdiped from the crest of the structure.

Pwell helps users address key issues related to:

Choosing the best downdip location, giving your discoveries a higher probability of finding Estimated Ultimate Recovery (EUR) hydrocarbons exceeding the Minimum Commercial Field Size (MCFS)

Understanding the impact on the chance of geologic and commercial success

Ensuring that, in the event of a dry hole, there will be “no regrets” about potential up-dip volumes tempting a decision-maker to drill a sidetrack or new well up-dip.

The Pwell function in RoseRA requires that an Area distribution be modeled using either the Area-vs-Depth or Area x Net Pay rock volume estimating methods. Users can input the closing contour area at the proposed well location and simulate it to calculate new metrics for the well location including:

The Pwell chances of geologic and commercial

Given a discovery, the range of resources up-dip and downdip from the well location.

Given a dry hole, the range of “attic” resources up-dip from the well location and the up-dip chance of commercial success.

In the following example, the team has proposed a downdip well location at 2,000 acres. The total prospect has a Pg = 50% and a Geologic Mean of 56 mmbo based on an area distribution with P90 of 1,000 acres and P10 of 6,000 acres. The commercial MCFS is 25 mmbo which results in the probability of commercial success being 37.1% and a commercial mean of 70 mmbo. Input the downdip closing contour area of 2,000 acres into RoseRA and run the simulation to observe the results.

Figure 1: Input Area of Closing Contour at Proposed Well Location
Figure 2: Downdip Geologic and Commercial Chance of Success are reported.

The Pwell chance of success decreases the further downdip the well is drilled. In this example, the Pwell geologic chance of success at the 2,000-acre downdip location is 34.0%, much lower than the crestal Pg of 50%. The tradeoff is that the proportion of the commercial downdip resources is high (95.5%), making the Pwell commercial chance (32.5%) only slightly below the Pwell geologic chance. Given success, you are almost guaranteed a commercial accumulation by drilling at the 2,000-acre location.

There is still the possibility of resources being present up-dip if the well location is dry. What is the commercial chance of success for the potential up-dip resources given a dry hole?

Figure 3: Updip Geologic and Commercial Chance of Success are reported.

Pg for the up-dip resources remains the same as the original crestal Pg of 50%. About 41% of the up-dip resource distribution exceeds the MCFS so the Pwell commercial chance is 20.6%. Would that Pwell commercial chance tempt management to drill an expensive sidetrack or a second well to determine whether commercial resources are present? Management would need to know the up-dip resource distribution to make an informed decision.

Use Pwell to investigate the up-dip and downdip geologic and commercial resources as shown in the Pwell log probit below.

Figure 4: The Pwell Log Probit chart shows the Updip (black) and Downdip (blue) resources and Updip Commercial (green) resources.

Updip geologic resources (black) range from P99 of 5 mmbo to a P01 of 66 mmbo with a Mean of 25 mmbo. Downdip geologic resources (blue) range from P99 of 18 mmbo to a P01 of 282 mmbo with a mean of 79 mmbo. The red shading indicates the overlap between the up-dip EUR distribution from P65 to P01 and the downdip EUR distribution from P99 to P45. Resources in excess of about 71 MMBO are not achievable in the up-dip EUR distribution. What is the consequence of the distribution overlap region on decision-making?

With the commercial MCFS of 25 mmbo, the up-dip commercial resources (green) show the up-dip Pmcfs is 41.1% and the up-dip Pc is 20.6%. This commerciality possibility might tempt management to spend additional capital to test for an up-dip commercial accumulation. If this is the case, perhaps the well should be drilled further up-dip to minimize total drilling capital.

Another metric that has been used in the industry to gauge whether or not to spend additional capital is the “No Regrets” resource, which is calculated as the up-dip productive area above the drilled location multiplied by the up-dip mean net pay and the up-dip mean oil yield. Figure 5 shows the “No Regrets” resource is 38 mmbo. Notice that this method provides a deterministic value to aid in decision-making instead of the full up-dip distribution and additional insight that RoseRA provides.

RoseRA provides the ability to model multiple zones, multiple wells, and multiple downdip Pwell areas, shown as vertically stacked zones in a single file. It simulates and reports all entities within the same simulation. Figure 5 shows the up-dip and downdip chances and EURs as increments from the crest to 2,500 acres. Determine which downdip location balances risk, volume, and value for your company.

Figure 5: Sensitivity Analysis of Multiple Well Locations Can Be Done within a Single Simulation

The RoseRA Pwell function reports clear tabular and graphical outputs so that the rationale for drilling the exploration well downdip can be discussed and communicated with team members and management. Pwell helps facilitate the chance of making a commercial discovery while minimizing the need to spend additional exploration capital. Such scenarios might involve drilling a downdip appraisal well to confirm commerciality, or in the event of a dry hole, avoid drilling an up-dip sidetrack or additional well to determine whether there is a “left-behind attic” resource. A dry hole with the potential for up-dip commercial EUR creates a real conundrum. The Pwell analysis with up-dip and downdip distributions can help clarify a better decision.

This methodology can also help users select locations for appraisal wells. The only difference from the exploration example above is that the crestal exploration discovery results in a prospect Pg = 100%, and the Pwell analysis is performed using the updated resources distribution following the discovery well.

Contact Phil Conway and David Cook at Rose & Associates today to get a demo or more information.

The implementation of Pwell in RoseRA is built on the methodology described in the following reference:

 Schneider, F. M., Cook, D. M., Jr. (2017, April 2-5), Drilling a Downdip Location: Effect on Updip and Downdip Resource Estimates and Commercial Chance [Poster session], AAPG 2017 Annual Convention, Houston, TX, United States.

http://www.searchanddiscovery.com/documents/2017/42102schneider/ndx_schneider.pdf

Posted on April 26, 2021 by Lisa Ward

by Creties Jenkins, Partner at Rose & Associates

Take 30 seconds to memorize these 10 items from a shopping list, and then write down how many you can recall: milk, yogurt, croissants, bananas, muffins, coffee, ham, jelly, cheese, and eggs. Most people will be able to remember about 7 items. Now make a note to construct this list again from memory in 24 hours. How many do you think you’ll recall? I’ll be impressed if you can list 5 or more!

The problem, of course, is that this list is being held in your short-term memory (STM) and doesn’t get transferred to long-term memory (LTM). As you can see, STM has a severe capacity limitation. However, this can be overcome, in part, by informational grouping. So if you noticed that the list above consists of breakfast items, you can create this organizational structure in your LTM to help you more easily recall the items next time.

This interconnectedness of information is a cornerstone of memory. Think of memory as a massive, multi-dimensional web in which data is retrieved by tracing through the network. Retrievability is influenced by the number of storage locations as well as the number and strength of the pathways. The more frequently a path is followed, the stronger the path becomes.

This can be illustrated by comparing the abilities of chess masters and ordinary players. If you randomly place 20-25 chess pieces on a board for 5-10 seconds, each of these groups will only be able to recall the positions of about six pieces. However, if the positions are taken from an actual game, the masters will be able to reproduce nearly all of the positions whereas the ordinary players will still only be able to recall the positions of six pieces.

The masters are using their LTM to connect individual positions into recognizable patterns that ordinary players do not see. This ability to recall patterns that relate facts to each other and broader concepts (such as strategy) is critical for success in chess. But what about the business world?

Decision makers and subject matter experts often see themselves as the equivalent of chess masters. They believe the data and experiences contained in their LTM allow them to uniquely perceive patterns and draw inferences that give them a competitive advantage. What’s forgotten many times, is that unlike chess, the permissible moves in the oil and gas business are constantly changing.

Once you start thinking about a given challenge, the same pathways that led you to a successful outcome in similar challenges will get activated and strengthened. This can create mental ruts that make it difficult to process different perspectives, including those that could lead to a better outcome. Our memories are seldom reassessed or reorganized in response to new information, making it difficult to modify these existing patterns.

To overcome this, you need a wider variety of patterns to reference and greater processing of new information to fully understand its impact. This means reaching out to a wider network of experts and devoting more time and effort to develop a deeper understanding, as well as implementing procedures that facilitate this including framing sessions, peer reviews, and performance lookbacks.

These procedures, as well as others, are discussed and practiced in our Mitigating Bias, Blindness and Illusion in E&P Decision Making course. Please consider joining us for a virtual or in-person offering, either as an open enrollment or internal session at your company.

Reference excerpted for this blog: Heuer, Richard J., Jr., 1999, Psychology of Intelligence Analysis, Center for the Study of Intelligence.

Posted on January 21, 2021 by Lisa Ward

by The Deep Coal Consortium (Deep Coal Technologies Pty Ltd, Rose & Associates, and Cutlass Exploration)

The Deep Coal Consortium is pleased to announce the completion of the first systematic review of the volumetric potential of the Gidgealpa Group Coal Measures in the deep portions (>3,000 feet) of the Cooper Basin based upon detailed analysis of well data (n=1,300). The Gidgealpa Group was evaluated in ten assessment intervals based on regional well correlation.

The study provides both deterministic and probabilistic estimates of in-place and technically recoverable gas and liquids, and (for the first time) Prospective Resources. Results are presented in tabular form and as a set of maps for each assessment interval and for the Group.

Low (P90), Median (P50), and High (P10) map realizations were generated for each assessment interval and volumes were aggregated to estimate the Group resource potential. An example of one interval is shown below.

The innovative methods used to calculate the map-based uncertainty in volumetric potential are discussed in URTEC 198304-MS.

The Deep Coal Play contains world-class volumes of potentially commercial gas and condensate. This volumetric study provides a strategic spatial context for decision-making as efforts to commercialize this play accelerate in the months ahead.

In addition, the data can be used to more fully characterize specific portions of the play, such as company acreage. The Consortium can provide technical support for such customized analysis.

For details contact David Warner of DCT.

Download this article as a PDF.

Posted on January 6, 2021 by Lisa Ward

by Henry Pettingill and Gary Citron

“It is amazing what you can accomplish if you do not care who gets the credit.”
Harry S. Truman

BEGINNINGS

In 2000, a tipping point occurred when many companies wanted to see a consistent set of industry-derived best practices around amplitude characterization for a chance of success determination (prospect ‘risking’). In response, Rose & Associates’ founder Pete Rose and his Senior Associate Mike Anderson turned to former Shell and Maxus geophysicist and executive Mike Forrest to consistently weave seismic amplitude anomaly information into the fabric of prospect chance assessment. With the input of others, they decided to form a consortium of companies to capture best practices in a process that quickly evolved to include a user-friendly software and database, which became SAAM (‘Seismic Amplitude Analysis Module’). They reached out to Geologist Roger Holeywell, who was actively commercializing other risk analysis software for R&A through a partnership with his employer, Marathon Oil Corporation, to serve as SAAM programmer.

The DHI Consortium officially began at Dallas Love Field on December 7, 2000, a day recorded famously as the starting flag, with the inaugural meeting of the 13 founding companies in January 2001. Shortly thereafter Rocky Roden (Repsol’s Chief Geophysicist and representative to the first Consortium Phase, and a thought leader in geophysics) ‘retired’ from Maxus and joined Mike and Roger as the third director of the DHI Consortium.


DHI Consortium group photo, May 2001 in Houston

WHAT IS A DHI?

In many basins with sandstone targets (especially those deposited during the Tertiary Period), the seismic signal associated with that target can be quite strong. Rock density and seismic velocity contrast noticeably between units, and that contrast is amplified by the presence of oil or gas in the pore system, making accumulations appear ‘anomalous’. The most common measure of the strength of the anomaly is the seismic amplitude (amount of signal deflection). These anomalies first became observable in the mid-1960s on relatively low-fold seismic lines and yielded significant quantifiable information as the fold increased.

While interpreting such seismic data, a geophysicist will measure an objective’s amplitude level in comparison to the ‘background’ level surrounding the objective amplitude. Significant amplitude strength above the background is referred to as an ‘anomaly’ or a ‘bright spot’. Mike Forrest is credited as one of the first explorers to recognize the exploration impact of seismic amplitude-bearing prospects when he was a Gulf of Mexico exploration project leader at Shell in the 1960s. The acronym DHI stands for Direct Hydrocarbon Indicator, suggesting that the seismic amplitude (hopefully) results from hydrocarbon charge.

THE CONSORTIUM FOR 20 YEARS

At the very first meeting in January 2001, the Consortium commissioned Roger to program an innovative interrogation process that facilitates a thorough, systematic, and consistent grading of the amplitude anomaly, based purely on observations (as opposed to interpretations). It begins with the geologic setting, a preliminary chance assessment solely from geology (as if no DHI were present), and many key attributes the seismic data are designed to extract. SAAM requires the exploration team to answer questions about AVO classification, seismic data quality, rock property analysis, amplitude character, analogs, and potential pitfalls (also referred to as false positives). In other words, SAAM successfully institutionalizes a process through which explorers address salient issues, forcing those who would otherwise treat it in a perfunctory fashion to digitally record the key information they may later forget or lose. Employing a weighting system, SAAM registers the impact of each scored characteristic to determine a ‘DHI Index’, which in conjunction with the Initial Pg yields a ‘Final Pg’. This ‘Final Pg” is now calibrated by over 354 drilled wells.

Initially, each Consortium phase lasted 12 to 18 months. In 2016 Consortium phases were aligned with the calendar year to better synch with client budgeting. Consortium membership is paid through a phase fee and each new company that joins must purchase a license to SAAM. Companies may license a version of SAAM without joining the Consortium, but that version does not include the powerful analog and calibration database, which at the end of 2020 SAAM has 354 drilled prospects. The final Pg can then be analyzed in several ways, and critically compared to the success rates of similar DHIs and further calibrate the weighting system. SAAM’s burgeoning database is owned by Rose & Associates, with each member company having rights to internal usage.

In accepting AAPG’s highest honor in 2018 (the Sidney Powers award), Mike Forrest commented on the Consortium: “We expected it to last a year or two, and almost 20 years later it’s still going on!” Roger attributes the longevity to the breakthrough thought that meetings should be member-driven, designed around member company prospect presentations.


First European Chapter meeting, October 2007 in London

Can you think of any other venue to see drilled prospects, learn how they originated, how they were technically matured, the drilling outcome, and lessons learned from the journey – all within two hours? Companies benefit from seeing the tools and techniques other companies use in the analysis. Although some companies are unable to share the exact location of certain prospects, all the key attributes are shown and a SAAM file is populated for each prospect in real-time. Then the participants are asked for their opinions of prospect quality and ultimately whether they would drill it. The company then reveals the drilling results, and all inputs, outputs, and drilling results are added to the SAAM database.

As member participation in Consortium meetings grew, with more younger staff involved, the challenge became how to get more people in the room involved to avoid domination of discussion by senior members? Roger answered this question in 2017 by introducing the use of individual wireless keypads (’clickers’) that quickly register, compile, and display the entire group’s answers to a variety of grading questions anonymously. This permitted the leaders to ask people to explain their diverse views, which highlights how differing perspectives can get the discussion to a higher level.

SAAM’s architecture and workflow have constantly evolved but were always based on a collaborative framework. For every prospect shown in a meeting, during a subsequent weekend the Consortium Leadership gathered again to review the SAAM file, ensure consistency, and discuss what could be improved in the software, based on key observations from the prospect presentation.

Founding Consortium Chair Mike Forrest still consults with Consortium, which since 2019 has been under the direction of R&A’s Henry S. Pettingill. Henry was a member of the Consortium with Repsol during its inaugural year, and as Director of Exploration at Noble Energy, oversaw their participation since 2002. Roger and Rocky have guided and influenced the technical direction of the Consortium throughout its history. With companies opting in and out through the years, much of the stewardship of SAAM (updating functionality, testing, checking for consistency, and database trends) is left up to Roger’s discerning eye. He takes advantage of the crew changes, always on the lookout for the fresh perspective provided by new Consortium member feedback.

The Consortium’s secret sauce is relationships. That starts with the Leaders, who have known each other and worked together for over 20 years. Rocky and Henry both worked for Mike in the 1980s and 1990s at Maxus and Shell respectively, Rocky and Henry then worked together within Repsol/YPF Maxus. All along the way, Roger interacted with Mike, Rocky, and Henry, as authors of R&A’s software products.

But there is also a strong bond between many of the members, some of whom have interacted in and out of the Consortium for over 20 years. One of the highlights of Consortium meetings in Europe is the way each host company highlights the unique aspects and special history of their city and culture through a ‘networking dinner’ featuring the local cuisine. These became an instant tradition, strengthening the network by building friendships amongst the industry’s elite DHI practitioners. There have even been occasional field trips to classic locations in Europe and South Africa.

Each year, the Consortium typically holds five meetings in Houston and three in Europe. Since March 2020, due to the COVID-19 pandemic, the meetings were replaced by monthly webinars. This turned out to be a blessing in disguise as it caused a surge in participation and expanded the reach of the Consortium. Whereas most people can attend a meeting only occasionally (and those in remote field offices virtually never), webinar attendance far exceeded regular meeting attendance, with new participation from field offices in the Netherlands, Oman, Malaysia, Indonesia, and New Zealand.

THE CONSORTIUM TODAY

As we celebrated our 20th anniversary in this month’s webinar, we looked back on the 20 years and some of its accomplishments – too many to list here. In numbers, the SAAM database has 354 drilled prospects from 30 basins, allowing calibration of assessments of undrilled prospects, as well as providing valuable benchmarking data. Over 80 companies have participated, most of whom have contributed drilled prospects to the database, and we have 36 member companies this year. But probably the most enduring accomplishment is the heightened prospecting skills and intuition of the participants. This has resulted in the Industry’s most comprehensive DHI prospect database, all evaluated using consistent methodology and peer-reviewed by a roomful of advanced practitioners.


Consortium membership and SAAM database vs. time

Perhaps most remarkable is how the Consortium has evolved with time and technology. For instance, seismic imaging and other advances have allowed for things unthinkable just a few years ago, like imaging amplitudes beneath salt. Computer power and machine learning have allowed analyses like never before. And each year, the Consortium sets goals according to where we are in this evolution.
It all leads us to ask: what will the next 20 years have in store for us? Most of us agree that changes will continue to come in DHIs and associated exploration technologies, and make the unimaginable not just imaginable, but even more fun.

Posted on December 9, 2020 by Lisa Ward

by Marc Bond

Following a groundswell of interest generated by a presentation at the 2008 AAPG Annual Convention by Glenn McMaster et al entitled “Risk Police: Evil Naysayers or Exploration Best Practice?” several (including myself, then at BG Group) thought it would be an excellent idea to organize a workshop to discuss best practices and challenges of exploration assurance. Glenn (then at bp) was great at speaking truth to power and embraced the idea. Hence, he and Gary Citron (at Rose & Associates) convened the first Risk Coordinators Workshop (RCW) on November 18-19, 2008, graciously hosted by bp in Houston. Twenty-eight industry leaders from 18 companies attended. Twelve presentations were given by the attendees, mostly focused on the state of assurance within the company of the presenter which was a rare insight at the time. That openness fostered a sharing and collaborative environment, defusing our concern that this would be a “one-off “event. Rather, the enthusiasm and interest of the successful workshop encouraged us to continue.

I particularly enjoyed attending and contributing to the Workshop through the years. We now continue with a yearly workshop, with the goal of sharing common experiences, issues, challenges, and suggested best practices. There is a nominal fee for attendees to cover expenses. The only obligation is to be open and share. We held our 19th annual workshop in November 2020. Given the current situation of the pandemic, the RCW was held virtually for the first time; and measured by the commentary and feedback, was a great success.
When I joined Rose & Associates in 2014, I brought in the idea of increasing workshop frequency, and we now meet two to three times a year (in North America and England, and every other year in Asia). We also now include Breakout Sessions to explore relevant assurance themes and provide a Summary Report to capture the outcomes of each workshop.

In 2015, we established the Risk Coordinators network as a natural follow-up to the RCW, which consists of a group of subsurface assurance experts who are responsible for assuring their companies’ opportunities. The network is an informal group that includes over 70 companies (ranging from super-majors to small companies) and over 160 people who are very open and passionate about assurance, risk analysis, and prospect assessment. Along with the workshops, we have now been active for over 12 years.

We work with the network on other assurance-related items, such as delivering a periodic Assurance Survey (2015 and 2019). The Survey results are shared with the network to monitor the current state of assurance and provide them with learnings to help improve their own assurance process.

Doug Weaver (Partner Rose & Associates) and I now manage the network. I would like to personally thank Gary for his support and coaching over the years. If you have any questions about the network or ideas for the next RCW, please contact us

Stay safe and healthy.

Posted on November 10, 2020 by Lisa Ward

by Doug Weaver

Last time we discussed the need to quantify everything in exploration, using my college glacial mapping project as an example. Let’s move back to the world of oil and gas exploration.

The main takeaway from my first blog is that an engineer’s role in exploration is to quantify. Geoscientists make interpretations of data and then engineers turn those interpretations into resource and economic assessments. The ultimate goal is to generate an inventory of opportunities that can be high-graded, allowing investment in those that are the most financially worthy. But how do we combine, resources, chance of success, costs, and economics to do this? We employ the expected value equation.

(Pc x Vc)-(Pf x Vf) = Expected Value

It’s a very simple equation. Let me describe the terms. Pc is the chance of success, Vc is the value of success. Pf is the chance of failure, Vf is the value (or cost) of failure. When we subtract the two terms we generate an expected value. If the expected value is positive the project is an investment candidate, if it’s negative, we’re gambling. We could still invest in a project with a negative expected value, but likely we’re going to lose money, and we’ll certainly lose if we invest in enough of them.

So let’s assume you’ve just generated a prospect, and you can make some estimate of a few items to describe it. You’ve got a rough idea of a chance of geologic success, maybe from working on a specific trend. You have some notion of size, either from your own volumetric assessment or again trend data. The engineer assisting your team with project evaluations should provide the team with a few key items to help with prospect screening.

  • Threshold sizes – how big do prospects need to be to be commercial?
  • NPV/Bbl (or Mcf) – what is the NPV/bbl for fields of various sizes? We’ll use this to transform barrels into dollars,
  • Dry Hole cost – what is the dry hole cost for an exploration failure in the trend? (Might want to get depth specific here)

Back to the equation. First the success case. Notice both P (chance) and V (Value) in the success case have the subscript c, meaning commercial. What we’re looking for is the Commercial Chance and Commercial Value, not the geologic counterparts. If you have done a formal resource assessment this conversion is easy, you just determine where the threshold volume intersects the resource distribution. In the example below if the threshold is 40mmbo, it intersects the resource distribution at the 75th percentile. If this project has a geologic chance of success of 30%, the commercial chance of success is simply 30% x 75% or 22.5%. (For anyone not familiar with the convention, 40mmbo means 40 million barrels of oil).

The Commercial Volume would be determined by the resource that exists between the threshold volume and the maximum volume or between 40 mmbo and 76 mmbo. There are better ways to determine this, but for now let’s just use an average value of 58 mmbo.

Now you may ask, especially for screening, what if I don’t have this resource distribution? What if I’ve just made a quick deterministic volume estimate multiplying prospect area times a guess at thickness times a likely recovery yield (Bbl/ac-ft)? Can I still estimate the expected value? Sure, just try to apply the process described above as best you can. If the threshold is 40 mmbo and you calculate a resource of 300 mmbo, adjustments to geologic chance and volume will be minimal when considering their commercial values. If you calculate a volume of 45 mmbo, I might not try to estimate commercial values, but you already know the prospect is likely challenged commercially.

Now that we have an estimate of volume and chance, we need to convert our volume to value. The simplest way to do this is with a metric called NPV/bbl. The engineer assisting your team has likely evaluated many fields of various sizes in his evaluation efforts. Your group has probably generated other prospects in the trend, evaluated joint venture opportunities, and maybe even had a few discoveries.

For each of these opportunities the engineer has had to estimate the success case value or NPV (Net Present Value) for a given field volume, usually at the mean Expected Ultimate Resource(EUR). The NPV is going to account for the time value of money at your company’s specific discount rate. A typical discount rate is 10%, resulting in what is referred to as an NPV10. The NPV calculation accounts for all production (therefore revenue) and all costs and expenses over the life of the field, including the costs of completing the discovery well and drilling and completing appraisal wells, and reduces them to a single value. When this value is divided by the volume associated with the evaluation, we generate the metric in dollars/barrel of NPV/bbl. Given that these types of evaluations have been generated for several opportunities within a play, we can get a pretty good idea of how NPV/bbl changes with field size.

Note that for a given play in a given country NPV/bbl often doesn’t change dramatically. If you’ve only got a few field evaluations at your disposal the engineer should still be able to provide a usable NPV/bbl. Better yet embrace the uncertainty and test your prospect over a range of values. Finally, to determine Vc I simply need to multiply my mean EUR volume by my NPV/bbl.

The failure values are much easier to determine. Pf, the chance of failure is simply 1-Pc. Simple as that. For conventional exploration opportunities Vf or value (cost) of failure is usually just the dry hole cost. Most explorationists working on a trend have a pretty good idea of that cost, if not ask a drilling engineer. For the expected value equation, you should input an after-tax dry hole cost. Obviously, the tax rate will change from country to country, for the US the after-tax dry hole cost is about 70% of the actual cost.

Now we have all the pieces we need to generate the expected value. Let’s start with the plot earlier in this discussion and do that.

We have:

A commercial success volume of 58 mmbo

A commercial success chance  of 22.5%

A failure chance of 77.5%

Let’s also assume an NPV/bbl of $2.00 and a dry hole cost of $20mm.

A couple of preliminary calculations

Value of success = 58mmbo x $2.00/bbl = $116mm

Cost of failure = $20mm x 0.7(tax) = $14mm

Here’s our equation

(Pc x Vc)-(Pf x Vf) = Expected Value

Plugging in values

(22.5% x $116mm)-(77.5% x $14mm) = Expected Value

$15,250,000 = Expected Value

Is this good? Yes, we’ve generated a positive value. Remember if it’s negative, we could still pursue the project but now we’re not investing we’re gambling. The key is that we need to perform this analysis on all our projects, look at our available funds, and invest in the best. That’s portfolio analysis and the topic of a later discussion.

The point of this blog was to simply walk you through the process, and encourage prospect generators to apply it to your opportunities as early as practical, even if it’s a “back of the envelope” calculation. Beyond chance and volume, all you need is a few values from your engineer. You’ll be able to use this tool to judge whether the prospect you’re working on is likely to be pursued or not. It may also give some insights as to what can be focused on to improve your prospect. For example, if you generate a low (or negative) expected value are there areas for improvement in chance or volume? If not, maybe it’s time to move on to the next one.

Posted on September 30, 2020 by Lisa Ward

by Marc Bond, Senior Associate

COGNITIVE BIAS

Success and value creation in the oil and gas industry have not been particularly good, as evidenced by its relative performance. This is widely recognized both in the global markets and within the industry itself. There are certainly many reasons that may explain why the oil and gas industry has not done well over the years, and one area that has received a lot of traction in recent years is the concept of cognitive bias.

Cognitive biases are predictableconsistent, and repeatable mental errors in our thinking and processing of information that can, and often do, lead to illogical or irrational judgments or decisions.

Surprisingly, the notion of cognitive bias has not been around for many years. It was first proposed by Amos Tversky and Daniel Kahneman in 1974 in an article in Science Magazine (Tversky and Kahneman, 1974). Since then, there have been numerous publications and research studies on the various cognitive biases and how they impact our judgments and decisions.

The book that introduced the concept of cognitive biases and their influence on the decisions of the general population was the seminal publication by Nobel Prize-winning psychologist Daniel Kahneman – Thinking, Fast and Slow (Kahneman, 2011). It is interesting to note that Dr. Kahneman won the Nobel Prize in 2002 not in Psychology, but rather in Economics. Why? Because traditional economic theory assumes that we are rational creatures when we make decisions or choices, and yet research and observations continually show that we do not.

There are many different cognitive biases (see Wikipedia), but there are a few that play a significant role within the oil and gas industry. These biases can act individually or in combination, leading us to poor judgments and decisions.

HOW BIASES MAY BE REPRESENTED IN THE OIL & GAS INDUSTRY

For example, imagine an exploration team is assessing a prospective area that is available for license bids. In the analysis of the data, the focus is on a very productive analog to describe the play. Nearby there has been a successful well recently drilled; and although it is acknowledged to be in a different play, the team is very excited about the hydrocarbon potential of their new play.

Some existing, older wells suggest that the new play may not work, but it is felt by the technical team that those wells were either old or poorly completed, and hence could be dismissed as valid data points. Given the uncertainty, the prospects and leads developed should have a very wide range of resource potential. However, given the team’s confidence in the seismic amplitudes, the range of GRVs estimated is quite narrow.

The team is also optimistic about the play potential and presents the opportunity to management in very favorable terms. If the company were to bid on and be awarded the license by the government, the team would be quite excited; and of course, success is often rewarded. The company ended up bidding on the license with a commitment of several firm wells. Upon further data collection and data analysis, a new team re-assessed the hydrocarbon potential and it is now believed to be limited; and yet there still is a large commitment to fulfill.

What happened to cause this result?  Was the original team overconfident in their expectations? Did they think that because they understood their commercial analog, they understood the perspective?  Were they so focused on the nearby successes? Was the data that was dismissed highly relevant? Were other alternatives and models not considered, which might have suggested that the resource size could be small?

Although the above narrative may appear to be contrived and one’s reaction to the scenario would probably be “I would never do that”, each of the justifications and decisions made are possibilities and all of them are rooted in forms of cognitive bias. You likely have recognized all or part of the scenario from your own experience. Further, these biases can work together in a complementary fashion, reinforcing the biased assessment, and making one “blind” to other possibilities.

Cognitive biases and their negative impact do not just present themselves during the exploration phase. There are numerous similar real-world scenarios observed in appraisal, development, production, and project planning projects.

STRATEGIES TO MANAGE

The bottom line is that these cognitive errors lead to poor decisions regarding work to undertake, issues to focus on, and whether to forge ahead or exit a project.  This makes it important to identify them and lessen their impact. Unfortunately, awareness alone is not sufficient. These biases are inherent in our judgments and decision-making and serve the purpose of helping us make rapid judgments based on intuition and experience. In our everyday life, they work generally well. Unfortunately, particularly in complex and uncertain environments such as the oil and gas industry, they can lead us to poor choices.

Hence, it is important to understand first what the biases are, why they occur, and how they can influence our assessments. This will then help us identify when our own, or our colleague’s judgments, assessments, and decisions may be affected by these cognitive biases. We then need to learn mitigation strategies. Given that these cognitive biases are normal and serve a purpose, the goal cannot be to remove them but rather to recognize the biases and then apply mitigation strategies to lessen their impact.

As noted above, there has been a lot of research on the biases, yet there is little published on actual, practical mitigation strategies. Hence, to help our industry, my colleague Creties Jenkins and I  have developed a course entitled Mitigating Bias, Blindness, and Illusion in E&P Decision Making course, where we go into further detail regarding these vitally important mitigation strategies. We use petroleum industry case studies and real-world mitigation exercises to reinforce the recognition of the biases. Finally, we show how to employ the mitigations to ensure any assessments or decisions are as unbiased as possible.

REFERENCES

Kahneman, Daniel, 2011, Thinking, Fast and Slow, Penguin Books, 499p.

Tversky, Amos and Kahneman, Daniel, 1974, Judgment Under Uncertainty: Heuristics and Biases, Science, vol. 185, no. 4157, pp. 1124-1131.

Wikipedia, List of Cognitive Biases, https://en.wikipedia.org/wiki/List_of_cognitive_biases

Posted on September 2, 2020 by Lisa Ward

I’m Doug Weaver, and I’m a partner with Rose and Associates residing in Houston, Texas. I joined Rose a little over three years ago after retiring from a 39-year career with Chevron. I’ve spent well over half of my career in exploration as a petroleum engineer.

I’m often asked, “Why would an engineer be so interested in exploration”? There are many reasons, but let me pose one of my usual responses – “if you think it’s difficult to generate resource estimates with all the data you’d want – try doing it with none”.

I hope to continue this blog well into the future and get into some of the services engineers provide for exploration teams. But in this first session, let me convey an observation on a topic that will be pervasive in future notes – Engineers and Geoscientists approach problems differently.

As I was scheduling my final semester of undergrad, I met with my advisor to get his feedback on one last technical course. Though my major was geotechnical engineering, I was a bit surprised when he suggested an advanced course in geology. Being that my advisor was one of the top geotechnical engineers in the world, I took his advice and enrolled in Geomorphology. The class consisted of about twenty geologists – and me. A good background for a future engineer in exploration!

All my engineering, math, and science classes had followed a very familiar cadence. Three hourly exams and a final. No reading, no reports, just understanding equations and concepts and solving problems with that knowledge on a test. Solve problems with math.

In the geomorphology class, we were posed with the problem of figuring out where a glacier had stopped and created a moraine. We collected data in the field. We then went back to the lab, plotting and interpreting this data. To my surprise, I was able to plot the exact location where the glacier had stopped. No formulas, just data collection and interpretation.

I’m fairly sure that Professor Hendron not only intended for me to learn about geomorphology but also to give me the experience of this alternate approach to solving a problem.

From what I’ve observed, this typifies the way most engineers and geologists solve problems (of course, I’m typecasting us all). Engineers start with a systematic workflow leading to a precise answer, while geoscientists use a more fluid, interpretive approach. Which leads us to the best answer? Both methods – when used together. The issues we face in exploration will certainly not allow the precise answer an engineer would normally want. In exploration, engineers need to embrace the uncertainty present in every aspect of their calculations. But, at some point, we need to quantify our analysis. We can’t make effective decisions if we can’t quantify and rank the investment options for our companies. And that becomes the primary role of the engineer in exploration – to quantify opportunities.

Back to our glacial moraine. Suppose I’m a Midwestern gravel company looking for mining opportunities. It’s great that I’ve identified my moraine and a potential quarry, but what does that imply from an investment perspective? How does this deposit compare to others I might exploit? What’s the quality of the sand and gravel within the deposit? Are others more accessible?

Switching hats from geologist to engineer, my task is now to answer these questions. I now understand that I will never know the exact size of the deposit, as it is uncertain. I’ll have to rely on samples collected to build a representation of the nature of the deposit, realizing the samples reflect a tiny portion of the total moraine. This data will inform me about the range of possible sizes of this deposit. I’ll want to investigate other deposits in the area to support the analysis of the samples I’ve collected in my own deposit and investigate how they were developed to get some idea of how to best evaluate the costs and timing of the extraction process. Finally, I somehow have to transform my moraine map and all these answers into a range of economic metrics, primarily Net Present Value, or if risk is present, Expected Value.

That’s where we’ll pick up next time, interrogating the Expected Value equation. Thanks for reading!

Posted on June 15, 2017 by Lisa Ward

Rose & Associates Success Plan

Suppose I said to you “Sue’s got a bug”. Quickly now…what do you think Sue has? If you’re a programmer, you probably think Sue has a computer virus. But if you’re a doctor, perhaps the flu comes to mind. And if you’re an entomologist, a ladybug may be your first thought. Did you consider all three as possible outcomes? Probably not. And what about some others? If you’re a spy, you might think Sue found a listening device. If you sell cars, you might think Sue bought a Volkswagen Beetle. The list goes on and on.

So why didn’t all of these come to your mind? Well first off, I asked you to respond quickly, which reduced the time you spent thinking about it. And second, you based your response on your intuition, instinct, or experience. You responded reflexively. This is inherently how we make most decisions every day. Do you know how much fat and calories are in that Sausage McMuffin you ordered? Did you review the economic fundamentals before acting on a friend’s stock tip? Did you read the TripAdvisor reviews that mentioned bedbugs in the hotel you booked? The answer to all of these is probably “no”. We neither have the time nor stamina to properly frame each decision in terms of uncertainty and risk.

The same is true in our working lives. However, the difference is that we’re paid to make good decisions in our jobs, and those decisions often involve millions of shareholder dollars. In these situations, we can’t afford to think reflexively. Instead, we need to think reflectively, which requires deliberate time and effort.

There are multiple tools to help us approach oil and gas decision analysis reflectively including…

  • staged approach focuses on determining what project stage you’re in, the key risks and uncertainties associated with that stage, and what data gathering and analyses you want to undertake to make a good decision about whether to move to the next stage.
  • Probabilistic thinking requires that we quantify the range of possible outcomes and assign a degree of confidence to any given outcome. This is much better than providing a single deterministic value as the most likely case because this is rarely (if ever) the actual outcome.
  • An assurance process, which provides independent and consistent guidance in the assessment of opportunities. This commonly involves subject matter experts involved in peer assistance and/or peer reviews.
  • Asking the right questions, means decision-makers need to probe 01. the work used to justify the recommendation, 02. whether the base case could be pessimistic/optimistic, and 03. whether credible alternatives were considered.

This sounds straightforward enough, but companies struggle to implement and apply these processes to their decision-making consistently. New management teams want to reorganize the way things are done. Staff turnover erodes the memory of what worked and what didn’t. Teams have turf to defend and walls to build. All of these contribute to lapsing into reflexive thinking.

“So what”, you say. “Let’s be bold and use our gut to guide us”. Could this be a successful strategy? Occasionally it does work, which provides memorable wildcatter stories (consider Dad Joiner). But given that oil and gas companies are in the repeated trials business, you’ll eventually succumb to the law of averages. For example, if we look at shale plays in the U.S., only about 20% of these have been commercially successful. You might get lucky by drilling a series of early horizontal wells in a shale play, but it’s more likely that you’ll squander millions of dollars you didn’t need to spend to realize that the play doesn’t work. In this sense, we’re like Alaskan bush pilots. There are old bush pilots and bold bush pilots. There are no old and bold bush pilots. If you want longevity, you need discipline.

Recently, we’ve begun to understand more about how people make decisions with their gut. It turns out that these reflexive decisions are very likely to be affected by cognitive bias. These are errors in thinking whereby interpretations and judgments are drawn in an illogical fashion. Some definitions and examples of this cognitive bias in the oil and gas industry are listed below:

  • Anchoring: attaching an evaluation to a reference value. Example: focusing on one geological model or a favored seismic interpretation.
  • Availability: overestimating the likelihood of more memorable events. Example: the recent well drilled by an offset operator with a huge initial production rate.
  • Confirmation: interpreting data in a way that confirms our beliefs. Example: collecting data in the most prospective area and extending this interpretation elsewhere.
  • Framing: reacting to a particular choice depending on how it is presented. Example: only comparing your opportunity to successful analogs.
  • Information: having a distorted assessment of information and its significance. Example: equating missing or low-quality data with a low or high chance of success.
  • Overconfidence: overestimating the accuracy of one’s own interpretation or ability. Example: generating a narrow range of resource estimates.
  • Motivational: taking actions or decisions based on a desire for a particular outcome. Example: Overstating the chance of success or size of the prize to get a project funded.

So if you’re going to make decisions “with your gut”, at least realize the types of cognitive bias that could impact your decisions, and take some steps to lessen their impact on your exploration risk analysis, resource play evaluation, or production type curve generation.

With this in mind, we’ve come up with a new 2-day course at Rose and Associates called “Mitigating Bias, Blindness, and Illusion in E&P Decision-Making”. This course, in concert with our portfolio of courses, consulting, and software designed to help you think more reflectively about your project, is aimed at helping you make better decisions. Check out our offerings.

~ Creties Jenkins, P.E., P.G., Partner – Rose and Associates