December 7, 2020 by

Production Surveillance in Under Five Minutes.

As a producer, it is important to be able to quickly narrow focus to specific assets requiring additional attention and prioritize efforts to achieve optimal performance.  In this short blog, I will demonstrate a quick and powerful workflow I have often utilized as a Production Engineer.  This workflow will help you zone in on those assets requiring additional TLC. How is corporate production this week? How does it vary from the prior week? Which assets are contributing to any weekly variance? Are these assets underperforming compared to a theoretical capability? HOW IS CORPORATE PRODUCTION THIS WEEK?  HOW DOES IT VARY FROM THE PRIOR WEEK? As a starting point I must know how current production compares to the prior week production.  The chart below shows corporate total production for the week along with the variance from the prior week.  Is there a positive or negative variance?  Is that variance expected, or unexpected? Figure 1: Corporate Total Weekly Production Summary with weekly variance   VERDAZO TIPS Separate the different Y axis’s by clicking the icon with the two arrows pointing towards one another in the quick bar. Add Chart data in a tabular form by clicking the Data Viewer icon in the...

Read More Download
November 6, 2020 by

The tips that every E&P company need if they want to maximize their asset value at all lifecycle stages.

VERDAZO 2020 Blog 1 – In the beginning; there was acquisition and play assessment.   The following blog series illustrates how VERDAZO can be used for the nose to tail analytics journey aimed to maximize value, optimize costs and drive operational efficiencies. VERDAZO can provide the confidence to drive informed decisions and focus on areas that will have a material impact on the balance sheet. We begin with a brief discussion regarding acquisition and play assessment. The Western Canadian Sedimentary Basin contains some of the most prolific and potentially profitable oil and gas resources in the world. The trick is to unlock these assets economically, safely and efficiently. This journey often begins with an acquisition target, assessment of it’s potential and a highly informed speedy determination on how to proceed. Determining the potential of an acquisition can be a daunting task given the wealth of available data, thus the need to leverage discovery analytics.  Discovery analytics is a sequence of explorations, each predicated on the discovery and insight of the last. Caution! The following is an illustration of an analytics workflow, not a fully vetted opportunity. Need for Speed – Initial Assessment: In the competitive market of discounted acquisition targets, it...

Read More
April 20, 2020 by

Analytics Checklist – Surviving Low Commodity Prices

When commodity prices fall and capital programs for drilling and completions get cut, its time to turn your attention to analyzing your data and making decisions that will minimize the negative consequences of these times. That’s why Verdazo has given all their clients unlimited licenses to help them dig into their data and find the opportunities that will help them the most. Today I’m going to offer a bit of a checklist and some suggestions about what you could/should be looking at. Essentially what you’ll be doing is a detailed asset review. A typical asset review would be looking at every well and assessing its performance from three perspectives: Production Performance, Financial Performance, and Performance to Plan. A typical asset review would be answering questions like these on a monthly or quarterly basis: Are we on plan? What’s the variance & trajectory? What wells are losing money? Why? Are there identifiable patterns? What are our top performing assets? What’s our strategy to keep them that way? What are our bottom performing assets? What are we doing about it? What cost/downtime reduction strategies are we exploring? (e.g. change in chemical treatments, new workover strategy, equipment changes) What’s our shut-in plan if...

Read More
April 1, 2020 by

Waste disposal: what comes up must go down

More than 2.9 billion barrels (465 million m3) of wastewater were disposed of in Western Canada in 2019. That’s the equivalent of more than 42,000 dispatched truck loads per day. Waste disposal, and associated trucking, is one of the largest operating expenses (from 5 to >50%) facing the oil and gas industry. Verdazo works with clients to analyse their operating costs & target achievable efficiencies. In today’s blog we’ve partnered up with local startups Galatea Technologies & Labsite who work with some of our clients. Galatea has waste disposal decision optimization software and brings waste disposal domain expertise to this blog. Labsite works with clients to optimize completion effectiveness and helped us answer some questions about frac fluid recycling challenges. Go to the end of this blog to learn more about these startups. Our last blog “The White Elephant in the Room” outlined the massive volumes of wastewater that are produced in Western Canada. In this blog we dig into how much wastewater we inject back into the subsurface and the challenges that operators face. Waste Disposal In 2019 North American oil and gas producers spent approximately $41 billion annually on oilfield waste transportation and disposal. $37 billion is spent...

Read More
March 2, 2020 by

The White Elephant in The Room

The title of this blog combines two concepts. A white elephant is a possession whose cost to dispose of, is out of proportion to its usefulness. The elephant in the room is a metaphorical idiom for an important topic, problem, or risk that that everyone knows about but no one wants to discuss because it makes at least some of them uncomfortable. With this explanation in place let’s ask a question. Question: “What produces 1.1 trillion litres of water per year in order to extract a marketable commodity?” Hint: 5.4 barrels of water are produced for every $50 worth of this marketable commodity. Answer: Canada’s Petroleum Industry. Some might say this is a water industry with a petroleum by-product. For over 20 years in this industry I have been aware of water produced in conjunction with oil & gas, of increases in water cut as wells mature, and the significant percentage of operating costs that trucking and disposal represents. However, I didn’t appreciate the magnitude of produced water that the industry contends with… until now. Produced water is water that is co-produced with petroleum products from a reservoir at significant depth. Produced water is non-potable water, saline, and often contains undesirable...

Read More
May 12, 2019 by

Augmenting Well Production Analysis with Subsurface Data

The Montney Formation, located in the Western Canadian Sedimentary Basin, is developed in a multi-zone stack throughout the fairway.  Unfortunately, these refined target zones are not captured in public data.  For analysis, it is important to differentiate Montney Wells into multiple target zones because they vary significantly in reservoir properties both vertically and laterally. Identifying the target zone based on sequence stratigraphy is a valuable process but can be time-consuming. In this blog we show a quick method to differentiate target zones using a depth-based approach that is helpful when you have limited time and resources.  This workflow can be applied to any map-based data to derive a data set suitable for well production analysis. For contour-based geologic data to be useful for well production analysis, we need a map-derived value for each individual well.  To accomplish this, we started with a publicly available Geologic map of the Montney Top in Meters Subsea (BC OGC, 2012). Figure 1: Digitized and interpolated Montney Top Subsea TVD (True Vertical Depth) Structure Map, (OGC, 2012). The Montney Top Structure map contours were digitized (Figure 1), so that interpolated well-values could be derived.  Using point-sampling, the Montney Top Depth was extracted at the intersection...

Read More
July 22, 2018 by

Data before delivery: putting the cart before the horse?

I have now heard dozens of stories from friends and colleagues about failed BI and analytics initiatives. They range in costs from hundreds of thousands to tens of millions of dollars. It’s a problem that I see several companies at risk of repeating…and the motivation for today’s blog. A common thread to most of these stories is trying to “fix up our data before we select or implement an analytics tool”. How can anyone possibly understand the data needs, the use cases, and the possible data issues without providing a means to use the data, view the data and identify issues? It’s like trying to anticipate what part of a car a mechanic should fix without test driving it first. Consider starting with the data you have, take it for a test drive. See how the business wants to use it and evolve your data quality and architectural initiatives incrementally. You’ll realize value on the way and better focus your efforts. Why do they fail? Lack of focus: Hyped up terms like “big data”, “data lakes” and “cloud” distract us from the pragmatic task of delivering information ­­­— getting reliable, current information into the hands of business users in a form...

Read More
May 7, 2018 by

Machine Learning: Is it really a Black Box?

Machine Learning isn’t the “black box” that many perceive it to be. On complex data sets, the use of Machine Learning with a rigorous process and supporting visualizations can yield far more transparency than other methods. What is a “Black Box”? Machine learning models are sometimes characterized as being Black Boxes due to their powerful ability to model complex relationships between inputs and outputs without being accompanied by a tidy, intuitive description of how exactly they do this. A “Black Box” is “a device, system or object which can be viewed in terms of inputs and outputs without any knowledge of its internal workings” (Source: Wikipedia). Black Boxes (and Machine Learning models) exist everywhere We tend to label things as “Black Boxes” when we don’t trust them more than when we don’t understand them. Machine Learning models aren’t unique in having an element of “mystery” in how they work – there are all sorts of things we trust all around us for which we don’t fully understand the inner workings. GPS, search engines, car engines, step counters, even the curve fitting algorithms in Excel are examples where we trust what’s happening inside because we’re able to see and, with experience,...

Read More
March 2, 2018 by

Machine Learning: Finding the signal or fitting the noise?

Before machine learning came along, a typical approach to building a predictive model was to develop a model that best fit the data. But will a model that best fits your data provide a good prediction? Not necessarily. Fortunately, there are machine learning practices that can help us estimate and optimize the predictive performance of models. But before we delve into that, let’s illustrate the potential problem of “overfitting” your data. Fitting the Trend vs. Overfitting the Data For a given dataset, we could fit a simple model to the data (e.g., linear regression) and likely have a decent chance of representing the overall trend. We could alternatively apply a very complex model to the data (e.g. a high-degree polynomial) and likely “overfit” the data – rather than representing the trend, we’ll fit the noise. If we apply the polynomial model to new data, we can expect it to make poor predictions given it’s not really modeling the general trend. The example above illustrates the difference between modelling the trend (the red straight line) and overfitting the data (the blue line). The red line has a better chance of predicting values outside of the dataset presented. Due to the powerful...

Read More
January 15, 2018 by

Innovation or Dishwashing Robot?

The start of a new year is often accompanied by commitments for change and improvement, both business and personal. We’re all striving to innovate, “to introduce something new; make changes in anything established”. In business, this often involves technology. How you approach the introduction of technology could be the difference between realizing true innovation or ending up with a “Dishwashing Robot”. Dishwashing Robot Image Source: Popular Science 2010 A friend introduced me to the expression “Dishwashing Robot” as a way of describing what happens when you apply new technology to an old way of doing things. It seems innovative on the surface, but it doesn’t bring true, impactful change to an organization. “Innovation” (e.g. a dishwasher) occurs when you leverage the full potential of new technology to change a process and realize optimal benefits. So my challenge to you, and myself, is to consider how we can change processes in our organizations to maximize the benefits of any new technologies we introduce. A common “Dishwashing Robot” in O&G producing companies A common focal point when introducing business intelligence solutions to an Oil and Gas producer is to replace the tedious task of assembling the Weekly Production Report – delivering it faster...

Read More