Site icon IT World Canada

AI is transforming the oil and gas industry: Insights from one of Canada’s largest producers

Canadian Natural Resources Limited (Canadian Natural) is one of Canada’s largest oil and natural gas producers. Which means, from a technology perspective, having a continuous innovation mindset is critically important for maintaining their leadership position – everything from environmental stewardship and being a highly efficient producer to driving digital transformation initiatives.

At the centre of Canadian Natural’s innovation mindset are teams working together and collaborating to deliver on their digital strategy, including individuals like JohnPaul Portelli, Technology & Innovation Lead at the company. In his role, JohnPaul is given a unique perspective as he is intimately involved and knowledgeable across many aspects of the business to determine the role of technology and its potential benefit to Canadian Natural’s shareholders, employees and customers.

I sat down with JohnPaul to gain insight into what drives his thinking on a variety of topics such as digital transformation strategies to understanding the benefits and challenges of artificial intelligence (AI) in the enterprise.

 

Brian Clendenin: JohnPaul, you are working on Canadian Natural’s digital strategy, what does that entail?

JohnPaul Portelli: “First, let me say that I am but one of many people who are working on the digital strategy for our company. I want to be clear about that. What we are after with our digital strategy at Canadian Natural is to improve productivity across all of our businesses.

The questions we ask ourselves every day are:

What this looks like at a practical level is evaluating whether we have the right data, the ability to generate new data, and having the right tools to analyze that data. So while we are always evaluating new tools, we are also looking at how to improve enabling and empowering our people to fully utilize these tools. This means training, new role development and constantly examining whether we have it right.

We have also developed processes and frameworks on how to tackle some of these digital problems. There are a lot of people and vendors out there who are overpromising on what AI and digital implementation can offer, but cost can get away from you if you are not disciplined. So, it’s as much a governance and cultural challenge as it is a technology challenge.”

 

Brian: What is the promise of AI and machine learning in the enterprise?

JohnPaul: “As it relates to our industry, the promise of AI is that it allows us to tackle problems that we couldn’t before because they were too large and complex. Engineers and geoscientists, I include geologists, geophysicists etc., have always used data and statistical methods to solve problems. What is different now is that there were problems where the data was so large or complex that it was impossible to get to the bottom of it through traditional methods, like using a spreadsheet. AI allows us to tackle some of these problems.

Another factor is that subject matter experts may have only looked at their plant or their specific area of responsibility, so you have people optimizing for their field. AI gives us some opportunity to optimize across a larger operation or collection of plants.

AI is going to be to the spreadsheet what the spreadsheet was to the calculator (or slide rule).”

 

Brian: When looking to AI as a possible area for innovation, what is important to assess at the beginning of the journey?

JohnPaul: “The approach we are taking here is to make sure that we are tackling actual problems. AI is very sexy right now and a lot of people want to throw these techniques at any problem out there. But with any problem, the approach we take is to first ask ourselves if we can solve the problem through a continuous improvement methodology, which is by simply removing inefficiencies in how we approach the work? If not, then we need to determine if a data science approach can work, and then move on to physical and chemical technologies if none of those work.

What is important to assess at the beginning of the journey is whether the data you have is adequate. Is the data clean? … Is it available in digital form? … Do you have the right kind of data? And this is a big one: does the resolution and frequency of data capture reflect the variability of the process? We often see that people want to solve a problem with AI but only collect a data point once a day or once a week. In our business, most processes are not steady enough, and if they were, you wouldn’t need AI.

The question then changes to: how much does it cost to collect more data?  Is the technology available to collect more data? These questions could move from being a data analysis problem to a sensor problem. If the sensor is too expensive, it can kill any ROI you had envisaged. So now, you need cheaper sensors. This then becomes a physical technology problem again.

One of the concepts I like is the sensors/barrel concept. It is very cost-effective to instrument a pump if you have 100,000 bbls/day of oil going through it. It is much more difficult to justify if you have 10 bbls/day going through it. This becomes a governance challenge. Do you have the discipline and the culture to think clear-headedly about what problems you are trying to solve and the easiest, most cost-effective way to solve them?”

 

Brian: Let’s talk about governance and culture required for successful digital transformation execution.  Why do you say culture is the cornerstone to success?

JohnPaul: “There are a couple of reasons. Let’s start with a culture discussion. Culture is very important to success here.  Having a culture of entrepreneurship and stick-to-itiveness matters. Let me give you an example. Let’s say you are a technician in the field and I come to you with an algorithm that I claim can predict pump failure. My algorithm currently predicts a half-hour into the future and is right 80 per cent of the time. To the technician, this is useless because the algorithm is wrong one-fifth of the time and it takes six hours to change out a pump. If I set the expectation from the beginning that this can predict pump failure, the technicians will abandon it and any further improvements made to the algorithm won’t matter. I will have a bigger hurdle to overcome in convincing anyone.

Now, what if I use a partnership approach where I am transparent with the technician. I tell the technician, “This algorithm isn’t very good yet because I need better data. Let’s come up with a plan to improve data capture, from the sensors on the pump to maintenance work order data etc. Then, over the course of the year, we can see pump failure prediction improves from half an hour at 80 per cent accuracy to three days at 95 or 99 per cent accuracy.” A culture of humility is required to really understand how to approach the user to get the buy-in you need.

Governance is required to ensure that discipline is followed in really evaluating what the benefits of this implementation might me. We need to be clear-eyed about what the costs are, what the benefits are, and whether the outcomes are likely to be realized.”

 

Steam generators at a Canadian Natural thermal in situ facility

Brian: What are your thoughts on small scale vs big splash AI investments?

JohnPaul: “What you want to do is to de-risk a technology at the cheapest cost and smallest scale possible. This is true of all technology deployments. I call it the cost of generating a unit of knowledge. You want this to below. In the physical technology world, you don’t need a full-scale boiler to test at what temperature water boils at. You can do that in a tea kettle. Knowledge at the lowest cost. But if you are testing a centrifuge, then the G forces increase tremendously at the periphery as you scale up. So, if you are handling an abrasive material, you have to design an experiment that takes into account the speeds and G-forces at full scale. The key is imagining all the things that can cause you to fail at commercial scale and try to design tests for them at the smallest scale and lowest cost possible.

The same is true for AI. You want to fail fast and fail small. The way I look at it is to build a system that is low cost where ideas and solutions can be tested but you design it in such a way that scale-up is possible.  So maybe you test-out the minimum number of sensors you need to predict pump failure and you do it across a subset and variety of pumps. Then, when you have success, you put together a plan on how you deploy this across all the pumps in your system,  or better yet, only the pumps that really need to be monitored because they generate the most revenue or have no redundancies.”

 

Brian: Is data fragmentation with numerous silos less of a challenge today with modern tools

JohnPaul: “Yes and no. I think the tools today make data silos much less of the problem than in the past. There are Application Interface Programs out there that can pull in all kinds of different data. The challenge now is that there is much more data to deal with.

The other challenge in moving data across silos is that you are now moving between several different subject matter experts. This brings you back to the problem of culture and governance, and getting people to work together.”

 

Brian: What are the top three use cases for AI and machine learning?

JohnPaul: “For our industry, the first is predicting equipment failure or process upsets. This is likely the most doable. The physics of the processes are probably the most well-known and the failure modes are few in number. Even if you can’t predict it perfectly, you can at least identify when anomalies are taking place that warrants further investigation. The challenge here is getting the prediction to work far enough in the future that something can be done about it. If you can predict the failure of a piece of equipment a half-hour in advance, but it takes you 24 hours to respond, that doesn’t help too much. And if process redundancies don’t exist, that also limits your ability to respond.

Second, and this is more challenging, is to look at improving oil/gas production and recovery. Keeping equipment running helps quite a bit here, so equipment failure prediction helps. At the same time, we are trying to improve recovery and production from reservoirs by identifying characteristics of the geology and how we can interact with them to improve oil production and recovery. This is a lot more difficult because you don’t have a complete picture of what is below the surface and you can’t verify with your own eyes what you think is happening.

Third, we want to ensure that data and insights get into the hands of our frontline workers and technicians.  We can drastically reduce the time some of our technicians are on the road by only sending them out to where the most critical opportunities are. I think we can also give them more data at the source so they have the ability to make more informed decisions.  They already do a great job but sometimes the data we provide them isn’t adequate. We’re looking to rectify that.”

 

Brian: What is a project that you’re thinking about for the future that has the potential to drive digital transformation forward at Canadian Natural?

JohnPaul: “I look at all kinds of technologies for implementation, from physical and chemical ones to digital ones. On many of the physical and chemical technologies, we are operating at the frontiers of scientific discovery, so we are moving at the pace of science around the world.

The digital sphere is somewhat different. With AI, the aim is to use the data we have with the statistics we know, to drive insights from our operations. In that regard, there is a lot of opportunity. One of the projects that I have front-of-mind is to use our existing data to incorporate multiple plants in our operation and optimize across all of them, over several factors, such as oil recovery, greenhouse gas reductions, equipment failure, etc. I think as we continue to prove value in these areas, digital transformation will take on a life of its own.”

Exit mobile version