In the latest episode of our Energy Transition Talks, Maida Zahid sits down with CGI experts Mark van Engelen and Curtis Nybo to discuss the growing role of artificial intelligence (AI) in the oil and gas space. Specifically, they look at the evolution of—and need for—generative AI in the industry, the value of an iterative, domain-based approach to implementation and cross-industry AI use cases to advance the energy transition.

The new frontier for AI in oil and gas: data, demographics and domain-based approaches

The use of AI to support the asset-heavy oil and gas industry has been in effect for some time, especially for optimizing asset maintenance and predictive maintenance. However, new areas of need are driving the evolving role and growing value of AI within organizations.

First, Mark mentions, is the need for generative AI to help unlock the vast amounts of data in the oil and gas companies (e.g., on the GIS side, on their land side, upstream, downstream, etc.). This rise of ‘data GTP’ as he calls it, means gaining access to that data in a natural language format to pose questions like, ‘How many barrels did you produce last month?’ without clicking through several layers of reporting.

Second, as shifting demographics and changing workforces expose a knowledge gap between retiring experts and new professional entrants, generative AI is helping organizations bridge the gap and provide access to legacy knowledge in an efficient manner.

More crucial than vast amounts of data is the quality of the data. When working on use cases with clients, Curtis says they begin with domains that have decent data quality or supporting data management processes, to maximize ROI and time to completion.

As he explains, “we take a domain-based approach, where in parallel as you’re working on an AI project in the one domain, you can clean up the data of another domain next on your list,” so you’re not applying AI to the whole company at once; you’re starting with one area or team and expanding throughout the organization.

The ‘build vs. buy’ debate: determining the best AI strategy with the right partner

In such a fast-paced, product vendor-dense market, organizations are struggling to navigate whether to buy AI solutions or build their own.

According to Mark and Curtis, the biggest advantage an organization can have in mapping their AI strategy is a seasoned partner with experience testing new services or providers, who can advise and support the respective build or implementation (e.g., in terms of cost, performance, tradeoffs, etc.).

When it comes to moving from proof of concept (POC) stage to production-ready or enterprise scale-ready, Curtis says organizations must have an up-to-date data and AI strategy that accounts for recent responsible AI guardrails, as well as a structured team with the right skills and capabilities.

“Often in these AI projects, organizations go straight to data scientists, but they can only take it so far and that’s where you need the engineers, the platform architects, the developers who can take it from a POC to a production environment able to handle the required workloads.”

With a build strategy, an organization would be responsible for all those aspects (e.g., staffing and balancing data science resources and resources that will put the POC into production). Whereas a buy strategy can keep things very simple for organizations to ‘pick to solve,’ selecting services that have all aspects packaged in (resources, compute, models). “With build, you get total control and can push the use case to where you want it to be and with buy it just makes it a little easier, but you lose a little bit of that control.”

The evolving AI landscape: use cases within and outside of the industry

For Curtis, a significant use case with clients is ‘democratizing your data,’ or using generative AI models to translate natural languages questions into coded SQL queries, to interact with your databases and produce responses in 30 seconds. This removes barriers to that data within an organization.

For Mark, AI-driven legacy modernization is an exciting use case, as oil and gas companies often have many smaller custom-built legacy applications, which are expensive to update. He’s helped clients leverage AI to enable code conversion for software to advance through versions rapidly, improving cost efficiency and freeing up IT investment from technical debt.

Within field operations of an oil and gas company, Mark says generative AI is also enabling chat-like interactions with user manuals to share entire knowledge bases for assets rapidly, in natural language.

Outside the industry, Mark and Curtis see other use cases to draw inspiration from:

  • The financial sector’s use of AI for cash flow prediction provides oil and insights into what models are being used and what’s working well.
  • Asset-heavy industries (e.g., class-one railway, mining or manufacturing) have learnings to draw from on how they run and optimize their assets.
  • Mining and forestry (with similar reclamation side of the projects to oil and gas) use satellite imagery and AI with computer vision to help map and track project progress.
  • Agriculture uses predictive maintenance and machinery inspection that can be similarly applied within oil and gas.

The road ahead: strategies for accelerating AI implementation and the energy transition

AI is already shaping energy transition activities, as predictive maintenance is now being applied on wind farms, as well as to any hydrogen assets. When it comes to ESG reporting, Mark sees generative AI playing a larger role in providing analysis and data-based guidance, for example, on how to run assets differently to produce less emissions.

For Curtis, optimization is key to energy goals. “Optimizing all operations from field services to quality control to predictive maintenance and material movement and ensuring every piece of those processes is performing with the least amount of waste” frees up cashflow and resources to be used elsewhere.

For organizations beginning or evaluating their AI journey, Mark and Curtis stress a few key criteria: data quality and management, the right teams and platforms to support an AI project from POC through to production, an iterative, domain-based approach and a list of use cases to begin testing your strategy.

As Mark says, “AI is already here. Everybody should be on this journey, one way or another.”

Listen to other podcasts in this series to learn more about the energy transition

Read the transcript:

1. Introduction

Maida Zahid:

Hi, everybody. My name is Maida Zahid and I'm the Canadian marketing lead for energy utilities based in Calgary Canada. Welcome back to another episode of the Energy Transition Talks Podcast. Today we're going to be talking about something that's talk of the town, artificial intelligence, and specifically AI in the oil and gas space. We'll cover the trends, how the industry is evolving, and how can we leverage AI for the energy transition? And to help me with the discussion, I'm joined by two of CGI's fantastic experts and my favorite people, Mark Van Engelen and Curtis Nybo. So to get started, why don't you guys introduce yourself? Mark, I'll start with you.

Mark van Engelen:

Thanks, Maida. Mark Van Engelen, I'm the VP of emerging technologies for CGI in Western Canada. I have about a 20-year experience and background in the oil and gas industry, but love working at CGI because I also get to work with other industries and see the best of all the worlds and bring them back to the energy sector. So that's a bit about me. Curtis?

Curtis Nybo:

My name is Curtis Nybo. I am also located in Western Canada with CGI, and my background is data science. I'm a director on the data science team. My data science background is mostly consisting of prediction and forecasting, but the last two years I've spent most of my time in natural language processing, which includes generative AI, and so nearly the last year doing research and working on projects related to generative AI mostly.

2. Changing demographics and evolving needs for AI in the oil and gas industry

Maida Zahid:

Well, welcome both of you. Thank you for joining us today, and I'm going to dive right in. So Mark, let's start with you and can you tell us why is there a need for AI in this oil and gas space and how has the industry been utilizing it traditionally?

Mark van Engelen:

Great point. AI has actually been in the oil and gas industry for quite some time. People have been using it, especially in the asset maintenance, predictive maintenance space. It is an asset-heavy industry of course, so people are wanting to optimize that. I guess the traditional AI has been used for quite a while and there's been some really good use cases that people have been using. I do think now with generative AI, there's some other doors unlocked. A, it's becoming a more broad conversation, anywhere from the C-suite and on the business side, while that was maybe a challenge in the past. And we see it being addressed now in a couple of different areas. One is unlocking data. There's so much data in these oil and gas companies, be it on the GIS side, on their land side, upstream, downstream, et cetera.

They have so much data, so just to navigate that is a big challenge. You need people know their data domains inside out, et cetera. So we definitely see a huge uptick in the kind of the data GPT as I would call it, getting that access to that data in a natural language format, so you can just ask questions like how many barrels did you produce last month without having to click through seven layers of reporting to get to that number. And then you can ask some follow up questions, "Well, how much byproduct did we get?" Et cetera, et cetera, right? So we see a big uptick in that, having the data accessible at people's fingertips as being a big differentiator with generative AI.

The other main area that we see is in the changing workforce and the changing demographics. In particular, if you look into Canada where we have a whole raft of retirees coming up, we have also people coming out of university and schools with particular knowledge, which may not be fully aligned with what we need out in the field. So there's this growing knowledge gap that's there, and it's harder for people to get that knowledge just because how companies are operating now compared to how they were used to operate. So changing workforces another big driver to see can we use generative AI to access that knowledge, again in a natural language way, et cetera.

Curtis Nybo:

And I think another big driver is the availability of compute services. So all your cloud computing services, back when oil and gas companies were collecting all that data prior to the mid-2000s, they had to deal with that data themselves. But now with the cheap and powerful computes that are available through cloud providers, that opens up doors for them to do complex machine learning as well as edge computing, so doing machine learning right on site, I think that was a big driver as well.

Mark van Engelen:

That's a good point, and the addition of quantum computing now that's becoming available makes it even more truly powerful and it's an option, right? In the past they couldn't afford such solutions, and they can now.

3. Differentiators, bottlenecks and opportunities for AI implementation within organizations

Maida Zahid:

So between junior, midsize, and giants, how do you see AI being implemented and adapted with the different size of companies that are out there?

Mark van Engelen:

I would say we hear different things. There are some midsize to large companies that are really doubling down on their differentiators. So if they see an area of strengths, could be in the asset space, could be in the financial space, et cetera, they're like, "Hey, how can we help AI to even accelerate or to emphasize this strength?" Others like junior upstream companies may just sit and wait and see until somebody solves this problem for them because they don't have the bandwidths to do all the steps that are necessary to do an AI project properly, but they'll likely adopt something if somebody's going to onboard it into their platform or make it accessible for them at an affordable rate without having to spend the time and the energy and the money upfront to develop this.

But that means, in my view, the difference is going to be larger, like the companies that are investing in this as a differentiator versus the companies that are waiting until there is something onboard, there's going to be a larger digital gap between the leaders that are embracing AI and the ones that are waiting and seeing. Because the speed of AI evolution is going so rapidly, it's going to be a big challenge for the ones that are going to wait and see, but I understand the reasoning perfectly fine as well.

Maida Zahid:

Curtis, I'll move on to you. What are some of the specific challenges or bottlenecks you've seen organizations face across the spectrum?

Curtis Nybo:

Having the knowledge of how the business operates and that knowledge disappearing with retirees, and how do you backfill that and how do you fit AI into that to kind of fill some of those gaps? But if we go back to basic AI bottlenecks, the biggest one would be data. So we talked about how they have oil operators usually collect a ton of data, but what kind of quality is that data? And so from my experience, most of the time we spend in a project usually turns into a pretty big data cleansing exercise. So that's looking at what's the data stored in? What's the format? Is it data stored in PDFs? Do we need additional tools to be able to parse that data out of those formats? And I think that's the biggest bottleneck that most organizations have because you can't start any AI project until that foundation is there.

Mark van Engelen:

I was going to jump in. So that's why we tend to, when we work on use cases with our clients, we look at some of these domains that they have pretty decent data quality on, or if they have good data management processes around, so then we know the effort to get to that AI stage is less than if we have to start from scratch or we have to do a lot of other data quality cleansing. Not to take away, a lot of these AI projects take a couple of weeks or a couple of months to execute, so it is actually, we see a really good return on investment.

But if you were to pick a use case where the data is missing or it's locked up in a couple of thousand of PDFs and they need to do 10 additional steps to get to that data, then it's going to take longer, people are going to ask questions. It's not as good as an ROI, return on investment, as the data domains that are under control. So we do that domain-based approach where in parallel as you're working on an AI project in the one domain, you can clean up the data of another domain that you have next on your list, so you can continue to evolve your practice.

4. Optimizing for efficiency and keeping pace with AI strategies

Maida Zahid:

Okay. A little bit on the positive. We said one of the most significant impacts is increased efficiency of productivity and we've seen AI power tools that can automate repetitive tasks. And so what are your thoughts on the impact of gen AI on the workforce? We kind of touched on that already, but what are some of the positives that you've seen that they can implement?

Curtis Nybo:

Oh, I think it saves people a ton of time. The whole goal, not everybody, but it's often looked at as a negative that it's going to take away jobs, that sort of thing, but really AI is just additional tool to help people make more out of their time within their day and help speed up menial tasks in some cases, or provide information that they otherwise can have to help make decisions. That's the main largest positive in my book.

Mark van Engelen:

There is a big knock-on effect, and some companies use that as their driving force as well about employee experience, so we see the same with automation as well as AI. Organizations tend to look at it is how can I free up time of my people to focus on value-add activities versus just data entry or spending hours trying to figure out the data while they could use a data GPT type of solution to get to the same answer? So it's about saving time so they can spend it on more value-added activities. We see that as the main driving force, and the positive side effect I guess of that is we see employee experience improving because they get to work on things that are exciting, interesting, challenging versus repetitive and boring to do in some cases.

Curtis Nybo:

And it's exciting trying to stay up-to-date with all the new AI stuff.

Maida Zahid:

It's very cool for sure. And this is kind of part of their overall strategy, but one of the things that we've seen is this space moves so fast, so AI is developing so fast every day. How do you see companies keeping up with their strategy? How do they keep up-to-date with this fast moving pace?

Mark van Engelen:

That's a good question. We get a lot of questions from our clients because it's so fast paced and they have so many different kind of product vendors, et cetera reaching out to them saying, "Oh, I've got this AI solution. You should buy this for X dollars per user, per month," or, "You should buy that," and then another vendor comes along. So they're trying to figure out do I buy those ones and if I buy multiple, is there any overlap? They're just trying to navigate this market of the gen AI explosion that is happening, and they're also figuring out, "Well, do I actually build it in some cases? Because it's either very differentiating for me and a competitive advantage, or I'm not sure what this other company is going to do with my data. I'm not sure if I can trust that yet, so maybe I should build it myself, because then at least I know where the data is flowing, et cetera."

5. The “build vs. buy debate” and the value of the right partner to help you develop AI strategies

So a lot of our clients are struggling navigating that market at the moment, but there are some really good examples and some use cases. So leveraging a partner I think like CGI that knows what the pros and cons are of some of these tools, and we've built several of these on production and enterprise scale for clients, so we've already gone through a lot of those lessons and can share those with our clients because it takes a lot of effort just to stay up-to-date about what's working, what isn't working. And Curtis, you're knee-deep in that world, but what would you say?

Curtis Nybo:

I think that's the biggest thing is having a partner that's able to spend the time testing and trying all these new services and different providers and their services that they provide, and we spend a lot of time, and I spent a lot of time in my role working with different generative AI models, for example, in more of an even academic sense but within projects and trying to figure out which ones perform best in which situations? What are some of the issues we run into, the errors? What are they best at, because each model is slightly different than others? And a big part of that is what is the cost associated for each model? There's often open source models which sound good on paper, but now you're responsible for the compute power and managing the capacity for it, and if you need more compute, you're now responsible for scaling that up. Whereas if you leverage a service that provides everything under the hood and you just pay a flat fee, that's a different conversation and it all comes down to a little bit of that trade off between performance and cost.

Maida Zahid:

So you just kind of touched on the argument between do I build in myself, do I buy it? Are there any tools out there or a platform that you've seen companies utilize? How can we help companies or organizations go from the proof of concept stage to being more production-ready or enterprise scale-ready?

Mark van Engelen:

I think it starts off with a data and AI strategy. Some organizations have that or it may have been two years old, et cetera, but because of generative AI and all the effects that that's had over the last couple of months, definitely people need to revisit that because that touches on a lot of the questions like the buy versus build, some of the responsible use of AI, providing the right guardrails, et cetera. So I think that's for me where it starts, is in that data and AI strategy and make sure that's refreshed and that we know what AI tools we are going to be using in our company or not, so I think that's really important for our clients to figure out. And then I think the teams is another important, like are we structured and do we have the right skills and capabilities for this? Curtis, what would you say about the team structure?

Curtis Nybo:

Yeah, absolutely. Just on that, for the build versus buy aspect of it, building allows you to really get knee-deep into it. You can have total control over what models you're using, what applications you're using, but now you're responsible for maybe the infrastructure, but you're definitely responsible for the team that's supposed to be maintaining that. And I think that's what you're getting at, Mark, where who should be on that team? How many people do you need on that team? What's the use case? And it can be really tricky. Often in these AI projects, organizations go straight to data scientists, which is good. I'm on the more data science team myself, but we can only take it so far and that's when you need the engineers, the platform architects, the developers who can take it from a proof of concept that my data scientists like myself would be able to produce and put it into a production environment that can handle the workloads that are going to be required of it.

And so from a build strategy, an organization would be responsible for all those aspects. For the staffing and trying to find that balance between data science resources, but also the more important, I would argue, resources that will actually put the POC from a proof of concept into production. Whereas I think a buy strategy for some organizations is a pretty straightforward type of pick to solve a lot of these problems, it can keep things very simple. Often vendors, when we provide those types of services, that is all packaged in. How many resources we need, what kind of compute you would need, what kind of models you need to be able to handle the workload, and so there's pros and cons to both. So build, you get total control and you can really push the use case to where you want it to be and with buy it just makes it a little bit easier, but you lose a little bit of that control.

6. Compelling AI use cases within the oil and gas industry

Maida Zahid:

I really like how you summarized the build versus buy argument. We talked a lot about the use cases, but can you guys expand on some of the cool and shiny things you've been doing with some of the clients?

Curtis Nybo:

Also, the coolest ones that I think that we've been working on is in the generative AI space. That's kind of the new, shiny AI object that everyone's working with. But everybody thinks of generative AI as working with tech, but it's also really good at working with computer code. And so one big use case that draws a lot of attention is we call it democratizing your data, where you're using these generative AI models to write SQL code or be able to interact with your databases. So one strategy we usually use is to use that large language model to write SQL code to be able to query your database with just a natural language question. So you can ask it a question, based on that question and the schema of your database, it'll be able to write the SQL query and then it'll be able to run that SQL query on your database, and that'll remove the barriers to all that data within an organization.

So where before you might've had to go to somebody who knew SQL or was responsible for producing reports and you'd have to make a request, they'd have to add it to their workload, they'd have to find time to write the SQL query, get the results, send it to you, instead you could go to a chat bot-like interface, ask the question. It does all that in the backend and produces a response in less than 30 seconds in most cases. And I think that's one of the more powerful use cases that we've been developing for the clients, and it just goes to show the flexibility of generative AI in the larger models.

Mark van Engelen:

What I'm also excited about is AI-driven legacy modernization because lots of oil and gas companies have lots of smaller custom-built applications hanging around. It's quite expensive to keep that up-to-date or they just don't get to it because they have all these other priorities, so we've leveraged for a couple of clients now that same kind of code conversion, but then for software code, to be able to take things from version two to version six really rapidly compared to just having a developer do all of that work, et cetera.

So it's becoming much more cost efficient to do some of that legacy modernization for the oil and gas companies, so that's been pretty exciting. There's a couple tools on the market, or we have built that for a couple of clients as well because a lot of the technical debt is holding the oil and gas companies down from innovating or doing other things. So if this generative AI can help speed up that legacy modernization, and we see that in several clients already, that's going to make a world of difference on where the IT investment is going. And again, that could create other types of job opportunities as people move away from old into the new.

Curtis Nybo:

And I think that also plays a big part in the changing workforce that we've talked about a couple of times already, where a lot of that technical debt within oil and gas companies, there's only a certain number of people that can say write COBOL code or write legacy system code. And as they retire, that knowledge goes with and they're just simply not teaching a lot of those skills anymore. And so we can leverage AI to fill that gap where a generative AI model already knows those legacy programming languages and you're able to leverage a bot that you can creating much, in lots of cases, plug in code of one kind of legacy language and have it returned in a new language with the same functionality.

Or another use case that we've spent quite a bit of time on is understanding code. So you take legacy code and you put it into a generative AI model and it doesn't convert it, but it goes through and it comments it and marks exactly what each line of code is doing, what each block of code is doing so that you have a better understanding of at least what the application does when the person with the knowledge of that application has left the organization or retired. That's a big use case.

Mark van Engelen:

We could talk about it all day, Maida, but I just want me do one more. And I know Curtis, you worked on a couple of projects like that is more if you look at field operations of an oil and gas company, getting access to that knowledge. Let's say you're looking at some machinery and the light's yellow and it says something on the display, what should I do? So being able to have a chat-like interaction with user manuals, with your whole knowledge base on your assets is going to be a differentiator for people that, for example, are working at a brand new LNG site and they need to know how this machine actually works because it's just going to go live now. So having access to that information rapidly and in a natural language is a differentiator, and the same for some of those listed Brownfield sites that have been there for a while, there's not a lot of knowledge left around it, so being able to access that information speeds up also those processes. So that's another access to knowledge and the operations, especially in the field for oil and gas that is also going to make a big impact.

7. AI use cases outside of oil and gas with transferable learnings and applications

Maida Zahid:

And I've heard a lot of things about chatbots, and it feels like chatbots kind of like the gateway for AI into a lot of industries. I know we've done that in health before, we've done that in utilities. So looking outside of oil and gas and some other industries, what can you see the organizations that they can learn from other industries, let's say manufacturing, national sector? What have you seen?

Mark van Engelen:

We constantly look outside, so oil and gas also has cash flow. So looking at the financial sector and what they're doing on cash flow prediction and what models they're using and what's working well and bringing that, again, back into the oil and gas side of it. The other one is any other asset intense industry, be it a class one railway or a mining company or those types of things, manufacturing, we also continuously look at how do they run their assets? How do they optimize it? What can we leverage from there and bring that to oil and gas? So we constantly look into the other sectors and what we can bring, what works well there, and does that work well in oil and gas?

Curtis Nybo:

And there's a huge amount of crossover because you look at mining or forestry where the reclamation side of the projects that they all work on are pretty similar, where you can use satellite imagery and AI with computer vision to be able to map out and see progress with those type of projects. Field services, it's pretty similar between multiple industries. Same with predictive maintenance and machinery, that inspection and the aspect of it, that is cross industry from anything from agriculture to mining to forestry to oil and gas. And so it's just a matter of keeping an eye on what everyone else is doing and how well it's working for them.

Maida Zahid:

The work is cut out for us, we just got to get started.

Mark van Engelen:

Exactly.

8. The role of AI in meeting emissions targets and advancing the energy transition

Maida Zahid:

That's what it sounds like. So we've talked a lot about current, what's happening right now, and I kind of want to jump into the future because as we all know, like most most industries, the oil and gas industry has a lot of emissions targets to meet. There's deadlines coming up and we have to eventually look at the energy transition. So what are some of the key ways they can leverage AI to get them closer to meeting those targets?

Mark van Engelen:

I think a lot of it's about finding the right use cases. As that's important for an oil and gas company, they're also applying the same predictive maintenance on wind farms. They're applying it to any hydrogen assets that they have, et cetera. So anything that's part of that energy transition is also part of the use cases that those companies tend to look at. We see a lot of energy or focus on ESG reporting and I wouldn't say there's a lot of AI yet on that, but there's a lot of analysis on, "Okay, if we have all of these emissions, what can we do about it? Could we have one of the assets run differently so it produces less emissions?"

I can see AI playing a larger role in that, but to be honest, there's a lot of foundational work that still needs to be done for a lot of these oil and gas companies in that space, like getting good data management, good data quality, or even just good data. Identifying the correlation between if I change the rotation of my compressor, what impact does that have on the emission level? That needs to be captured somewhere as well and only once that's captured, then you can start to let AI loose on top of that to say, "Well, give me some ideas on what I can do to make this better." So I see that as coming up next. I still think people are working on some of these core areas, either foundational or some of our more primary AI use cases, I would call that.

Curtis Nybo:

Optimization, that's the biggest one to me. It's trying to allow organizations to be able to do the most with the least amount of effort, so optimizing all operations from field services to quality control to predictive maintenance and material movement, and ensuring that every piece of those processes is performing with the least amount of waste essentially, in my mind. And so after optimization is complete and you really hit a mark where it's performing as good as it almost could perform, that's when it frees up cashflow to invest into other projects. It frees up resources. You're not having to have as many people fix assets, for example, if you're able to do maintenance. And so I think to me, optimization is the fundamental area to tackle and make sure everything's running as good as it possibly could run.

9. Strategies for moving from AI proof of concept to production within organizations

Maida Zahid:

Okay. Optimization is the name of the game but in addition, what are some of the other things or key strategies can these organizations deploy to accelerate their AI implementation?

Mark van Engelen:

What I've seen is really getting from that proof of concept stage two enterprise level is the key bottleneck that we've seen, so how to overcome that. I think we touched on several of those. If you think of AI in the four E's, so envision, explore, engineer, expand, a lot of people get stuck on the explore because either their teams are not structured properly, they're spending much more energy on the data quality than they thought they would. So again, there's a couple of these lessons that we talked about throughout here that I think people need to apply. It starts with the data and AI strategy, it starts with the foundations that you need to have in place before you get to AI, which is good data quality, good data management in the domain you're looking at. And then the other one is having your team build for success.

I've seen a couple of organizations where they've had a great, smart team with data scientists, data engineers, architects, et cetera, but after a couple of use cases, they end up owning that in the production as well. So then they can't really work on any other use cases because they're just maintaining the things that they've built so far, and that turns into a bottleneck and then organizations get stuck. So I think it's good to really think through that and think through that flow, as well as having enough use cases in your pipeline. So that starts with the envision side of it. There's lots of use cases across different industries, but having a list for what's relevant for your organization I think is key.

Maida Zahid:

Curtis, anything to add?

Curtis Nybo:

Yeah, I think that goes back to the buy versus build part of it. I agree with all that. It's building your own AI solutions. I think that's a really good point, to be careful that you don't end up building one solution and then that's what the team spends most of the time maintaining, and then potentially having to hire a second team to be able to start doing a new project. That's a pretty often overlooked part of the pipeline of these projects is from POC to production, and then what does that look like once it's in production as far as maintenance goes, and what does that allow for capacity for additional projects down the line?

10. The future role and impact of AI within the oil and gas industry

Maida Zahid:

Let's say closer to 2030, how do you see the AI transform the industry?

Curtis Nybo:

To me it would be the access to information. So if we think of just generative AI, when we have huge amounts of technical manuals, that sort of thing for say a maintenance team that's responsible for maintaining equipment, they won't have to spend hours of their day trying to troubleshoot or find problems by going through tons of manuals. If they're unfamiliar with the hundreds pieces of equipment they might work on, they'd be able to just ask a generative AI bot, which would do all the heavy lifting for them, find the information within those documents. Return back where that information came from, but also return the information itself and maybe a step-by-step plan of how to actually fix the issue with some citations for verification purposes. I think that's going to be one of the bigger things, just pretty much enhanced search. As boring as it might sound, it's pretty powerful.

Mark van Engelen:

You can see AI popping up in different areas. So let's say you implement it in one team area and they can do things faster, they can do things more efficient, but that was part of a supply chain. They're handing stuff over to another team, so you're basically pushing a problem out to somewhere else. Now they need to implement AI to be able to even handle the large volumes of data they're getting and to make sense of all of that, et cetera. So you'll see the need, and we see that even in legacy modernization. Great, I can convert code more rapidly, but now I need to also test more rapidly, so I'm looking at AI-driven test automation, et cetera. So the same, you'll see the need of AI shifting as it gets more dominant in certain domains, those domains that are surrounding that also need to start to leverage more AI just to keep up. That's one thing, and the other thing is I think lots of organizations are keeping an eye out what AI means for certain roles.

If I buy maybe a platform, how's that going to affect the productivity of my project managers or how's this going to affect the productivity of my field staff, and what am I going to do with that productivity gain? Now where am I going to put that and how am I going to measure that they actually got that productivity gain? So having good benefits realization frameworks in place, and a good idea if I save somebody two hours a day, what are they going to do with the two hours and give them the right direction of it. So there's a lot going to happen in that space where we're going to look at the different jobs and personas and what we're going to actually be doing with these productivity gains and how we're going to measure the success of that, so that's going to be a quick follow-up question between now and 2030 as well.

11. Key takeaways

Maida Zahid:

Yeah. We only have six years to go, so we've talked about a lot of things. I think we've talked about many aspects, but if you both had to pick let's say one or two key takeaways, what would that be from our discussion today? And I'll start with you, Curtis, first.

Curtis Nybo:

For me, the biggest key takeaway would be what I've encountered the most and what I think is OK overlooked is proper staffing of AI projects. So ensuring that if you're building the solution, this mostly pertains to that, but you need more than data scientist. You need the entire team, so you need the architects, you need developers, and you need to have that entire approach nailed down where you can put the project through a POC phase and then put it into production using the other team, while the data science maybe goes back and tries to tackle a new problem and then repeats the process with that kind of maintenance of the first use case being taken care of by another team. So I have an iterative process, I guess a domain-based approach Mark called it earlier, where you're not applying AI to the whole company at once, but you're starting with one area, with one team and then expanding that throughout the organization, but making sure that you have the proper capacity to support that project from through its entire life cycle so that it doesn't get stalled.

Mark van Engelen:

I would say AI is already here. Generative AI has some really good use cases that we know and some really good benefits. We have some good playbooks on how to get started, and we talked about several of these topics today already about what needs to be in place to make something successful in the AI space. So I would definitely, if you haven't started, if organizations haven't started yet, start with that envision and really getting that list of use cases, getting your AI strategy in order and picking your first domains so you can work on your data quality, data management, et cetera. And then the second part is making sure you can scale and are ready to get this into a production mode with the right team, with the right platforms, et cetera, in place. So we know this is making a huge impact in some of the clients we work with, so I think everybody should be on this journey one way or another.

Maida Zahid:

Awesome. Well, thank you for your time today, Mark and Curtis, and thank you for a fantastic discussion.

Mark van Engelen:

Thank you.

Curtis Nybo:

Thank you.