SLB - Sizing Billion USD Well Construction Tenders Using Web Application & Machine Learning Models

Names:

Valerian Guillot 
Pierre Sesboue
Myriam Amour
Weijia Du
Ana Escobar
Ahmed el Ganaini
Nadia Rahmadianty
Todd Hughes
Mohamed Yehya Rabie
Mohamed Khalil Nafi
Oscar Eduardo Ramirez Lamus

Country: France

Organization: SLB

SLB is a technology company that partners with customers to access energy. Our people, representing over 160 nationalities, are providing leading digital solutions and deploying innovative technologies to enable performance and sustainability for the global energy industry.

Awards Categories:

  • Best AI Acceleration Use Case
  • Best Moonshot Use Case
  • Best MLOps Use Case
  • Best ROI Story

 

Business Challenge:

SLB offers integrated well construction services to operators. One of the main services offered consists of the delivery of “lump sum turnkey” wells, where SLB delivers wells to its customers at a fixed cost. In this type of fully-managed turnkey service, SLB carries an increased proportion of the well delivery risk. As integrated projects, SLB has drilled more than 10,500 wells, totaling more than 20,000 km drilled (Fig. 1 ).

image1.png

Figure 1: Localisation of SLB Integration Projects (source SLB 2021)

The size of these opportunities ranges from USD millions to USD billions and are typically granted following a tendering process, where multiple service providers will bid for the project.

As SLB carries an increased proportion of the well delivery risk in a competitive environment, it is necessary that we correctly size the response to the bid to ensure that the project is profitable while providing competitive prices to our customers.

As most of the costs that SLB bears are time-related (e.g., rig rental, personnel, equipment use), when a tender is issued, we need to analyze historical data to predict the time it will take to deliver the wells of the tenders, and get an understanding of the risks. We will then build on this understanding to put together a strategy to outperform the well construction process and respond to the tender with a competitive offer.

Determining the cost of a well starts with the manual classification of operations, followed by the extraction of key performance indicators before eventually building an operational sequence and forecasting the risks associated with each well of an invitation to tender.

The main problems to solve to achieve this prediction are:

  1. The data are often stored in unstructured reports [daily drilling reports (DDRs)].
  2. The turnaround time to respond to a tender is extremely short (Fig. 2)

image3.png

Figure 2. Snapshot of a DDR, which describes the activities being performed, as well as nonproductive time events that may have occurred (sourced from Australia’s National Offshore Petroleum Information Management System

This tedious and time-consuming process causes some critical challenges:

  1. Time: Manual data ingestion and classification are time-consuming, tying up resources that could be better utilized.
  2. Accuracy: Human errors are incorporated by the unconscious bias of the well engineer.
  3. Scalability: Inability to reproduce obtained results for future business opportunities.

 

Business Solution:

To address these issues and precisely estimate the drill time of a well campaign to maximize win rates of profitable bids, two Natural Language Processing (NLP) services, two web applications, and a number of (Enrich, Transform, Learn) ETL pipelines were built (Fig. 3).

image2.png

Structure of the Dataiku projects used when responding to tenders for well construction projects. BI: Business Intelligence; PET: Project Evaluation Team; MDM: Master Data Management; DDR: Daily Drilling Report.

An NLP-based web application was first deployed in Dataiku to parse unstructured client data, making it possible to efficiently classify operational categories and extract depths. The NLP-based application consists of:

  1. A dash web application hosted in Dataiku that enables the user to:
    1. Upload daily drilling reports in tabular format.
    2. Validate/correct activity codes.
  2. A data processing flow, with an automation scenario to trigger processes as the users are using the web application.
  3. Several NLP services hosted in the Dataiku API node to:
    1. Assign standard activity codes based on the description (English, Spanish) using a Bidirectional Encoder Representations from Transformers (BERT) NLP model. 
    2. Extract depth information from descriptions using regexp rules (Figs. 4 and 5).

image5.png

Figure 4. Python services hosted in Dataiku, that are running the BERT NLP models.

image4.png

Figure 5. Web application deployed in Dataiku, that allows the user to QC the results of the automated classification.

To further accelerate the digitalization of the Offset Well Analysis (OWA), another web application is being developed to compute well construction KPIs. It predicts the sequence of operations needed to drill wells in a drilling campaign and, finally, predicts the time. For every new tender, the well engineer’s quality check on the suggested key operations needed to drill a well is now easily done, thanks to an alignment of the operations made by an algorithm inspired by alignment genetics (Fig. 6).

image6.png

Figure 6. Screenshot of the web application that the user can use in Dataiku in order to predict the sequence of operations that will be necessary to drill the well as per the scope of work

The prior state was nonrepeatable workflows, done tediously by SLB engineers in commercial spreadsheets that were tedious, time-consuming, and subject to human error and/or bias.

The applications provide SLB engineers with a data-driven way to predict the time and risks associated with well construction when responding to a tender. This is of critical importance, as 30% of SLB revenue is associated with integrated well construction projects.

The SLB engineers spend 25x less time classifying data, and can instead dedicate this time to building technical scenarios that will outperform the historical performance of the well construction.

This application is now part of the standard operating procedures for SLB when responding to tenders.

 

Value Generated:

This data-driven approach has been used to assess over USD 10 billion worth of well construction tenders.

The web application is providing SLB engineers with tangible time efficiency gains. Manual efforts to analyze legacy data are now divided by 25; whereas manually classifying a well previously took approximately 8 hours, the same can now be achieved within 20 minutes.

Similarly, the time prediction application allows for a structured, auditable, and data-driven approach to predicting the time it will take to drill the wells, as defined in the tender's scope of work.

Billion-dollar decisions are now data-driven while keeping our well engineering experts in control of the outcome.

 

Value Brought by Dataiku:

A data science model that sits on a data scientist’s Python notebook yields no value to SLB operations and to our customers.

Dataiku has enabled SLB to:

  1. Go from a data science Proof of Concept (POC) to a product that has a direct business impact on SLB operations.
  2. As opposed to developing standalone applications, which are costly and have a slow time to market, deploying within a platform (Dataiku) enables the reutilization of Dataiku’s: 
    1. IT/security qualifications processes.
    2. User management and access control.
    3. Automation and monitoring capabilities.
    4. Service deployment capabilities.
    5. Auditability and data governance capabilities.
      This enables the software engineering team to spend more time implementing features for the end users and less time implementing infrastructure.
  3. Dataiku’s close integration with Git has enabled the SLB software engineering team to efficiently develop software applications to be deployed in Dataiku.
  4. Dataiku’s extensive library of APIs has enabled the SLB site reliability engineers to automate the continuous integration/continuous deployment process.

Furthermore, the web applications that enable the user to interact with the models are helping build trust and acceptance of artificial intelligence applications within SLB. Some of the application users have also become Dataiku users and are complementing the web applications with their own flows in Dataiku. 

Value Range: >100M$

Comments
PierreG
Dataiker

Amazing use case!
Congrats @Valerian and the whole team! 

XavierD
Dataiker

A concrete (and amazing example) of how AI and automation can yield massive value and competitive edge when applied to the most rewarding applications !

A big congrats again @Valerian and to the Montpellier Technology Center and the SLB team !

Share:
Version history
Publication date:
11-05-2024 01:46 PM
Version history
Last update:
‎07-26-2023 11:22 AM
Updated by:
Contributors