Alex Jackson, Commodities Trading IT Delivery Manager
Heemanshu Jain, Principal Product Manager, Trading Analytics
Country: United Kingdom
Organization: bp Trading & Shipping
T&S is the energy and commodity trading arm of bp and is one of the world’s leading energy, marketing, operations, and trading organizations. We buy, sell, and move energy across the globe to provide integrated solutions to over 12,000 customers in 140 countries. With upwards of 300 ships on the water at any given moment for bp, T&S moves around 240 million tonnes of oil, gas, and refined products every year.
Best Acceleration Use Case
Best Moonshot Use Case
Trading & Shipping (T&S) is the energy and commodities trading division within bp. With over 12,000 customers worldwide and a business spanning crude oil, refined products, natural gas, power, LNG and biofuels, T&S helps to keep the planet’s energy moving. As a participant in both the physical and paper markets, we transport hydrocarbons by ship, rail, and pipeline, move electrons along transmission lines and trade financial instruments such as futures and options contracts across the energy, commodity, agriculture, and emissions complex.
Alongside our well-established discretionary trading activity driven by market fundamentals (such as supply, demand, weather, refinery availability, macro data and geopolitical events), there is a smaller but growing Quant Trading (QT) activity. Here mathematical models consume price and fundamentals data to prescribe trades and to inform trader thinking on risk deployment. These models span commodity lines, regions and timeframes.
Within bp T&S, QT is being pursued by the commercial teams across the globe. To allow the activity to grow at a reasonable cost the challenge was to provide:
A shared platform that would allow the Quant Trader, Quant Researcher and Quant Python Developer personas to collaborate on researching and productionizing new strategies;
A set of core software engineering capabilities, for example python environment management, GIT source control and fine-grained job orchestration;
Easy access to data stores, proprietary python libraries and interactive dashboarding capabilities;
A robust platform infrastructure featuring scalable compute; and
‘Follow-the-sun’ enterprise IT support.
With considerable experience of running Dataiku instances for our fundamentals modelling in T&S, our central IT team was quickly able to spin up a standalone instance dedicated to QT. Platform users, comprising the three quant personas, immediately benefited from an extensive suite of:
In-house python database connectors;
Proprietary code libraries;
Re-usable design patterns, shared learnings and best practices; and
Core platform functionality.
Now six months in, we have a growing number of signals live on the platform and an expanding set of bespoke python toolboxes for researching and productionizing strategies.
Intuitive, modular Dataiku workflows, comprising both visual recipes and custom python components, allow the three quant personas to colloborate seamlessly. In particular:
Quant Researchers can efficiently explore diverse data sets using a suite of database connectors, visualisation tools and open-source machine learning toolboxes;
Quant Python Developers can rapidly productionize new strategies using shared python libraries and core platform functionality; and
Quant Traders benefit from transparent model workflows that free them up to spend more time on ideas generation, studying markets and risk management.
A single QT Dataiku instance provides a global solution to a common set of business requirements, yielding greater speed-to-market and requiring fewer IT resource to support.
Model-driven trading is a growing source of revenue generation in T&S, naturally augmenting our traditional proprietary trading activities. Notably, this is a first example of deploying Dataiku to the front office trading teams in T&S and it is a precursor to making it available to more commercial teams wishing to use python and machine learning for trading.
Value Brought by Dataiku:
Dataiku allows us to bring together our triumvirate of expert knowledge of traded markets, deep technical know-how and abundant creative thinking, to allow us to productionize new signals with ease. Knowledge is institutionalised in a way that standalone Jupyter notebooks running on local machines cannot. Scheduled model runs eliminate ad-hoc manual tasks that can be time consuming and carry operational risk.
The adoption of Dataiku by the QT teams brings a cohesive solution to a common set of business requirements that would otherwise be solved by a set of ad-hoc, siloed approaches. Dataiku provides:
A faster, lower cost route to market for the QT teams;
Frictionless data access;
Rich open-source machine learning toolboxes for signal generation;
A path to production that seamlessly scales as the commercial activity grows;
Easy sharing of custom python code libraries between the QT teams; and
Visual workflows that foster superior collaboration between quant personas.
Dataiku’s integration with scalable compute is key to us working successfully with the highly granular datasets required by a next generation of QT strategies and a wave of new fundamentals models. Platform enhancements and learnings acquired from evolving the QT Dataiku instance also benefit our fundamentals modelling Dataiku instances, and vice versa, allowing us to develop our overall analytics capabilities at a greater pace and at a lower cost.