en

Standard Chartered Bank: Unlocking Collective Intelligence in FP&A

On average, two people armed with the Digital MI team's applications in Dataiku are doing the work of about 70 people limited to spreadsheets. That means increased analyst productivity by a factor of 30 by replacing spreadsheet-based processes with governed self-service analytics.

Team of 2

Doing the work of 70 people in spreadsheets

30x

More productive analysts

12

Communities across the bank using Dataiku

Banks are sitting on a gold mine of diverse data that is becoming bigger and bigger every day. Collecting it has never been easier — the challenge is using it efficiently and changing behaviors to get real business results. 

Craig Turrell, Head of Plan to Perform (P2P) Data Strategy & Delivery at Standard Chartered Bank, knows this better than anyone. In his financial planning and analysis (FP&A) division alone, there are more than 400 people producing over 5,000 pieces of information every month, 90% of which can be automated. This story uncovers how Craig and team transformed FP&A’s ability to answer more questions with data, faster.

“My goal is to change the way the bank reports. We’ve taken very good accountants, given them Dataiku, and they’re producing brilliant stuff.” Craig Turrell Head of Plan to Perform (P2P) Data Strategy & Delivery at Standard Chartered Bank

The Challenge: From 10 to 400 Million Rows of Data

FP&A works on a jigsaw puzzle of major core financial statements and structure systems of the bank. They need to be able to look five years back and five years forward to identify abnormalities and trends, do balance sheet analytics, and conduct cost analysis to answer complex questions around how and where the bank is making profit, how the bank behaves, who should be hired and where they should be placed as related to cost profiles, etc.

Of course, they had the systems in place to do all of this for many years, but analysis was limited to 10 million rows of data. While it sounds like a lot, the reality is that teams could provide one or two levels of detail for the 10 core products of the bank or core primary country markets and look at basic account structure over about three months, and even at those dimensions, they had to start splitting analysis in pieces — they were exceeding 10 million rows really fast. 

“It’s simple math — take two account structures x two currencies x 150 countries x 1,000 cost centers… we are responsible for managing problems with 500 million moving pieces, and at any given time, we could only see about 10% of it.” 

Ultimately at the time, they designed a system that was intended for the top of the house (i.e., they digitized the reports and data that the core CFOs looked at), but Craig realized that in reality, this approach wasn’t going to be able to influence the behaviors of the bank. He needed to find a way to impact the day-to-day work of financial analysts, making them more efficient and effective.

When Craig looked at the problem, he realized it was primarily a question of volume that would require him to figure out how to get from 10 million to 400 million rows of data. But it was not a question of underlying infrastructure — in fact, they already had robust compute warehousing, but almost no one was using it. He needed to find a way to leverage that existing ecosystem.

“What we had was basically a data lake, and if you wanted to do anything with the data (change, use, model — anything), you had to take it out. That didn’t make sense. If we have generally big infrastructure, including terabyte-level compute warehousing that costs us over a million dollars and is being used at 3% capacity, and it’s pretty good, why don’t we find something that could leverage that ecosystem?” 

The Solution: Collective Intelligence with Dataiku

In addition to finding a solution that leveraged their existing infrastructure investments, Craig didn’t want to have to go looking for another tool again in a few years when his team became mature enough to start doing machine learning on their data. That’s when they found Dataiku and, in Craig’s words, “Dataiku solved volume straightaway.”

“We got our hands on Dataiku and we said — what can this thing do? Within three to four weeks we managed to turnover a 4.5 billion row table in a single operation. It was like data origami. We could do process on process. So within a very short period of time we basically achieved our original goal, which was processing data better. But Dataiku made us realize we could do so much more than that. We’ve got an army of people copy and pasting data, and this thing can load up very big stuff. Dataiku allowed us to have different conversations about data.” 

“Dataiku can do single operations on the amount of compute capacity you have somewhere in your ecosystem — doesn’t matter where it is. It’s amazing technology. Sometimes the data is small, sometimes it’s very big. Sometimes it’s a graphic problem. At some point you’ll want to go from just processing lots of data to adding value. You just need that flexibility, and Dataiku provides it.”  Craig Turrell Head of Plan to Perform (P2P) Data Strategy & Delivery at Standard Chartered Bank

In the first nine months with Dataiku, the team churned out use cases from around FP&A. They had so much flexibility and freedom, however, that the next step was productionalizing their system and approach, including ensuring there was discipline with data pipelines, SLAs, and more stringent DataOps processes. According to Craig, that’s the power of Dataiku and what makes it unique — it has that unbonded freedom, but it also has the features in place to facilitate structure and processes.

By 2021, one year into their Dataiku journey, Craig and team are using Dataiku to run three major systems at the bank and to refresh daily Tableau dashboards for all the bank’s finances, a laborious task previously done in spreadsheets. They’ve developed a data marketplace that people across the organization can leverage for plugging in other pieces to get answers from data (for example, analysts trying to understand the cost of property can use the balance sheet from the data marketplace and plug in lease data). 

Craig is developing a unique brand of data democratization or self-service analytics. The idea is not that individuals can do and build whatever they want with any data, but rather that a center of excellence (CoE) owns the core structured intelligence of the bank. That is, enterprise-level data connected to a homogeneous pool with product owners for every dataset and defined governance, which connects out to the entire organization. From there, the team builds specific experiences on top of that data that can deliver answers through core apps, and the ultimate “self-serve” flexibility comes from how people around the organization use those apps to solve business problems day-to-day.

“You can’t self-service the answer when you have too much chaos. It’s like booking a flight when you have to book a luggage on one app, the seat on another, and so on. We had democratization, and with it, we had app chaos. Over three years, one division had built 500 different apps. If I asked ‘Tell me which one answers x business question’ the answer was always — no clue. No one was really using it. Now with Dataiku, there is a core dataset, and it is secure. We’ve removed the people in the middle creating different stuff, and we now have a set of core apps and ultimate flexibility on answering questions and solving problems.”

The CoE is currently made up of 16 people, but they will be expanding to 30 and expect to be hundreds in the next few years to support the growing demand and continue driving efficiencies around the business.

Results & Next Steps

There are 12 communities across the bank leveraging Dataiku and building “digital bridges” to the CoE’s core structured intelligence. On average, two people armed with Dataiku are doing the work of about 70 people limited to spreadsheets. The goal in the coming years will be to continue to upskill people with Dataiku to increase efficiency across more areas of Standard Chartered Bank. 

In the months and years to come, Craig and team will move into more predictive analytics in the FP&A division with a focus on predicting within the mid/short term (three months to a year). The vision is around a “supermind,” or a smart group of independent agents working together to create a benchmark of intelligence (if, say, 10 machines independently make predictions, taking those predictions collectively will probably be close to reality).

With Dataiku we have deep learning, a broad enough range of machine learning possibilities, we can aggregate everything. Of all the platforms to do collective intelligence on, Dataiku is the one — we’ve committed to it. It’s all about the wisdom of many people and many machines, and Dataiku will effectively be the manager.

Mercedes-Benz: Democratizing Automated Forecasting

Mercedes-Benz enables people on the finance team to combine their expertise with state-of-the-art machine learning using Dataiku.

Read more

Go Further:

Watch video
Video

NXP: Building a Data Organization That Performs & Elevates the Individual

In this fireside chat-style webinar, Lance Lambert, Director of Enterprise Business Intelligence at NXP Semiconductors and Kurt Muehmel, Chief Customer Officer at Dataiku, discuss NXP’s keys to success with their data initiatives.

Learn More

Industry Analyst and Customer Recognition for Dataiku

Don't just take our word for it — see what industry analysts around the world say about Dataiku, the leading platform for Everyday AI.

Learn More
Watch video
Video

Rabobank: Ethical Enterprise AI – A Guideline or Compass?

In this EGG talk, Martin Leijen shares how Rabobank determines and protects their privacy and ethical standards, as well as how financial institutions can we effectively maintain a firm commitment to moral and ethical standards while at the same time encouraging a strong drive to optimize business opportunities and profitability.

Learn More

Bankers’ Bank: Goodbye Data Silos, Hello Analytics Efficiency

Bankers’ Bank leverages Dataiku to increase efficiency and ensure data quality across an array of financial analytics, ultimately reducing the time to prepare analyses and deploy insights by 87%.

Learn More