M1 - Building Company-Wide Data Literacy Through a Slack-First Approach
Team members:
Nicklas Ankarstad, Sr. Director, Data Science & Analytics, with:
- Rahul Nair
- Holden Price
- Brenna Sullivan
- Sam Weeks
- Nicholas Hotz
- Brian Wonch
- Crystal Ancona
- Michael Herman
- Catherine Brennan
- Matt Soehren
Country: United States
Organization: M1 Finance
M1 is a financial services company that empowers you to invest, save, borrow, and spend on one intuitive platform. The Finance Super App® offers tools to customize your investment strategies and automate the big picture—commission-free.
Awards Categories:
- Best Moonshot Use Case
- Best MLOps Use Case
- Best Data Democratization Program
- Best Approach for Building Trust in AI
Business Challenge:
M1 has been a Dataiku customer for a few years, and the Data Science and Analytics team makes extensive use of alerting to monitor model executions. To be proactive with model monitoring, M1’s Data Science and Analytics team first set up a system to send alerts via Slack when a model or scenario didn't run properly. This helped streamline MLOps by having information posted into a central ML-Ops Slack channel.
As M1 developed new products, the Data Science and Analytics team extended this same mindset to support more business related activities. For example, as the team took over credit risk assessments, they used Dataiku to generate Slack messages to report and check key metrics. This meant that instead of logging in to a BI tool each morning to see the number of applicants and loans originated the previous day, the information was posted to a company-wide Slack channel.
M1 internal staff liked these message. Specifically, the Personal Loans team appreciated the transparency and could more easily access and focus on the metrics that matter.
Data Science Analytics applied this same concept further with the launch of its next new major product, M1 Savings. A similar Dataiku setup generates daily reports that post daily Savings funnel KPIs to Slack.Now slack alerts have become a standard across M1. Key metrics are sent via Slack every day, resulting in very high engagement and democratization of the data. People are even asking about the messages before they get sent!
The previous challenges therefore revolved around data democratization and making insights accessible:
- Any organization has a lot of metrics and data. How do we think about rallying people around them? This first requires defining the most important ones, as North stars to keep track of.
- Moreover, employees are not keen on logging in to check metrics on a dashboard. It's not a good user experience and wastes a significant amount of time given the scale, as most employees need to check in on data at least once a day.
- The effort required results in gate-keeping information, whereas it should be widely leveraged as a driver for day-to-day work.
Business Solution:
M1’s Data Science and Analytics team was already sharing error reports via Slack to alert of job failure. The initial focus was therefore to check the alerts to ensure there are no errors with any ML pipelines.
Then we started integrating variables into Slack messages to further enrich the message and information shared. This started out as simply passing what check failed to the Slack message so the support team knew whether what the error was. We started templatizing the Slack messages for the Data Science and Analytics team to reuse and scale across use cases.
The Slack alerts make use of Dataiku’s built in scenario functionality. Data is read in from SQL and stored in a table that Dataiku can access from python.
The Slack alerts are sent via a scenario that first builds the dataset on a daily basis:
The scenario reads the data into memory using a python script, stores the metrics as scenario variables:
import dataiku import pandas as pd, numpy as np from dataiku import pandasutils as pdu from dataiku.scenario import Scenario from datetime import date, timedelta ## Set-up the scenario scenario = Scenario() # Read recipe inputs dataset = dataiku.Dataset("dataset") df = dataset.get_dataframe() ## Get today and yesterday's dates today = date.today() yesterday = today - timedelta(days=1) ## Filter dataframe to only yesterdays data df = df[df['date'].dt.date == yesterday] ## Get the metrics metric_1= df['metric_1'] metric_2= df['metric_2'] metric_2= df['metric_3'] ## Set the scenario Variables: scenario.set_scenario_variables(metric_1= str(metric_1), metric_2 = str(metric_2) , yesterday= str(yesterday), metric_3 = str(metric_3))
The scenario variables are then outputted into the Slack message:
Then the scenario is scheduled to run at some cadence.
We have also added helpful context in the reports. From just numbers to percentage differences, and then to reporting against targets so that employees have all valuable information at a glance. We have even started to share screenshots and have formatted most of them with emojis and other elements for better readability and engagement!
Day-to-day Change:
Currently, 40% of M1 employees use the enterprise dashboarding tool monthly. In Slack, we have four daily KPI alerts, and a few more on a weekly basis. The Slack alerts reach about 70% of our employee base.
We have noticed a significant deepening in engagement from across all levels of the organization, including senior leadership where our CEO participates daily in the reporting. Our project teams are also anchoring their discussions around the next steps to take based on the metrics. The alerts serve a dual purpose, benefiting both the senior leadership in terms of a quick update on performance and the project team in their day-to-day operations.
This has become a company-wide initiative, with leaders across all departments engaging with the metrics. This collective effort is not only improving our communication but also fostering a stronger sense of unity and belonging within our organization.
Business Area Enhanced: Internal Operations
Use Case Stage: In Production
Value Generated:
- Creates company-wide data interest and literacy
We've seen a significant shift in data democratization within our organization. More people are now engaging via Slack, far surpassing the number previously connected to our dashboarding tool - only a third of these engaged users were previously active dashboard users.
We've moved from a pull to a push approach for insights discovery, which has reduced the effort needed to check them.
- Build common understanding around key business metrics
Our approach to data and the way we discuss it has evolved, resulting in much more clarity and transparency around the key metrics. Teams are much more focused on driving improvements in business performance as a result.
For instance, when launching a new savings product, we have a clear understanding of which part of the funnel we should focus on, where our strengths and weaknesses lie. This has streamlined the efficiency of how we talk about business performance.
- Save time and resources
In the past, our project leads would spend their first hour of the day collecting metrics for each product. Now, metrics are broadcasted automatically in Slack channels. This also helps us triage issues faster. For example, when a metric declines, we can easily tag the right people and initiate a conversation much faster.
- Inform smarter decisions and faster issue resolution
We've also seen the benefits of this broadcasting approach in situations where changes have led to issues. For instance, when we rolled out a change that broke an important process problem, we were able to quickly identify that we had an issue, trace it back to the root cause, roll back the change, and ultimately fix the issue within 24 hours. This transparency allows for quicker identification and resolution of anything that impacts our metrics negatively, as opposed to being solely the responsibility of a small group of people.
Value Brought by Dataiku:
Dataiku has been pivotal to quickly prototype and operationalize the solution, namely through:
- Integrations
We’ve used web hooks with Slack. It's not merely about sending a random Json string, but we've been able to formulate the message with emojis and proper formatting, as if a project manager had manually written it. This has resulted in a real data product, not just a simple email.
- Data flow construction
We've utilized the Flow to gather the necessary data for Slack alerts. This has streamlined our processes and improved our efficiency.
- Automation
We've set up scenarios to pull that data every day. We've been running it through SQL scripts and outputting the results in Slack. This daily scenario has enabled leaders across M1 to stay updated and promptly respond to any changes or issues.
Value Type:
- Improve customer/employee satisfaction
- Save time
- Increase trust