Enabling advanced sales decomposition at McDonald’s | by Global Technology | McDonald’s Technical Blog | Nov, 2022

Global Technology rethinks its internal business reporting to optimize decision making.

By
Subramanian Krishnan, Senior Manager, Global Technology Data & Analytics Akshay Sahni, Director, Global Technology Data & Analytics
Brian Stroner, Data Engineer, Global Technology Data & Analytics

At McDonald’s, we strive to equip the business with the complete snapshot of sales drivers, so they can make informed decisions for optimizations on spend/investment.

Over the past year, we revisited our legacy process for generating executive reports to make them faster, automated and accurate, and created a platform to transform the way we interact with our data. Through advanced regression models we can isolate the role of each individual sales driver, as well as dynamically forecast and plan for future sales scenarios. This is our first foray toward centralized and standardized sales decomposition and forecasting. The product will continue to mature and evolve through the course of our journey.

The challenge
The current process of executive reporting is individualistic, where sales actuals, and projections are grouped into non-mutually exclusive categories (e.g., channel, pricing, media). This does not account for the interactions between each of these parts, which is needed to give the complete picture.

In addition to providing a holistic view, we also wanted to automate the data collection that enables reporting. Completing executive-level reports for all countries can require significant hours of manual computations for analysts, and we wanted to ensure the tools we provide are helping make the Business Insights team’s jobs more efficient and readily actionable.

Because McDonald’s operates on a global scale with nearly 40,000 locations in over 100 countries, certain countries may require additional nuance. Due to local variances around menu items, promotional material, and pricing, we found that countries prepared their local reports using different methodologies, hence, the need for a dynamic and versatile tool to account for these differences.

The solution
We built an always-on insights platform with self-healing model governance that is accessible to analysts and business leaders, alike, via a lightweight, web-based user interface.

The platform was built with the following technology tenets in mind:

The platform architecture comprises three key tiers:

  1. Data Collection and Transformation Tier
  2. Modeling Tier
  3. User Interface (UI) Tier

Under the hood
Data Collection and Transformation Tier
Our multistep engineering solution combines the power of data engineering, machine learning, and user experience to transform how our business understands overall sales performance. The starting point for effective and accurate insights is timely and reliable input data. Our data collection and transformation layer is automated to source periodic data from various data sources and has in-built data validations and data quality thresholds applied before the data is transformed and aggregated for modeling.

Modeling Tier
We feed internal and external data into a Least Squares Regression Model to map several menu categories to their most apparent business drivers. We employ various tactics to ensure the model is a good and accurate fit for our variables before finally exposing results to the interface.

For a holistic view of what influences a sale, we take a multifaceted approach and consider:

  • Incremental sales driven by marketing interventions, accessibility for consumers, etc.
  • Simple factors, such as the expected sales we would make regardless of any outside factors, if the store was simply open for the day.
  • External factors, such as holidays, unemployment rates, COVID restrictions that may have been present during the time the sales data was collected.

This information provides a more complete story and accounts for any fluctuations that might otherwise go unexplained.

After ensuring the quantity and quality of the data, we use data from the past years to build our model. We archive the data that fails the quality checks, and a custom notification system baked into the tool alerts affected parties of a potential data flaw that may cause delays.

Having ingested and transformed both internal and external datapoints, we are ready to feed them into the model. This high-level modeling algorithm gathers the final insights that are exposed to the tool. The variables are filtered through millions of models to determine how these factors that may appear to be independent of each other are actually impact one another, with some having more influence than others.

We create many combinations and only retain those that make sense and tell a meaningful story. We expose the values the model is most confident about through the final data displays.

Within our model, we perform several insightful transformations on our variables for feature engineering to reflect customer response. Let’s go over a few of them:

Adstock Effect — the effect marketing activities have over time on sales or brand health — and captures how advertising builds and decays in consumer markets.

Lag — measures the delay in a marketing activity’s impact, reflecting a structural delay in consumer responding to advertising.

Diminishing Returns — Captures the nonlinear consumer response to marketing. The more a consumer sees a TV advertisement, the less the incremental effect it has

Our models are trained to determine the effect of each driver on sales without influence. This results in an Anchor Model off which we iterate for future measurements. We use common model-fitting methods to ensure the results of our model on actual data are consistent with the sample data we used in our model. Then we tune the statistical outcomes with gradient descent, which essentially means we are helping the model perform at the most optimal level.

UI Tier
After the data is rechecked for quality and tested for the proper model, it is ready for user interface exposure. Models are loaded and exposed via a web-based application build on the React Framework.

As new data becomes available, the tool periodically auto-refreshes for the latest results. At its core, it has three main capabilities: executive reporting, sales decomposition, and scenario forecasting. Through multiple views, the user can easily access key performance indicators in a market and determine each driver’s exclusive impact to sales.

The tool lets users interact with data in ways that surpass the abilities available on most standard data and analytics tools, today. Insights are readily available to the user upon loading the application. They also can use features like trends, comparisons, sales forecasting custom scenarios, one-click download to editable presentations, etc., empowering our analysts and business leaders to drive outcomes.

Source link