JCMR recently Announced Fintech blockchain study with 200+ market data Tables and Figures spread through Pages and…
Predictive sales forecasting is a proactive approach, whereby retailers use data from past customer buying patterns to predict expected sales growth due to change in consumer behaviours and/or market trends. This helps retailers stay ahead of the curve, compete effectively and gain considerable market share while ensuring better services to the shoppers.
Taking a conservative example of a modern-day retailer with 100 stores, and about 5000 products in every store, the opportunity for engaging with customers and growing, are 500000 of what can be called ‘need pockets’. It is important for the retailer to know how much the exact sales potential for each of these need pockets is, and hence how much should they centrally buy and de-centrally stock in stores. Here the need for better forecasting arises.
Three generations of the science of forecasting
The conventional approach to sales forecasting was to study trends of past demand and sales, and to create future projection from these trends. Techniques such as Holts-winters, Smoothing techniques, ARIMA are most commonly used to achieve results. These techniques put tremendous faith in history repeating (a base trend with cyclicity of seasonality overlaid) and are hence amenable to stable economies, customer buying patterns, and product categories. While they were powerful 20-30 years ago – there are increasingly fewer situations where we see such stability in real world. There are organizations which still use these methods, because of their sheer simplicity. For example, you just need a single datapoint of sales over time to run these techniques.
Over the last 10-15 years, newer approaches to forecasting have emerged as winners. They differ from conventional approach, where the focus is usually on ‘explanatory’ factors both within and outside the business for providing ‘explainable forecasts’ to businesses. For example, it is quite understandable that a deep price discounts on private label products will drive more sales, whereas the same from the competitors end will affect it. Many other potent explanatory variables such as marketing spend, weather, changing demographics, economic indicators (Per capita income or unemployment rate), and even the strength of competition, are taken into consideration while aiming to build more robust and explainable forecasts. They are moderately complex to build using techniques such as ARIMAX, Tree-based algorithms, Mixed Models – but are generally more accurate and provide the gift of answering ‘why’.
The last 2 to 3 years have seen a tremendous buzz in AI based forecasting, thanks to a strong evolution of new types of algorithms such as Neural Networks and Bayesian Belief Networks.
A special class of Neural Networks called LSTM (Long short-term-memory) has this innate ability to forecast accurately. It may not explain why – because it predicts through a complex network of neurons.
Increasingly, businesses are starting to adopt LSTM based forecasts in situations where the speed to act is paramount – such as in stock market predictions or commodity buying decisions. They are extremely costly to run but can be uncannily accurate. They are generally “fool-proofed” through a layer of anomaly detection engines, to prevent AI recommending wild forecasting decisions.
With three generations of forecasting and each one of them having their set of pros and cons, its often difficult for retailers to zero in on one method for better forecasting for business. To determine a best way forward, it would be in the best interests of brands to map different use cases by 5 parameters – cost of inaccuracy, need for speed, volume and range of historical data, trust in algorithmic decisions, and cost of wrong decisions. This opportunity cost-based analysis will help identify the best fit forecasting methodology.
The Last Word
We are all aware of the troves of data, retail productions generate one a daily basis. Surprisingly enough, many retailers still employ a manual process based on intuition without properly analysing the same. This repository of critical data is insignificant if it cannot be translated into valuable insights to peep into the consumer’s minds or anticipate market trends.
This paves way the path for decision-makers to engage predictive analytics for deriving the best value of all the data gathered, ensuring better sales outcomes in the near future with enhanced customer experience. With the nature of forecasting being evolved over the years, it might be difficult to analyse the pros & cons of each of the three generations.
Below are a few guidelines on what should the focus be while aiming to receive a more accurate forecasting for your business? Below are some important pointers:
a. Map the different use cases by 5 parameters: The following parameters should be considered while trying to devise the individual data received:
• Cost of inaccuracy
• Need for speed
• Volume and range of historical data
• Trust in algorithmic decisions
• Cost of wrong decisions
b. Focus on harvesting rich internal data on pricing, promotions, workforce along with the external data factors of weather, competition and demographics will always help towards accurate prediction of business environment
c. Keep in view the pros and cons of different forecasting approaches and aligning the right use case to the right approach. The final goal of efficient forecasting to create the most profitable sales insights and strategies, can then be an achievable target.
(The author of this article is Associate Director – Analytics, Tesco Business Services.)