“The most objective among us are those who make the most accurate predictions” – Nate Silver
In today’s hyper-competitive market, the ability to predict the future uncovers a significant competitive advantage. With predictive analytics, brands can go beyond what has happened and why, to understanding and predicting what will happen in the future to guide effective decision-making and drive growth.
Predictive analytics transforms data into future insights by extracting patterns and relationships in the data with disruptive data science applications. With the advancement in predictive techniques and increased availability of predictive solutions, organizations now have the ability to tap into the power of predictive analytics without an army of data scientists.
In last week’s webinar, our Head of Operations and Customer Success, Aviva Egulsky-Ozeri, covered a wide range of topics surrounding predictive analytics, from why it is so difficult to make good predictions to examples of how predictive analytics get applied to specific use cases.
She called out the 5 Building Blocks for successful prediction and highlighted how various aspects of the Signals Analytics platform are designed specifically to drive relevant, granular and highly accurate insights that portend the future.
Like all successful data and analytics deployments, the best predictive analytics implementations begin with specific business needs in mind, ensuring that the result will be applicable to the needs of the decisioning process. How the initial question is defined will affect the rest of the process, what data sources will be included, the appropriate algorithm to be applied and how the analysis should be presented. Connecting multiple sources including sales data, voice of the consumer, product listings, product reviews, patent filings and more, will yield more accurate and enriched predictions.
The most important thing to keep in mind regarding predictions is that they are dynamic. Predictions can change as the reality and consumer sentiments shift, as we have all learned too well this year alone with COVID-19. Technologies that are able to continuously ingest data sources and pick up on subtle changes will yield the most accurate and timely predictions.
A major ingredient manufacturer was looking to position themselves as a trusted advisor with one of their largest accounts, a global food leader. They wanted to make sure that their customer had a good handle on changing consumer needs and armed with data, to give them the confidence to invest in a new dairy alternatives product line, which was perceived to be an emerging area worthy of pursuit. By analyzing sales performance, consumer discussions and product claims, the platform surfaced oat milk as a severe under-addressed need (as opposed to soy which was already an established ingredient in the category with little room for new innovations). At the time, oat milk was a $4.3M market and by the end of 2019, it had $68M in attributed sales.
In a similar example, analyzing the future potential of olive oil as an ingredient using sales data predicted a decrease of nearly $2M in the 3-year period from 2016-2019. Adjusting the model to include other sources, the Signals Analytics projected an increase of $13.75M. This was based on positive indicators that compared olive oil to 371 other ingredients: olive oil ranked #4 in key influencer posts in 2016, olive oil was above the median as compared to sales with both consumer reviews and new products launched, and finally, olive oil was in the top 15% of brand interest around ingredients. The actual result: sales of olive oil were up $54.5M between 2016-2019!
These use cases demonstrate the power of prediction and the need to connect multiple data sources on a continual basis to be able to surface the type of insights that can drive successful business decisions. For more use cases on the successful implementation of predictive analytics, listen to the full replay of the webinar here.