Roughly speaking there are two different approaches when it comes to optimizing online marketing strategies: trial and error, and predictive analytics. This post examines both approaches and provides a real-life example of a successful prediction based analytics project.
(Illustration: Alessandra Angelucci)
When it comes to adapting and optimizing online marketing efforts most marketers will apply a trial-and-error method. This approach usually consists of three key steps:
- Measuring and reporting key performance indicators
- Trying to implement changes that might improve those indicators
- Evaluating the impact of said changes and going back to point 2, ...
While applying this approach people usually make changes based on their personal insights, know-how, and experience. While this approach proves to be easy and cheap to deploy, some flaws come along with it as well.
Gut Feeling vs. Data Driven Prediction...
First of all, it might not be the most effective way of optimization. You might spend hours tinkering with your campaigns, trying to find out which measures have an impact on your performance and which ones don’t. Moreover, this approach is inherently reactive: you only can act after a given event has taken place. That might be a bit of a problem in fast-changing environments.
You therefore might consider betting on the predictive analytics approach, which consists of the following key steps:
- Measuring and reporting key performance indicators
- Measuring the impact of a given range of modifications
- Creating a model to predict the impact of a given modification
- Using the model for finding the optimal intervention
- Evaluating the impact and going back to point 4
By applying this method, you can usually eliminate the risk of spending hours tinkering with the wrong measures and thus speed up your optimization process significantly. At the same time, the predictive analytics approach is proactive: prepared properly, you can act before an event critical to your campaign takes place.
… it’s all in the mix
Now what does this mean regarding resources? Conducting proper measurements and creating a reliable scientific model is a time-consuming task and therefore deployment costs are usually higher than when you go for ye good olde gut feeling.
On the other hand, you will be able to automate the optimization process to a significant degree, and that will decrease your operational costs – in the long run the predictive analytics approach usually overperforms the trial-and-error approach with regards to overall operational costs as well.
One real-life example
So how does this work in the wild? Let’s look at one instance where we developed a predictive model for one of our customers, an e-business company whose revenues are highly dependent on the company’s online performance. Of course our customer is investing in a considerable amount of his revenue in online marketing.
However, as revenues vary from month to month our customer wants to adjust his budgets accordingly. But how can he adjust his marketing spend, not knowing in advance how his revenue will evolve in the forthcoming months? Setting the optimal budget seems to be quite a challenge.
Instead of going for trial and error, i.e. setting an arbitrary budget and adapt it over time, tinkering with arbitrary measures, we build a model that can predict the number of daily visitors, assuming the online marketing budget stays the same over time.
The blue line shows the predicted number of visitors, the red line the actual daily visitor numbers. The effective 30 day model accuracy is 88.9% ± 2.0%.
As the number of visitors usually correlate with the number of transactions, we now have an idea of how revenue will look like in the months ahead – granted we do not change the online marketing budget. But we do love to make changes, and thus we implement a given set of changes to see how they affect performance.
Now at the end of the month we can measure the impact of those changes by comparing the actual realization with the forecast. The findings we thusly gain can then be integrated into our model. Once we have enough data and a properly refined model, we will be able to predict the impact of a given set of changes in advance. This allows us to optimize our customers’ budgets with statistically proven measures.
Predictive Analytics – so what?
So Predictive Analytics is here to solve all of your problems? No, not in any case. It often makes perfect sense to stick with the simpler trial-and-error approach indeed, especially for intermittent issues. Of course, predictive analytics is a powerful, fast, flexible, and moreover, highly scalable tool, but it requires time and know-how to set up and implement. Therefore you should consider all your options carefully – after all, that’s what predictive analytics really is all about!
Talk to us if you would like to find out how to team up gut feeling and mathematical models. Our software engineering team is looking forward to assisting you.
Data scientist Gergely Kalmar worked at Wigner Research Centre for Physics and Morgan Stanley before joining forces with Webrepublic, Switzerland’s first and so far only german-speaking Google Analytics Certified Partner. Learn more about our analytics services here.