Why we test and learn.
Mary, Mary, how does your garden grow? Best you say? Let’s Test!
If you have ever planted a garden, then you understand the importance of adding fertilizer to produce bigger and fuller plants. But let’s assume that we don’t know anything about gardening with fertilizer. We have had a garden every year with great results, but we want to get more tomatoes out of the garden this year. What can we do? Well, some ideas included incorporating fertilizer into the soil, so let’s test it.
First, let’s form a hypothesis around our situation. “If I add fertilizer to my garden, then I will increase the number of tomatoes produced.”
Now we can create a learning agenda that will help us understand our results as a success or failure and learn from them. Keep it as simple as, “did we get more tomatoes?”
Half of the tomato plants are not given any fertilizer, and half are given a standard amount. Now we observe, analyze and refine. We can look at the percentage difference in total weight of the tomatoes and make a conclusion. But let’s not stop there. Rather, let’s refine our test and run through it again. This time, we can adjust the amount of fertilizer given to the test group and compare with the same process.
But why are we talking about tomatoes?
This exact same process works for when we’re trying to understand our audience and their behavior. Maybe we want to know which creative performs better, or even what feature is better on a website to retain traffic and minimize drop off. For example, in The Download client newsletter, we test the headlines to see which performs better. Half of our clients receive the email with one headline, the other half get the email with another. This helps us identify which subject lines receive the most opens and can help us learn what works best for the future. We also do A/B testing on our social media ads to see which images and copy perform best. See the four versions we recently had in market:
And the list goes on and on with the things you can test within the marketing world. This process is crucial and fundamental to optimization and performance. The only way we are going to know if we are doing the absolute best thing we can is if we test it.
I tested and learned, now what?
All of the hard work is done and now it’s time to evaluate. Look at the results and determine if there is a statistically significant change between the two groups. Whether the test concludes that a new element or feature should be implemented or not, it’s now time to refine, improve, optimize or roll out. We want to continue to evaluate our test to further improve our performance and push the limits of optimization.
About the Author
As the Manager of Analytics at Bader Rutter, Jon Schotte and his team define client goals and deliver actionable insights around marketing performance. Jon started his career in the travel and tourism and higher education marketing industry as an analyst and data engineer, before he transitioned into the e-commerce industry focusing primarily on data engineering and automation to provide data in real-time dashboards. Jon’s dedicated team is working to provide clients with the latest industry standards for data reporting and insight, focusing on telling the story behind the data.