Online marketers love testing: A/B testing, multivariate, holdouts, incrementality. Heck, before I was married, I’d A/B test my profile descriptions on dating sites. (For some reason, my best results came when I posted the blurriest photos … kidding.)
The beauty of online marketing is that we get reams of data, almost instantly, that enable us to quickly test iterations to improve our results. Nerdy as it may sound, it is exciting to launch an experiment and sit in front of the computer and watch the results show up on a dashboard, right? Who’s with me?
I firmly believe that the best online marketers are the folks who are religious about testing every decision possible. More often than not, however, I’ve observed marketers who only selectively implement a testing regimen.
This article describes some of the most common instances where marketers eschew testing — and why their reasons for doing so aren’t good enough.
Testing Fail #1: We’re At Maximum Efficiency
Like any team inside a company, marketers have to fight for budget, be it annually or quarterly. Once budget is established, the marketing team is often faced with the perverse “use it or lose it” choice, where reducing budget (even if for smart reasons) results in the marketing team getting “punished” with less budget in the next cycle.
In a true testing environment, use it or lose it should never be a factor in deciding the optimal budget. Marketing teams should communicate to senior leadership that part of their responsibility to the company is to test different budget amounts to determine the optimal efficiency and that budget for the next year shouldn’t be reduced just because not all of the prior year’s budget was used.
At One Kings Lane (a former client), for example, VP of Marketing Jim Kingsbury routinely schedules an “efficiency month” where budgets are slashed from every channel and each channel team is directed to drive the same amount of revenue from significantly less budget. This exercise forces teams to think creatively about channels and tactics and drives tests that often improve efficiency.
Testing Fail #2: Our Agency/Team Can’t Be Beat
Humans are strongly affected by “confirmation bias,” or according to the definition on Wikipedia, “the tendency to search for, interpret, favor, and recall information in a way that confirms one’s beliefs or hypotheses.” This is most apparent in marketing organizations when it comes to choices about staffing.
I’ve seen this manifest itself in two ways:
Switching budget from an agency to an in-house team without testing
Replacing an agency without any form of bake-off
For example, let’s say that a new VP of marketing joins a company with an existing agency relationship in place. The VP then hires some in-house team members who subsequently tell him that they can manage the marketing better than the agency.
Since the VP hired these team members (but inherited the agency), the VP subconsciously concludes that his team must be better than the agency, because the internal team is a reflection of him and the agency is not. The same situation occurs when the VP replaces the agency with an agency he has worked with in the past.
Marketing experts who truly believe that testing drives the best results eschew confirmation bias, or at least try to dampen its influence. So in the case where the in-house team claims that they can outperform the agency, a test-driven VP would give half the budget to the agency and half to the in-house team to see which one truly was the best.
For more on the general decision of in-house versus agency, see my last Marketing Land post.
Testing Fail #3: We Don’t Have Time/Budget For New Channels/Technology
Marketing teams are under constant pressure to deliver better and better results. This often leads marketers to focus all of their attention on “what’s working now” with little regard to “what could work better in the future.”
I see this most frequently when it comes to new channels or technology. From a channel perspective, I believe the job of a marketer is to figure out where customers are online and then market to those customers.
Customer behavior online changes very rapidly, and that generally means that a marketer’s media mix should evolve with equal speed. And yet, marketing spend almost always lags behind customer behavior.
If you don’t believe me, just look at the amount of money that is still spent on print and TV versus the amount spent on mobile advertising, and then compare that to actual consumer usage.
The same hesitancy to try new channels applies to technology. Having to talk to the IT team about implementing a new pixel (much less an entire new software system), juxtaposed with comfy familiarity with an existing system, leads marketers to avoid new technology vendors like the plague. (In economics, this is referred to as “switching costs.”)
A marketer committed to testing would set aside both time and budget to new channels and technologies, always trying to find the next great thing that will move their business forward.
Don’t Be A HIPPO, Be A Tester
I once quipped that the biggest challenge facing online marketers today is “mo’ data, mo’ problems.” There is not enough time in the day to look at all the different data sources available to us.
As most marketers have discovered, the solution to this overabundance of information is to be selective about what you choose to analyze. There’s a huge difference, however, between selecting testing and no testing at all.
A complete lack of testing leads to decisions by HIPPO, “the highest paid person’s opinion.” And a hippo who is also influenced by confirmation bias, budget pressure and switching costs is unlikely to make the most efficient decision for a business.
My friend Chris Goward from Wider Funnel wrote a book recently that has perhaps the most perfect title of all time: “You Should Test That.” At the end of the day, marketers who truly believe in the power of testing should live by this adage when asked for their opinion on any topic.
Should we replace our agency with an in-house team? You should test that. Should we change ad servers? You should test that. Do I need more budget? You should test that!
Don’t be a hippo, be a tester!
Comments