Imagine a page with 158 different links on it. This page is critical to your sales funnel, but with so many links, there are thousands of paths your visitors could find. Some links go to new pages. Some go to other parts of the page. Some open overlays. What’s an optimizer to do?
This is a daunting, if not impossible, task to instrument using traditional analytics tools like Google Analytics and Adobe Analytics. Click-tracking “heatmap” tools will tell you where visitors are clicking, but they have limited abilities to segment traffic, and they don’t handle overlays or fly-out menus well.
The solution? Consider using a split testing tool.
By setting click goals during our split tests, we were able to get a very clear picture of where visitors were clicking on this complex page. We were able to discern the language that activated visitors’ left mouse button and moved them closer to converting.
These insights shaped our choice of language to test and guided us to those pages most likely to influence conversions.
Testing Tools: They’re Not Just For Breakfast Anymore
Split tests are studies of what works on your website. By “works,” we mean generates more revenue, more leads, or more subscribers.
Properly set up, split tests are also good at answering the question, “Why?” Why did a certain treatment succeed or fail? What kinds of visitors made it through to the ultimate goal?
We often portray split testing as something different from studying analytics data. It really isn’t — it is a way of collecting additional data. This data is comparative and quantitative. It controls for the “history effect,” that is, things that change over time. In short, it is some of the most reliable data you can gather.
While you may see the primary job of split testing software as directing traffic to different versions of a page, don’t overlook the intricate and flexible tracking capabilities built into it.
When Analytics Let You Down
It’s somewhat unreasonable to ask a data analyst to anticipate all of the questions you might ask about your website and design data collection strategies for them. At the very least, it’s expensive.
Split testing tools provide a way to instrument paths and funnels with amazing accuracy. In fact, we have run split tests with no split just to collect data on visitors. The targeting is sophisticated.
Lighting Up Your Funnels
Developers make choices that confound optimizers. Often, we will encounter a multiple-step cart or registration process for which the URL never changes. In a traditional analytics setup, this makes the funnel invisible. However, your split-testing tool can see it.
In one example, we selected elements unique to each page in a signup process and tracked clicks on them. This indicated that the visitor had come to the page and continued entering data.
The form field asking for visitors’ birth month can be targeted, indicating they made it to this step of the process. (Visual Website Optimizer)
This provided us with a rich view of performance throughout the conversion funnel.
There are 30 goals set up for this test. We are tracking page visits, clicks on page elements, and even a sign-in action. (Visual Website Optimizer)
There are 30 goals set up for this test. We are tracking page visits, clicks on page elements, and even a sign-in action. Each treatment in the test can be evaluated separately based on how visitors interact with site elements.
Tracking Funnels For Different Experiences
When testing one or more treatments — splitting the traffic among different experiences — traditional analytics can fall down. With split testing tools, we can see how our changes affect the funnel for each experience. While one experience generated more conversions, another may have generated more click-throughs (and thus abandons). Could we come up with an experience that got the clicks without the abandonment?
These goals track a sales funnel for each of the experiences we are testing. (Visual Website Optimizer)
Here, you can see a separate goal for each step in this website funnel. Other goals include clicks on elements and navigation interactions.
Which Pages Influence Conversions
If you knew that when visitors came to one page on your site they were twice as likely to convert, wouldn’t you do everything in your power to get more visitors to that page? This is a common insight available using a Page Index or Page Value in your analytics database. But these metrics are blunt, failing to qualify the true impact of different treatments.
It was with a split test that we discovered just such a page for North Central University. Thanks to multi-goal tracking, we found that visitors to the financing page were far more likely to request more information — and then to convert.
Talk About Multi-Goal Testing
There are quite a few goal-setting options offered in most split testing packages. All include tracking form submissions, revenue generated, and visits to a “thank you” page.
As we’ve discussed, goals can be “fired” when a visitor clicks a link or another element on the page.
There are a number of ways to set a goal in most split-testing programs.
In many cases, it’s necessary to track a variable set on a page or an element that doesn’t have a well-defined “id” parameter. This requires some custom JavaScript. This is supported by the most common split testing tools on the market.
Use of wildcard characters and regular expressions are powerful ways to target across pages.
You aren’t limited by the number of goals you can track. It is typical for us to have dozens of goals for each test. Many of these are set just in case we might learn something.
Segmenting
Split testing tools allow you to target segments of traffic. You can present a test to new or returning visitors, make a test available only to mobile visitors, and target search ad traffic.
Many of our visitors don’t identify themselves until they’ve arrived at the site. Using goals can help us identify these visitors and do some post-test analysis.
For example, we are always curious about the use of site search. The magic question is, “Do searchers buy more or do buyers search more often?”
In the split testing tool, we set a goal to track clicks on the search element for every page on the site. This is done from inside the WYSIWYG editor or by tracking the CSS element.
This setting is used to identify visitors who use site search.
Visitors identify themselves by the pages they visit, the elements they click on, by logging in, and myriad other ways. We’ve made great use of segments to improve website performance.
For one B2B client, people visiting one product section reacted positively to a specific call-to-action, while people researching a different product responded negatively. This technique has allowed us to target experiences so accurately for another client that we have multiplied their lead flow many times over.
Post-Test Analysis
It’s in post-test analysis that insights are really mined from this data. We use goals to answer specific questions, but we also set goals in anticipation of future questions, or just to satisfy our curiosity.
Exporting this data into Excel allows us to pivot our way to new revelations about how site elements are working and to new test ideas.
The Magic Of Multiple Goals
In a split test, the main objective is to increase your conversion rate — whether that’s by increasing transactions or lead form fills. For example, you might create a test that changes the headline copy in your split testing tool, then add a goal to track visits on a “thank you” page.
Soon, you’ll know which headline copy converts your visitors best. This is all good, but after you read this article, you’ll never set up just one goal again.
To get more out of your split tests, track more goals. Measuring only one goal tells you what is good for all visitors, disregarding any other engagement with your site.
By tracking multiple goals, you’ll squeeze all the data you can out of the time you spend collecting test data. Plus, you’ll even be able to transform your split testing tool into both a click-tracking heatmap tool and a funnel visualization tool.
Listen to this column
(Duration: 8:48 — 8.1MB)
Comments