In the world of web analytics, not all metrics are created equal. Some can stand alone on their own, giving you valuable insights into business performance at a glance. We call these metrics Key Performance Indicators (KPIs), and depend on these metrics to guide our marketing efforts, especially as we head into the busy holiday season.
For retailers, popular KPIs include Conversion Rate, Average Order Value and Customer Lifetime Value.
KPIs only make up a tiny percentage of the overall metrics available to marketers. Anyone who has logged into Google Analytics can tell you that there are seemingly infinite reports, metrics and dimensions available by default. While many of these can support your KPIs, there are a few “usual suspects” that always seem to confuse marketers and steer the ship in the wrong direction.
Bounce rate
Bounce rate is a fairly simple metric. It measures the percentage of website visitors who view a single page on your website before leaving, or “bouncing.” Marketers tend to obsess over their website’s bounce rate, constantly questioning if it aligns with bounce rates of similar stores in their industry.
But here’s the thing: There is no standard bounce rate. Differences in design, conversion paths, user experience and page type can wildly influence bounce rate on your website. For example, imagine a visitor searches for “how to clean my Lenovo PC” and lands on a support article on your site that answers that exact question. The user will be highly satisfied and will likely bounce after finishing the page. This will increase bounce rate, while also improving user satisfaction.
Pro tip: Use bounce rate to optimize conversions on individual landing pages. Drill into your landing page reports to see how data differs for devices, sources and user type. This will define what a poorly performing page is for your website and allow you to A/B test more effectively.
Average session duration
Average session duration is a confusing metric, because it doesn’t measure the total time of a website visit. Instead, it measures the time spent between an initial page load and subsequent page loads or event triggers. Imagine the following sequence:
In this example, the total time spent on the site is three minutes. However, average session duration will only show as two minutes, since there is no “ending” timestamp when the user exits on product page #2. Therefore, only the first two minutes are logged.
This is where our old friend, bounce rate, comes to ruin the party. Unless custom events are set up in Analytics, a bounce will typically log a time on site of 0:00, no matter how long the user actually spends on your site during that single page session. This can greatly skew the average session duration metric, especially on sites with high bounce rate variance per page.
Pro tip: Track scroll rate, button clicks, video plays, PDF downloads and other on-page interactions with event tracking to get a more accurate understanding of session durations. This metric can also be used as a relative metric on a per-page basis when A/B testing, similar to bounce rate.
Direct traffic
Despite its name, the definition of direct traffic isn’t very direct at all. Common sense would imply that it consists of people who type your website directly into their browser. But most fail to realize that direct traffic also includes visits that Google Analytics can’t attribute elsewhere.
Here are a few common examples of traffic that can be logged as direct:
Link clicks in external email clients like Outlook and Mac Mail
Link clicks from external documents like PDFs or Microsoft Office files
Browser tracking errors that confuse analytics
HTTP and HTTPS errors that strip referral source data
Misattributed search engine traffic (In fact, an experiment by Groupon showed that up to 60 percent of their direct traffic was actually from organic search!)
Because of this confusion, you may be over- or under-valuing different channels, especially email, organic and “direct” traffic.
Pro tip: Use UTM tracking whenever possible to add another level of precision to campaign tracking online. This can help reduce false positives and allow you to give credit where it’s due.
Here’s an example from FeedOtter on how they measure different campaigns with UTM codes:
Site speed
Speed is an essential metric that can heavily influence conversion rates, SEO and user experience. And with Google’s firm focus on Mobile First, you can be assured that speed will continue to play a more important role as time goes on.
And here’s where things get strange. Despite their emphasis on site speed, Google’s own partner teams advise agencies and brands not to trust the site speed reports in Analytics. There are a few reasons why:
Analytics is using an average of a small sample, and the outliers can dramatically differ between what data tells you and what your speed actually is.
Timeouts or errors are logged as 0:00 — when often, the problem is the opposite, too-long loading times.
You’ll see different speeds when using this tool vs. Chrome Development Tools, Search Console and third-party tools.
While these reports can identify some user experience issues, it is recommended that you use a more consistent reporting tool to identify problem pages and overall site performance.
Pro tip: The Google Partners team recommends two tools to measure site speed. Chrome Developer Console is built right into your browser and allows a granular view of site performance and speed. WebPagetest.org is a third-party site that measures site speed on different device types and browsers and gives you a free video showing average load times.
The takeaway
This list was just a start, but there are hundreds of other deceptive metrics lurking around every corner.
For retail marketers, it’s critical to keep your metrics in context. You should understand the different metrics that are important to your business, see where they fit into the big picture and know how each can be influenced.
Numbers don’t lie, but they don’t always tell the truth, either.
Comments