What decisions are we making based on our data if we ignore one of the important factors of the on-line experience – the speed, or otherwise, of our site?
What factors are influencing your visitors as they convert or abandon on your site? We segment visits that nearly convert, by the search terms or ad copy that brought them, by recency, by frequency etc. Yet we know how a shoddy on-line experience can dampen the hottest leads or most ardent shoppers.
This solution does not only support a business case for improvement but increases the accuracy of the insights we can draw because the Latency Data can be correlated with other important interaction-based and visit-based data.
In last week’s post we tabulated (in a Google Spreadsheet) the comparison of Custom Variables and Events. Today, we apply that knowledge and offer the configurable javascript code that makes it all happen.
Let’s first look at the sample reports so we can get a visual grasp on the end result (unless you’d prefer to jump to the Solution Design description
My purpose today is not to prove or disprove that latency adversely impacts the user experience or affects revenue and conversions. It is to describe the a near “drop-in” solution and the use of Custom Variables and Events to create solutions to more complex problems and to display sample reports.
The sample data was generated using test pages to simulate the various scenarios (visits of varying loading times, skipped Entrance or mid-visit pages and different traffic Sources). The actual code was used but load times random values within each of slow, medium and fast ranges overwrote the actual load times. This simulation was done to deliberately favour faster visits to demonstrate what that looks like in the reports.
All but one of the Reports are from our Google Analytics solution. That solution was based on our original implementation in SiteCatalyst. Here is a custom SiteCatalyst showing real, live Data. Although the data is obfuscated to protect the anonymous, the percentages have been retained:
But here’s a kicker! Those numbers are from the US site. The identical Canadian site shows that around 85% of traffic accounted for the fastest loading visits and a little less than 85% or Revenue!!! Do Canadians have a faster network infrastructure and more patience?
Lets start with the juiciest data
A Visit Average Load Time is the average of the load times of all the pages in a visit. It is reported throughout the visit in a Session Level Custom Variable (SLCV). As with all Custom Variables, the last value in the scope (Session) overwrites all previous values so only the last reported value is retained.
Exhibit 1 first shows a report of the Session Level Custom Report (Slot/Key #3) showing the Average Load Times of all Pages in a Visit, as reported at the end of the visit.
The first Performance Graph shows the load time periods and the percentage of revenue that visits of each load time period contributed.
Values are similar in nature to any custom other Session attribute, such as “Purchaser”, “Member” etc. Here, instead of “Member” or “Subscriber”, the session is described by the Average Load Times of all pages in the visit, as at the end of the visit.
The report shows 70% of total revenue is contributed by visits in the fastest range and 80% by the top two.
If the ranges show similar percentages for revenue as for visits, the report might suggest that Latency does not impact Revenue on this site.
Lets see the distribution of Visits in Exhibit 2. The most lucrative visits accounted for only 30% of all visits. Clearly, the fastest loading visits contributed way more revenue than their “fair share”.
Superficially, at least, it follows that reducing latency to bring more visits up within the fastest ranges is likely to result in increased revenue.
The more the trend holds true across other relevant segments such as Geography and Traffic Sources the greater the likelihood that reduced Latency means increased revenue.
And here is where this solution really pays. Exhibit 3 shows Revenue attributed by Average Load Time and Medium. Medium (or Source or Keywords or Landing Pages, etc) are not always the determining factor. Even when one is, how do you know it to be main issue. Here, 70% of Revenue comes from the fastest visits, regardless of Medium, so where should this site owner be investing for increased revenue?
Conversely, if conversions are not what you’d expect in other segments (eg Branded Search or Paid Search, etc) and optimization efforts are not bearing fruit, measuring PLL may explain things.
There are 2 ways of tracking Load Times of Bounced Pages. Both are in the main profile and neither use Events. Here’s one method for now.
Exhibit 4 shows what the report looks like when the slowest Landing Page load times account for the greatest percentage bounced visits.
The same variable (Session Level, here using Slot 3) is used for both Bounced and Visit Average Load times. On the first page of a visit, the value is submitted with a key “Bounce_PageLoad_Times”. If there are not other pages, that Key and Value remain.
Exhibits 5 & 6 show the Individual Page Load Times of the Pages on the site. Unlike Visit Average Load Times, these are Page Level Custom Variables and can therefore be broken down by Page as shown in Exhibit 6
Exhibit 6 identifies the actual pages that loaded within 42 to 44.9 seconds, the worst on the site, and which ones bounced.
And that’s the 2nd way to report Bounce Pages Load Times. Exhibit 5 shows the Bounce Rates by Time Range while Exhibit 7 shows the Bounce Rates of pages when they loaded within the 42 to 44.9 second time range.
Exhibit 7 lists all actual Entrance Pages that were skipped and not recorded as Pages Views. It also shows the Sources that GA recorded for their visits. Comparing the Skipped Sources to the Skipped Entrance Pages above, we see in rows 5 & 6 that campaign data was lost. In row 5, the visit was attributed as a Direct visit while in row 6 the visitor returned via the same campaign as before (this time!).
The Event Label is used to capture the Referral Path that lead to the visit begun by a skipped Entrance Page. Exhibit 8 shows those Referral Paths matched with the actual Medium of the visit. Rows 3 and 4 show missed Mediums-their referral paths indicate both were referred rather than from an email campaign or direct.
Placing code at the top of the page (hopefully it’s the Async snippets) will deal with skipped tracking. Then the problem is, and the solution identifies visitors not seeing your content or experiencing your site as you expect.
Tracking Skipped Mid-Visit Pages is almost identical to Entrance pages except that the Event value records the number of skipped pages before the one that was recorded. The number should always record at least 1 so the metric to watch is the Average Value. Sorting descending on Average Value will bring pages to the top that more typically followed a series of skipped pages. This suggests either a problem with the GA tracking on those pages (but not with the initial snapshot), a slow section of the site or a section with which your visitors are very familiar.
The other numbers to watch are the differences between Total and Unique Events. The greater the gap, the more frequently such pages are being skipped.
The solution comprised of two parts:
Tracking code implemented at the top of the page will deal with skipped tracking but we are talking of visitors skipping pages and ads and products and other messages. The code for this implementation currently uses the “Legacy” Code but will be converted to use async code. It will still track whether users have skipped pages.
The common features between the two parts are:
While they are related, one does not have to use the two solutions together.
Relating the general strengths and weakness of CVs and Events tabulated in my previous post to our specific problem:
Events:
Custom Variables have their own set of strengths, weaknesses and one interesting quirk:
The PLL and Skipped pages fit nicely within all these attributes and quirks:
The easiest configuration is:
We are making the code available upon request. Leave a comment in the code requesting it, if you are happy with the following:
If users of the code are interested, we will gladly do a joint case study for submission to Google for posting on their blog.
Brian Katz – Analytics – VKI
As consumers become increasingly digitally savvy, and more and more brand touchpoints take place online,…
Marketers are on a constant journey to optimize the efficiency of paid search advertising. In…
Unassigned traffic in Google Analytics 4 (GA4) can be frustrating for data analysts to deal…
This website uses cookies.