Skip to Main Content

Are Your Google Analytics Site Speed Metrics Accurate?

Google Analytics Website Speed is a free real user monitoring (RUM) tool that exists in your GA dashboard under the “Behavior” menu. RUM measures site speed and captures other session information via a snippet of code included in a web page. GA4 does not offer site speed reports, but you can still use UA’s tool and compare the metrics with caution.

Last year we published a blog called How to Cut Through the Noise and Find Meaningful Data With Google Analytics Site Speed.  In this article, we identified issues that could make the standard output of site speed data in your GA portal noisy.  They include the low default sampling rate, the use of mean averages (rather than medians), and others.  These issues don’t mean the data is useless, but they do add a few steps to the process of getting meaningful output.

 Since then, our client services team has encountered another issue that has left some of our customers and prospects scratching their heads.  It’s the prevalence of failed data points and the effect those have on the averages for site speed over time.
 Let’s explore by example.
Google Analytics Site Speed
This screenshot shows a period-by-period comparison of site speed for an online retail company.  As you can see, the Average Page Load Time (sec.) is 9.5 percent worse — that is, slower — in the recent period, vs. the previous period.  Bad news!
But let’s take a peek at the data, which you can export into a CSV.  Here’s an excerpt from the file sorted by Average Page Load, low to high:
Site Speed

347 of the 601 samples for the first date range threw zeros, meaning over half of the samples reported no data for Average Page Load Time. Errors like these are a common occurrence in RUM data, due to connection timeouts, failed requests, and other inconsistencies of testing on the live Internet.  But it goes without saying that having over half the samples fail will not lead to an accurate reading. What’s more,
they are not identified as errors, so they are factored with equal weight as the real samples into the averages shown in the dashboard.

In the next date range, it’s a similar story: 301 of the 601 are zeros. For those keeping track at home, that’s 648 failed samples out of 1203 total.

How do we fix this? Easy. Delete the zeros and re-run the average calculation.  It’s not a perfect solution — for instance, samples are supposed to be collected at consistent time intervals, and deleting half of them will throw off that aspect of the testing methodology.  But it will certainly be a more accurate output than it is with hundreds of errors weighing in the average.

Here are the averages for page load time with all of the zeros deleted:

Site Speed

These averages show that in the recent time period, May 7-31, the site actually had a faster average page load time, not slower.  In fact, it’s 23% better — so if you factor in the 9.5% decline that was shown in the GA dashboard, that’s a nearly 33% swing gained by just cleaning up the data.

Not every site will experience the same error rate, or one this extreme. But we’ve observed over time that these findings aren’t far out of the norm.
 So, what have we found?
  • In Google Analytics site speed, errors occur frequently, and not consistently enough to make accurate comparisons of different time frames without some scrubbing
  • Those errors (predictably) can throw off the averages substantially
  • A little data analysis can go a long way – don’t blindly trust your data, especially when the source is not purpose-built. (There are many solutions that focus exclusively on performance monitoring via RUM; for Google Analytics it’s a value-add feature).
The conclusion here is the same as our last post on cutting through the noise.  Is this tool useless? No – RUM data is very interesting and potentially actionable, and GA is one of the only tools that allow you to do it so easily and for free.  But you should take steps to make sure it’s not misleading.  That may include comparing the site speed data to the results of a free synthetic monitoring tool like or and following general analytics best practices like checking for extreme outliers.

new conversion equation ebook


Don’t let slow site performance cost you conversions.Let's Talk