Well, because Yottaa’s web monitoring service monitors tens of thousands of web sites and has accumulated a database of tens of millions of web performance samples, we have a treasure trove of data. So I set out to answer the following questions:
- Are there ones that have a particularly poor “service level agreement” (SLA)?
Looking at data across millions of samples, I looked at the following metrics:
- Average size of the download;
- Average Time to Last Byte (i.e., how long it takes to download);
Here are the results:
|File Name||% of Web Sites||Size (Bytes)||Time to Last Byte (msec)||Samples with > 1 sec. Time to Last Byte (%)|
|ga.js (Google Analytics)
|plusone.js (Google +1)
|conversion.js (Google AdWords)
|show_ads.js (Google AdSense)
Speed of Google Analytics Around the Globe
Hosting: Should You Host on 3rd-party domain or Your Own?
Knowing the size of the file, server IP address and the download timeline, we can compute the individual server’s bandwidth. And as it turns out, this varies widely – typically a couple hundred KB/sec, though (as illustrated in the histogram below) there’s a spike at the high end.
3 Tips on What to Do...
- Consider leveraging a content delivery network such as Akamai's or Yottaa's CDN.
There’s several ways I can imagine improving on this analysis:
- Though I believe our sample set is fairly broad (tens of thousands of web sites, and millions of browser samples), repeating this for a larger set of sites might give an even more representative view of the web;
Thoughts / Suggestions?