yottaa sprocket Support Login

Valuable tips and solutions for process management, delivered straight
to your inbox.

Subscribe to Our Blog

Top Javascripts Slowing Down the Web: The Web’s Dirty Dozen

Ari Weil on Jan 25, 2013 5:30:00 PM

These days, web pages make nearly universal use of Javascript to deliver a richer experience.  But have you ever wondered what the impact of those Javascripts on performance might be? (I have!)

Well, because Yottaa’s web monitoring service monitors tens of thousands of web sites and has accumulated a database of tens of millions of web performance samples,  we have a treasure trove of data.  So I set out to answer the following questions:


  • What Javascripts are most popular?
  • How long do these Javascripts take to download?
  • Are there ones that have a particularly poor “service level agreement” (SLA)?
  • Does it matter whether you host the Javascript on your servers, vs. a third-party server – in particular, a CDN?
  • How is the Javascript’s delivery impacted by its market share, size, and location?

The "Dirty Dozen" Javascripts

Looking at data across millions of samples, I looked at the following metrics:

  • % of web sites that feature the Javascript (as a proxy for market share);
  • Average size of the download;
  • Average Time to Last Byte (i.e., how long it takes to download);
  • Percent of samples with Time-to-last-Byte of >1.0 seconds (given that most Javascripts take less then 500 msec to download, this metric helps identify samples with a  particularly poor download times, as a proxy for a good/bad SLA).

To identify the worst offenders impacting speed of the web, I developed a “figure of merit” that combined the Javascript’s popularity with the fraction of time its download took over 1 second.   After all, if a widget is popular but takes little time to download, it may not have a particularly adverse impact on performance; and if the widget takes a long time to download, but is not particularly popular, few sites and visitors are affected.  But a widget that has a high percentage of slow download times, AND is popular, has a higher likelihood of screwing up performance.

Here are the results:

File Name% of Web SitesSize (Bytes)Time to Last Byte (msec)Samples with > 1 sec. Time to Last Byte (%)
jquery.js 19.7% 65,939 1,039 40.9%
ga.js (Google Analytics) 66.7% 14,888 253 8.6%
plusone.js (Google +1) 9.8% 7,036 823 26.7%
swfobject.js (Flash) 11.5% 6,815 410 7.2%
all.js (Facebook) 12.2% 60,698 363 6.4%
jquery-ui.min.js 3.8% 30,200 495 18.2%
p.json (AddThis) 5.0% 236 506 7.1%
widgets.js (Twitter) 9.5% 23,061 231 3.5%
quant.js (QuantServe) 7.1% 2,320 220 2.5%
conversion.js (Google AdWords) 4.3% 2,467 197 3.0%
count.json (Twitter) 5.1% 497 311 1.7%
show_ads.js (Google AdSense) 4.0% 6,272 196 1.8%

So for example, the Google Analytics Javascript is used by nearly two thirds of web sites within our sample; is 15KB in size; and takes 0.25 seconds to download on average, with 8.6% of the samples taking longer than 1 second.  On the other hand, the Google +1 widget has half the download footprint (7KB), but a much wider spread in performance: nearly a second to download, with over a quarter of the samples taking longer than 1 second.


With these metrics captured, we can identify the “Dirty Dozen” – the top Javascripts affecting web performance:

javascript dirty dozen

Javascript Performance versus Market Share

How does a Javascript’s popularity impact performance?  Here’s the empirical data for the entire population of Javascripts we looked at:

javascript dirty dozen   market share vs performance 
Here we see that for the less popular Javascripts, there’s a wide spread in performance.  On the other hand, for the more popular Javascripts, the spread in performance drops.   What I suspect is happening is that as Javascripts become more popular, it is increasingly impactful – and thus important – to have a quick download time, and (as we will see later) web developers turn to techniques like caching copies of the Javascript around the globe.

Speed of Google Analytics Around the Globe

Since Google Analytics is the world’s most popular Javascript, I thought it’d be interesting to explore its performance around the world.  Here’s the stats from 8 cities.  It’s interesting to

note that even the web’s best funded infrastructure has notable challenges: the Javascript takes one or two tenths of a second to download in San Francisco  and New York (and >98% of the time it downloads within 1 second), whereas Hong Kong and Sydney take three to five times longer, with nearly 20% of the samples taking longer than 1 second.

javascript dirty dozen   GA performance

Hosting: Should You Host on 3rd-party domain or Your Own?

Should you host the Javascript on your servers?  Or on 3rd party servers (e.g., the Javascript vendor’s servers, GitHub, CDNs)?  Well, the good news is that for a fairly large set of Javascripts, there were samples of it hosted both at the same domain as the rest of the page, AND samples of it hosted elsewhere.  In the plot below, each point represents a unique Javascript: the X value corresponds to Time-to-last-Byte when hosted on the same origin server as the rest of the page, whereas the Y value corresponds to the Time-to-Last-Byte when hosted at a 3rd party.  This data suggests that though there’s some variation, there’s not an obvious benefit to hosting it elsewhere, ON AVERAGE(I emphasize “on average” because it turns out there ARE approaches where 3rd-party hosting is beneficial – see next section.)

javascript dirty dozen   hosting

Javascript Download Bandwidth

Knowing the size of the file, server IP address and the download timeline, we can compute the individual server’s bandwidth.   And as it turns out, this varies widely – typically a couple hundred KB/sec, though (as illustrated in the histogram below) there’s a spike at the high end.

javascript dirty dozen   BW

It turns out that a small set of very fast IP addresses serve a large volume of the web’s Javascripts.  I thought it’d be interesting to see whose servers they are.  Not too surprisingly, Google and (CDN leader) Akamai dominate the list – delivering files an order of magnitude faster than the rest of the web.

javascript dirty dozen   Fast IP

Impact of File Size on Javascript Download Time

The final part of the analysis was to see what impact the Javascript’s size has on download time, and the percentage of samples above a certain threshold (I picked 1 second).  As expected: Larger files take longer to receive.

javascript dirty dozen   sept

3 Tips on What to Do...


Presumably, if you're using a Javascript, you actually need it.  So short of eliminating it, what can you do to optimize performance?  Here's 3 tips:

  1. Measure the page performance - for example, using Yottaa's free multivariate testing tool  Website Test and see whether the Javascript impacts it materially by exploring the waterfall chart.
  2. The bigger the download, and the more there are of them, the longer it will take to receive the files and render the web page.  Consider minifying and concatenating the Javascripts (either manually, or via an automated web acceleration service such as Yottaa).
  3. Consider leveraging a content delivery network such as Akamai's or Yottaa's CDN.


There’s several ways I can imagine improving on this analysis:

  1. Though I believe our sample set is fairly broad (tens of thousands of web sites, and millions of browser samples), repeating this for a larger set of sites might give an even more representative view of the web;
  2. In this analysis, I’ve looked strictly at the Time-to-Last-Byte metric – versus how long it takes to execute in the browser, or whether the Javascript blocks any page assets from loading prior to execution.  Looking at this may give a more accurate measure of the impact on user experience.
  3. In the “figure of merit” to select the top offenders, I looked at the Javascript’s popularity across web sites, and the download SLA.  But some sites are much more popular than others, and Javascript performance problems on those sites would impact a larger population of users.  So in the future, it may be interesting to weigh the “figure of merit” by the % of users affected, rather than just the % of sites affected.

Thoughts / Suggestions?

I’d love to hear how you think about Javascript performance!  What metrics, experiments, test tools have you found useful?  Thoughts on how I can improve on the above study?



Ari Weil

VP of Products | I am a hands-on, results-focused, resourceful and creative product leader with a track record for successful solutions from initial concept throughout the product lifecycle. Filling various management and operational roles at each company I've worked, I enjoy and thrive under pressure. Nirvana is a dynamic, fast-paced organization where creativity, quality, collaboration and customer focus are key to delivering truly impactful products. Specialties: Product management, lifecycle management, networking and communication, database administration, performance tuning, production deployment and support, some system administration background. Experienced and engaging public speaker and evangelist.