New Yottaa Monitor Feature: Performance Benchmark
We’re excited to announce the launch of Yottaa Benchmark as a feature in our Yottaa Monitor service!
Yottaa Benchmark is the simplest way to compare the performance of your website or web application against competitors. Yottaa Benchmark includes data visualization that enables performance pros and neophytes alike to trace performance data from high-level trending down to milliseconds.
Armed with knowledge of how your pages compare to the competition you can set goals and create a plan to improve performance – and to make sure you stay ahead!
This post covers:
- Major features and basic how-tos
- Suggested sites and pages to benchmark
- Resources that help you to get the most out of the tool
contact us to get started.
Here are some of the features that have us so excited about Yottaa Benchmark
When setting up a benchmark, you can choose up to 10 URLs (4 URLs for users with free Community plans). You have the option to select geography that makes sense for you, whether it?s worldwide or a single continental region.
You may also use the Advanced Features tab to add login pages, or pages that exist behind a login screen. This means every web property, from complex SaaS applications to mainstream web applications to simple HMTL-only pages can be placed side by side in performance.
The samples used in a benchmark are taken from the pool of available samples in your Yottaa Monitor account. (For Community users benchmark samples don’t count against the pool, but the number of sites is limited to 4). Samples are collected once per day, per location, per URL ? i.e., if you benchmark 5 sites from North America (which includes 4 locations) you?ll collect 20 samples a day.
The Trending Graph
The centerpiece of benchmarking is the trending graph, where you can see all of the data from your benchmarks in one place.
On the graph, the default comparison is Time to Interact. This is the closest measurement of the moment the visitor feels the page is done – some scripts may still be loading in the background, but the visitor is able to interact with the page.
Beyond Time to Interact you may compare other metrics, such as content metrics and incremental stages of the page load process. Sites will rank differently for metrics such as Time to Start Render, number of assets, and Connection Time. Every page is unique, and comparing a number of metrics reveals where performance bottlenecks exist.
Another way to customize the trending chart is to expand and contract the time frame with the date range selector to compare long-term and short-term trends. (Alternatively, click and drag within the graph field to automatically zoom to that time segment for a detailed view.)
The red and green fields in the graph show average and top/bottom 25% for whatever metric is displayed. These are based on percentiles gathered with a sample of 14,000 websites ? they do not change depending on the data shown on the graph. These provide a frame of reference for the performance data shown. If all the sites are in the benchmark are in the green, they are exceptionally performing sites; if they are all in the red, they are uniformly slow.
Metrics Ranking Comparison
The screen capture ?film strip? in the metrics ranking comparison helps to replicate the average user?s experience, adding visual context to the trending data. Since the data is taken from real browser samples, the screen shots show exactly what a human visitor would have seen had he or she accessed the site at that moment with the same browser and location.
The slider broadens the gaps in time for the screen shots. The default setting is the most granular (250 ms), but if you?re viewing pages that represent a wide range of performance, expanding the interval may be the only way to see a comparison that shows all the images side by side in the same frame.
Detailed Metrics Comparison
This section shows a side-by-side view of metrics with overall averages, as opposed to trending over time. Whatever metrics have been viewed in the trending graph will automatically appear in this matrix; any additional metrics may be added through the green ?edit? button. The figures given are an average of all samples collected.
The color codes of the tiles in the matrix correspond to the percentile number shown in each (i.e. “Top 37%” or “Bottom 9%”). These percentiles were collected from a study of 14,000 sites. The color codes offer a quick view of how sites compare not just to each other, but also relative to averages across the web.
Individual data samples
Clicking a tile in the Detailed Metrics Comparison provides a list of the samples that determine the average (above). This way you can quickly see how consistent the page is, sample by sample. Clicking the “Sample Time” leads to a detailed summary page on the metrics from that single sample, including a waterfall chart (below). You can use this information to dig even further into performance and identify bottlenecks asset-by-asset.
What Sites Should I Benchmark?
We’ve covered what you can do with the tool – now, how should you set it up?
First, think of your closest competitors. These are the clearest choices for a benchmark. You want to know whether your site is faster or slower than the sites you compete with on a day to day basis in order to set optimization goals.
But don’t stop there. Think outside your competitors, to other sites that feature content related to your business. These could be blogs, news sites, or big name sites like Amazon. You could even include sites that have nothing to do with your market, but happen to have a similar look and feel to yours. You want to collect information that will help you to improve your site, even if it’s not a direct reflection of whether you’re winning customers away from another site. Finding out what slows down websites in general helps to inform your performance policies.
For example, if you are from flash sale site Rue La La, you might benchmark other flash sale sites:
- Retailfetish.com (an aggregator of flash sale sites)
- Wayfair (a leading eCommerce site)
- Amazon (the leading eCommerce site)
- Polyvore (style ideas and tipes)
An entirely different approach is to benchmark multiple pages within your own website or web application. Many Yottaa Benchmark users create separate benchmarks for this purpose. This way, you can take a quick look at whole-site performance and instantly pin point areas for improvement. You might observe that some pages have been slowing slightly over a period of a week, while others have not. This would be cause to find out what common element across those pages is causing a problem.
You could benchmark:
- Ruelala.com/event (home page when logged in)
- Ruelala.com/event/child/73133 (a sample sale page)
- Ruelala.com/event/product/73133/1111944356/0/DEFAULT (a sample product page)
- Ruelala.com/common/faq (FAQs)
- Any other pages that draw significant traffic