Skip to Main Content
Performance

Introducing: Retail Madness and the Customer Experience Index

Update: since this introduction, we’ve published these posts with the results: 

At Yottaa we’re obsessed with performance and user experience. This year, in the spirit of March Madness, we’ve put together a tournament pitting some of the top U.S. retailers against one another in terms of who delivers the best user experience.

But the bracket is not all that’s new. We’re also debuting a new way to measure performance, an equation we’re calling the CXi, or Customer Experience Index.

Why develop a new index?

It’s a good question. There are already a number of useful measures of performance. Some common ones include the elapsed time at which the first bit of information arrives at the client device; the moment the first pixels paint in the browser; and the time at which all visuals are rendered. These are most useful if you are a performance pro or someone with a lot of spare time for analysis. But none of them tells the whole story: what that experience was actually like for the user.

Think of it this way – JavaScript is often used to augment the user experience of web apps. It’s the vehicle for ratings, reviews, live chat and more. But JavaScript is also known to slow down performance, especially those third-party scripts that call data from other locations. Thus if you’re trying to benchmark user experience, pure performance isn’t enough. You must account for things like JavaScript that can provide both benefits and drawbacks. It’s like comparing an Aston Martin to a drag racer. The drag car might have an edge in speed off the line, but it’s been gutted to the frame for performance, leaving nothing but plastic bucket seats. The Aston, on the other hand, is meticulously designed for comfort and driving experience (of which speed is a factor). Which would you rather use on a daily basis?

In short, our CXi attempts to give the Aston Martins of the world their due, and not overly praise the drag racers. Unlike other measures, which don’t factor in the weight or number of certain kinds of elements on a page, we include the complexity of the page in the assessment, as well as a weighted blend of the time-based metrics mentioned above. The CXi rewards those apps that are able to combine rich UX with fast page rendering, and apps that present content in an orchestrated fashion.

What have you done for mobile, lately?

The CXi doesn’t stop there. Mobile browsing has advanced in scale, and lines are now blurred between mobile and desktop experiences, native app and open web experiences, and even in-store and digital. We know that the majority of online purchases involve a multi-device conversion path, and that many of those paths begin on mobile. (Google found that 90% of online tasks are completed with multiple devices used sequentially — and that was back in 2012.) Moreover, mobile devices now account for 52% of all web traffic.

Given what we know about user behavior, it no longer makes sense separate mobile performance and desktop performance when looking at the overall customer experience. So for this tournament, we’ve collected data for desktop, mobile (wi-fi) and mobile (3G) to paint the whole picture.

Let’s get ready for tipoff!

Our method for the tournament is to take the top 64 retailers (according to the Internet Retailer Top 500) and group them into “regions” based on the type of business. The 1-16 seeds correspond to the company’s place in the IR Top 500 list, which is based on annual revenue. (If a company had more than one website or brand, we chose the largest). We will test the home page, a sample category page, and a sample product page for each website and run the data with our CXi calculation. In the coming weeks we will present our data on each region, focusing on trends we see in the data.

Who do you think will take the top prize?

Let us know in the comments or tweet at us @yottaa with your predictions.

(Click to enlarge)

2015-03-24_1738-1

Note: Since the most recent IR list is from 2014 and is based on FY 2013 data, you may notice some discrepancies in the relative rankings. We also had to skip over some companies, such as Netflix (#7 overall), which don’t fit with the consistent format for our testing method, and Office Max (#12) which has since been absorbed by Office Depot.

Don’t let slow site performance cost you conversions.Let's Talk