App Performance Tournament Wrap Up: Inside Great Online Retail Experiences
If you’ve been following along with our Retail Madness tournament, you know we’ve been tracking the customer experience of major online retailers according to our CXi, or Customer Experience Index.
Customer Experience vs. Web Performance
Since the drama was taken out of the “finals”, we thought we’d do something different. The chart below shows CXi rankings against a “pure performance” measure, a blend of Time to Start Render and Time to Display (weighted equally) for desktop pages.
Measures like this one are what drive much of the performance conversation, whether its the data found in Internet Retailer guides, Keynote performance benchmarks, or boardroom presentations from CIOs. They’re easy to understand and easy to gather.
That means mobile performance, page complexity, and the many other factors lurking behind those performance figures are less often part of the CX discussion (if one is happening at all). That’s in part why we created the CXi: in an attempt to bring together more of the many facets of user experience that ought to be tracked and optimized in any web business.
This chart shows where companies land when you consider both our CXi score and pure performance (the standard score).
- The top half of the chart are the best performers according to typical performance measures.
- The top left quadrant is fast sites that scored low in the CXi for some reason — be it that mobile experiences were not on par with the desktop in either performance or richness, or that simply there was a lot of extraneous stuff on the pages.
- The bottom right is companies that have scored highly for the CXi in spite of lower-than-average performance. That could be because they had strong scores for mobile and rich user experiences to offset middling performance.
Note: this reflects relative scores, not actual scores, to ensure a legible distribution.
So what did we learn in this Customer Experience project?
- m.dot sites are far from dead. They may seem passé for a certain cohort of the tech industry, but many of the high-scoring retailers, including Walmart and Grainger, were able to dominate the customer experience competition using these more traditional solutions. (This may speak more to the general shortcomings of particular responsive implementations than the strength of the m.dot solution, however).
- Amazon can break the rules. They scored poorly on the CXi, which was based on correlative data comparing conversion rate and various content complexity metrics, plus performance best practices. Yet their success is undisputed. There’s a lot more behind every purchase than the several criteria we selected for this measurement, and Amazon proves that the less tangible features can prove to be the difference maker.
- Technology-related companies fare worse than others; Luxury is the worst of all. Predictably, luxury brands choose glitz over utility. Perhaps their customers are less sensitive to performance and usability; but just think – if they took some cues from mass retail and boosted conversions by even fractions of a percent, the payoff could be huge, as average tickets for those companies can hover around $500.
Most of all, we learned that there’s a broad range of actual experiences, even among the most successful sites. There’s certainly not a replicable “formula” for a great online retail experience. What matters most is a solid performance and feature delivery.