How a major ecommerce site learned to avoid assumptions & optimize for users
When building and optimizing websites, sometimes the zeal for things like faster load times and shiny new features causes developers and engineers to lose sight of underlying goals – namely increasing conversions and revenue. While certain levels of both performance and pizzazz are necessary for a successful online business, at Yottaa we often observe a point of diminishing returns. For instance, at a certain point shaving milliseconds off page load time stops helping the bottom line, while focus could instead be placed on optimizing other elements of page display and organization, through methods like application sequencing.
We recently come across a great third-party case study that underscores these observations. It comes via a presentation by Etsy engineer Dan McKinley that covers successes and failures in developments on the Etsy site. It’s a helpful presentation for anyone trying to make sense of using A/B testing in development, but one part stuck out to us in particular: a study of what went wrong when the team attempted to improve conversions by optimizing the search function.
Summary: When Optimization Backfires
(Watch the presentation from 7:45 to hear the story from Dan himself)
Etsy’s search team wanted to improve engagement on the site by optimizing the primary search function. They decided to follow in the footsteps of Google, who famously (in web performance circles, at least) saw their engagement metrics go up when they made their search engine results pages load faster. Google also saw increases when they showed more results per page.
Taking Google’s success as maxim, the team decided to implement infinite scrolling, a development technique on trend recently. In theory this would place more results in front of users faster, and speed up the search function in general. The team developed the feature, and achieved the intended behavior. Then they set about A/B testing its effectiveness on crucial engagement metrics like numbers of clicks, items viewed, favorites, and purchases.
The result? Crickets. After fixing some bugs and browser compatibility issues, the team still couldn?t explain why the hard-won update hadn?t improved user engagement on the site.
At a loss for a path forward, the team went back to basics: they tried testing the very assumptions on which they’d based the project. They ran experiments on the control group (the one without the new infinite scrolling feature). In one experiment the team artificially delayed rendering of search results by 200 milliseconds. No changes in purchases were observed. In another test, they doubled the number of results shown per page – again, no measurable change.
Through tests like these the team discovered that what had worked for Google simply didn’t apply to their site. Etsy?s buyers weren?t sensitive to small changes in page speed, and they tended to use the search function more when they had fewer results to look at. With these results, the team abolished infinite scrolling and landed on a simpler results page presentation using the new data.
Conclusion – Optimize for Your Users
The moral of Dan’s presentation was that rigorous testing/iterating process should be applied throughout the development process – not bolted on after.
We think this story can support a few other conclusions, too:
Never blindly follow assumptions
Websites today are just too complex for broad platitudes like “faster is always better” to hold water. Sure, for a site that?s awfully slow, improving performance should be a top priority. For others, it might be a third, fourth, or lower-level priority. In the example of Etsy, a company with a deeply ingrained performance culture and a fast-loading site, speeding up search slightly did nothing to affect end purchases. What did? Fine-tuning things like how much content show the user, and how that content is organized.
What’s on trend isn’t always good for business
Infinite scrolling and other trendy development tricks are often resource-intensive to create, and are only good insofar as they improve the bottom line. At worst they turn users away, as Etsy found: the control group with the simpler, paginated SERP was more reliant on the search tool to find their products than the group with infinite scrolling.
Test, test, test again
You can read 10,000 best-practices articles, but you’ll only know what works for your site if you examine user engagement metrics. Etsy uses home-grown metrics to determine success, but anyone can use Google Analytics to track a basic set of metrics like bounce rate, conversion rate, order value, and others. (Learn more about using GA metrics in our eBook, 10 Engagement Metrics Everyone Can Use.)
On Yottaa, We’ve built these ideas and many more into Yottaa With Yottaa you can tune content presentation and load sequence with context-specific rules, all of which are applied on the fly. Our built-in monitoring and testing lets you find out what works. Learn more.