What the Fastest Sites Did Differently: Insights from Yottaa’s Inaugural Site Speed Awards 

When we analyzed performance data from over 1,000 of North America’s most popular eCommerce sites during the 2025 holiday season as part of our inaugural Site Speed Awards, one pattern emerged immediately: popularity don’t necessarily equal digital excellence. 

How We Measured Performance

To identify the fastest sites of the holiday season, we didn’t just look at one-dimensional metrics like load time. Instead, we used Yottaa’s Health Score — a comprehensive measurement grounded in real shopper sessions across top retailers. Health Score is calculated from four weighted components, each derived from Core Web Vitals and other critical website performance metrics. This includes:  

  • Backend Performance — How quickly servers respond (measured via TTFB) 
  • Rendering — How fast content appears on screen (measured via LCP) 
  • Interactivity — How responsive the site is to user actions (measured via INP) 
  • Visual Stability — How stable the page layout remains during loading (measured via CLS) 

Each pillar contributes to an overall score out of 100, with higher scores indicating faster, more reliable experiences. Sites that maintain strong performance across all four pillars achieve the highest Health Scores, while those that neglect even one pillar see their overall scores suffer.  

Why does this matter? Our data consistently shows that better performing websites leads to higher conversion rates and improved sales. 

To ensure fair, like-for-like comparisons within each industry, we further filtered results using the Tranco ranking, a widely used measure of site popularity and traffic footprint. For each industry category, only sites in the top 50% by Tranco rank were eligible for awards consideration. This ensured winners weren’t just fast in isolation, but fast at meaningful scale.  

Our rankings incorporate: 

  • Real-user performance data captured during the busiest shopping weeks, not synthetic tests 
  • Core Web Vitals signals that reflect the four pillars above 
  • Consistency over time across a 28-day period 
  • Stability during traffic spikes when campaigns launched and shopper volume surged 

This methodology brings together user-centric metrics, engineering reality, and operational resilience, giving retailers a meaningful benchmark for digital experience quality. 

Consider the following examples.  

Top 100 Revenue Brands with Poor Performance: 

  • Fabletics: Health Score of 17.80 
  • Chico’s: Health Score of 21.11 

Mid-Market Brands with Elite Performance: 

  • JetPens: Health Score of 99.77 
  • Sports Warehouse: Health Score of 99.35 

The Site Speed Awards recognized the retailers that delivered the fastest, most reliable experiences during holiday traffic. But beyond the rankings, the data reveals clear patterns about what separates performance leaders from underperformers — and the insights challenge conventional assumptions about site speed. 

The Performance Divide: Why Big Brands Struggled

Revenue scale doesn’t protect brands from performance debt. In fact, our data suggests that it often accelerates it. 

Larger organizations typically run more complex tech stacks. More teams ship scripts. More personalization and testing layers on top of already heavy pages. Without strong governance, that complexity compounds into slower experiences, especially under the sustained pressure of holiday traffic. 

The winners of the Site Speed Awards aren’t necessarily the biggest names in retail. They’re the brands that prioritized performance discipline and maintained control when third-party scripts fired simultaneously, campaigns launched back-to-back, and traffic spiked unpredictably. 

Scale is an advantage only when paired with governance. Without it, complexity quickly becomes the enemy of speed. 

INP Is Where Digital Experiences Live or Die 

The strongest signal in our data was the outsize impact of Interaction to Next Paint (INP) on overall Health Score. 

INP measures how quickly a page responds to user interactions. When INP is high, sites don’t just look slow — they feel broken. Buttons don’t respond. Filters lag. Navigation feels unresponsive. For shoppers accustomed to instant feedback, delays of even a few hundred milliseconds create friction that lead to cart abandons and lost revenue. 

INP exposes the hidden cost of modern eCommerce stacks: personalization engines, A/B testing tools, review platforms, chat widgets, and analytics tools all competing for main-thread time. Every script added to the page increases the risk of blocking interactions. 

How INP Impacts Overall Performance

Because INP feeds directly into the Interactivity component of Health Score, poor INP performance can collapse overall scores even when other metrics look acceptable. Our data showed a clear relationship: 

Elite Performers (Health Score 95+): 

  • INP under 100ms 
  • Interactivity scores: 95-100% 
  • Overall Health Score maintained despite traffic spikes 

Stability Zone (Health Score 80-94): 

  • INP 100-200ms 
  • Interactivity scores: 85-95% 
  • Performance holds but margins narrow under heavy load 

Collapse Zone (Health Score below 80): 

  • INP 200ms+, often exceeding 500-1000ms 
  • Interactivity scores drop below 50% 
  • Overall Health Score collapses as Interactivity component fails 

The Amplifier Effect

INP doesn’t just hurt performance on its own — it amplifies problems in other areas. Sites with poor LCP and poor INP create compounding frustration. The page is slow to appear and unresponsive when shoppers try to interact. 

Many retailers believe they’ve handled performance once LCP is acceptable. But our data showed sites with decent rendering scores that still failed because INP went unaddressed. The brands that scored highest maintained both fast rendering and responsive interactivity. 

INP is the hidden performance tax of growth and experimentation. Every new feature, every additional script, every personalization layer adds to that tax. Holiday traffic made it impossible to ignore. 

The 80 Threshold: Where Performance Discipline Becomes Critical

Performance doesn’t degrade gradually. It falls off a cliff. 

Across our dataset, we observed a clear threshold around a Health Score of 80, which corresponds with passing Core Web Vitals scores. Above that line, sites showed manageable variance and controlled performance degradation. Below it, performance decay accelerated rapidly, with multiple component failures compounding into slow experiences. 

Above the Threshold (Health Score 80+) 

  • Sites cluster tightly around stable performance ranges 
  • Functionality is maintained even under stress 

Below the Threshold (Health Score Below 80) 

  • Rapid, compounding decay 
  • Single failures cascade into multiple issues 
  • Performance problems feed more performance problems (timeouts trigger retries, heavier scripts load to compensate) 

The data suggests that once performance slips past 80, fixes become significantly more difficult. Poor performance creates a negative feedback loop that’s hard to escape during peak traffic. 

For brands that didn’t make the awards list, this threshold provides a clear target. Getting above 80 establishes the foundation for sustainable performance discipline before the next holiday season. This aligns with Google’s Core Web Vitals passing threshold, making it a meaningful benchmark for both user experience and search performance. 

Mobile Is Where Execution Separates Winners from Everyone Else

Desktop performance has effectively been standardized across the industry. Most retailers have solved the basics of fast desktop experiences. Mobile remains the true execution battleground. 

Elite performers maintained device parity, delivering nearly identical experiences across mobile and desktop: 

  • JetPens: 97.71 desktop vs. 99.82 mobile (+2.11 points) 
  • Sports Warehouse: 99.67 desktop vs. 98.63 mobile (-1.04 points) 

Meanwhile, other brands displayed eye-popping discrepancies between their desktop and mobile experiences: 

  • Express: 50.22 desktop vs. 31.06 mobile (-19.16 points, a 38% drop) 
  • Meijer: 25.24 desktop vs. 16.11 mobile (-9.13 points, a 36% drop) 
  • Apple: 90.13 desktop vs. 60.37 mobile (-29.76 points, 33% drop) 

Why Mobile Execution Is Harder

CPU constraints magnify INP issues. Mobile devices have less processing power to handle heavy JavaScript execution, making bottlenecks more severe. 

Mobile networks also expose third-party inefficiencies. Even with improved 5G coverage, variable network conditions make poorly optimized scripts more noticeable. 

Mobile experiences are also more interaction-heavy. Filters, swipes, carousels, and tap targets demand responsive execution. Delayed interactions feel broken on mobile in ways desktop users might tolerate. 

The pattern is clear: winners don’t just have fast sites — they have consistent performance across devices. Performance leadership today is a mobile execution problem, not a desktop optimization problem. 

Winners Governed All Four Performance Pillars

Across all categories and platforms, the winners shared a consistent trait: they maintained control across all four performance pillars simultaneously. Elite performers scored 95%+ on all four pillars. 

Mid-tier sites made tradeoffs, maintaining strong rendering but weak interactivity, or solid backend performance but poor visual stability. But once one pillar fails, the others follow. The compounding effect of multiple weak points creates experiences that feel fundamentally broken. 

The fastest brands look at the entire picture of performance — understanding how the pillars interact, how weakness in one area threatens the whole structure, and how consistent tuning and maintenance across all metrics creates resilience. 

Vertical Complexity Increases Risk, But Control Beats Complexity

Our analysis revealed distinct patterns across retail categories, and the widest performance variance appeared in the most experience-dense verticals. 

Apparel & Fashion: Rich Experiences Don’t Require Slow Performance

Winners: Von Maur (99.22) 

Mid-tier: Journeys (76.81) 

Collapsed: Express (35.34), Carhartt (35.77), Chico’s (21.11) 

Rich imagery, personalization tools, and dynamic merchandising make apparel sites inherently complex. The fastest apparel retailers controlled third-party execution and reduced early render delays without compromising visual storytelling. They proved you don’t have to choose between rich experiences and fast performance — but you must govern the complexity. 

Health & Beauty: Bi-Modal Distribution

Winners: Goop (89.24) 

Mid-tier: Swanson Vitamins (77.76), Liquid IV (76.88), Glossier (76.77) 

This category relies heavily on user-generated content (UGC), reviews, and recommendation engines — elements that slow load times when left unmanaged. The top performers ensured these features loaded progressively and didn’t block the initial shopper experience. This means review widgets load after core content. Recommendations render asynchronously. The experience feels rich, but the initial page stays fast. 

Sporting Goods & Outdoor: Most Consistent Performance

Winners: Sports Warehouse (99.35), EVO (97.10) 

Mid-tier: Traxxas (77.20), West Marine (76.17) 

Promotional intensity and broad product catalogs create pressure in this vertical, but performance remained relatively consistent compared to apparel and beauty. Top performers kept product listing pages and product detail pages lean, prioritized stable checkout performance, and avoided over-layering promotional scripts during campaign pushes. 

The Lesson: Governance Over Category

The gap between best and worst in experience-dense categories like apparel (nearly 80+ Health Score points) proves that category complexity doesn’t determine outcomes. Control over complexity is the key. 

The Roadmap for Improvement

If your brand didn’t make this year’s awards list, the holiday season still provided valuable data. Peak traffic exposed weaknesses that remain invisible during normal patterns — and those insights become your advantage for 2026. 

The patterns are clear: 

Get above the 80 threshold. This is your first milestone. Below 80, performance decay accelerates and fixes become exponentially harder. Above 80, you establish the foundation for improvement. 

Focus on INP. If you optimize only one metric, make it Interaction to Next Paint. INP exposes the true cost of your tech stack and predicts how your site will feel to shoppers, not just how fast it renders. 

Achieve mobile device parity. Desktop performance is table stakes. The real differentiator is delivering consistent experiences across devices. If your mobile Health Score is more than 5 points below desktop, you have a critical execution gap. 

Govern all four pillars. Don’t make tradeoffs. Backend, interactivity, rendering, and visual stability all matter. Weakness in any single pillar creates vulnerability when traffic spikes. 

Implement governance, not just optimization. One-time fixes don’t survive holiday pressure. Sustainable performance requires continuous monitoring, automated controls, and disciplined management of third-party complexity. 

Performance Discipline Defines Digital Excellence

The Yottaa Site Speed Awards revealed a fundamental truth: performance discipline, not brand size or budget, determines digital experience quality. As eCommerce becomes more competitive and shopper expectations continue to rise, performance becomes foundational, not optional. 

Want to see how your site performed during the holiday season? Request a performance assessment to understand where you stand against industry benchmarks and what improvements would deliver the biggest impact. 

Insights from Yottaa's Site Speed Awards

Signup for Free Web Performance Tips & Stories

Search