(This is part 5 of a series. For Part 1, see How Elevar Prototyped a Headless E-Commerce Store with Gatsby; for Part 2, see How Elevar Used Storybook With Gatsby to Support a Modular Design Process, for Part 3, see Product Experience Management with Gatsby: Delivering A Rich Product Experience, for Part 4, see Implementing Dynamic E-commerce Features With Strivectin and Gatsby)
Analytics, tracking events, and third-party JS are not designed for performance. Instead, they’re optimized for ease of installation and the broadest user agent compatibility. That all translates to an expensive performance hit for site visitors.
Site performance is often the case of “death by a thousand cuts.” A single script may not make a dramatic impact, but the multiple third party scripts included on most commercial sites definitely will.
The good news is that even if a handful of third-party scripts is unavoidable, there are creative solutions to mitigate the impact and still get solid performance.
Squeezing Performance out of Third-party Scripts
We recently spoke with Elevar about their work for StriVectin’s ecommerce site with a focus on site speed as they transitioned from Magento to Shopify. While there’s not one magical solution for every situation, Elevar’s methodical approach to evaluating third-party scripts through the lens of both their value to the business and their effect on front-end performance, they used a series of tactics for balancing business needs and performance needs.
— Thomas Slade, VP Engineering, Elevar…every merchant has 7 to 10 different services that are injected via JavaScript. We have to make sure those services work without dragging down performance for visitors.
Given the amount of additional JavaScript involved in supporting the shopping experience, Elevar focused on perceived performance for users rather than the timing for a perfect fully-loaded page. To do this, they relied on Lighthouse and even set up automated hourly testing to raise alarms if anything ever created problems with performance.
Using GTM’s “dataLayer” to Aggregate Events
In order to pull off the best possible perceived performance for visitors, Elevar employed a variety of tactics to both understand which tools and scripts created the most problems and identify or create solutions to mitigate any negative impacts on performance. They managed the third-party scripts using Google Tag Manager through a combination of a custom Data Layer and an app they built and published called Elevar’s Google Tag Manager Suite.
The `dataLayer` object with Google Tag Manager provides a powerful way to declare important values and information such as the current page, data about the visitor, or any other metadata that you may have in the system but that Tag Manager wouldn’t receive through the browser. With a `dataLayer` object declared before loading the Google Tag Manager script, it will automatically pass the data along to Google Tag Manager. It also provides the capability to dynamically push data to the `dataLayer` object as a result of specific events or actions by the visitor.
Server-side tracking can be another alternative to Google Tag Manager and help mitigate the performance cost of loading multiple vendor scripts in the browser. Segment is a popular option that supports both client-side and server-side analytics and event tracking support.
Quantifying Performance On A Continual Basis
While Google Tag Manager served as the single-source of truth for third-party scripts, Elevar needed good performance data around the impact of those third-party scripts.
To ensure ongoing performance monitoring during development they set up a cronjob to run a Lighthouse audit every hour through the course of the project so they could quickly identify when a change slowed the site down. This type of continual performance monitoring, either on a “timed” basis or on a per-PR / per-commit basis, is a recommended Gatsby best practice.
The performance foundation started with Gatsby and React which both enable quick navigation and page loads through built-in features. That foundation ensured that any variance in performance would be noticeable and thus set the bar for front-end performance.
One current limitation of Lighthouse to keep in mind is that it focuses on the initial payload from a first visit without a primed cache. Gatsby, however, really shines after that initial page load. Through React, it ensures that it preloads pages and reduces the amount of full-page renders so that navigating between pages feels incredibly fast. So while the Lighthouse score is important, it’s only the leang indicator and doesn’t fully represent how the perceived performance feels to a real visitor.
With all of that in place, Elevar measured the impact of each third-party script in isolation by comparing the before and after scores in Lighthouse. This helped them understand the performance cost for each script so they could factor that in to how they approached minimizing the performance impact.
Lighthouse score before adding a third-party script.
Lighthouse score after adding a single third-party script.
Combined with their knowledge of each script’s impact on performance and StriVectin’s business priorities, they were able to make informed decisions about how to approach optimizing the performance for each script. They used a few different tactics here:
- Help clients choose the best tools
- Maximize local cache with React’s global state.
- Deferring loading non-critical elements.
Depending on the performance impact and the role each script played, they were able to choose a solution that balanced business needs and performance while keeping in mind that performance was a business need unto itself. Let’s dive a little deeper on each of these tactics.
Help Clients Choose the Best Tools
With any third-party script, businesses aren’t always using the full power of the service. Slade’s advice is that if they’re only using 5% of it, there may be opportunities to optimize how it’s loaded and applied in a way that can help improve performance.
In other cases, it may be necessary to work with clients in helping them understand the tradeoffs so that you can explore and consider alternative tools. Slade’s experience is that there are four categories of optimization opportunities:
- First, over the years, some sites have accumulated scripts that are no longer necessary and can be removed entirely.
- Second, some teams have added tools that are dated to the point where there are better and more modern alternatives.
- Third, since a tool was added, it may have improved integration options, like API access. In this case, it is often possible to write slimmer code that is functionally equivalent to what the script is doing, increasing performance.
- Fourth, with Gatsby sites, you can eliminate some types of scripts by loading key data asynchronously during the build process. That’s precisely what Elevar did with Yotpo review data for StriVectin.
— Thomas Slade, VP Engineering, Elevar…a lot of merchants who are on Shopify and Magento, they’re used to point and click third party plugins. And that is a huge challenge because the client doesn’t have much intent when they’re doing it. It’s just, “I want this. This is going to help.”
Deferring Loading Non-critical Elements
They were also able to use Google Tag Manager to defer loading elements that weren’t critical for rendering the page. For example, with ZenDesk live chat, they set up Tag Manager to wait three seconds before loading and rendering the chat widget. This ensures that any delays from loading the chat widget didn’t negatively impact any of the other more critical scripts for the initial page render.
In other cases, they deferred loading scripts until a visitor triggered an event that required a given script. That let visitor actions and behavior drive loading only when it was necessary. For example, with a script that’s only tied into adding items to the cart, a click on an ‘Add to Cart’ may load the javascript and then execute it rather than loading it when the page initially loads and waiting around for an event that may never happen and thus never need the script to load.
Different scripts in Google Tag Manager were tied to a 3-second delay to prevent scripts that weren’t related to rendering from interfering with more critical scripts.
When deferred loading wasn’t possible, they designed the pages such that any delayed content or rendering as a result of the script happened lower on the page so that it was less likely to be visible when the page first loaded. In one case, a personalization and A/B testing library included an advanced inline WYSIWYG tool but the JavaScript file was about 1MB, and they had to avoid using any of it above-the-fold. Instead, they designed the components to behave as placeholders and fade-in as the changes happened so that the page didn’t jump around as the slower content was rendered.`
Don’t Let Third-Party Scripts Crash the Performance Party
Third-party scripts can be an Achilles heel to a site’s performance. Sometimes, even the slowest scripts are critical and can’t be avoided. But just because they can’t be left out doesn’t mean they can’t be adjusted to play nicely with performance metrics.
While a site that requires a handful of third-party scripts may never achieve perfect front-end performance, Elevar’s experience with StriVectin shows that there are creative ways to mitigate the performance impacts, make the most of a challenging set of tradeoffs, and keep sites snappy.