Is tracking your web data worth the drag on your site speed?
One of our clients recently came to us with some data showing that their site was scoring low in Google’s PageSpeed Insights tool. At Vendi, we take great pride in delivering sites that score high, both for mobile and desktop. Although we hadn’t built this site for this client, we did make a lot of changes to increase the site’s performance, so we jumped right in to figure out the problem.
First, although we call it “site speed,” we are always talking about the speed of a single page on the website.
A site’s speed score is not a measurement of a site’s quality. It is a technical representation of the subjective interpretation of user’s perception of “how fast a page loads,” with the idea being that “slower pages” are more likely to be navigated away from by users. In the real world, however, the user’s intention needs to be considered too. For instance, if a user intends to buy something on Amazon, they will be more likely to accept a slower page because their task is bound to that site. If a user is gathering information, however, they might navigate away from slower sites looking for faster sites, changing their query or possibly giving up. Google’s scoring algorithm always assumes the latter.
The scoring system is from 0 to 100, and it is practically impossible for a real-world site to get 100.
*not actually a million years
The lab vs. the real world
When a site leaves Vendi, we almost always have a score in the mid to high 90s, both for desktop and mobile. However, this is a “manufacturer recommended usage in ideal conditions” score and represents our best effort to make the site as fast as possible in our “lab environment.”
For sites that we host, we can ensure that our server optimizations are enforced. Some clients, however, have their own IT staff, and the best that we can do is provide our recommendations. For instance, HTTP/2 has been out for many years, but we still have several clients whose IT departments aren’t able to roll that feature out for various internal reasons.
But even for sites that we host there are still things outside of our control, and the most common culprit is third-party trackers.
Although we call them third-party trackers, which include things such as Facebook and Twitter pixels, we’re really talking about any third-party code that isn’t part of the initial site build. Other examples include HotJar for heatmap generation, Google Analytics for normal site metrics and any chat bots that are required. Often these trackers are loaded through a single tool such as Google Tag Manager which makes it easy for marketers to add their own tracking logic without needing to contact IT or the website developers.
In order for third-party tracking code to work, it has to be built in a way that both guarantees that it will always work, and that it in no way breaks the site that it is installed on. Although those are good things, it also means that the code tends to be much larger because it cannot be optimized for specific cases. Additionally, some third-party tracking code loads additional code, which loads even more code, which might load even more code, or it contacts a tracker, that contacts another tracker, that contacts another tracker. These actions immediately come with some negative performance problems including additional DNS lookups, TLS handshakes and plain byte transfer. We often use a tool such as the Request Map Generator to visualize these third-party assets.
To be clear, we’re not saying any of these tools and services are bad, we’re just pointing out that they come at a cost.