When it comes to website data tracking and tracking pixels, too much of a good thing can become a bad thing.
For every tracking tag (AKA tracking pixel) on a site, the site slows down, because it has to load that pixel from another server so it can do its work.
Over time, this accumulation of tracking pixels will cause the site to slow to a point where it can impact the revenue generating capability of a site.
How long before the pixel tracking slowdown hurts revenues? About 1 second.
Our experience has shown that with an ecommerce site that generates $1MM of revenue per year, for every second that a brand site’s tracking pixels cause the site to slow equates to a loss of $75,000 in revenue per year.
What is a tracking pixel?
A tracking pixel is using third-party services to collect data used to improve your site.
Need more scary tracking pixels news?
Those tracking pixels can actually be a security problem for a site.
With so many brick and mortars getting hacked, sites are inviting the same vulnerabilities into their e-commerce sites.
The data that is valuable to a brand is also valuable to third parties who can use the tracking pixels on the brand site to gather data about your customers.
Loss in revenue due to site speed
This deserves repeating. For an e-commerce site that generates $1MM of revenue per year, for every second that the site’s tracking pixels cause the site to slow equates to a loss of $75,000 in revenue per year.
Many of the tracking pixels that brands use are actually gathering duplicate data.
Pixels that monitor scrolling or generate heat maps are great for short-term testing, but implemented full-time only duplicate data that is generally gathered via Google Analytics or Omniture.
The best advice is to use heat mapping and other performance anchor pixels during key testing times like new page rollouts or after a site redesign.
What can you do?
Pixels, like tacos and beer, are all best used in moderation. To help moderate the epidemic of tracking pixel creep, do the following:
- Use only what you need. Accumulating a mountain of duplicate data only creates a dearth of insight and analysis. Will you actually be able to consume all of that data, analyze it, and make changes because of it? There are diminishing returns when the data set becomes overwhelming.
- Load pixels after the page loads and load the pixels all at once. A good developer can assist with this best practice.
- Use intensive pixels only as needed, for instance right after the launch of a major redesign to test.
- Secure your data (via https) to ensure security vulnerabilities are plugged
At The Good, we believe in data informed digital decision making. But there is a limit to the amount of data that is useful before it becomes overwhelming. There is also a limit to how much site performance (and revenue) a brand is willing to lose to gain duplicate data.