Breaking Down Page Speed Insights, Part 1: Performance

Breaking Down Page Speed Insights, Part 1: PerformanceBreaking Down Page Speed Insights, Part 1: Performance

For the longest time, one of the biggest pushes in the web world has been towards responsive design and development. While this has led to a great number of advances, in both specs and tooling, it has also shone a light onto web site performance.

With the spread of hi-dpi, and more powerful devices over the last decade, more and more pages on the web are bloated - slowed down by unnecessary amounts of additional code and years of agencies selling the next big thing, adding snippets to your pages with wanton abandon.

However, more recently, this began to change, as Google started heavily pushing their Page Speed Insights. As more and more marketers, design leaders, and agencies became aware of the tool, it became the de-facto standard of testing page speed (along with tools like GTMetrix). Agencies and development shops were finally given the command from their clients - make my site do better.

But what is better?

In our first part we'll take a look at the metric people tend to focus on the most currently: performance.

Let's break down what affects your performance indicators, and why hitting 100% at all costs shouldn't be your endgame, and why sometimes those 100% scores are not really fair.

Firstly, what constitutes your performance score, according to Google

First Contentful Paint (FCP)

The time it take for the first visible bit of actual content to render on page

Speed Index (SI)

The time it takes to render all the visible elements on the page (basically, the time it takes to render your “above the fold” content).

Largest Contentful Paint (LCP)

The time it takes for the largest amount of visible content (image or text block) to render and be fully interactive within the viewport (again, usually your “above the fold” content, but maybe only a hero image, for example)

Time to Interactive (TTI)

Google defines this as: The page displays useful content, which is measured by the FCP, event handlers are registered for most visible page elements, and the page responds to user interactions within 50 milliseconds.

Total Blocking Time (TBT)

This metric measures the time between your FCP and TTI numbers. Basically, it calculates the amount of time your browser is “locked up” in rendering before a user is able to do something

Cumulative Layout Shift

This metric measures how much of the page layout changes once items are fully loaded.

These are all valuable metrics. But let's talk about what is, in our opinion, a significant item that often gets hit the hardest, and can occasionally cause concern amongst marketers and developers - time to interactive.

When working on the Eightfold site, we made the conscious decision that having live chat would be better for us than having contact forms on every page - it decreased on-page clutter, let us focus on the copy for each page, and let us inject some personality into our site... but it had a side effect, it drastically tanked our time to interactive score!

But why? We load the tag asynchronously - why would it be affecting our score so drastically? We'll - it's simple. PSI sees the script performing tasks (loading the chat icon, popping the dialog box) after the initial page loads, and thus marks the interactivity of the page as low.

That's not the whole story though - our pages are snappy and fast, and we'd rather people start browsing before the live chat even loads - we have no issue with how long it takes to load. And while we could build our own live chat tool, with better implementation code, it doesn't make sense for us right now (building your own software vs rolling a pre-built solution is a topic for another blog post).

Something we often see as well, is Google's own tools being dinged for certain aspects of Page Speed, like cache times. This is unavoidable if you're using these tools. You have no control over the scripts that load, and while you could host and cache them yourself, you really shouldn't - Google, and other scripts with low cache times, keep them low because they're constantly modifying the contents of these scripts, making improvements and tweaking things to better serve their users.

Lastly, something I've personally seen become more and more common, are developers using tactics to fake their way to 100%, just to hit that number. Look, hitting 100% is a goal everyone should strive for. Doing so by removing scripts when the page is hit by PSI is not the way to go. We should be educating clients on what those scores mean, and how different things can affect them in different ways.

If you’re going through a site rebuild now, be aware of the sacrifices you'll have to make, both in the on-going development of your site, and the restrictions placed on your marketing teams when it comes to new scripts and tools for tracking user engagement, and how they may affect your page scores.