The release of the Core Web Vitals report in Google Search
Console has caused a few waves in the industry, with people discussing
potential impacts, focuses, and where the priorities should be. There are a lot
of articles at the moment which go into detail about what the new web vitals
are, but only a handful that look at what this means for Search. So, I’ve split
this post into three sections:
An overview of the Core Web Vitals
An Overview of the Core Web Vitals
The Core Web Vitals consist of three metrics:
Largest Contentful Paint (LCP) – measuring
These are all metrics which have previously been available
through other tools such as PageSpeed Insights and are powered by the Chrome
User Experience Report (CrUX), but having them directly accessible in GSC
opens up a few new opportunities. Alongside these three Core metrics, there are
other “supplemental metrics”
which capture a larger part of the experience and are used to help diagnose
issues with the Core Web Vitals:
Time to First Byte (TTFB) & First Contentful
Paint (FCP) – diagnose issues with LCP
Total Blocking Time (TBT) and Time to
Interactive (TTI) – diagnose issues with interactivity
The other point to note is that although these metrics are
all available now, they still have room to grow and develop as more and more
data becomes available. This is key for understanding whether the benchmark
will move in the future and, right now, it doesn’t look like it will.
Benchmarks across each metric can be seen in the image below for what constitutes a “Good”, “Needs Improvement” and “Poor” URL:
The aim at the moment is to have your URLs in the 75th
percentile of page loads across each of the Core metrics and desktop and
mobile. But, if they were to add a 4th metric into the Core metrics,
it is likely you would still need to be in the 75th percentile but
across more metrics.
Digging a little bit deeper, the 75th percentile
was chosen so that a majority of visits to the site experienced the target
level of performance or better, while being less affected by outliers. There is
a huge amount of information on the criteria for these metrics and making sure
they are achievable in this post,
which are both extremely valuable sets to work from. For example, for a
threshold to be achievable they require at least 10% of origins to meet the
“good” threshold, and then also ensure that well-optimised content consistently
meets the “good” threshold to avoid misclassification due to variability. The
same post also breaks down the % of CrUX origins classified as “good” and
“poor” for each of the metrics, which are then used as the thresholds seen in
the image earlier:
Use Cases for the Core Web Vitals
One of the biggest use cases has got to be bulk reports. Being
able to drill directly into the Mobile performance of Poor URLs by specific
issue, and then see the number of similar URLs affected by the same issue is a
huge time saver, and great for prioritising which sections to look at first.
Before, you’d have had to do a fair bit of that manually. Further examples
Analyse impact of Web Vitals performance on
Analyse impact on scroll depth/on-page
Identify problem “sections” of the site through
the “similar URLs” grouping
Identify focus areas from an optimisation
perspective at scale
Combine with Mobile Usability issues to
prioritise areas of the site
But the use cases don’t stop at just the Core Web Vitals. We know that supplemental metrics are also being used, so utilising tools that pull those (CrUX, PageSpeed Insights) to pull the broader spectrum of factors will give you a greater base to work from for any of the above three areas.
Google have said that the Core Web Vitals will become form
part of the Search
ranking factors from next year (2021) and they will give “at least 6 months
notice” before rolling it out and that this will also incorporate page
experience metrics into ranking criteria for Top Stories in Search on mobile.
However, they have also said that there is no immediate
action to be taken.
Does that mean we don’t need to do anything right now?
Technically, yes. Should we start doing things right now? Almost certainly.
The more time you can give yourself, and your teams, to get
improvements implemented the better. Starting now, even though there is no
immediate need to, means you can get a head start instead of scrambling to get
fixes in place once the 6 month warning announcement comes out.
We’ve known for a long time now that page
speed plays a part in the ranking algorithms, with initial announcements
focused on mobile search. This announcement is building on that but applies to
Desktop and different search features as well as Mobile search.
So how do you really get ahead?
Explore GSC and the new Web Vitals reports, understand your data and the classification of URLs across your site.
Identify areas within the report that need improvement, then prioritise based on effort/scale/impact.
If you can see a small impact by changing a couple of things across a large number of URLs, you’re going to see more of an overall lift than you would if you spent a lot of effort fixing a handful of URLs
Prioritise, then optimise.
Cross-reference the “Poor” URLs being flagged for each metric with other GSC and GA data, e.g.
Impressions (by device)
Sessions (by device)
Page engagement metrics (by device)
Cross-reference the “Poor” URLs with other tool data, e.g.
Google Ads (formerly AdWords) landing page experience reports – a factor of the landing page experience is mobile speed, if its flagged there, its probably also flagged in the Web Vitals report
PageSpeed Insights – to get supplemental metric data
Deploy implementation at scale for affected URLs (most often a template change will impact the whole site)
Measure impact across the site/all affected URLs
It might not be an immediate switch, but we all know development and implementation takes time. So, the sooner you start looking into the data, potential issues, potential causes, and required time to fix, the sooner you’ll be able to start prioritising the “problem” areas and getting them into a good state ahead of the upcoming rollout. And because we’re all marketers, and we all love competition, the sooner your site is sorted, the harder it will be for competitors to overtake you.