The algorithm that Google uses to determine webpages appear in search results is a secret. A whole industry exists that monitors Google’s search results to try to figure out if any changes have been made to the algorithm, and if so, what they are.
But occasionally, Google does tell us about specific changes, especially when it wants people that maintain websites to make some changes. That is happening right now with something called core web vitals and the page experience.
Learn about these in the video below, or read the transcript underneath. This talk was given at one of our free, regular Link and Learn events. To get your ticket to our next free event, take a look at our events page to see when the next ones are coming up.
I’m Jason. I help run Padua Communications with Nicky.
I used to be a journalist. I did various journalist-y things but probably the most relevant for today is I ran a website called CNET for eight years in the UK. It’s the world’s biggest technology website. And we’ve ended up doing quite a lot of work helping clients with websites, we’ve built some, we help maintain some. And obviously SEO is quite a big part of that. We’ve had to keep abreast of all the latest developments so that we can advise our clients on the best place to spend their time and money. I thought I’d talk about the latest stuff today.
What is the page experience?
Google is always tweaking its algorithm for the way in which it decides which pages appear where in its search results. Several times a month it makes changes and mostly doesn’t say very much about them publicly. Unless the traffic to your site is affected some way (maybe it might go through the roof, or collapse, or maybe you’re a professional SEO person), you probably won’t notice. You don’t really need to.
But sometimes Google makes some changes that it wants people to know about, so that they can make the changes that are needed to their site. And that’s one of the changes I want to talk about today which is all about what Google calls, “the page experience”.
This is essentially how fast a page appears to load from the user’s perspective, and how quickly they can start interacting with that page. It’s been pushed back because of COVID, but the changes to the algorithm are coming in mid-June and will roll through towards the end of August. That’s the plan. They’ll do it in phases to see if there are any unintended consequences from their side.
The importance of site speed
For many people, Google is the gatekeeper to the internet. If you are on the first page of Google for your goods or services, it means big business normally. If Google is going to reward you for doing certain things, then you’re probably going to do them. For some time, Google has looked at various things such as whether a site is mobile friendly:
- Does it have any malware or nasty things on there?
- Is the page delivered over a secure connection?
- Are there any annoying interstitials?
Interstitials are ads that you used to see: remember when you used to click a link and, instead of going to what you wanted, you got a full page, and you had to watch it before going any further? You don’t see many of those ads anymore. They are still around, but, there are less of them. Most websites now have a half decent mobile version. Most use secure connections.
It is worth saying why page speed is important. There are lots of studies that show that the faster a webpage loads, the more time users spend on it, and the more likely they are to contact you or buy something from you. So for example, the BBC found that it lost 10% of its users for every additional second their site took to load. The retailer Furniture Village managed to reduce the time it took for its site to load by about 20% and it saw a 10% increase in conversion rates on mobiles.
What are LCP, FID and CLS?
So a fast page is a good thing, but what makes a fast page? There are lots of ways you can measure it, but Google has made it easier for you to focus your attention on certain things. By coming up with a set of measurements it calls core web vitals. As with many internet things, this sounds complicated, but it isn’t really. Let’s try some alphabet soup first: there’s LCP, FID and CLS. There we go. That’s explained everything, I think! (Maybe not.)
Now we will try the word salad version. Largest Contentful Paint, First Input Delay and Cumulative Layout Shift. My job’s done!
If you still scratching your head, which I was when I first came across this, I’ll explain each one of those in turn. Largest Contentful Paint: this is basically how long does it take for the biggest bit of content to appear. That could be text. It could be an image or a video. It’s not how long it takes the entire page to load. It’s just how long does the most important bits of it take to load. Google considers under two and a half seconds to be good, somewhere between two and and a half seconds to four seconds to be needing improvement. Over that is poor. That’s LCP.
FID or First Input Delay. That’s basically how long does it take for the site to react to the first interaction. Does the site do anything when you tap it? If the page looks like it’s loaded but nothing is happening, then that is a frustrating experience. Good here is considered to be under 100 milliseconds (which is essentially immediately), and over 300 milliseconds is poor. That can be quite a tough one to reach.
The last one is probably the most important for users – Cumulative Layout Shift. That basically means: does the page move around while it’s loading? Perhaps the example below has happen to you. When you look at something on your mobile, and you go to click it, and at the last second before your finger gets there, it moves. This is an example of what Google is trying to stop happening.
This is a pretend website, where you’re maybe thinking about buying something you’ve got 14 items, ready to confirm. You’re about to press “No, go back” and an ad or similar loads at the last second, just above where the button is, pushing everything down. In this case it means you click “Yes, place my order” instead of “No, go back.”
As you probably gathered, most of these metrics are focused on mobile devices, where a slow-loading image, or a site that doesn’t do anything when you tap it, or moves around when you tap, is particularly frustrating. If you remember the chart from earlier, Google has taken all the things it has looked at for years and it has added Largest Contentful Paint, First Input Delay and Cumulative Layout Shift. It takes all of these signals and it feeds into its overall score for the page experience. There are thought to be about 200 signals that Google uses to determine every single search result. So you probably shouldn’t get too caught up with these things at the expense of, say, great content, but nevertheless, it’s still important to bear in mind.
How to measure site speed
That’s the theory, but how do you know how well a website is doing? There is a tool called the page speed insights tool. You can put any URL at all into the field at the top, press analyse and it will give you a report and a score.
The score for the Guardian on mobile is 39 – not particularly great. If you’re a popular website, you’ll get a load of stuff where it says field data. This is taken by Google looking at Chrome users when they viewed the pages. So actually, although this score isn’t great, there’s quite a lot of green here. So it’s probably not as bad as it looks. And then all the way down here it’s giving you the indications for the different fields we’ve talked about. And then you get a great big list of all the stuff you could do to make your site faster in varying degrees of complication. I thought it worth looking at a couple of others just to show that even the big brands don’t get this right all the time.
This is BBC. They’re only at 60. So there’s still some work that they could do. They seem to have a particular problem with Cumulative Layout Shift. So basically something’s loading in right the last second that changes the way the page is displayed. Even the mighty Amazon is only getting 76. So, there’s still some stuff they can do there.
There is also the Google Search Console, which if you’re not familiar with it, it’s probably the one thing that you should spend some time with, because it’s free. It’s actually a really good tool and it contains loads of information about how Google sees your website. If there are any errors in there or suggestions for coding improvements, they’ll show up. You’ll be able to see how much traffic you’re getting in a simple way so you don’t have to dig through pages and pages of Google analytics.
There’s a new little bit here that says Core Web Vitals. You get a graph like this and it shows you how many URLs need improvement in Google’s opinion. How many are bad? How many are good? Then there is another one called Page Experience. That’s rolls all this stuff together to give you a score.
You look at the core web vitals here, how many URLs you’ve got that failed that in some way and how easy it considers your site to be to use on a mobile. Whether there’s any security problems, which would be the most important thing you’d have to fix if that was a problem. Is it secured? Is it delivered over a secure connection? That’s what HTTPS means.
Making a start fixing site speed
The most practical thing to do is just look for something that you might be able to do yourself pretty easily. The thing that normally falls into that is images. A normal website will have some images that are just too big, or they’re not compressed enough or they’re using an old inefficient file format. Just sorting those out on some key pages like your home page and your main services would be a few hours well spent.
There are different ways you can compress an image. This might be a bit hard to see in any great detail over ,Zoom but this is basically a slide that looks at the same picture delivered in three slightly different ways. One is the original JPEG of the dog and that comes to 824 kilobytes. The one in the middle is using lossy compressions. That means you’re saving it and you’re removing some of the information. So on my screen, this may not come through. The background is a bit blocky and some of the edges of the dog are a little bit jagged. But depending on what size it is and what you’re trying to do with it, that might be fine. You are certainly saving a lot of space. So 76k is a lot less than 824k, so it will load a lot faster.
You can go all the way up to 80% compression, which is even smaller. But again, the picture really starts to break up now. The background has lots and lots of blocky bits on it. It is starting to look less like a dog with hair and more a vague impression of something. But, you know, if it was used very small, that might be absolutely fine.
There’s also another file format. JPEG has been around forever and ever and ever. A newer one is called WebP, which most browsers now support. This slide shows the same picture. On the left is a JPEG version of the picture, which is over three meg, which is enormous for a website really. And on the right is the same picture delivered in the WebP format. It looks exactly the same. The quality is identical, but the file size is more than half. So that’s a huge, huge deal. So images, depending on how comfortable with this sort of thing you are can be something you can do yourself, or at least kind of make a start with.
The key takeaway is that all of this is good stuff to do anyway. Even if Google hadn’t brought out this system to mark your homework, webpages should be fast to load. They should be free of nasty malware. You should be able to type a credit card number and an address in there and not have someone steal it. Web pages should do something when you tap them. They shouldn’t move all over the place. So it’s really an excuse to implement best practises for your website.
If you’ve, if you’ve been thinking, “maybe I need to look at my website”, use this as an excuse to make these sorts of changes. Hopefully that all made sense. I was trying to give you a bit of a whistle-stop tour through lots and lots of different things as quickly as possible.