Please do not read the title of this article and think I am advocating for ignoring website performance completely. That is not what this article is about.
We strongly believe in benchmarking a website’s performance and looking for areas to improve it and routinely doing Technical SEO audits.
Routine SEO audits should include:
- Website Performance Testing
- Create and Review Schema Markup
- Indexability Audits
- Stale Content Audits
Let’s set boundaries for this information!
This article aims to address the many companies that point to a Google Lighthouse report and say, “Look at these scary numbers. Everything you are doing is bad!”.
Marketing companies that do this are being dishonest with you. More often than not Google Lighthouse reports can give limited view into why a page might not be performing.
There are over 200 signals (that we know of) a single can be effected by.
Although these scores can show areas to improve, they are not the only thing effecting why a single page is not ranking well.
I do recommend also setting the scope here to only local businesses and to make it more limited those in the United States.
Local business examples:
- The lawncare company that deals in their main city and the 5 surrounding towns.
- The family dentist office that services residents in a couple zip codes primarily, etc.
Isn’t Google always right though?
Great question! The answer is no.
Before we get into the specifics on how the Lighthouse tool works, we need to review a specific website performance framework Google released known as AMP. It is important to understand the failures of AMP to understand why we ended up on Core Web Vitals.
Why did Google create AMP and what did did they do?
Google often finds itself in the same catch 22 position. Where they have to cater to the end-user and also the advertisers on the platform. A good user experience means better results for the businesses advertising on their pages. So it is a no-brainer they would want to have their hand in overall website performance at all times.
AMP stands for Accelerated Mobile Pages. These pages are built wrapped in a special web component framework engineered by Google. By wrapping your pages with it, you gave Google a way to cache it in their servers directly.
This feature worked much like a content delivery network. Very similar to the way that CloudFlare delivers a page off their CDN to a visitor, Google would provide your page straight from their own cache.
That’s great right? Well yes, but while AMP pages are very fast, they also are very boring to the eye. This makes them nearly impossible to convert leads with. So ultimately, if you were aiming to build landing pages on mobile, wrapping them in the AMP framework would not be the best option.
The AMP idea wasn’t all bad for website performance was it?
The purpose at the core of AMP Pages was to speed up the web. We think that’s a great idea and believe website performance is a huge deal. AMP just didn’t hit the mark. It really was a big swing and a miss for small businesses.
AMP also meant Google had complete control over the content as it now lived on their servers. I personally didn’t see anything bad going on with the data, but many others shook their fists at this.
In a nutshell, it was and ugly mess.
In the end, with much of the web snubbing their noses at it and developers started to build ways to subvert using it, Google began slowly putting it back in the box.
Browsers like Brave built features to de-AMP the web. provided ways to get around it when it. While Reddit bots were deployed to solely provide users links to non-AMP versions of pages whenever a link to one is posted.
As you can imagine, the growing disgust for the service left AMP in the same path as Google Plus and many other failed Google products.
That is all I can say about AMP at this point, you can read about it’s life and slow (it is somehow still dying as of writing this) death here – AMP Pages.
What is bad about the Google’s Lighthouse tool then?
The main flaw I see with using Lighthouse reports (again targeting a small business in the USA) is simply the fact that the tool throttles down the connection speed. Countries with developed internet and mobile networks should not have to worry about having a download speed of 1.5Mbps. The throttle itself aims to provide a glimpse into how your page loads over networks with slow 4G service or fast 3G service.
Service at these speeds in 2022, is very unlikely and as we near the end of life for many early mobile networks, it will become next to impossible. The average connection speed for many mobile users 30Mbps download at the moment. This is expected to climb in the coming years as 5G becomes more established.
A look into the habits of US viewers, points to 52% of their searching happening on mobile devices. Of that 52% of the mobile users, 75% of them were connected over a Wi-FI connection (source).
In short, the vast majority of those in the US will not experience the picture that Google Lighthouse paints. It is very much for testing your website for global traffic.
Beyond throttling the test connection, is the rest valid?
I have other small issues, but one that stands out as another major flaw is that the scans do not take into account for a CDN being in the mix.
A CDN, like CloudFlare, caches copies of a page across their network of server.
This makes the specific page and any static resources (images, css files, etc.) attached to it to then get dispersed to over 250+ nodes on the network. After it has been dispersed to all the nodes it sets expire headers on the content. This makes it so if the node on the network receives very little traffic, it will over time drop those cached contents.
Depending on Google Lighthouse’s testing location, your report might have a node with zero cached copies on it. Meaning the first scan isn’t seeing the actual speeds users locally are experiencing. When Lighthouse scans, it should trigger CloudFlare to cache things, however that happens after the request. Meaning the original scan has been completed and with poor scores. While your primary user base doesn’t share that experience.
This is why with tools like GTMetrix running a second scan will usually show you better scores when using a CDN.
Website Performance is better with Core Web Vitals
The good news with AMP starting to go away is Google ended up pulling some key indicators of good performance. These new metrics developers can use are known as Core Web Vitals. They determine a few different metrics to indicate where users may struggle when viewing a website:
- Largest Content Paint (LCP): measures loading performance of the largest paint to the viewport.
- First Input Delay (FID): measures interactivity time between first request and a when a user becomes able to interact with an input field.
- Cumulative Layout Shift (CLS): measures visual stability. Does the page shift around while it loads everything in?
Do they matter? Yes!
Are they why your marketing campaign is failing? Likely not!
What tool works best for Small Business website performance?
I use several, but more often than not, GTMetrix.com is my go to tool for viewing unthrottled website performance scores. Google also offers built-in features like Chrome DevTools in their browser. Specifically the Network tab in there can give you way apply your own custom throttles to live test with. Chrome DevTools also gives you a way of visually seeing your website on desktop and mobile. It’s quite powerful.
Don’t get me wrong. Lighthouse has value! However it is very little in when working with US-based small businesses with local markets.
These reports should not be used as a scare tactic.
Basic SEO factors like structuring your pages with proper HTML hierarchy, applying on-page SEO like meta titles and descriptions, JSON-LD Schema, and doing off-page SEO are all going to likely carry more weight for your website.
Shaving off a few seconds for that 3G mobile users is not going to have a large impact on the lead funnel. That traffic for a US business likely may not even exist.