Technical SEO Audit: (How to Perform an In-Depth Analysis)

Read this in-depth tutorial on how to create a full technical SEO audit for your clients using a step-by-step process.

I’m not going to sugarcoat it: conducting a Technical SEO Audit is a big deal.

Being an SEO consultant, there is nothing better than hearing, “Your audit looks great!

” This Is now the right time for you to join us?”””

  • Are you afraid to begin?
  • Is this your first time performing an SEO AUDIT?
  • Perhaps you simply don’t know where to start?

Sending an excellent SEO audit to a prospective client will put you in the best position possible.

So relax and take your time. Remember:

Your major goal with your site recommendations is to provide value to your customer in the immediate and long term.

In this article, I’ve included the fundamental steps for doing an SEO audit, as well as a peek inside the first part of my workflow when I first meet with a new client.

It is divided into sections below. Are free to skip through to the next section if you feel confident in your understanding of a certain section.

When Is It Advisable to Conduct an SEO Audit?

We set up an intro meeting when a potential customer sends me an email expressing interest in working together (Skype or Google Hangouts is preferred).

Prior to the meeting, I conduct my own quick SEO audit (which takes at least an hour of manual research) based on their survey responses to familiarize myself with their market landscape.

You’ll undoubtedly check them on Facebook, Twitter, Instagram, and any other public social media platforms

Here’s an example of a survey I’ve created:

During your initial encounter with the client, you should ask the following critical questions:

  • What are your long-term company objectives? What are your business objectives (public relations, social media, etc.)?
  • Who do you want to reach out to?
  • Are you involved in any commercial collaborations?
  • When was the last time the website was updated? Is there a web developer or an IT department at your company?
  • Have you ever worked with a search engine optimization (SEO) consultant? Have you done any SEO work before?

After our consultation, if we feel we’re a good match, I’ll provide my formal proposal and contract.

To begin, I always give my clients the first month as a trial period to ensure that we are a good fit.

This allows the client and I to get to know each other before working. This month, I’m going to devote some time to conducting an in-depth SEO analysis.

Depending on the size of the website, these SEO audits might take anywhere from 40 to 60 hours.

The audits are divided into three sections and presented using PDF.

  • Technical: crawling errors, indexing, hosting, etc.
  • Research: Keywords, competitor analysis, content maps, meta data, etc.
  • Link analysis: Backlink profile, growth tactics, etc.

If the client enjoys my work after the first month, we’ll start implementing the SEO audit recommendations. In the future, I’ll do a monthly mini-audit and a quarterly in-depth audit.

To summarize, I conduct an SEO audit for my clients as follows:

  • First month.
  • Monthly (mini-audit).
  • Quarterly (in-depth audit).

Before an SEO audit, here’s what you’ll need from a client.

When a client and I begin working together, I will provide them a Google Doc containing a list of passwords and vendors.

This includes the following:

  • Access to Google Analytics and any other third-party analytics tools.
  • Advertisements on Google and Bing.
  • Tools for webmasters.
  • Access to the website’s backend.
  • Accounts in social media.
  • Vendors’ list.
  • Internal team members’ list (including any work they outsource).

Tools for SEO Audit

Before you begin your SEO audit, here’s a recap of the tools I use:

Conducting an SEO Audit on a Technical Level

Tools required for a technical SEO audit include:

  • Screaming Frog 
  • DeepCrawl.
  • Copyscape.
  • Mac’s Integrity (or Xenu Sleuth for PC users).
  • Google Analytics  (if given access).
  • Google Search Console  (if given access).
  • Webmaster Tools from Bing (if given access).

Step 1: Add your website to the DeepCrawl and Screaming Frog toolsets:

  • DeepCrawl.\sCopyscape.
  • Screaming Frog is a frog that screams.
  • Integrity. Google Analytics.
  • Google Tag Manager is a tool that allows you to manage your tags.
  • Google Analytics code can be found here.

When Using DeepCrawl, What to Look for

I begin by adding my client’s website to DeepCrawl. The crawl may take a day or two to complete, depending on the size of your client’s site.

Here are the things I look for when I get my DeepCrawl results:

Content Duplication

To find duplicate content, look at the “Duplicate Pages” report.

If duplicate content is discovered, I’ll make rewriting these sites a major priority in my recommendations to the client, and in the meanwhile, I’ll tag the duplicate pages with the meta name=”robots” content=”noindex, nofollow”> tag.

The following are some of the most common duplicate content problems you’ll come across:

  • Duplicate meta titles and meta descriptions.
  • Duplicate body content from tag pages (I’ll use Copyscape to help determine if something is being plagiarized).
  • Two domains (ex:,
  • Subdomains (ex:
  • Similar content on a different domain.
  • Improperly implemented pagination pages (see below.)

What to do about it:

  • By adding the canonical tag to your pages, you are informing Google what you intend your preferred URL to be.
  • In the robots.txt file, disallow invalid URLs.
  • Rewrite the content (including body copy and metadata).

Here’s an example of a duplicate content problem I encountered with a customer. They had URL parameters without the canonical tag, as you can see below.

Here is what I did to fix the problem:

  • I fixed the 301 redirect issues.
  • To the page I want Google to crawl, I added the canonical tag.
  • You should remove all parameters from Google Search Console that do not produce unique content.
  • To increase crawl budget, the disallow function was added to the robots.txt for invalid URLs.


There are two reports worth looking at:

  • First Pages: Review the “First Pages” report to see which pages use pagination. Then you may manually inspect the pages on the site to see if pagination is correctly implemented.
  • The “Unlinked Pagination Pages” report will inform you if the rel=”next” and rel=”prev” links to the previous and next pages, respectively, to see if pagination is operating properly. Using DeepCrawl, I was able to discover that a client had reciprocal pagination tags in the following example:

What to do about it:

  • Add the rel=”canonical” tag to any “view all” or “load more” pages. Crutchfield provides an example:
  • Add the normal rel=”next” and rel=”prev” markup if all of your pages are on distinct pages. Consider this Macy’s example:
  • If you’re using endless scrolling, make sure your javascript includes the appropriate paginated page URL. Take a look at this example from American Eagle.

Maximum Number of Redirections

To see all the pages that redirect more than four times, look at the “Max Redirections” report.

In 2015, John Mueller stated that if there are more than five redirects, Google will stop following them.

Crawl mistakes are sometimes referred to as “crawl budget eaters,” but Gary Illyes refers to them as “host load eaters.”

Because you want your host load to be used efficiently, it’s critical that your pages render properly.

Here’s a quick rundown of the answer codes you can encounter:

  • 301 – These are the majority of the codes you’ll see throughout your research. 301 redirects are okay as long as there are only one redirect and no redirect loop.
  • 302 – These codes are okay, but if left longer than 3 months or so, I would manually change them to 301s so that they are permanent. This is an error code I’ll see often with ecommerce sites when a product is out of stock.
  • 400 – Users can’t get to the page.
  • 403 – Users are unauthorized to access the page.
  • 404 – The page is not found (usually meaning the client deleted a page without a 301 redirect).
  • 500 – Internal server error that you’ll need to connect with the web development team to determine the cause.

What to do about it:

  • Remove any internal links to old 404 pages and replace them with the internal link to the redirected page.
  • Remove the middle redirects to undo the redirect chains. If redirect A causes redirects B, C, and D, for example, you’ll want to undo redirects B and C. A reroute from A to D will be the end outcome.
  • If you’re using Screaming Frog and Google Search Console, there’s also a way to accomplish it below if you’re using that version.

When it comes to Screaming Frog, there are a few things to keep an eye out for.

When I get a new client site, the second thing I do is enter their URL to Screaming Frog.

I may adjust the settings to crawl specific portions of your client’s site at a time, depending on the size of the site.

RELATED POST:  In 15 minutes, you will learn 80% of SEO!

My Screaming Frog spider setups are as follows:

You can accomplish this by omitting portions of the site or changing your crawler settings.

Here are the things I look for when I get my Screaming Frog results:

Code for Google Analytics

Screaming Frog can assist you in determining whether sites lack the Google Analytics code (UA-1234568-9). Follow these procedures to locate the missing Google Analytics code:

  • Go to Configuration in the navigation bar, then Custom.
  • Put analytics/.js in Filter 1, then change the drop-down to Does not contain.

What to do about it:

  • Contact your client’s developers and request that the code be added to the pages where it is currently missing.
  • Skip through to the Google Analytics section below for further information about Google Analytics.

Google Analytics

Similar procedures can be used by Screaming Frog to determine whether pages are lacking the Google Tag Manager snippet:

  • In the navigation bar, go to the Configuration menu, then Custom.
  • Replace “iframe src-“/” with “iframe src-“/” Does not contain the items that were selected in the Filter.

What to do about it:

  • Go to Google Tag Manager to check for any issues and make any necessary changes.
  • Share the code with your client’s developers to see if they can re-implement it.


You should also look to see if your client’s website employs schema markup. Structured data, also known as schema, aids search engines in determining what a page on a website is about.

Follow these procedures to check for schema markup in Screaming Frog:

  • In the navigation bar, go to the Configuration menu, then Custom.
  • With ‘Contain’ checked in the Filter, add itemtype=”


If you want to see how many pages your client’s website has been indexed, follow these steps in Screaming Frog:

  • Go to Directives > Filter > Index when your site has finished loading in Screaming Frog to see if there are any missing sections of code.

What to do about it:

  • If the site is new, it’s possible that Google hasn’t yet indexed it.
  • Make sure you’re not blocking anything you want Google to crawl in the robots.txt file.
  • Make sure your client’s sitemap has been submitted to Google Search Console and Bing Webmaster Tools.
  • Perform manual research (seen below).


Because to the poor website load times, Google stated in 2016 that Chrome would begin banning Flash.

So, if you’re doing an audit, you’ll want to determine whether or not your new client is utilizing Flash.

To do so in Screaming Frog, follow these steps:

  • In the menu, go to Spider Configuration.
  • Check SWF is selected.
  • After the crawl is complete, filter the Internal tab by Flash.

How to fix:

  • Embed videos from YouTube. Google bought YouTube in 2006, no-brainer here.
  • Or, opt for HTML5 standards when adding a video.

Here’s an example of HTML5 code for adding a video:

<video controls="controls" width="320" height="240">&gt;
 <source class="hiddenSpellError" data-mce-bogus="1" />src="/tutorials/media/Anna-Teaches-SEO-To-Small-Businesses.mp4" type="video/mp4"&gt;
 <source src="/tutorials/media/Anna-Teaches-SEO-To-Small-Businesses.ogg" type="video/ogg" />
Your browser does not support the video tag.</video>


According to Google’s announcement in 2015, JavaScript is okay to use for your website as long as you’re not blocking anything in your robots.txt (we’ll dig into this deeper in a bit!). But, you still want to take a peek at how the Javascript is being delivered to your site.

How to fix:

  • Review Javascript to make sure it’s not being blocked by robots.txt
  • Make sure Javascript is running on the server (this helps produce plain text data vs dynamic).
  • If you’re running Angular JavaScript, check out this article by Ben Oren on why it might be killing your SEO efforts.
  • In Screaming Frog, go to the Spider Configuration in the navigation bar and click Check JavaScript. After the crawl is done, filter your results on the Internal tab by JavaScript.
spider in sreaming frog
javascript in screaming frog


When you’re reviewing a robots.txt for the first time, you want to look to see if anything important is being blocked or disallowed.

For example, if you see this code:

User-agent: *

Disallow: /

Your client’s website is blocked from all web crawlers.

But, if you have something like Zappos robots.txt file, you should be good to go.

# Global robots.txt as of 2012-06-19

User-agent: *
Disallow: /bin/
Disallow: /multiview/
Disallow: /product/review/add/
Disallow: /cart
Disallow: /login
Disallow: /logout
Disallow: /register
Disallow: /account

They are only blocking what they do not want web crawlers to locate. This content that is being blocked is not relevant or useful to the web crawler.

How to fix:

  • Your robots.txt is case-sensitive so update this to be all lowercase.
  • Remove any pages listed as Disallow that you want the search engines to crawl.
  • Screaming Frog by default will not be able to load any URLs disallowed by robots.txt. If you choose to switch up the default settings in Screaming Frog, it will ignore all the robots.txt.
    robots txt in screaming frog | SEJ
  • You can also view blocked pages in Screaming Frog under the Response Codes tab, then filtered by Blocked by Robots.txt filter after you’ve completed your crawl.
  • If you have a site with multiple subdomains, you should have a separate robots.txt for each.
  • Make sure the sitemap is listed in the robots.txt.

Crawl Errors

I use DeepCrawl, Screaming Frog, and Google and Bing webmaster tools to find and cross-check my client’s crawl errors.

To find your crawl errors in Screaming Frog, follow these steps:

  • After the crawl is complete, go to Bulk Reports.
  • Scroll down to Response Codes, then export the server-side error report and the client error report.

How to fix:

  • The client error reports, you should be able to 301 redirect the majority of the 404 errors in the backend of the site yourself.
  • The server error reports, collaborate with the development team to determine the cause. Before fixing these errors on the root directory, be sure to back up the site. You may simply need to create a new .html access file or increase PHP memory limit.
  • You’ll also want to remove any of these permanent redirects from the sitemap and any internal or external links.
  • You can also use 404 in your URL to help track in Google Analytics.

Redirect Chains

Redirect chains not only cause poor user experience, but it slows down page speed, conversion rates drop, and any link love you may have received before is lost.

Fixing redirect chains is a quick win for any company.

How to fix:

  • In Screaming Frog after you’ve completed your crawl, go to Reports > Redirect Chains to view the crawl path of your redirects. In an excel spreadsheet, you can track to make sure your 301 redirects are remaining 301 redirects. If you see a 404 error, you’ll want to clean this up.
screaming frog redirect chains
screaming frog redirect chains 404 status code

Internal & External Links

When a user clicks on a link to your site and gets a 404 error, it’s not a good user experience.

And, it doesn’t help your search engines like you any better either.

To find my broken internal and external links I use Integrity for Mac. You can also use Xenu Sleuth if you’re a PC user.

I’ll also show you how to find these internal and external links in Screaming Frog and DeepCrawl if you’re using that software.

How to fix:

  • If you’re using Integrity or Xenu Sleuth, run your client’s site URL and you’ll get a full list of broken URLs. You can either manually update these yourself or if you’re working with a dev team, ask them for help.
  • If you’re using Screaming Frog, after the crawl is completed, go to Bulk Export in the navigation bar, then All Outlinks. You can sort by URLs and see which pages are sending a 404 signal. Repeat the same step with All Inlinks.
    Screaming Frog bulk export backlinks
  • If you’re using DeepCrawl, go to the Unique Broken Links tab under the Internal Links section.


Every time you take on a new client, you want to review their URL format. What am I looking for in the URLs?

  • Parameters – If the URL has weird characters like ?, =, or +, it’s a dynamic URL that can cause duplicate content if not optimized.
  • User-friendly – I like to keep the URLs short and simple while also removing any extra slashes.

How to fix:

  • You can search for parameter URLs in Google by doing inurl: “?” or whatever you think the parameter might include.
  • After you’ve run the crawl on Screaming Frog, take a look at URLs. If you see parameters listed that are creating duplicates of your content, you need to suggest the following:
    • Add a canonical tag to the main URL page. For example, is the main page and I see$, then the canonical tag would need to be added to
    • Update your parameters in Google Search Console under Crawl > URL Parameters.
parameter URL options in google search console
  • Disallow the duplicate URLs in the robots.txt.

Step 2: Review Google Search Console and Bing Webmaster Tools.


  • Google Search Console.
  • Bing Webmaster Tools.
  • Sublime Text (or any text editor tool).

Set a Preferred Domain

Since the Panda update, it’s beneficial to clarify to the search engines the preferred domain. It also helps make sure all your links are giving one site the extra love instead of being spread across two sites.

How to fix:

  • In Google Search Console, click the gear icon in the upper right corner.
  • Choose which of the URLs is the preferred domain.
RELATED POST:  Content Writing for SEO (How to Create Content that Ranks in Google)
Google Search Console Preferred Domain
  • You don’t need to set the preferred domain in Bing Webmaster Tools, just submit your sitemap to help Bing determine your preferred domain.


With the announcement that Penguin is real-time, it’s vital that your client’s backlinks meet Google’s standards.

If you notice a large chunk of backlinks coming to your client’s site from one page on a website, you’ll want to take the necessary steps to clean it up, and FAST!

How to fix:

  • In Google Search Console, go to Links > then sort your Top linking sites.
    GSC top linking sites
  • Contact the companies that are linking to you from one page to have them remove the links.
  • Or, add them to your disavow list. When adding companies to your disavow list, be very careful how and why you do this. You don’t want to remove valuable links.

Here’s an example of what my disavow file looks like:

disavow example


As an SEO consultant, it’s my job to start to learn the market landscape of my client. I need to know who their target audience is, what they are searching for, and how they are searching. To start, I take a look at the keyword search terms they are already getting traffic from.

  • In Google Search Console, under Search Traffic > Search Analytics will show you what keywords are already sending your client clicks.
keywords google search console


Sitemaps are essential to get search engines to crawl your client’s website. It speaks their language. When creating sitemaps, there are a few things to know:

  • Do not include parameter URLs in your sitemap.
  • Do not include any non-indexable pages.
  • If the site has different subdomains for mobile and desktop, add the rel=”alternate” tag to the sitemap.

How to fix:

  • Go to Google Search Console > Index > Sitemaps to compare the URLs indexed in the sitemap to the URLs in the web index.
    Google Search Console - Index - Sitemaps
  • Then, do a manual search to determine pages are not getting indexed and why.
  • If you find old redirected URLs in your client’s sitemap, remove them. These old redirects will have an adverse impact on your SEO if you don’t remove them.
  • If the client is new, submit a new sitemap for them in both Bing Webmaster Tools and Google Search Console.
    Add a new sitemap


Crawl errors are important to check because it’s not only bad for the user but it’s bad for your website rankings. And, John Mueller stated that low crawl rate may be a sign of a low-quality site.

To check this in Google Search Console, go to Coverage > Details.


To check this in Bing Webmaster Tools, go to Reports & Data > Crawl Information.

bing webmaster tools crawl information

How to fix:

  • Manually check your crawl errors to determine if there are crawl errors coming from old products that don’t exist anymore or if you see crawl errors that should be disallowed in the robots.txt file.
  • Once you’ve determined where they are coming from, you can implement 301 redirects to similar pages that link to the dead pages.
  • You’ll also want to cross-check the crawl stats in Google Search Console with average load time in Google Analytics to see if there is a correlation between time spent downloading and the pages crawled per day.

Structured Data

As mentioned above in the schema section of Screaming Frog, you can review your client’s schema markup in Google Search Console.

Use the individual rich results status report in Google Search Console. (Note: The structured data report is no longer available).

This will help you determine what pages have structured data errors that you’ll need to fix down the road.

How to fix:

  • Google Search Console will tell you what is missing in the schema when you test the live version.
  • Based on your error codes, rewrite the schema in a text editor and send to the web development team to update. I use Sublime Text for my text editing. Mac users have one built-in and PC users can use Google bought YouTube.

Step 3: Review Google Analytics


  • Google Analytics.
  • Google Tag Manager Assistant Chrome Extension.
  • Annie Cushing Campaign Tagging Guide.


When I first get a new client, I set up 3 different views in Google Analytics.

  • Reporting view.
  • Master view.
  • Test view.

These different views give me the flexibility to make changes without affecting the data.

How to fix:

  • In Google Analytics, go to Admin > View > View Settings to create the three different views above.
    google analytics view settings
  • Make sure to check the Bot Filtering section to exclude all hits from bots and spiders.
  • Link Google Ads and Google Search Console.
  • Lastly, make sure the Site search Tracking is turned on.
    google analytics bot filter


You want to make sure you add your IP address and your client’s IP address to the filters in Google Analytics so you don’t get any false traffic.

How to fix:

  • Go to Admin> View > Filters
  • Then, the settings should be set to Exclude > traffic from the IP addresses > that are equal to.
filters in google analytics

Tracking Code

You can manually check the source code, or you can use my Screaming Frog technique from above.

If the code is there, you’ll want to track that it’s firing real-time.

  • To check this, go to your client’s website and click around a bit on the site.
  • Then go to Google Analytics > Real-Time > Locations, your location should populate.
    real time tagging in google analytics
  • If you’re using Google Tag Manager, you can also check this with the Google Tag Assistant Chrome extension.

How to fix:

  • If the code isn’t firing, you’ll want to check the code snippet to make sure it’s the correct one. If you’re managing multiple sites, you may have added a different site’s code.
  • Before copying the code, use a text editor, not a word processor to copy the snippet onto the website. This can cause extra characters or whitespace.
  • The functions are case-sensitive so check to make sure everything is lowercase in code.


If you had a chance to play around in Google Search Console, you probably noticed the Coverage section.

When I’m auditing a client, I’ll review their indexing in Google Search Console compared to Google Analytics. Here’s how:

  • In Google Search Console, go to Coverage
  • In Google Analytics, go to Acquisition > Channels > Organic Search > Landing Page.
    google search console channels
  • Once you’re here, go to Advanced > Site Usage > Sessions > 9.
    google analytics sessions

How to fix:

  • Compare the numbers from Google Search Console with the numbers from Google Analytics, if the numbers are widely different, then you know that even though the pages are getting indexed only a fraction are getting organic traffic.

Campaign Tagging

The last thing you’ll want to check in Google Analytics is if your client is using campaign tagging correctly. You don’t want to not get credit for the work you’re doing because you forgot about campaign tagging.

How to fix:


You can use Google Analytics to gain insight into potential keyword gems for your client. To find keywords in Google Analytics, follow these steps:

google analytics site search

Go to Google Analytics > Behavior > Site Search > Search Terms. This will give you a view of what customers are searching for on the website.

Next, I’ll use those search terms to create a New Segment in Google Analytics to see what pages on the site are already ranking for that particular keyword term.

GA new segment

Step 4: Manual Check


  • Google Analytics.
  • Access to client’s server and host.
  • You Get Signal.
  • Pingdom.
  • PageSpeed Tools.
  • Wayback Machine.

One Version of Your Client’s Site is Searchable

Check all the different ways you could search for a website. For example:


As Highlander would say, “there can be only one” website that is searchable.

How to fix: Use a 301 redirect for all URLs that are not the primary site to the canonical site.


Conduct a manual search in Google and Bing to determine how many pages are being indexed by Google. This number isn’t always accurate with your Google Analytics and Google Search Console data, but it should give you a rough estimate.

To check, do the following:

  • Perform a site search in the search engines.
    annaleacrowe_SEO audit site search
  • When you search, manually scan to make sure only your client’s brand is appearing.
  • Check to make sure the homepage is on the first page. John Mueller said it isn’t necessary for the homepage to appear as the first result.

How to fix:

  • If another brand is appearing in the search results, you have a bigger issue on your hands. You’ll want to dive into the analytics to diagnose the problem.
  • If the homepage isn’t appearing as the first result, perform a manual check of the website to see what it’s missing. This could also mean the site has a penalty or poor site architecture which is a bigger site redesign issue.
  • Cross-check the number of organic landing pages in Google Analytics to see if it matches the number of search results you saw in the search engine. This can help you determine what pages the search engines see as valuable.


I’ll run a quick check to see if the top pages are being cached by Google. Google uses these cached pages to connect your content with search queries.

RELATED POST:  11 Best Ad Networks in 2021: (How to apply)

To check if Google is caching your client’s pages, do this:

Make sure to toggle over to the Text-only version.

You can also check this in Wayback Machine.

How to fix:

  • Check the client’s server to see if it’s down or operating slower than usual. There might be an internal server error or a database connection failure. This can happen if multiple users are attempting to access the server at once.
  • Check to see who else is on your server with a reverse IP address check. You can use You Get Signal website for this phase. You may need to upgrade your client’s server or start using a CDN if you have sketchy domains sharing the server.
  • Check to see if the client is removing specific pages from the site.


While this may get a little technical for some, it’s vital to your SEO success to check the hosting software associated to your client’s website. Hosting can harm SEO and all your hard work will be for nothing.

You’ll need access to your client’s server to manually check any issues. The most common hosting issues I see are having the wrong TLD and slow site speed.

How to fix:

  • If your client has the wrong TLD, you need to make sure the country IP address is associated with the country your client is operating in the most. If your client has a .co domain and also a .com domain, then you’ll want to redirect the .co to your client’s primary domain on the .com.
  • If your client has slow site speed, you’ll want to address this quickly because site speed is a ranking factor. Find out what is making the site slow with tools like PageSpeed Tools and Pingdom. Here’s a look at some of the common page speed issues:
    • Host.
    • Large images.
    • Embedded videos.
    • Plugins.
    • Ads.
    • Theme.
    • Widgets.
    • Repetitive script or dense code.

Core Web Vitals Audit

Core Web Vitals is a collection of three metrics that are representative of a website’s user experience. They are important because Google is updating their algorithms in the Spring of 2021 to incorporate Core Web Vitals as a ranking factor.

Although the ranking factor is expected to be a small factor, it’s still important to audit the Core Web Vitals scores and identify areas for improvement.

Why Is It Important to Include Core Web Vitals in Your Audit?

Improving Core Web Vitals scores will not only help search ranking but perhaps more importantly it may pay off with more conversions and earnings.

Improvements to speed and page performance are associated with higher sales, traffic, and ad clicks.

Upgrading the web hosting and installing a new plugin may improve page speed but will have little (if any) effect on Core Web Vitals.

The measurement is done at the point where someone is literally downloading your site on their mobile phone.

That means the bottleneck is at their Internet connection and the mobile device. A fast server will not speed up a slow Internet connection on a budget mobile phone.

Similarly, because many of the solutions involve changing the code in a template or the core files of the content management system itself, a page speed plugin will be of very little use.

There are many resources to help understand solutions. But most solutions require the assistance of a developer who feels comfortable updating and changing core files in your content management system.

Fixing Core Web Vitals issues can be difficult. WordPress, Drupal, and other content management systems (CMS) were not built to score well for Core Web Vitals.

It is important to note that the process for improving Core Web Vitals involves changing the coding at the core of WordPress and other CMS.

Essentially, improving Core Web Vitals requires making a website do something that it was never intended to do when the developers created a theme or CMS.

The purpose of a Core Web Vitals audit is to identify what needs fixing and handing that information over to a developer who can then make the necessary changes.

What Are Core Web Vitals?

Core Web Vitals are consist of three metrics that collectively identify how fast the most important part of your page loads, how fast a user can interact with the page (example: click a button), and how fast it takes for the web page to become stable without page elements shifting around.

There are:

  • Largest Contentful Paint (LCP).
  • First Input Delay (FID).
  • Cumulative Layout Shift (CLS).

There are two kinds of scores for the Core Web Vitals:

  • Lab data.
  • Field data.

Lab Data

Lab data is what is generated when you run a page through Google Lighthouse or in PageSpeed Insights.

Lab data consists of scores generated through a simulated device and Internet connection. The purpose is to give the person working on the site an idea of what parts of the Core Web Vitals need improvement.

The value of a tool like PageSpeed Insights is that it identifies specific code and page elements that are causing a page to score poorly.

Field Data

Field Data are actual Core Web Vitals scores that have been collected by Google Chrome browser for the Chrome User Experience Report (also known as CrUX).

The Field data is available in Google Search Console under the Enhancements tab via the link labeled Core Web Vitals (field data can be accessed via this link, too)

How to Perform an In-Depth Technical SEO Audit

The field data reported in Google Search Console comes from visited pages that have had a minimum amount of visits and measurements. If Google doesn’t receive enough scores then Google Search Console will not report that score.

Screaming Frog for Core Web Vitals Audit

Screaming Frog version 14.2 now has the ability to display a pass or fail Core Web Vitals assessment. You need to connect Screaming Frog to the PageSpeed Insights API (get an API key here) via a key.

To register your Page Speed Insights API key with Screaming Frog, first navigate to Configuration > API Access > PageSpeed Insights

There, you will see a place to enter your API key and connect it to the service.

In the same PageSpeed Insights popup, you can also select the Metrics tab and tick off the boxes indicating what metrics you’d like to have reported.

Be sure to select Mobile for the device as that’s the metric that matters for ranking purposes.

How to Perform an In-Depth Technical SEO Audit

If you select the Opportunities tab, after the crawl Screaming Frog will show you a list of different kinds of improvements (like defer offscreen images, remove unused CSS, etc.).

How to Perform an In-Depth Technical SEO Audit

Note Before Crawling

There is generally no need to crawl an entire site and produce an exhaustive page-by-page accounting of what’s wrong with every single page of the website.

Before crawling, you may want to consider crawling a representative set of pages. To do this, first select a group of pages that represent the types of pages common to each section or category of the website. Create a spreadsheet, text file list, or manually paste the URLs in using the Upload tab in Screaming Frog.

Most sites contained pages and posts created with similar page structure and content. For example, all the pages in a “news” category are going to be fairly similar, pages in a “reviews” category are also going to be similar to each other.

You can save time by crawling a representative group of pages in order to identify issues common across individual categories as well as problems common to all pages sitewide that need fixing.

Because of those similarities, the issues discovered are going to be similar. It may only be necessary to crawl a handful of representative pages from each type of category in order to identify what kinds of issues are specific to each of those sections.

The kinds of things that are being fixed are typically sitewide issues that are common across the entire site, like unused CSS that is loaded from every page or Cumulative Layout Shift caused by an ad unit located in the left-hand area of the web pages.

Because modern websites are templated, the fixes will happen at the template level or with custom coding in the stylesheet, etc.

Crawl the Site With Screaming Frog

Once the URLs are fully crawled, you can click on the PageSpeed tab and read all the recommendations and view the pass/fail notations for the various metrics.

Zoom In on URL Opportunities

A useful feature in the Screaming Frog Core Web Vitals Audit is the ability to select a URL from the list of URLs in the top pane and then see the opportunities for improvement in the bottom pane of the Screaming Frog display screen.

How to Perform an In-Depth Technical SEO Audit

Below is a screenshot of the bottom screen, with an opportunity selected and the details of that improvement opportunity in the right-hand pane.

How to Perform an In-Depth Technical SEO Audit

Official Google Tool

Google has published a tool that can provide an audit. It’s located here:

How to Perform an In-Depth Technical SEO Audit

Insert a URL for an overview of the page performance. If you’re signed in Google will track the page for you over time. Clicking the View Report link will open a new page containing a report detailing what is wrong and links to guides that show how to fix each problem.

Image Credits

Featured Image: Paulo Bobita
All screenshots taken by author

Article Credits

Search Engine Journal

Protected by Copyscape

Share your love

I am an African from Ghana who loves to read and code. As a result of being on the internet, my life has changed and this blog is about my experience building an online business with free offers and motivating and inspiring others to pursue their dreams.

Leave a Reply

Your email address will not be published. Required fields are marked *

Covid-19 Update 😷

Because of the coronavirus (Covid-19) outbreak, our in-person services are closed until further notice. However, our online services are available as usual with some special offers.

Newsletter Signup

Subscribe to our weekly newsletter below and never miss the latest product or exclusive SEO tips we share with our readers.🙋‍♂️


Customers Reviews


Customers Reviews


Marketing Director
Working with the Mxblog24 team on our new website and SEO efforts has been seamless, instructive, and priceless because of the lengths to which they go to secure the greatest possible outcomes for our website. Our website has been producing results since day one, and it continues to improve. I would strongly, strongly, strongly recommend these!


Digital Strategist
Mxblog24 provided actual and honest SEO results from industry specialists, as well as smart next measures for the future of our organization.

Ricky Price 18th May 2021

We've seen remarkable increases in organic traffic and rankings (2 keywords in the #2 spot) since they came on board. In addition to link building, my site's assigned team continues to demonstrate excellent communication and knowledge of off and on-site SEO strategies.
My overall experience with Mxblog24 has been great, and I would recommend their services to anyone that is looking to get high-quality SEO work done.💞

Gilda Ponde 9th April 2021

Trust Pilot
"Mxblog24 is our first online experience and they are fantastic. A great service, good communication, and a handy way to get the job done in our website. Lots of support and assistance to help us achieve our goals.

Lori Southall 07/02/2021

I am convinced that signing a second optimization contract with Mxblog24 is wise based on your constant support and aggressive technology.". It enables us to stay ahead of our competitors."

Michael Payne

'We just wanted to thank each and every one of you at Mxblog24 for your dedication and hard work in getting outstanding results with our website. We have risen from page 50 on Google to page 2 in less than three months."

Mark Luckier

President Online Footwear Ventures Inc. (
Within three months, all of our major keywords ranked on the first page of Google. We have been utilizing Mxblog24. for 1 year and believe they will achieve the placement that your company needs to become its best.

Leony Gonzalaz

Web Marketing Manager AccuData America
The uniqueness of Mxblog24 makes it stand out from the crowd and I applaud all that you do.

Mark Canavarro 08/30/2021

“I wanted to take a moment and let you know how happy we are. You have shown great knowledge in regards to SEO best practices as well as optimization strategy and management of the account. Most importantly, you have obtained results, and that is an undeniable measure.”