Updated: April 16, 2024.

The guide on how to perform an in-depth SEO audit that only experienced SEO experts can do.

SEO audit

This is a complete SEO audit checklist that will let you perform an in-depth SEO audit of any website. I personally use this checklist every time I do an SEO audit. And I invite you to use it as well! 

This checklist probably contains each and every technical SEO element you can think of. In response to your feedback, I also added a few less technical (but also vitally important) elements, such as E-A-T.

Check my SEO auditing services.

❓Looking to hire someone to audit your website or do SEO? Make sure to check the SEO services I offer including SEO consultations and monthly SEO services.
👉 Contact me for more information or if you have any questions or learn more about why you want to hire me as your SEO consultant.

SEO Audits, Site Audit & SEO Auditing Purpose

Let’s start by answering these 5 starter questions so that you can establish the purpose of the SEO audit you are about to do. 

Here are a few questions to ask yourself: 

  • Do you want to find the reasons why the website has lost its traffic and visibility in search? 
  • Do you want to get an overview of technical issues that may hinder the website? 
  • Do you want to detect the most serious technical errors and suggest some quick fixes? 
  • Do you want to perform an in-depth technical SEO audit that analyzes all the more and less important technical SEO elements? 
  • Do you want to audit your own website? 

It’s great to know what you want to achieve with the audit before you get started. After all, you cannot hit a target you cannot see

Once you know your purpose, it’s time to start the geeky part. 

⚡ Check my article with 100 top SEO mistakes I identified in the last 100 SEO audits I performed.

⚡ Check my guide on how to do a Google page experience audit.

And do not miss my Core Web Vitals audit.

Gathering basic information for your SEO audit

In these steps, you will be gathering basic information about the site you are about to audit.

How long does a (technical) SEO audit take?

Here is my video where I explain my thoughts on how long a technical SEO audit should take and how long it takes me.

No. 1 Do a manual review of the website

It is a great practice to start your audit by manually reviewing the website and simply taking notes of what draws your attention. This is an awesome pre-audit task that will help you take a different look at the data provided by the tools. Your initial analysis at this stage will let you easily connect the dots later once you have gathered the data from SEO tools.

No. 2: Check the website using Semrush or another SEO tool

The next step after manually reviewing the website and before getting into the details is to simply analyze the website using one of the most popular SEO tools, such as Semrush or Ahrefs.

Checking domain in SEMRush
This is how you overview a domain in Semrush to gather some basic data about it.

Run the website analysis and pay attention to things, such as domain metrics, backlinks, overall visibility, organic traffic, organic keywords, and traffic trends to get a general “feel”. This is an essential step!

No. 3: Check the backlink profile

Even though this is a technical SEO audit (not a link audit), you should still do at least a general backlink profile analysis. If the website is involved in low-quality link building, then you – as an auditor – must know about it because this can have a negative influence on the overall performance and visibility of the website in search.

Backlink audit in SEMRush
Semrush has a very nice backlink audit tool that lets you quickly preview the backlink profile of the site audited.

You can do a general link audit using Semrush which will group links and indicate potentially low-quality links. Remember that this is only a tool and your human judgment is essential here.

No. 4: Check the CMS of the website

The chances are the website you are auditing uses a CMS (Content Management System). And there is a good chance it is WordPress (which currently powers almost 40% of websites). 

However, you – as an auditor – must know what CMS the website uses so that you can provide relevant recommendations on how to fix, for example, some of the CMS-specific issues. 

Checking the CMS with the CMS detect tool
You can use CMS Detect to check the CMS of the site audited.

To check the CMS of a website, you can use a tool like CMS Detect

No. 5: Check the hosting provider of the website 

It’s also worth knowing the hosting provider of the website. With this knowledge, you can give detailed and relevant recommendations on things, such as how to adjust specific server settings or how to add an SSL certificate. 

Assuming that you already have some experience, you will instantly know if a given hosting provider is a good fit for this website. 

Checking the hosting provider with the Hosting Checker tool
The Hosting Checker tool lets you check the hosting provider of a site. However, if a site uses a CDN, then you will get the information about the CDN type used.

Hosting Checker or any other similar tool will let you check this quickly. 

No. 6: Check if the website is on a shared hosting plan  

The opinions on whether shared hosting can or cannot impact a website’s ranking differ a lot.  Anyways, it’s always good to know if the website is using shared hosting and what other websites are on the same IP.  To learn that, you need to perform the reverse IP domain check. 

Checking what websites are hosted on the same IP address.
The You Get Signal tool will let you quickly check what other websites are hosted on the same IP address.

You can do a reverse domain IP check using You Get Signal or any other similar tool.  

No. 7: Check the domain history 

You cannot get the whole picture of the website without knowing at least a bit about its history. Make sure to check the age of the domain, its registrant history, and who its current registrant is.

Checking the domain history with the help of WhoISrequest
With the WhoISrequest tool you can quickly check the domain history. This information may be especially useful if you are investigating the reasons for a drop in rankings.

A simple tool like WhoISrequest will let you check the domain history. 

No. 8: Check the Wayback Machine 

In addition to the domain history, it’s also great to know what the website looked like in the past and what type of content it had. This knowledge will let you determine if there was a redesign or some other important change on the website. 

Wayback Machine
The Wayback Machine is an essential tool for SEOs, especially if they are looking for the reasons behind a traffic drop.

You can use the Wayback Machine to see what the website used to look like in the past. Also check my website redesign SEO checklist

No.9: Check if the site has undergone a major change or redesign recently 

It is good to know if there have been some major changes on the website, especially if you are auditing it to find the reasons why it lost traffic. To check if there were some redesign changes, you can also use the Wayback Machine.

Now let’s dive into the data from Google tools

Analysis Of Data Provided By Tools

Google tools provide tons of useful data about the technical SEO aspects of a website. Any good and comprehensive technical SEO audit should start by looking at the data from those tools!  

Google Search Console (GSC)

GSC is the bread and butter of any technical SEO analysis. If you have access to the GSC data of the website, then start by taking a brief look at the elements discussed below. 

If you love Google Search Console as much as I do, then don’t miss my guide on how to audit a site with Google Search Console (only) and How To Use Google Search Console For Keyword Research.

No. 10: Check if the website has a GSC account set up. 

It is hard to believe but I still come across websites that have never even set up a GSC account! If the website you are analyzing is this rare case, then you can skip all of the points from this section. 

Instead… set up a Google Search Console account for the website. If you don’t have the power to do that, make it a priority that the client does that. 

No. 11: Check the Performance report

Open the Performance report to get an overview of the website’s performance in search. Check metrics, such as total clicks, total impressions, average position, and average CTR. 

Set the date range to the last 12 months. Take notice of any traffic trends.  

The Performance report in Google Search Console
The Google Search Console Performance report gives you tons of information about how your site is doing in organic search.

No. 12: Check the Index report

The Index report will show you:

  • what web pages are indexed and can appear in Google (Valid and Valid with warnings),
  • and what web pages are excluded and why (Error, Excluded). 

From a technical SEO standpoint, this is a very important report.  

The Coverage report in Google Search Console.
The Google Search Console Coverage report is the best tool for checking the indexation of a website and its web pages.

No. 13: Check XML sitemaps 

Now go to Index > Sitemaps to check if an XML sitemap or a sitemap index has been submitted to Google. If no sitemaps are submitted, then you can move to the next point. You will take a deeper look at this element later on. 

If under Status you see Success, then a sitemap has been processed successfully. You can also click on a specific sitemap to see its details.  

The sitemaps submitted in Google Search Console.
The Google Search Console Sitemap report lets you submit XML sitemaps and check their status.

If you are not very sitemap-savvy, make sure to read the sitemap guide from Google.  

If you are not sure if the website audited has an XML sitemap, learn how to find the sitemap of a website.

No. 14: Check Removals 

You also want to know if someone intentionally or unintentionally requested the removal of the content of the website. Simply navigate to Index > Removal and see what’s there. 

Removals in Google Search Console
The Removals tool in Google Search Console allows you to quickly remove any web page from the Google index.

No. 15: Check Enhancements  

The next step is to check if there are any enhancements to be made. Simply navigate to Enhancements and take a look at each element listed there. 

These are important SEO elements, such as Core Web Vitals, Mobile Usability, Breadcrumbs, FAQ, and more (depending on the type of structured data used on the website).  

Enhancements and mobile usability in Google Search Console

I would take a detailed look at Core Web Vitals and Mobile Usability. These are all very important reports which you should understand well.

UPDATE: Check the Page Experience report

Google Search Console has recently got a new report which can be found in a new section Experience that now has three reports: Page Experience, Core Web Vitals, and Mobile Usability.

Go to Page Experience to see if the site meets all of the five Google page experience signals. Ideally, you want to see a lot of green here.

Google page experience report in GSC
Here is the new Page Experience report in Google Search Console.
Google page experience signals in GSC
Here are the reports for all the Google page experience signals.

No. 16: Check if the website has a manual action

Even though manual penalties are less common now they still happen! You need to know if there is or was a manual penalty for the website. Simply go to the Security & Manual Actions > Manual actions. Ideally, you should see no issues. 

Checking if there are manual actions in Google Search Console.
The Manual Actions report should be one of the first places to check in GSC when auditing a site.

If the website has a manual action, it should become a priority to clean it and submit a reconsideration request. Google explains manual actions in a very straightforward and comprehensive way in their article. 

No. 17: Check if the website has security issues 

Manual actions are bad and so are security issues. To check if the website has this problem, go to Security & Manual Actions > Security issues. In an ideal SEO world, you will not see anything there. 

Checking if there are security issues in Google Search Console.
The Security Issues report will let you know if the site audited has any security issues and may be demoted in search because of them.

No. 18: Check Links

Last but not least come links. This is not a link audit but you should have a general overview of the backlinks. Simply go to Links and check what’s there. You may also compare these data with the data from other tools like Ahrefs or Semrush

Links in Google Search Console
Use the Google Search Console Links to check the backlinks of a website.

Your task now is to navigate to Links under Legacy tools and reports. You will the tables with data on external and internal links.

Here is what to do:

  • Check what the Top linked pages tab says and if there is one or a few specific URLs that have some really huge numbers of links in comparison to other web pages of the website.
  • Overlay the above information with that you see in the Top linking sites. Do the majority of links come from one or a few sites only?
  • Analyze the Top linking text tab to make sure that exact-match keyword anchor texts are not overused. If they are, this should be a red flag for you. Click MORE to learn more details.
Top linking text ub Google Search Console
Here is the Links report in GSC showing the top linking text for my links.

No. 19: Check Crawl Stats report

The Google crawl stats report lets you take a deeper look into how Google is crawling the website.

Navigate to Settings and then under Crawling click on OPEN REPORT next to Crawl stats.

Crawl Stats report in GSC
Here is how you open the Crawl Stats report in Google Search Console.

The crawl stats report will let you take a quick look at the total number of crawl requests, total download size, and average response. In addition, in section Hosts, you will see the information about the health of your hosts (if there have been issues with the robots.txt fetch, DNS resolution, or server connectivity.

Crawl stats report in GSC
The GSC Crawl Stats Report is an awesome tool for understanding how Googlebot is crawling your site.

Make sure to check this section to quickly spot any crawling issues on the website.

No. 20: Check the disavow file

When auditing the website, it is also essential that you check if the disavow file was submitted and (possibly) if it indeed has the links it should have.

Checking whether the disavow file has been submitted is essential especially if you are auditing a website that has lost or has been losing its organic visibility.

Google Disavow tool
This is what the new version of the Disavow tool looks like.

You cannot access the disavow tool from Google Search Console because it is an advanced tool that can do a lot of harm if used incorrectly. You can only find it on Google.

Your task here is to check if the disavow file has been submitted and if possible take a look at its content to make sure it has been used correctly and in line with its purpose.

No. 21: Check the primary crawler

All the websites will be switched to mobile-first indexing in March 2021. Before that happens always check the primary crawler of the website.

To check the primary crawler of the website, log in to Google Search Console and navigate to Coverage. You will see the information about the primary crawler at the top of the page.

The primary crawler in GSC
The Coverage report in Google Search Console provides information about the primary crawler of the site.

The majority of websites have already been switched to mobile-first indexing. What if the website audited hasn’t been moved yet and its primary crawler is still Desktop?

  • It probably is a bit obsolete and has serious loading and display issues on mobile devices.
  • It may not be mobile-friendly. If that’s the case, then making it mobile-friendly should be the top priority.
  • It still has the m. version for mobile devices.
  • It may be a new website whose domain had been registered before. This is the only OK scenario for the website still using the desktop crawler (this is the case with my website).

Google Analytics

This is also a very general overview of the most important data from Google Analytics. Check my Google Analytics 4 basic SEO guide.

No. 22: Check if the site has a Google Analytics account

If your answer is NO, then you can skip the rest of the questions from this section. Instead, your task is to set up a GA account for the website. 

No. 23: Check if there are any visible trends in the data from the last 12-18 months

To check that, go to Audience > Overview and then set the date range to be at least the last 12 months.

Google Analytics audience overview
Here is the audience overview in a specified date range in GA.

No. 24: Check how the site acquires traffic

To check that, go to Acquisition >Overview. Organic search should bring the most traffic. But this is, of course, also case-specific.

Traffic acquisition in GA
The Acquisition Overview in GA will let you check if the site gets the majority of traffic from search engines.

If you are new to GA, check how to find organic traffic in Google Analytics.

No. 25: Check if traffic trends look similar in both Bing and Google

Comparing the organic traffic trajectory in Google with the traffic in Bing may be the key to understanding the causes of drops in traffic. To check and compare the organic traffic from Google and Bing, navigate to Acquisition > All Traffic > Source/Medium. Compare google / organic with bing / organic.

Organic traffic from Bing and Google in GA
Here is how you check the organic traffic from Google and Bing in GA.

Here are the 2 scenarios:

  • If the traffic trajectory is similar both in Google and in Bing, then it may mean that there are some technical issues with the website like some URLs not resolving.
  • If the traffic drop is visible only in Google, then the website may be suffering from a Google penalty.

No. 26: Check the bounce rate of the website

To check that, go to Behavior > Overview and you will see Bounce Rate there. 

Bounce rate in GA
Here is the bounce rate in GA for one of my sites.

No. 27: Check the average time spent on the pages of the site

To check that, go to BehaviorOverview and you will see Avg. Time on Page. 

Average time spent on site
The average time spent on the web pages of one of my sites

No. 28: Check where the majority of the audience comes from

To check that, go to Audience > Geo > Location

No. 29: Check the language the majority of the audience uses

To check that, go to Audience > Geo > Language

No. 30: Check the most often visited web pages of the website

To check the most visited web pages, go to Behavior > Site Content > All Pages

The most popular web pages in Google Analytics
Here you can check the most popular web pages on your site.

No. 31: Check what types of devices the majority of users of the site use

Is it mobile or desktop? To check that, go to Audience > Mobile > Overview

Webmaster Tools From Other Search Engines 

In most cases, websites get the majority of traffic from Google and a small percentage from Bing. But there are other search engines, such as Yandex or Baidu, and some websites indeed get a lot of traffic from them.

These cases are quite rare but they happen! That’s why we need to check if a website uses or should use the webmaster tools of other search engines. 

No. 32: Check if the website has a Bing Webmaster Tools account set up. Set it up if needed.

It’s good to verify the website with Bing because there are a few very nice tools within Bing Webmaster Tools, such as Site Scan, Robots.txt Tester, and Site Explorer.

I strongly recommend taking a look at those tools! 

Bing Webmaster Tools
Bing Webmaster Tools contain a bunch of really awesome SEO tools I strongly recommend checking.

No. 33: Check if the website has and should have a Yandex Webmaster account. Create one if needed.

It is for you to decide if the website needs a Yandex Webmaster account. Set up a Yandex Webmaster account if necessary. And check my list of Yandex search operators if you use this search engine.

No. 34: Check if the website has and should have a Baidu Webmaster Tools account. Create one if needed.

This is mainly for Chinese websites. Set up Baidu Webmaster Tools if applicable. 

Visibility in popular SEO tools

In addition to checking what Google tools have to say about the website, it’s crucial to analyze the visibility of the website using an SEO tool like Semrush or Ahrefs.

No. 35: Check the SEO visibility of the site in Semrush

Simply check the domain overview to get a general idea of how the website is performing.

Semrush Domain Overview
Here is the Domain Overview in Semrush.

The things to look at include Authority ScoreOrganic Search TrafficTraffic TrendKeywords TrendSERP FeaturesTop Organic Keywords, and Organic Position Distribution.

Domain Overview of Moz in Semrush
Here is the Domain Overview for moz.com in Semrush.

This will give you a pretty good idea of how things are.

No. 36: Check the SEO visibility of the site in Ahrefs

Simply type the domain in Site Explorer in Ahrefs and hit enter.

Ahrefs Site Explorer
This is the Ahrefs Site Explorer you can use to check the visibility of the site.

The things to look at include URDROrganic keywordsOrganic trafficOrganic positions. You may also want to check Top pages.

Backlink Profile

Any in-depth technical SEO audit would be incomplete without briefly analyzing the backlink profile of the website.

No. 37: Check the backlink profile of the website

The quickest and easiest way to analyze the backlink profile of the website is to use the Semrush Backlink Analytics and Backlink Audit tools.

Semrush backlinks
Here are the Backlink Analytics in Semrush.

Mobile-Friendly Test

The next vital step in your technical SEO analysis is to run the Mobile-Friendly Test. This is a quick way to check how Google sees and renders a web page, if there are any loading issues, or if it is mobile-friendly

Mobile-friendly test
The Mobile-Friendly Test allows you to check if a site is mobile-friendly with one click.

Once you run the test, you will be able to see the rendered page and HTML code. 

Mobile-friendly test
Here are the Mobile-Friendly Test results for my website.

No. 38: Check if the website is mobile-friendly

Run the Mobile-Friendly Test to make sure the website is indeed mobile-friendly… in the eyes of Google. 

No. 39: Check if there are any loading issues. 

Next, click VIEW DETAILS under Page loading issues to check the details of page loading issues (if there are any). 

Checking page loading issues in Mobile-Friendly Test
My site, fortunately, does not have any loading issues. If there are any issues, you should investigate them further.

No. 40: Check the rendered screenshot and its HTML code

Compare what you see in the rendered screenshot with what you see in the browser. Check the HTML code of the rendered screenshot and make sure that the most important links (like navigation links) and content are indeed there.  

There is also a nice tool JavaScript rendering check that will check any URL for differences between the original source and the rendered HTML.

Rendered vs source code
The results of the JavaScript Rendering Check tool for my website.

Google PageSpeed Insights 

Google PageSpeed Insights is an awesome tool that both examines the speed of the website and gives actionable tips on how to improve its speed and performance. 

Note that Google PageSpeed Insights provides information on a per-page basis. This is not the score for the entire website but for the specific URL (web page) you test. 

Google PageSpeed Insights
The Google PageSpeed Insights tool lets you quickly check the speed and performance of any website (including Core Web Vitals).

⚡ If the website uses WordPress, then the WP Rocket plugin will probably solve all the performance and speed problems. Check my review of WP Rocket to learn more.

No. 41: Analyze the website with Google PageSpeed Insights 

Check the scores for both the mobile and desktop versions of the website. If the score is below 80/100, you should take a closer look at the issues indicated by the tool. Anything below 50 (RED) requires your immediate attention, examination, and action. 

Google PageSpeed Insights results for seosly.com
The Google PageSpeed Insights tool results for SEOSLY (using WP Rocket for optimizations).

No. 42: Check if the website passes Core Web Vitals 

Core Web Vitals has become an official ranking factor.

Technical SEO Audit
The Google page experience signals (including Core Web Vitals).

With that in mind, you should pay special attention to Core Web Vitals and make it a priority that the website passes this assessment. This is a great investment for the future! 

If the website does not pass Web Vitals, fixing this in the nearest future should be a priority

⚡ Check my in-depth guide to Core Web Vitals. And my guides to Largest Contentful Paint, First Input Delay, and Cumulative Layout Shift.

No. 43: Check if the tool indicates that images are not optimized 

The uncompressed and unoptimized images slowing down the website are usually the easiest and the quickest to fix. 

If there are such images on the web page, the tool will indicate them along with potential savings. If possible, make it a priority to optimize all the images the tool indicates. 

No. 44: Check if the tool indicates that JS, HTML, and CSS code is not optimized 

These optimizations are also usually quite easy to implement, especially if the website is using WordPress. 

Google PageSpeed Insights
The tips on removing unused CSS provided by the Google PageSpeed Insights tool.

You will find my recommendations on WordPress plugins at the end of this guide. 

No. 45: Check if there are any other important opportunities or diagnostics indicated by the tool

The suggestions given under Opportunities and Diagnostics do not affect the performance score of the website. However, they can help the website load faster. Sometimes a lot!

Google PageSpeed Insights opportunities
The suggestions that may help a site load faster displayed by the Google PageSpeed Insights tool.
Google PageSpeed Insights diagnostics
The diagnositcs displayed by the Google PSI.

In most cases, the tool gives you all the information and suggestions you need to make a given web page faster and improve its performance score. 

Compliance with the Google quality guidelines

No. 46: Check if the website is compliant with the Google quality guidelines

Make sure the website is free of any glaring errors, such as keyword stuffing, doorway pages, sneaky redirects, or other obvious violations of the Google quality guidelines.

Technical SEO audit quality guidelines
An overview of the Google quality guidelines available at the Google Search Central site.

Many of those are obsolete black hat techniques but there are still websites that use them. 

By now you should have a pretty clear picture of the website, so you can move on to performing its in-depth technical SEO analysis. 

To perform most of the below tasks you will need a site crawler. I mainly use Screaming Frog and Semrush (I love their crawler and SEO tools) but you can complete these tasks with any other decent crawling tools as well. Most of the screenshots come from these two tools.

Indexing, Crawling & Rendering 

Let’s now get into details of how Google is indexing, crawling, and rendering your website.

Make sure to check my guide to the crawl budget optimization.

Status in the index

Analyzing indexability with Sitebulb
Here is the overview of the indexability of my site in Sitebulb.

No. 47: Check how many web pages of the website are indexed 

Use the site: command to check the approximate number of the web pages indexed. In most cases, the homepage should be the first result of the site: search command. 

Here is how I check my domain: site:seosly.com

Site: command to check if the site is indexed
The site: command is the quickest way to check if a given URL or a site is in the index.

The site: command used to check the rough number of the web pages in the Google index. And don’t forget that this works with Bing as well.

Checking the indexed web pages in Bing

⚡ And I have the whole guide about Google search operators and Bing search operators if you want to learn more.

No. 48: Check if the number of the web pages indexed corresponds to the number of valid web pages in Google Search Console

The number of the web pages indexed shown by the site: command is approximate. But it should still be similar to the number of the canonical (indexable) web pages of the website. 

In Google Search Console, go to Index > Coverage > Valid to check the exact number of the web pages indexed. 

Valid pages in Google Search Console for seosly.com
The Coverage report in GSC lets you check how many web pages are currently indexed. Note that the data in GSC appear with a one-week delay.

Any discrepancy may need your special attention! 

No. 49: Check if any weird or irrelevant web pages are indexed  

The site: command is also very useful for checking if any weird or irrelevant web pages got indexed.  Simply take a look at 2-5 pages of search results returned with the site: command. You may get really surprised! You may also put your domain in quotes and do an exact match search like “seosly.com”.


If you are not very robots.txt-savvy, start by reading the Google introduction to robots.txt.

No. 50: Check if the website has a robots.txt file 

Simply add /robots.txt to the website address to check if it has a robots.txt file and to see its content.  

For my website that would be https://seosly.com/robots.txt.  If there is no robots.txt file, make sure the correct status code is returned (200, 403, 404, or 410), or the website may not be crawled. 

The robots.txt file on my website. You can check the contents of your robots.txt file directly in Sitebulb.

⚡ In my other article you will learn how to access and modify robots.txt in WordPress.

No. 51: Check if the robots.txt file blocks website resources that should be indexed

Any errors in the content of the robots.txt file may result in valuable website resources not being discovered by search engine robots. Blocking a resource in the robots.txt file won’t prevent it from being indexed BUT from being crawled. If there is a link to the blocked resource anywhere on the internet, then this resource may get indexed and still appear in search results. 

No. 52: Check if the robots.txt really blocks website resources that should be blocked

Really huge websites (with millions of pages) should make good use of the robots.txt file so that it blocks the resources that should not be crawled like URLs with parameters, thin tag pages, etc.  It’s you – an SEO specialist – to decide what web pages (and if any) should be blocked from crawling.

No. 53: Check if the robots.txt file is valid

And you really want to be sure there are no typos or misspellings in the robots.txt file. You can use these tools to check if the robots.txt file is valid: 

No. 54: Check if there are any other errors in the robots.txt file

Robots.txt may be completely valid and free of any syntax errors but there still may be other errors, such as the wrong name of the directory to be blocked.  Double-check that robots.txt really does what it is supposed to do and gives the Googlebot correct directives. 

No. 55: Check if the robots.txt file indicates an XML sitemap address 

The standard location of the sitemap (/sitemap.xml) does not need to be indicated in robots.txt. If the sitemap is not at a standard location or there are multiple XML sitemaps (or a sitemap index), then the sitemap URL should be indicated in the robots.txt file. 

In the case of my website, it looks like this:
Sitemap: https://seosly.com/sitemap_index.xml

You can host the XML sitemap on an entirely different domain. You just need to make sure to include it in the robots.txt. 

Robots meta tag

No. 56: Check if the robots meta tag blocks website resources which should be indexed 

Both the intended and unintended “noindex” value of the robots meta tag will block a page from being indexed. Crawl the website to check the indexability status of its web pages in bulk.  

Checking indexability in Sitebulb
Sitebulb lets you check the indexability of all the web pages of a site in bulk.

You can also use the SEO Indexability Check Chrome extension to check indexability on a per-page basis. Note that if there are two different robots meta tags, Google will choose the more restrictive one. This is a rare case but it happens!

SEO indexability check
The SEO Indexability Check Chrome extension from DeepCrawl.

No. 57: Check if there are indexable website resources which should be blocked by the robots meta tag 

The opposite can also be harmful… especially if we are talking about thousands of low-quality indexable web pages. Your task here is to analyze all the indexable web pages and assess if they are valuable enough to be indexed. Use your common SEO sense!  

Checking the values of meta robots tags across the website using Sitebulb.


No. 58: Check if the X-Robots-Tag blocks website resources which should not be blocked

You can also block a web page from being indexed with the help of the X-Robots-Tag. Your detective work is by no means over. Your task now is to detect and analyze the resources blocked by the X-Robots-Tag.

Checking X-Robots-Tag in Screaming Frog
Checking the X-Robots Tag values in Screaming Frog SEO Spider.

Directives in X-Robots-Tag, robots.txt, and robots meta tag

No. 59: Check if the directives given in the robots.txt, robots meta tag, and X-Robots-Tag do not contradict one another 

Yes, you can instruct robots in multiple ways, such as with the use of the X-Robots-Tag, the robots.txt file, and the robots meta tag. Your task is to check if the directives are not contradicting one another, or bots will get confused and not behave as you want them to. 

No. 60: Check if the web pages like a privacy policy or terms of service are indexable 

The old school of SEO would advise against indexing privacy policy, terms of conditions, and the like web pages. These pages, after all, rarely have unique content. This seems to no longer be true. You should indeed index those pages because they help build up your E-A-T. You will learn more about E-A-T at the end of this guide.


No. 61: Check if the web pages of the website render correctly 

Rendering is seeing your web page through the eyes of a bot. It is really good to know how it sees the website! The Mobile-Friendly Test will render the web page and show the HTML code of the rendered page. 

I use either Sitebulb or Screaming Frog to render all the web pages of the website in bulk. 

In Sitebulb, once you start a new project, simply choose Chrome Crawler and Sitebulb will do all the dirty work for you together with analyzing the differences between the source and rendered code.

Chrome crawler in Sitebulb
When editing the project settings in Sitebulb, simply choose Chrome Crawler.
Response vs render in Sitebulb
After the Sitebulb crawl is done, go to Response vs Render to see if there are any problems or differences between the source and rendered code of your site.

If you want Screaming Frog to render JavaScript, go to Configuration > Spider > Rendering. Choose JavaScript and Googlebot Mobile

JavaScript rendering in Screaming Frog
The JavaScript rendering settings in Screaming Frog SEO Spider.

XML Sitemap

Now let’s do some detective work on XML sitemaps. 

No. 62: Check if the website has an XML sitemap

Let’s now check if the website has an XML sitemap. A lack of an XML sitemap may prevent search engine robots from discovering all of the web pages of the website (especially the web pages that are deep in the structure). 

So where do you look for an XML sitemap?

  • Check the default location which is /sitemap.xml or sometimes /sitemap_index.xml.
  • Check the robots.txt file. 
  • Check Google Search Console (Index > Sitemaps). 
  • If possible and applicable, log in to the CMS of the website and look for sitemaps settings there. 
XML sitemap
This is the XML sitemap for my site.

If the website does not have a sitemap, you may skip this section. Instead, create or suggest creating an XML sitemap for the website. 

⚡ In my guide on how to find the sitemap of a website, I’m showing you 7 different ways to detect a sitemap.

No. 63: Check if the XML sitemap contains all the URLs it should contain 

By definition, an XML sitemap should contain all of the indexable & canonical web pages of the website. In practice, however, that is not always the case

The quickest way to check if an XML sitemap contains the canonical URLs of your website is to crawl it. I strongly recommend using Sitebulb for that.

No. 64: Check if the sitemap contains incorrect entries

Make sure the sitemap is free from incorrect entries, such as redirected URLs, 4xx web pages, password-protected URLs, etc. Again, you can quickly check that by crawling the XML sitemaps of a website. 

Sitebulb will crawl the sitemap and show you any issues with it automatically.

Auditing XML sitemap
Sitebulb will automatically audit the XML sitemap of the site you are crawling and show you all the issues

Here is how you can analyze the content of sitemaps in Screaming Frog:

  • In Spider Configuration and under XML sitemaps, I check Crawl Linked XML Sitemaps and paste the URLs of the sitemaps of the website. 
Crawling XML sitemaps using Screaming Frog.
The settings in Screaming Frog SEO Spider that allow you to crawl XML sitemaps.
  • Once the crawl of the website is done, I run Crawl Analysis.
Crawl Analysis in Screaming Frog
This is how you run the Crawl Analysis in Screaming Frog SEO Spider.
  • Next, I analyze the contents of Sitemaps in the Overview tab. This tells me pretty much everything I need to know. 
Sitemap analysis in Screaming Frog
Under Sitemaps there are the results of the sitemap crawls.

No. 65: Check if the sitemap uses the deprecated <priority> and <changefreq> parameters 

Simply open the XML sitemap to check if there are <priority> and <changefreq> parameters. These parameters are currently ignored by Google, so there is really no need to have them in the sitemap. I usually recommend removing them entirely. 

No. 66: Check if the website uses the <lastmod> parameter. If the parameter is used, check if it is used correctly. 

Google is able to determine the actual last modification date of the website.  If the <lastmod> parameter is misused, Google will simply ignore it. 

If, for example, the <lastmod> parameter indicates the same date across all the sitemap entries, then you can be pretty sure it’s not used correctly. 

The <lastmod> parameter used in an XML sitemap
This how the <lastmod> parameter looks like in an actual XML sitemap.

In such a case, I usually recommend removing it entirely. 

No. 67: Check if the website has an image sitemap 

If a website has a lot of valuable images and relies on Image Search, then it should definitely have an image sitemap. Lack of thereof may result in search engine robots being unable to discover and index all of the images. Images can be added as a separate sitemap or together with the regular XML sitemap. 

Image sitemap
Here are the image URLs on my XML sitemap (generated by Rank Math).

No. 68: Check if the website has a video sitemap 

This applies only if a website hosts videos on its own server. A video sitemap is not intended for indicating YouTube videos.  

Language Versions

Skip this section if the website is not multilingual. 

Practically any website crawler will provide the information needed to assess if there are issues with hreflang implementation. If the website you are auditing is not multilingual, you can skip this section.

No. 69: Check if language versions are clearly divided

Multilingual websites should have clear language division. A lack thereof may result in incorrect indexation of particular language versions. For example, some web pages may be indexed only in one language while other web pages only in the other language. 

To avoid this issue, a website should:

  • (ideally) put different language versions in different directories, 
  • have a language switch visible on every web page,  
  • and use hreflang tags (see the point below). 

No. 70: Check if hreflang tags are used on the website 

Multilingual websites should use hreflang tags because: 

  • Hreflang tags let search engine robots discover alternate language versions of web pages. 
  • Hreflang tags help eliminate content duplication caused by the availability of web pages in the same language but meant for different regions. 

To analyze the hreflang tags on your site with Sitebulb (and get very clear suggestions and possible fixes), make sure to check International in project settings.

Make sure that this setting is checked so that Sitebulb can check the validity of hreflang annotations and HTML lang attributes on your site.

Screaming Frog will let you analyze hreflang tags too. 

Checking hreflang tags in Screaming Frog.
These are the hreflang issues that Screaming Frog SEO Spider checks for.

No. 71: Check if hreflang tags are used correctly 

Hreflang tags will not work unless they point to the correct (corresponding) web pages. Your task is to check if hreflang tags really link to the alternative language versions of the web pages. 

No. 72: Check if the return links are missing

Without return links, Google will ignore hreflang tags on the web page. This is a rare case where mutual linking is desirable. Screaming Frog or any other similar tool will let you check if return links are missing. 

Checking return links in Screaming Frog.
Screaming Frog will tell you if return links are missing or not.

No. 73: Check if the x-default hreflang attribute is used

Each web page using hreflang tags should also point to the default language version. To learn more about hreflang tags and how they work, read this Google article about managing multilingual and multiregional sites.  

Checking x-default hreflang attribute in Screaming Frog
You can use Screaming Frog SEO Spider to check if X-Default has been used across the entire site.

Internal linking

Internal linking can literally either make or break a site. This is a technical SEO aspect you should put a lot of focus on. A decent crawler like Sitebulb will analyze the internal linking structure of your site and notify you of any issues or areas of improvement.

Internal URLs in Sitebulb crawler
Here is the summary of the analysis of internal links in Sitebulb.

No. 74: Check if there is an excessive number of links on the homepage

If there are hundreds or even thousands of links on the homepage (including multiple links), then something is not quite right. There is no specific number to aim for but anything above one hundred should be a red flag and a reason to investigate further. 

Sitebulb does a brilliant job of analyzing your internal links and letting you understand how internal linking is implemented on a site. Just open the audit and go to Link Explorer.

Homepage links
Here is the Sitebulb Link Explorer that allows for digging deep into the internal linking structure. Here I am checking the links from the homepage.

Here is how to check the number of links on the homepage in Screaming Frog: 

  • Navigate to Internal > HTML.
  • Click on the (canonical) URL of the homepage. 
  • Navigate to the Outlinks tab.  
  • Click Export to export all the links. 
  • Analyze!  
Checking outlinks on the homepage in Screaming Frog.
This is how you can check the outlinks on any page with the use of Screaming Frog.

No. 75: Check if every web page of the website has a link to the homepage

This issue rarely happens but it happens. Each web page should link to the homepage. The most common placement for this link is the link from the logo or/and the “Home” link in the main navigation.

Link to the homepage
These are examples of links to the homepage on SEOSLY.

The most common placement for this link is the logo (a graphic link) or the “Home” link in the main navigation.

No. 76: Check if there are multiple links on the web pages of the website 

Multiple linking is less of an issue now.  Even John Mueller in one of the recent Google SEO office hours said that the first-link-counts rule is obsolete and quite irrelevant. It is your task as an SEO is to take a look at multiple links on the website and assess if these links are problematic or not.  

Checking multiple links using Screaming Frog
Screaming Frog SEO Spider allows you to check easily check if there are multiple links on a page. You just need to sort the links by their name, and you will see the duplicates.

Again, you need to use your experience and common sense. If there are tens or hundreds of multiple links on the web pages, then you may want to analyze it a bit more deeply. You can use Screaming Frog or any other website crawler to check that.  

No. 77: Check if lower-level web pages link to other thematically-related pages 

Product pages and blog articles are very helpful when it comes to using the potential of internal linking. Linking between thematically-related web pages strengthens both the linked and the linking page. For example, a product page should have links to other products. 

And it is also a great opportunity to use keyword-rich anchor texts, which helps search engine algorithms understand the topic of those pages.

No. 78: Check if the website makes use of contextual linking 

Contextual linking is a bit similar and is also very valuable. Contextual links also help search engines associate the linked web pages with specific keywords used in the anchor text. Using keyword-rich anchor texts in internal links is totally OK!

The best place for contextual links is blog articles that should link to product or offer pages. If the website has articles or tutorials, contextual linking should be put into action.  

Anchor text in Sitebulb
One way to check if the site uses valuable contextual links is to check the anchor text of internal links. In Sitebulb, go to Link Explorer > Internal Anchor Text.

No. 79. Check if there are links pointing to non-existent or blocked resources 

Links to blocked or non-existent resources may lead to a bad user experience which can negatively influence SEO. Your task is to remove the links pointing to non-existent or password-protected resources. You can also replace such links with working links returning status 200 (OK). 

Sitebulb will let you check both internal and external links very easily.

Links in Sitebulb
Sitebulb lets you quickly analyze the internal and external link status.

Low-value links

No. 80: Check if there are low-value text links with inappropriate anchor text 

Low-value text links make it harder or even impossible for search engine algorithms to associate the linked page with appropriate keywords related to its content. Links with low-value anchor text (like “Read more” or “Click here”) are a missed opportunity to give search engines information about the topic of the linked page. 

No. 81: Check if there are low-value graphic links with the inappropriate or missing ALT attribute

The ALT attribute in image links is like the anchor text in text links. Use it! Make sure that graphic links are high-value links! 

Website structure

Any crawling tool will help you analyze the structure of the website. However, I love how Sitebulb does that.

Sitebulb site structure
Here is the crawl tree generated by Sitebulb. It lets you easily visualize the structure of the site.

And here is how Screaming Frog does that.

Crawl depth in Screaming Frog
Screaming Frog SEO Spider provides a lot of useful data you can use to analyze the website structure.

But don’t blindly believe what the tool is saying. Make good use of your SEO experience and common sense here. The best website structure is the one that avoids any extremes. 

No. 82: Check if the website structure is too flat 

One extreme is when the homepage contains all the links to all the web pages. 

No. 83: Check if the website structure is too deep

Another extreme is when the structure is too deep with more than 4-5 levels and lots of orphan web pages. 


Breadcrumbs help users and search engine robots navigate through the website and better understand its structure. Breadcrumb navigation (or a breadcrumb trail) creates a return path to the superordinate web pages of the currently browsed page (including the homepage).

No. 84: Check if the website has breadcrumbs 

It is generally a good practice to use breadcrumbs on both smaller and bigger websites. On huge websites, it is even a must! 

Example of breadcrumb navigation
This is an example of a breadcrumb on one of the web pages at SEOSLY.

No. 85: Check if breadcrumbs are implemented correctly 

Two things to note here:

  • Breadcrumb navigation should be implemented with the use of structured data. 
  • Breadcrumbs should not omit any web pages in the path and the last item (the page itself) should not be a link. 

If the site you are auditing has breadcrumbs, you can quickly check if they are added with the use of Schema.org by simply going to Structured Data > Search Features in Sitebulb. You should see Breadcrumb there.

In this Sitebulb report you can see that breadcrumbs are added correctly to my site and there are no error or warnings in their implementation.

No. 86: Check if breadcrumbs are used consistently across the entire website

The website should use breadcrumbs consistently on each page. You can learn more about breadcrumb trails directly from Google


The main navigation informs both users and search engine robots about the most important web pages of the website. 

Sitebulb has an interesting feature that lets you to analyze internal links in terms of their placement. For example, you can analyze the links added in the main or footer menu.

Internal links location
This is the breakdown of the location of the internal links of my site in Sitebulb.

No. 87: Check if the main navigation of the website contains links to the most important web pages 

Navigation should have links to main category pages, hub pages, or important info pages (contact or about pages). 

Navigation menu links
The navigation menu at SEOSLY contains the most important links.

No. 88: Check if the navigation of the website is implemented based on text links

I know this seems pretty basic and obvious but it’s still worth checking. 

No. 89: Check if list tags (<ul> and <li>) are used to build navigation elements  

And making sure that navigation is built with the use of list tags. 

No. 90: Check if navigation links are visible to search engine robots 

Since navigation links are the most important, you need to make sure that robots can really see those links. This is especially important for JavaScript-heavy websites. You can check that by simply comparing the source and rendered HTML code of the website.

No. 91: Check if navigation works correctly on a mobile device

In addition to being accessible to search engine robots, navigation links also need to work as expected from the user’s side.  Simply open the website on a mobile phone and check how navigation works. Does it drop down where it should? Does it open a new web page as expected?

Mobile menu
This is the main menu of SEOSLY on mobile.

External Links

To check external links in bulk (in Screaming Frog), go to Overview > SEO Elements > External

External links
Screaming Frog SEO Spider allows you to check all the external links in bulk.

No. 92: Check if external links that are not true recommendations have a rel=”nofollow” or rel=”sponsored” attribute

Any outbound link that is not a true recommendation of the website audited should have a “nofollow” or “sponsored” attribute. And, conversely, there should also be true quality non-sponsored dofollow links to other thematically related web pages. There needs to be some balance!

No. 93: Check if the links added by users have a rel=”ugc” attribute 

If there is user-generated content on the website, then the website should make use of a rel=”ugc” attribute. This is especially important for links in the comments and in the forum section. Make sure to check that! 

No. 94: Check if there are site-wide dofollow links 

In most cases, site-wide dofollow links should also have a rel=”nofollow” attribute. It’s almost 2021 but there are still websites that use site-wide links to boost SEO!

No. 95: Check if there are external dofollow links to valuable resources

The website should have external dofollow links to high-quality resources. It is the way of the web to link out to the web pages that the website author considers valuable. Did you notice that I link to many external resources in my guide on how to do an SEO audit? I do that because I know these are valuable resources that may further help you.

URL addresses

To analyze URL addresses in bulk (in Screaming Frog), go to Overview > SEO Elements > URL.  

External URLs in Screaming Frog
Screaming Frog SEO Spider will analyze all the URLs of a site.

No. 96: Check if URLs contain parameters (i.e. session or user identifiers) that do not influence the content displayed 

URL addresses should not contain parameters that have no influence on the content displayed (e.g. session or user identifiers). If there are such addresses, then they should have a canonical link pointing to the URL version without parameters. 

No. 97: Check if URLs contain keywords 

In one of the recent Google SEO office hours, John Mueller said that keywords in URL play a minimal role. However, when it comes to users, that’s slightly different. Users like clear URLs and Google likes what users like!

URLs should contain appropriate keywords describing the topic of the web page instead of unfriendly characters like “/?p=123”. 

For example, an article talking about Google search operators should have these keywords in the URL like here: https://seosly.com/google-search-operators/

No. 98: Check if URLs contain words in a different language than the language of the website 

And, of course, URLs should contain words in the language of the web page. URLs in a different language might confuse both users and search engine robots. 

No. 99: Check if dash characters are used to divide words in URLs 

You should use dashes to separate words in URLs. Again, it’s both user and bot friendly. 

No. 100: Check if URLs contain unnecessary words

Ideally, URLs should not contain unnecessary words that would make them super long.


Any crawling tool will let you check the redirects implemented on the website. In Screaming Frog, go to Overview > SEO Elements > Response Codes to examine all the redirects. 

Checking response codes in Screaming Frog
Checking the response codes in Screaming Frog SEO Spider.

No. 101: Check if there are multiple redirects (redirect chains)  

Ideally, one URL should be redirected only once. Note that Googlebot may stop crawling after more than 2-3 redirects. 

No. 102: Check if there are redirects with incorrect statuses 

In most cases, you should use 301 (permanent) redirects. 302 (temporary) redirects are to indicate a temporary change. People often confuse these two and use 302 redirects for permanent site changes (like a redirect from non-HTTPS to HTTPS version). Make sure the website uses redirects in line with their purpose. 

No. 103: Check if there are meta refresh redirects

Unlike 301 or 302 redirects which are server-side redirects, meta refresh redirects are client-side. A meta refresh redirect instructs the browser to go to a different page after a specific time period. For Google meta refresh redirects are sneaky redirects.

Checking meta refresh redirects in Screaming Frog
This is how you check in Screaming Frog if a site has meta refresh redirects.

Meta refresh redirects should be replaced with regular HTTP redirects. Yu can learn more about sneaky redirects and 301 redirects straight from Google.

Status Codes

An HTTP status code is the server’s response to the browser’s request.  Status codes indicate if the HTTP requests were successful (e.g. 2xx), if there were some errors (e.g. 4xx), redirections (3xx), or other problems with the server (5xx).   

No. 104: Check if there are web pages returning 5xx errors

A huge number of web pages returning status code 5xx may indicate that there are some problems with the server. The server may be overloaded or may need some additional configuration. 

No. 105: Check if there are web pages returning 4xx errors 

We have already touched upon this a bit. Lots of web pages returning status 404 (not found) or 410 (content removed) on the website may lead to a bad user experience. This applies to both internal and external links within a website.  

If there are backlinks pointing to those 4xx web pages, then Google will not count those links. The internal links to these 404 web pages should either be removed or replaced with working links. If there are external links pointing to those 404 URLs, then I recommend 301-redirecting these web pages to working URLs.

Error Page

And we are not done with status codes yet. 

No. 106: Check if an error page returns a 404 status code

A website should be able to handle error pages correctly. A non-existent page should return a 404 status code (not found) instead of 200 (OK). An error page returning the status code 200 may get indexed and become a soft 404. 

Google is getting better and better at handling soft 404 pages but you still should ensure that the website handles error pages in an optimal way.  

You can use the Link Redirect Trace Chrome extension to quickly check the status code of any web page. 

Error page
You can use the Link Redirect Trace Chrome extension to check if an error page returns 404.

No. 107: Check if the website has an error page

A blank error page that screams “ERROR” in red is not a good user experience.  A website should have an error page that clearly says that this is an error page and a user landed on it because the URL typed does not exist or cannot be found. 

No. 108: Check if the website has a dedicated error page 

In an ideal SEO world, an error page should also have links to the most important web pages of the website. Its layout and design should also be like the rest of the website. 

A dedicated error page is there mainly for users to make their page experience better. 


Yup, it is true that Google is getting better and better at handling duplication. But you, as an effective SEO, can make it a lot easier for Google with just a few technical fixes. 

No. 109: Check if there is duplicate content caused by the incorrect technical implementation of the sorting of content 

The incorrect implementation of the sorting of the content may result in a lack of control over which URLs get indexed. A quick fix to this is usually to add a canonical link element that points to the URL without the sorting parameters. 

Note that this may vary from website to website. There may be situations where you want to index URLs with specific sorting or filtering parameters.  

No. 110: Check if the website is available both at the HTTPS and non-HTTPS URL versions 

We are coming back to redirections! I see this so often that I need to repeat myself. If the website has an SSL certificate, then all the non-HTTPS versions should permanently redirect (301) to the HTTPS versions. And there should be one redirect only. 

A redirect from HTTP to HTTPS
The Link Redirect Trace tool will let you trace the redirects on any page. Here I am checking if the redirection from HTTP to HTTPS works correctly.

No.111: Check if the website is available both at the WWW and non-WWW URL versions

And the same applies to the WWW and non-WWW versions of the website. The website should not resolve at both the WWW and non-WWW URL versions. 

Redirect from WWW to NON-WWW
Here is how you can check if a site correctly redirects from WWW to non-WWW.

One version needs to be chosen as canonical and the other one needs to redirect (301) to the canonical version. Here again the Link Redirect Trace Chrome extension will come in very handy. 

No. 112: Check if the web pages of the website are available at URLs in which the letter case is insignificant

In an ideal SEO world, the letter case in URLs should not be significant. But if it is significant, then you may:

  • add a canonical link pointing to the lowercase URL,
  • or use the so-called URL rewriter (like URL Rewrite, URL Rewriter, or ISAPI Rewrite 2) that will rewrite URL addresses so that there are only lowercase characters.

No. 113: Check if pagination is handled correctly 

Make sure that the paginated web pages can be crawled by search engine robots. If you feel like having a long read, check this guide to pagination in SEO

No. 114: Check if there are duplicate or near-duplicate pages

It’s for you to judge if they really exist. Screaming Frog and Semrush may help you find such pages. 

In Sitebulb, go to Duplicate Content to check if there are duplicate content issues on the site.

Sitebulb duplicate content
This is how Sitebulb reports on duplicate content. As you can see there is no duplicate content on my site.

In Screaming Frog, go to Crawl Overview > Content and check what pages are listed under Exact Duplicates, Near Duplicates, Low Content Pages.  

Checking if there are duplicate pages using Screaming Frog.
Here are the content issues that Screaming Frog SEO Spider checks for.

No. 115: Check if the website is available at a different URL 

It’s better to be safe than sorry. So make sure there is no indexable copy or test version of the website somewhere on the internet. Use Copyscape to detect duplicate content. 

Copyscape is the most popular tool for checking if the content of a site is unique on the internet.

No. 116: Check if the content of the website is unique on the Internet

You can check if the website is unique by simply copying a unique block of text from the website and pasting it in quotes into Google. The only result returned should be the web page from which you copied the text.


Let’s now dive deep into canonicalization. Sitebulb or Screaming Frog or any other website crawler will let you check if there are canonicalization issues on the website.

When in Sitebulb, go to Indexability > URLs and then check what’s under Canonical URL.

Indexability in Sitebulb

When in Screaming Frog, go to Overview > Canonicals to check how and if canonicals are used on the website. 

Checking canonicals using Screaming Frog
You can check all the canonical URLs in bulk with Screaming Frog SEO Spider.

No. 117: Check if canonical link elements are used on the website

The rule of SEO thumb I recommend is that all the web pages of the website should have a canonical link element. Check if this is the case for the website you are auditing. 

⚡ Make sure to check my guide on how to audit a site with JetOctopus.

No. 118: Check if canonical link elements are used correctly 

Having canonical link elements is one thing, but having them implemented correctly is another! Your task is to check canonicalized URLs and make sure that, for example, a group of unique web pages is not canonicalized to one general URL.

I once even saw all the web pages being canonicalized to the homepage.  

No. 119: Check if for the most important web pages Google has chosen the same canonical address as indicated by the webmaster. 

You cannot force Google to choose the canonical URL you set for a given web page.  A canonical link is not a directive but only a hint for Google. 

Canonical address in GSC
This is how you can check what URL is set and chosen as canonical (the URL Inspect tool in Google Search Console).

But it’s great to know if Google took into account that hint. The URL inspect tool will help you check that.

Title Tags 

Page titles are more related to the content side of SEO but let’s briefly take a technical look at them. 

The <title> element is what users see directly in search results and in the tab name in modern web browsers. 

Here is the title tag of the homepage of SEOSLY.

Page titles provide both search engine robots and users with a lot of valuable information about the web page. I recommend using a crawler like Sitebulb or Screaming Frog to analyze the website’s meta tags in bulk. 

When in Sitebulb, go to On Page and scroll down until you see Title Length and Title Identification. Click on any to see more detail.

Title tags in Sitebulb
Here is how to check title tags in bulk in Sitebulb.

To analyze page titles in bulk using Screaming Frog, go to Overview > SEO Elements > Page Title

Analyzing page titles in Screaming Frog
Here is how you analyze title tags in Screaming Frog SEO Spider.

No 120: Check if the web pages of the website have the <title> tag 

In Sitebulb, go to On Page and then Title Identification and click on Missing. The link will be active if you have missing titles on any of the web pages on your site.

Missing page titles in Sitebulb
Here is how you can check in Sitebulb if there are missing titles on the site.

In Screaming Frog, to check if there are any web pages with missing page titles, simply navigate to Page Titles > Missing

Checking pages with missing titles in Screaming Frog
In Screaming Frog SEO Spider you can easily display the list of all the URLs that do not have a title tag.

Make sure to check all the web pages of the website. All valuable web pages of the website should have <title> tags with high-quality and relevant content. 

No. 121: Check if the <title> tags are of the recommended length (30-60 characters) 

Page titles should contain between 40 and 60 characters to look attractive in search results.

In Sitebulb, go to On Page and Title Length to check the length of the title tags on the site.

Title length in Sitebulb
Here is how you check the title length in Sitebulb.

In Screaming Frog, go to Page Titles and check web pages appearing under over 60 Characters and Below 30 Characters

Length of page titles
Screaming Frog allows you to display the title tags that are too short (or too long).

No. 122: Check if the <title> tags used are duplicate

In Screaming Frog, go to Page Titles > Duplicate and you will see the list of web pages with duplicate page titles. 

Each web page of the website should have a unique title tag. You can learn more about page titles and descriptions straight from Google in this article

Checking duplicate titles in Screaming Frog
You can also display all the duplicate titles with just one click in Screamin Frog SEO Spider.

No. 123: Check if the <title> tags used have appropriate content

Page titles need to be unique and form a kind of summary of the content of the web page. 

In Sitebulb, you can quickly analyze the content of all the title tags by going to On Page > URLs and scanning the list.

Title tags in Sitebulb
Here is how Sitebulb lets you analyze all the title tags on your site.

No. 124: Check if the <title> tags used contain important keywords

And page titles should also contain the most important keywords in them. But please, no keyword stuffing! 

No. 125: Check if keywords are placed at the end of the <title> tag

Keywords should be placed at the beginning of the title (starting with the most important keyword). Again, no keyword stuffing!

No. 126: Check if the brand name appears at the beginning of the <title> tag

If you want to put the brand name in the title, then it should appear at the end of the page title. The only exception is the homepage where it can appear at the start. 

Page Descriptions

And now a technical look at another content-related SEO element: page descriptions. If the page description is attractive and gets to the point quickly, then users will be more likely to click the website’s snippet in search results.  

Here is the meta description of the homepage of SEOSLY.

In Sitebulb, go to On Page and scroll down until you see Meta Description Length and Meta Description Identification.

Meta descriptions in Sitebulb
Here is how you can quickly check the meta description elements of your site in Sitebulb.

To analyze page descriptions in bulk using Screaming Frog, go to Overview > SEO Elements > Meta Description

Checking meta description using Screaming Frog
You can easily get the list of web pages that do not have the meta description element with the help of Screaming Frog SEO Spider.

No. 127: Check if the web pages of the website have content in meta description tags 

In Sitebulb, go to On Page and Meta Description Identification. Check what’s under Missing.

On-page SEO in Sitebulb

In Screaming Frog, go to Meta Description > Missing to see the web pages without meta descriptions. If there is no content in the meta description element, then Google will generate it on its own. 

Having no meta description is not a big issue because even if a web page has its unique description, Google will probably rewrite it more often than not. However, it is still a good SEO practice to have unique page descriptions for at least the most important web pages of the website. 

No. 128: Check if the content of meta description tags is of the recommended length (140-155 characters)

Assuming that Google chooses our own meta description, we should keep it within 140-160 characters so that it does not get trimmed down in search results or is not too short. 70 characters is an absolute minimum length. 

In Sitebulb, go to On Page and Meta Description Length to see the details. Click to see the details.

Meta description length in Sitebulb
Here is how you can check the length of the meta description elements in Sitebulb.

In Screaming Frog, go to Meta Descriptions > Over 155 Characters and Below 70 Characters.

Checking meta description in Screaming Frog
Screaming Frog SEO Spider lets you view all the web pages with too-long meta descriptions.

No. 129: Check if meta description tags are duplicate

Just like with page titles, we want meta descriptions to be unique. Go to Meta Description > Duplicate to see the web pages with duplicate meta descriptions. 

Note that it is better to have no page description at all than to have this element duplicated across many web pages. You can learn more about page titles and descriptions straight from Google in this article. 

Meta description in Screaming Frog
With just one click in Screaming Frog SEO Spider, you can view all the duplicate meta description elements.

No. 130: Check if the content of page descriptions is appropriate

Unattractive or random content of meta description tags will make users much less likely to click the snippet in search results. 

No. 131: Check if the page descriptions contain keywords

Page descriptions should include the page’s most important keyword, its variation, and, if possible, a synonym. This time it’s mainly for users, not search engines. 


Whatever they say about them at the moment, headers are important in terms of SEO.  Headings are very important when it comes to SEO. They help both users (especially users of screen readers) and search engine robots understand the topic and subtopics of the web page. In this technical SEO checklist, we are taking a more technical look at them. 

H1 Heading

No. 132: Check if the web pages of the website have an H1 tag. 

Every web page of the website should have one H1 header.

To analyze headings in Sitebulb, go to On Page and check the hints.

On page and H1 tags in Sitebulb
Here is the On Page report in Sitebulb with hints of the H1 tags on my site.

To check H1 headings in bulk in Screaming Frog, go to Overview > SEO Elements > H1 > Missing and you will see the list of web pages without an H1 tag.  

Checking missing H1 tags
Screaming Frog SEO Spider will display the list of web pages that do not have an H1 tag.

A web page without an H1 tag is missing a huge opportunity to give search engine algorithms valuable information about itself. Each page of the website (including the homepage) should have exactly one unique H1 heading. 

No. 133: Check if there are multiple H1 headings

In Sitebulb, go to On Page and scroll down until you see H1 Tag Identification. Click to see details.

H1 tags in Sitebulb
Sitebulb lets you check if the site has H1 tags and if they are used more than once on a page.

In Screaming Frog, go to SEO Elements > H1 and Multiple to check if there are web pages with multiple H1 headings. 

Multiple H1 tags
Here is how you check what web pages have multiple H1 tags.

It is certainly better to have multiple H1 tags than to have no H1 tags at all. But, if possible, stick to one H1 tag! 

No. 134: Check if the content of H1 headers is SEO-friendly

H1 headers should contain the most important keyword for the web page to clearly communicate its topic both to users and search engines.

Structure of headings 

No. 135: Check if headings are used on the web pages of the website 

H1 tag is not enough to provide users and search engine robots with information about the content structure of the web page. A web page that has no headings at all or just one heading is difficult to understand for both users and robots. 

You can use Chrome plugins, such as Web Developer or Detailed SEO Extension to check the structure of headers on a web page.  

The structure of headings on SEOSLY
Here is the structure of headings on the SEO Blog web page of my site.

No. 136: Check if headings are used excessively 

Headings should be used to highlight the most important content and individual sections of the website. Excessive use of headings will confuse people using screen readers just as it will confuse search engine algorithms. 

No. 137: Check if the structure of headings is corrupt  

Another great SEO rule of thumb is to have a logical order of headings. You should treat headings and subheadings as chapters and subchapters in a book. A web page is that book. 

Chrome extensions for checking headers:

Make sure to check my list of SEO Chrome extensions (79 extensions reviewed).

Graphic Elements 

Graphic elements, if optimized correctly, can give both search engine robots and users of screen readers a lot of extra information about the web page.

When in Sitebulb, go to Page Resources > Overview to see the overview of the page resources on your site.

Page resources in Sitebulb

Click on Images to check the details of the images (ALT attribute, file size, compression, etc.)

Image dta

To check images in bulk in Screaming Frog, go to Overview > SEO Elements > Images.

Checking images in bulk using Screaming Frog
Screaming Frog SEO Spider also lets you check the images and their SEO optimization in bulk.

No. 138: Check if images are embedded correctly

Google will not treat the images embedded with CSS as part of the content of the webpage. To check how a given image is embedded, simply right-click on it and click Inspect

Except for the images forming the layout of the website, graphics should be embedded with the <img> tag. 

Inspecting an image in Chrome
By inspecting an image you can quickly check if it is embedded correctly.

No. 139: Check if there are images with low-value ALT attributes

Low-value ALT attributes will confuse both users of screen readers and search engine algorithms. ALT attributes provide important information about the content of the image. Each unique image on a web page should have a unique and high-value ALT attribute. 

No. 140: Check if there are images with no ALT attribute at all 

Go to Overview > SEO Elements > Missing Alt Text to view all the images with missing ALT text. 

Checking images with missing ALT text in Screaming Frog
Screaming Frog SEO Spider can also show you all the images that do not have an ALT text.

You can also check images on a per-page basis using the Detailed SEO Chrome extension.

Images with and without ALT
The Detailed SEO Extension can also show you the data about the images used on a given web page.

Each unique image on a web page should have a high-value unique ALT text. 

No. 141: Check if there are images with low-value file names 

To check image filenames in bulk, go to Overview > SEO Elements > Images > All. Image file names are not as important as ALT attributes. If a website has high-value ALT attributes, then filenames are less important. However, it’s still a good practice to use SEO-friendly image filenames at least from now on.

No. 142: Check if the images used are of appropriate size 

In an ideal world, the images should be displayed in their original (already compressed) size. A very common error I see is when a web page has huge (often PNG) images and adjusts its display size with CSS/HTML.

Google PageSpeed Insights will notify you if a web page has this issue. If this is a common practice, then this can really decrease the speed and performance of the website. 

No. 143: Check if the images used are optimized

Test the web page with Google PageSpeed Insights to check if there is room for improvement. The images on a website (especially if it has lots of them) should be compressed and optimized. The next-gen format should be used if possible.

In the case of WordPress sites, there are lots of useful plugins for optimizing and compressing images. Check Google Image best practices.


No. 144: Check the code

This is very case-specific. Simply view the source code of the website audited and use your common SEO sense. You can view the code of any website by simply adding view-source before its address like in: view-source:https://seosly.com/ 

PRO TIP: To check the code of a website on a mobile device, simply add “view-source” before the address. 

If the homepage is relatively small and does not have a lot of content, but there are tens of thousands of lines of code, then something is not right. 

No. 145: Check if there are some unnecessary comments in the HTML code 

Check if the website has some unnecessary or weird comments in the code. You may be really surprised at what sometimes gets there! 

No. 146: Check if the HTML code is alternated with JavaScript

The general rule is to use JavaScript <script> tags before the </body> tag and in the <head> section. Your task is to check if they are not added all over the place. 

No. 147: Check if in-line styles are used 

In-line styles on rare occasions are OK. But they should be an exception rather than the rule. The HTML code should not contain excessive numbers of in-line styles. 


In Sitebulb, go to Duplicate Content to see the technical analysis of the content on the site.

Duplicate content in Sitebulb
Here is how the Duplicate Content report looks like in Sitebulb.

In Screaming Frog, go to Overview > SEO Elements > Content to take a technical look at the content side of things.  

Checking content with the help of Screaming Frog
Use cna use Screaming Frog SEO Spider to detect some of the common content issues on a website.

No. 148: Check if there are duplicate or near-duplicate web pages

Now go to Overview > SEO Elements > Content and check the web pages listed under Exact Duplicates and Near Duplicates.  In most cases, you don’t want Google to index those pages. These web pages should either be canonicalized or a “noindex” robots meta tag should be added to them. 

No. 149: Check if the homepage has at least some text

The homepage is by far the most significant web page of the website. That’s why it should have at least some text (at least a few hundred words) and a clear heading structure. Use the Detailed SEO Chrome plugin to check the heading structure and number of words on any webpage. 

You can use the Detailed Chrome Extension to check the number of words on a page as well.

No. 150: Check if there are low-content web pages

Go to Overview > SEO Elements > Low Content Pages. A low content page is a page that has few words and no unique content on its own. No one likes low-content web pages. In many cases, low content pages should either be optimized (if these are category pages) or trimmed down. 

No. 151: Check if the text is implemented in the form of images 

Google is getting better at understanding images but it is still a good practice to add text in the form of… text. To check if this issue is on the website simply view some of its most important web pages and analyze the images used there. 

No. 152: Check if flash elements are used instead of text 

I know this is an obsolete question. But I still advise you to check if Flash is used on the website. To check that, go to Overview > SEO Elements > Internal > Flash. You should see (0) there. 

Checking Flash content with the use of Screaming Frog
Screaming Frog will help you check if there are web pages that use Flash.

No. 153: Check if the content of the website is added with the use of iframes

The actual content of the web page should not be placed with the use of iframes. Make sure this does not happen. 

No. 154: Check if the website has relevant and topically-coherent content

A Technical Look At Keywords

Keywords are content but a technical approach and technical knowledge are required to make them work! Here are a few things to check regarding keywords.

No. 155: Check if correct tags are used for highlighting keywords

Highlighting keywords on a page can be quite helpful both for users and search engine robots.  To make keyword highlighting work, <strong> tags need to be used instead of <b>. Your task here is to take a look at the most important web pages, articles, or guides and check if keywords are highlighted correctly. 

No. 156: Check if keyword research has ever been done for the website

Your task as an auditor is to check if anyone has ever done keyword research for the website and if possible take a look at the keywords selected for the website. This provides some additional insights into the website examined.

Keyword research is not part of the technical SEO audit but you may offer the client to do it. You can use Semrush or Ahrefs to do keyword research. If you don’t know how to do keyword research, you might want to take the Semrush free course on keyword research.

No. 157: Check if specific keywords are mapped to specific web pages

If you don’t have information about the keyword research for the website, you may simply manually review a bunch of its web pages to check if they are mapped to keywords. A web page targeted at a specific keyword will usually contain this keyword in the title, URL, headings, and the first paragraphs of text.

The web page you are now reading is obviously targeting the keyword technical SEO audit. If you have any experience with SEO, this will be very easy to determine.

In the case of WordPress websites that have an SEO plugin installed, one way to check if a web page targets a specific keyword is to view the metadata of the page. Both Yoast SEO and Rank Math let you do that. Here is how it looks in Rank Math.

This is Rank Math on my site.

Note that to be able to check that you need WordPress admin access.

No. 158: Check if web pages are optimized for their specific keywords

This is the follow-up to the previous step. This time you want to make sure that the web page targeting a given keyword is actually optimized for it. In addition to having the keyword in the title, URL, headings, and the first paragraphs of text, the web page should also have valuable graphics (with ALT text), topically-relevant links to external resources, and more.

Structured data

Google uses structured data in order to better understand the content of the web page and to enable special search results features and enhancements (rich results). Every technical SEO audit should also analyze the website in terms of structured data. You can learn more about structured data in this Google article.

No. 159: Check if structured data are used on the website

The best way to check if structured data are used on the website is to run a crawl. You can use Sitebulb or Screaming Frog.

In Sitebulb, go to Structured Data > Schema to check what types of Schema are used on the site (and whether there are warnings or validation errors).

Schema entities in Sitebulb
Here is how Sitebulb shows the Schema entities it detected on my site.

If you are using Screaming Frog, make sure to check JSON-LD and Schema.org Validation under Structured Data in Spider Configuration, or the crawler will not check for structured data.

Structured data in Screaming Frog
Here are the Screaming Frog SEO Spider settings for checking and validating structured data.

Once the crawl is done, you can check if structured data is on the website. Navigate to Overview > Structured Data and Contains Structured Data.

Structured Data Validation in Screaming Frog
Structured Data Validation in Screaming Frog.

This is the list of URLs that have structured data. To check if structured data is used on a per-page basis, you can use the Detailed SEO Chrome extension.

No. 160: Check if the structured data used is valid

To check if the structured data used on the website is valid you can use the Google Structured Data Testing Tool or the Rich Results Test to check if the web page is eligible for rich results.

If you are using Sitebulb, run the crawl and go to Structured Data to see very in-depth and beautifully-looking report of the structured data used on the site.

Structured data in Sitebulb
This is the structured data report in Sitebulb.

If you are using Screaming Frog, navigate to Overview > Structured data to check if structured data are used on the website. If they are, then you will see the number of URLs next to Contains Structured Data and in Missing there will be nothing or almost nothing.

Structured data in Screaming Frog
Here are the results of the Screaming Frog crawl regarding the structured data used on my site.

No. 161: Check if other types of structure data could be added to the website

Here your task is to analyze the most important web pages of the website, check the types of structured data they contain, and decide what other types of data can also be added. Here you can use a Chrome extension like Detailed SEO to check types of structured data on a per-page basis.

Schema.org on SEOSLY
You can also use the Detailed SEO Extension to check what types of structured data are used on a given web page.

If the website audited is based on WordPress, then you may think about upgrading to Rank Math Pro which allows for implementing different types of structured data.

Website Speed

I assume you have already tested the website with Google PageSpeed Insights. Let’s now get even more data about the speed of the website.  

Note that you can check the PSI scores of all your pages in bulb with the help of Sitebulb. When setting up the crawler settings make sure to check, Page Speed, Mobile Friendly and Front-end.

Auditing page speed
Check this setting in Sitebulb to analyze the speed scores and issues across the entire website.

When the crawl is done, go to Page Speed to analyze this element in detail.

Page speed in Sitebulb
This is the Page Speed report in Sitebulb.

No. 162: Check the website speed with GTmetrix

GTmetrix is another great tool for analyzing the speed of the website. It also gives a lot of actionable tips and highlights specific problems. 

Here are the GTmetrix speed and performance results for SEOSLY.

No. 163: Check the website speed with the WebPageTest

And now check the web page with the WebPageTest. Make sure to test it as a mobile device.

Web Page Test
Here are the speed and performance test results for SEOSLY on WebPageTest.

No. 164: Check the website speed with Google PageSpeed Insights if you still have not done it.

Or rerun the test and compare its results with the results provided by other speed tools. 

Website Security

Most websites on the internet do not implement even basic security best practices. Your task is to ensure that the website you are auditing is not one of those websites. 

To take a thorough look at some security issues in Sitebulb, run the crawl and then navigate to Security where you will see a ton of different security elements and their assessment.

Security in Sitebulb
Sitebulb can also analyze the site in terms of the most popular security issues.

No. 165: Does the website have an SSL certificate? 

HTTPS has been a ranking factor since 2014. In 2020 (and 2021) each and every website should use HTTPS. A website not secured with HTTPS is marked as Not secure in Chrome and other browsers. 

A website without an SSL certificate
This is the warning that Chrome displays when a site does not have an SSL certificate.

Make sure the website uses HTTPS. If it does not make it a priority that it moves to HTTPS as soon as possible. 

SSL certificate at seosly.com
This is what you see if a site has an SSL certificate.

⚡ Check my guide on the difference between HTTP and HTTPS.

No. 166: Is there mixed content on the website? 

Mixed content occurs when website resources load both over HTTP and HTTPS. In Sitebulb, go to Security and see the issues. If the site does not have mixed content, then you will see it in the No Issue section.

Security score in Sitebulb
My site does not have mixed content issues so Sitebulb added this element to the No Issue section.

All the HTTP resources should be redirected to the resources that load over HTTPS. But note that Chrome has recently been updated and deals with mixed content on its own! 

No. 167: Are there at least some basic security best practices implemented? 

It is hard to exactly determine the scope of basic security practices. Generally, the more, the better.  A few simple and effective security practices include:

  • HTTPS, 
  • two-factor authorization for login panels, 
  • password-protection of the login panel,
  • strong passwords,
  • regularly scanning the website with some security software,
  • doing regular backups,
  • making sure the website is not filtered by Google Safe Browsing,
  • to name just a few.  

Technical SEO Audit: Server Log Analysis

No. 168: Check server logs 

Server logs will always tell you the truth and the truth only. If possible, do a server log analysis. The areas to focus on include, for example, crawl volume, response code errors, crawl budget waste, temporary redirects, last crawl date. Semrush contains Log File Analyzer that will help you analyze the raw data and make sense of it.

Technical SEO audit - log file analysis
Semrush has a very nice log file analyzer that can do all the dirty job for you.

If you cannot access server logs, make sure to analyze the crawl stats report in Google Search Console.

WordPress Technical Checks & Quick Fixes

Here are some of the plugins that will let you fix some of the above-discussed issues. I am a heavy user of WordPress, so I can recommend some of my favorites plugins. You don’t necessarily need to install all of those plugins if you are only doing a one-time technical audit but you should at least indicate that they are an option.

If, on the other hand, you are auditing a website which you will be monitoring on a regular basis, you will make your life a lot easier with those plugins. 

No. 169: Install a backup plugin. 

Before making any changes to the website, back it up. There are a lot of ways to back up a site. One possible way is to install a backup plugin (like UpdraftPlus) that will automatically create a copy of all the files and database. Make sure that the backup is not stored in the same place as the rest of the files. If the website does not have any backup plan with its hosting, take care of this. 

Updraft Plus WordPress Plugin
Updraft Plus lets you update or restore your site within one click.

No. 170: Install a security plugin

WordPress websites are especially vulnerable to attacks and hacks. There are a few good WordPress security plugins (like iThemes Security) that will let you implement at least a basic level of security on a website. If you decide to buy a pro version of a security plugin, you can pretty much forget about this aspect.

UPDATE: Security plugins often can also slow down your site. Make sure to test this. Adding Cloudflare CDN to your site also increases its security.

No. 171: Install a Really Simple SSL 

The mixed content issue is quite common in WordPress websites. There is very often something not quite right with the HTTP > HTTPS redirects. Fortunately, you can fix it with one click with the help of an SEO plugin called Really Simple SSL

UPDATE: According to my recent tests, this plugin slows down sites. I recommend installing it and uninstalling it but keeping the settings.

No. 172: Update WordPress and plugins 

Both WordPress and plugins should be updated regularly. This is also a very important security best practice. Some websites, unfortunately, like to break with updates. That’s why doing regular backups is so crucial. 

WordPress update
This is the WordPress dashboard where you can update the plugins and WordPress.

Your task here is to back up the website and run updates (if you have the power to do it). 

No. 173:  Install Google Site Kit

I know that having too many plugins is not good but Google Site Kit is really worth installing. This is an official Google plugin that provides insights from different Google tools in one place: the WordPress dashboard.

Site Kit from Google
This is the Google Site Kit plugin.

UPDATE: I have recently been moving in the direction of minimizing the number of plugins. Unless you really need to have a dashboard in the WordPress panel, I don’t recommend installing this plugin anymore.

No. 174: Install an SEO plugin

Check if the website is using an SEO plugin. Install and configure one if needed. There are basically two major players here: Rank Math and Yoast SEO. It’s up to you to decide which one to choose. 

UPDATE: I recommend using Rank Math which does not slow down sites and offers many of the features that you need to pay for in Yoast.

WordPress plugins: Rank Math
Rank Math really offers tons of very useful SEO features and options.

No. 175: Optimize all the images in bulk.  

Images can really slow down the website if they are not optimized. Fortunately, there are a lot of plugins that will let you optimize images in bulk. I most often use Imagify. There are other good options like WP Smush

Imagify WordPress plugin
Here are the Imagify settings that let you adjust the level of optimization of images on your site.

No. 176: Improve the speed and performance. 

I have probably tested hundreds of different site speed and caching plugins. Some improved the Google PageSpeed Insights score a bit, others even decreased it. From my own experience, there is only one plugin for speed. WP Rocket!

Google PageSpeed Insights
WP Rocket is really one of the best caching and speed optimization plugins out there.

No. 177: Regularly check for broken links. 

Links change, come and go. Broken links can lead to a bad user experience. Fortunately, there is an easy way to monitor all your links and get notifications about broken links with a quick option to update the link. I use Broken Link Checker to regularly check my links. 

UPDATE: I don’t recommend using this plugin anymore. In my tests, it proved to slow sites down a lot. You can monitor your sites for broken backlinks with the help of Sitebulb.

Broken links in Sitebulb
When in Sitebulb, go to Links > Internal Link Status > Broken to check if there are any broken links on your site.

No. 178: Check if the website needs some cleaning regarding the unused/used plugins.

When in WordPress, navigate through the list of the plugins installed. Make sure that each plugin is actually used. If it is not, remove it. Make sure the website does not have multiple plugins doing the same things (e.g. multiple security or SEO plugins).

Technical SEO Audit: E-A-T

E-A-T (Expertise, Authoritativeness, Trustworthiness) is big. That’s why I think you should also analyze it at least briefly even when doing a technical audit.

No. 179: Check if the website has backlinks from authoritative sites in the same field

In the case of an SEO blog, these would be backlinks from SEO authorities like SEJ or Moz. A quick way to get an overview of backlinks is to run a Backlink Audit in Semrush. Once complete, you can sort link types by their authority score. In the example above, there are unfortunately no highly authoritative links for the domain examined.

Link audit
Here is how you can check backlinks and their quality using Semrush.

No. 180: Check if the website is mentioned on other authoritative websites

Mentions are not always links but are very important as well. Check if the website (or the brand name) is mentioned on authoritative websites. The simplest way is to perform an exact match search for the brand name. In the case of my website that would be "seosly".

Brand name search
This is an exact-match type of seach on Google.

No. 181: Check if the content of the website is up to date

Depending on the topic of the website, there may not be a clear way to check if the content is up to date. One possible way to check if the content on the website is up to date is to look at the publication or update date on articles. Here is how it looks in the case of my website.

Content freshness on SEOSLY
I always show the last update date and the publication date on each of my blog posts.

You can also check the last modified date in the XML sitemap for a given URL or all the URLs.

No. 182: Check if the content is factually accurate

Of course, you may not be able to verify the factual accuracy of the content of the website (especially if its topic is very specific) but you should check if the website makes claims that contradict scientific consensus. This includes websites that promote conspiracy theories or alternative medicine treatments.

No. 183: Check if the authors of the websites are recognized authorities in the field

One of the elements of E-A-T is authority which means that the website and its authors should be recognized authorities in the field. A quick way to check that is to perform an exact match search for the author or authors of the website. Do their names appear in other trustworthy publications or websites? Are they referred to as authorities?

No. 184: Check if the website presents its credentials (awards, certifications, badges, etc.)

If the website or the brand has any kind of achievements (like awards, certifications, trust badges, etc), they should all be presented on the website. The best place to show these achievements is the home page or about page. In the case of my website, I put all of my achievements on the about page and on the home page. Here is how it looks on my SEO consultant page:

These are the FAQs that provide even more information about myself and why I am qualified to do what I do.

No. 185: Check if the website has genuine reviews (and if they are positive or negative)

There is nothing worse than the website or brand writing its own (fake) reviews. Again, this may not be very easy to verify but some digging should let you determine the quality and genuinity of the reviews. One or two negative reviews are not a problem and are part of the web. However, if after typing the brand name in the search box, you see nothing but overtly negative reviews and dissatisfied customers, then this needs to be addressed in the first place.

No. 186: Check if the website has information about its authors (like author bios)

If the website has one author, then the information in the about page should do. However, if there are multiple authors, each author should have a bio in each of their articles. I am the only author of the content on SEOSLY but I still add my bio at the end of my articles.

Author's bio
This is the author box I use on my articles.

No. 187: Check if the website has contact details (and a contact me page)

Any trustworthy website must give you an option to contact its owner. Ideally, there should be a contact page where all the possible ways to contact the owner are listed. There should ideally be an e-mail address, phone number, and physical address. Some contact details may also be placed in the footer. If there is no way to contact the website, then this is a red flag.

No. 188: Check if the website has a Wikipedia page

Most of the web pages do not have a Wikipedia page and that’s OK. It is extremely difficult to get a Wikipedia page but if the website audited (or its brand or author) has a Wikipedia page, then it means it has a decent level of E-A-T.

SEO Audit & SEO Auditings – Frequenty Asked Questions

There are frequenty asked questions about SEO auditing.

How is a google sitemap different from an html sitemap?

A Google Sitemap, typically in XML format, is a file that lists all the URLs for a site that Google should be aware of, while an HTML sitemap is designed for human users to help them navigate through the site. Google Sitemaps allows search engines to better crawl and index your site, while HTML sitemaps improve user experience and site navigation.

How do you submit your xml sitemap to google?

You can submit your XML sitemap to Google through Google Search Console. First, you have to verify your website in Google Search Console. Then, you can go to ‘Sitemaps’ located in the ‘Index’ section, enter the URL of your sitemap and click on ‘Submit’.

What is an image xml sitemap?

An Image XML Sitemap is a specific type of sitemap that helps Google discover images that might otherwise be missed. It provides additional information about the images on your website, such as the subject of the image, its location on your site, and the type of image it is. This can improve the visibility of your images in Google image search results.

Can an seo audit tool help me optimize my website for search engines?

Yes, an SEO audit tool can certainly help optimize your website for search engines. It can identify issues such as broken links, slow page load times, duplicate content, missing meta descriptions, and more. By addressing these issues, you can improve your site’s SEO performance, thereby improving its visibility in search engine results.

Book your SEO audit or request a quote

Olga Zarr is an SEO consultant with 10+ years of experience. She has been doing SEO for both the biggest brands in the world and small businesses. She has done 200+ SEO audits so far. Olga has completed SEO courses and degrees at universities, such as UC Davis, University of Michigan, and Johns Hopkins University. She also completed Moz Academy! And, of course, has Google certifications. She keeps learning SEO and loves it. Olga is also a Google Product Expert specializing in areas, such as Google Search and Google Webmasters.
Show 37 Comments


  1. Sabbir Al Saba

    Very very informative post. I have learned so many things by reading your post. Thanks

    • Hello Sabbir! Thank you so much for your comment. I am very happy to hear that. I will keep creating such posts! Greets! ????

  2. Smita

    Fantastic post, Olga! I learned so much and can’t wait to apply all of these to my site!

    • Hi Smita! Thank you so much for the comment. I am happy you could learn something from my post! Good luck with your website 🙂

  3. Consulente SEO Bari - Pier Zotti

    Very useful post. Thanks a lot

  4. Tanya

    This is great and very similar to the in-depth technical SEO Audits I conduct for clients.

    One thing I sometimes struggle with is presenting the information back to clients, the more technically minded are more than happy with the Google Sheet/ Excel report I provide but alot of clients are scared by the amount of information in this.

    How do you present your findings and recommended actions back to clients in a way that they’ll understand? Do you transfer all your findings into a more visual summary report or presentation that they’ll find easier to understand?

    • Hi Tayna! Thank you for stopping by and commenting 🙂 You know that really depends on the client. If the client is not very technical, then I ask him to let me talk to someone from their technical or dev team. I usually do the audits of this scope for big clients and big websites. In the case of smaller clients, I am willing to simply offer them some post-audit SEO consultation where I answer their questions, explain things, or send them some materials for further reading. I have not come up with a visual presentation yet other than a PowerPoint presentation with the main errors. I am very interested in learning how dod you that?

  5. Ahmed Adel

    Fantastic post, Olga

  6. Ravi

    I have seen SEOs sending audit report with just some random screenshot from web tools. It’s one of the best article I have came across for reading technical seo audit.

    • I am so happy to hear that! I am now also working on adding detailed captions for each image so that everyone understands what each image really shows and what its purpose is. Thanks for stopping by!

  7. Isuamfon Offiong

    Hi, Olga,

    So, when a post is good we say it’s Fantastic. Now, we have to say that this one is Cokastic.

    As an ultimate guide writer, take nothing a way from it, this took you nothing less than 2 weeks to come by.

    Frankly, it was a good read. Thanks for the weeks or months you put into this.

    • Hello Isuamfon! Thank you for your comment 🙂 You are right. It took me about 2 weeks to create this guide and about 8 years to gather the knowledge and experience to be able to create it. I am doing my best to make each new guide slightly better than the previous one.

  8. Elavenil

    Thank god I found this from Twitter! Very helpful template.

  9. Charmaine

    Olga, thank you so much for doing this. This is the most detailed technical audit I have ever seen. I have one of those sites that got hit by the December update and I am trying to fix it. The technical side of the business is really not my strong point but the details in your audit reports is remarkable. Thank you so much for doing this!!!

    • Hello Charmaine!
      You are most welcome! Thank you for your comment. I am SO happy to hear that you like my work. Reading comments like yours makes me want to keep going and producing an infinite number of similar guides. If you even need help, feel free to contact me. And good luck with your site 🙂

  10. Shikha Vashishth

    I also found this post by twitter..
    It’s a great audit, thanks for sharing all these in detail.

    • Hello Shikha!
      Thanks for commenting 🙂 I am glad you like the guide.

    • Songnags

      Dear Olga, great article with lot of detailing. Would have really helped the readers if they were listed as per priority in terms of SEO and points which are common and are most applicable to all sites.

  11. Ryan Ip

    Hi Olga,

    What a great article you have written and it shows that you’ve put a lot of time and effort into it. A lot of detailing is in the writing.

    Keep up the good work!

  12. Nikos Taskos

    Amazing guide, thanks for sharing!

  13. Daniel

    One of the best guides I’ve seen recently. The templates are fantastic and I use it for my projects to which I have added actions for local seo. Best regards!

  14. Thanks for sharing great information and technical strategy.

    I have one question, after rank on the 1st page there are a lot of people copy the content and implement in own site of ranking pages, is it effect our ranking if anyone doing???

    Plz guide.

  15. I will use only 3 characters to express what I think: WOW
    If you allow me I would add some of the things I am checking related to multilingual and multi-regional websites:
    * Keyword Research and Cultural Research for all additional languages and regions
    * Consistency in the multi-regional and multilingual structure
    * Using canonical tag to avoid duplicate content
    * Charset; NON ASCII characters in URLs;
    * length of elements for literally translated texts
    * lang attributes
    Here I have listed few more: https://seosmoothie.com/seo-for-multilingual-websites
    Note that my article compared to yours is like a first grade homework intention versus a professor’s thesis.

    • Hello Martin!
      Thanks for stopping by and adding your bits to the article 🙂

  16. Nathan Bradshaw

    This post is criminally underrated! I’ve been looking for an SEO audit like this for over the last 2 months. Thank you very much for putting so much effort and value into a single post. Hope it gets on the first page so most people get benefit from it.

    • Thank you for your kind words. I am very happy you like my post 🙂 The first page for “technical SEO audit” is very competitive to get onto, but maybe one day 🙂

  17. Ahmet

    The very valuable technical SEO information. Thank you so much

  18. Alex

    Hey Google Crawlers … Are you reading my comment? Put this post on the first position ASAP… This blog doesn’t deserve less than 2nd position of the first page.

    • Hi Alex 🙂 I really hope they are reading your comment and will soon comply with your request 🙂 Thank you!

  19. Michael Ruebcke

    wow olga,

    this is really one of the most comprehensive guides to the components of a tech-seo audit I’ve read so far. thanks for preparing, providing and of course sharing it.

    it’s always nice to discover and learn something new even with years of experience.

    keep up the good work, stay healthy and good luck!



    Wow. This is Gold. The step by step guide on what to look out for and how to do it makes this a resource for decades. I am just starting my SEO consultancy journey and besides getting clients I am still trying to figure the perfect cost to clients. Will appreciate if you point me in the direction of content on your site that can help me figure that out….great job Olga

Comments are closed