A technical SEO audit is vital to ensure your content has the best chance of ranking highly in Google search. You might have articles written by top notch authors, but if Google has issues finding, crawling, and indexing it, then it won’t rank well (if at all!).
Normally, a technical SEO audit can take several days to a week to complete. The audit document can run to fifty-plus pages and presenting all this work to non-technical colleagues or clients is also quite challenging. There’s a lot of industry jargon and it’s hard to know what to fix first.
That’s why in this guide, I share a ten-part process that covers the most impactful areas of a technical audit that’ll only take a few hours to finish. I’ll explain why you need to check for the issue, how it can affect your search rankings, and most importantly, how to fix it.
For most websites, this will give you a clear plan to improve your website’s technical SEO performance.
Let’s begin!
1. Check important pages are indexed
One of the main reasons for performing a technical SEO audit is often because a website has lost traffic. If this is the case, checking that your important pages are indexed is a great starting point.
Sometimes, a noindex tag can be applied by mistake, and in rare cases, Google chooses to deindex a page it feels is no longer worth crawling.
How to check if your pages are indexed:
- Use a free SEO browser extension like Detailed. This shows if the page allows crawling by Google.
- Use the URL InspectionPage tool in Google Search Console (GSC) to be 100% sure the page is indexed
Pro Tip: This is a manual process that can be somewhat time-consuming. So, if you have a long list of URLs, I recommend using a crawling tool like Screaming Frog instead.
Connect Screaming Frog to the Google Search Console API and use “List mode” to check if the pages are indexed. The process takes only a few minutes.
What to do if pages aren’t indexed
If you notice a “noindex” tag applied by mistake, it’s pretty easy to change.
- In WordPress, you can usually revert this in the installed SEO plugin, like Yoast.
- In Webflow, go to the SEO tab and check the settings.
Things get trickier if you notice Google has chosen not to index your page and applied a “Crawled - currently not indexed” status.
Here’s how to fix “Crawled - currently not indexed”.
- Check the page title and content to see if it’s too similar to other pages.
- Add some internal links from pages that get decent traffic.
- Run the page through GSC’s Inspect tool and submit for reindexing.
Pro Tip: If the above doesn’t work, try linking to the page from social networks and adding some external backlinks if possible. Then, resubmit the URL to the URL inspection tool.
2. Verify canonicals are set up correctly
A canonical tag is a directive that tells Google which is the “master copy” of a webpage.
If you notice a sudden traffic drop in your analytical tools, it could be because an incorrect canonical has been applied.
Here’s a common issue I often see when performing audits:
URL: https://example.com/page1/
Canonical: https://example.com/page1
A small mistake like forgetting to add the “/” can cause a big problem. These are technically two different URLs, and Google might index the one without the forward slash. This can explain why the first URL has experienced a traffic drop.
How to check canonicals have been set up correctly
Use the Detailed SEO browser extension to check if canonical tags have been set up correctly.
The tool helpfully shows that this canonical is “self-referencing,” which means both the URL and the canonical URL are the same.
Remember: Use SEO crawling tools like Screaming Frog to check large volumes of pages.
Once you’ve crawled the website or your list, click over to the “issues” tab and it will display if there are any URLs with canonicals set to different URLs (i.e not “self-referencing”).
How to change canonical tags
To change the canonical tags of a page in WordPress, check your SEO plugin, for example Yoast.
In Webflow, add custom code to set canonicals.
3. Check a robots.txt file is present and correct
A robots.txt instructs Google’s crawling robots about which pages of your website should and should not be crawled. It is the first resource that the crawler will check for on your website so it’s important that it is set up correctly.
Sometimes, you’ll find instructions to let Google add everything to its index and this is not ideal. Things that are usually “disallowed” are sensitive pages such as admin and user account login pages, internal search pages to prevent Google repeatedly crawling hundreds of pages with little to no value, and staging sites that cause duplicate content issues.
Larger websites like Tom’s Guide prevent the crawling of entire URL paths in their robots.txt file as shown below.
The URL of your sitemap should also be present on your robots.txt page. This makes it easier for Google to discover and crawl.
How to find a robots.txt file
You can usually find a website’s robots.txt file by appending /robots.txt to your root domain, like this: example.com/robots.txt.
Paste this into your browser and your robots.txt file should display.
What to check in a robots.txt file
Here are some common issues that might occur:
- The robots.txt file is missing:
In this case, you’ll need to create one and add it to the root directory of your website. There are some free tools you can use to generate a robots.txt file. The WordPress plug-in Yoast also makes this easy. - The robots.txt commands block areas of the website that you would like crawled
- There isn’t a sitemap specified
For the last two issues, you’ll need to:
- Download the robots.txt from your host.
You can do this via FTP or using the File Manager in your host’s CPanel. - Edit the robots.txt using Notepad++ or a similar .txt editor
- Re-upload and replace the existing file using the same method as step-one.
4. Check if your sitemap is in GSC without any errors
Uploading a sitemap to Google Search Console makes it much easier for Google to crawl, index, and rank your website. However, I often find when performing technical SEO audits that a lot of webmasters forget to do this.
To check if yours is present and correct:
- Go into the “Sitemaps” section of Google Search Console.
- You should see the name of your sitemap, a recent last read date, and the status showing a green “success”
How to upload a missing sitemap
A sitemap can be submitted to Google in the “Sitemaps” tab in the “Indexing” section of GSC. It should only take a few seconds for Google to read your sitemap and discover all the URLs.
Pro Tip: If you can’t find the address of your XML Sitemap (typically example.com/sitemap.xml), first check robots.txt. It should be specified there. If it’s not, it needs to be added using the robots.txt editing method mentioned above.
The XML Sitemap address can also be found in WordPress SEO extensions like Yoast or in the SEO tab in WebFlow.
Common problems with sitemaps
A common problem is submitting a sitemap that contains URLs that redirect or have a 404 error code. This is an issue because Google expects all of the URLs to be status code 200 “OK”.
Usually, Google will flag this kind of issue in the Search Console and users will receive an email notifying them of this issue.
Pro Tip: If you want to be proactive, audit your sitemap with an SEO crawling tool like Screaming Frog. This will check the status of your URLs and if any URLs on your website are not included in the sitemap.
How to fix sitemap problems
For WordPress or Webflow users , your xml sitemap is most likely auto-generated. For Wordpress users, check to see if you have the latest version of your SEO plugin (like Yoast) installed. Clear the cache to generate a new sitemap.
It’s also possible to create a custom sitemap if you’re still having problems. This help page from Google explains how to do this and submit it to Google Search Console.
5. Check the internal link structure
Google assumes your most important pages will be close to your home page. A problem larger sites often face is older content becoming harder to find for both Google and users as new pieces are published.
To check if this is a problem on your website, start at the home page and manually click through until you reach your target page. This is called “click depth,” and you want to aim for three clicks or less.
As always, you can do this at scale with an SEO site audit tool. These tools usually refer to click depth as “crawl depth,” but they’re essentially the same thing.
How to reduce click depth
If you discover pages that have lost traffic and want to reduce their click/crawl depth, you can:
- Add internal links with very low crawl depth.
- Reevaluate your navigation menus, categories, and/or tags.
- Remove outdated or older content that’s “in front” of your page.
Pro Tip: Using a site:search is a quick and easy way to find relevant pages for internal links. For example, imagine we have a running website and have just written a guide about running in winter. Use this Google search operand to find all pages that mention “winter”:
Site:example.com “winter”
6. Analyze click-through rate (CTR)
Even if you haven’t lost any rankings, optimizing click-through rate (CTR) is a quick, easy way to gain more clicks from the same number of impressions.
CTR becomes a factor only for pages in the top 10 search results, and sometimes, a declining CTR can indicate wider SEO issues.
For example, Google might have changed the search intent or profile of the top-ranking search results. Imagine you have a guide on working from home, but “COVID 19” or “lockdown” still appears in the page title and meta description. A searcher might think that your page was outdated and choose another search result.
How to find pages with low CTR
Start by finding your website’s average CTR in your GSC dashboard. Any pages below that average have a low CTR.
You can also select individual pages and view a CTR graph over different time periods, then export the data from GSC into Google Sheets or Excel.
Alternatively, you can get this data from a site audit tool with the GSC API connected.
How to fix pages with low CTR
The two main factors that affect Click-through-rates are:
- Page Titles
- Meta-Descriptions
You can see how a page looks in Google search by entering the URL into your browser and pressing the down arrow (↓) That brings up a magnifying glass as shown below.
Clicking the magnifying glass will show you the search result.
For both of these elements, I recommend looking at the search results and observing how the top-ranking pages look.
For page titles, check:
- The title isn’t missing any repeated words in top-ranking titles like tips, itinerary, guide, 2024, etc.
- The title isn’t too short (less than 30 characters) and Google is rewriting it for better context
- The title isn’t too long (over 60 characters) and being truncated in the search results
This is a great guide for writing title tags and how to do it quickly in ChatGPT.
For meta descriptions check:
- The description attracts curiosity and makes people want to click the search result by using action words like read, check, find, discover, etc.
- The meta-description has been written. If it hasn’t Google will write it for you with often suboptimal results.
- The meta-description isn’t too long (over 160 characters) and is getting truncated.
- The meta-description isn’t too short (under 100 characters) and Google is rewriting it.
You can also use ChatGPT to write meta descriptions. I usually use a prompt that includes:
“...keep the meta description under 160 characters and free from any buzzwords or jargon.”
7. Check bounce rates
An extremely high bounce rate can indicate underlying SEO and user experience (UX) issues. These can include very slow pages and broken images or videos.
Ads and pop-ups blocking large parts of the page are also an issue. Google calls these intrusive interstitials, which can “devalue” the pages they appear on.
As ads and pop-ups are often site-wide, it’s worth checking the bounce rate if you use them and have experienced a traffic drop.
Note: An exception is pages that have affiliate links. A high bounce rate is not concerning here as visitors might be leaving to check out the offer.
How to find pages with high bounce rates
Rather unhelpfully, Google has removed the bounce rate metric from Google Analytics 4 (GA4). You now have to calculate it manually.
To do this:
- Find the “Sessions” and “Engaged Sessions” metrics.
- Subtract “Engaged Sessions” from “Sessions” to get the number of bounces.
- Divide the number of bounces by sessions to get the bounce rate.
For example, if your engaged sessions were 100 and sessions were 200, the bounce rate would be: (200-100) / 200 = 50%.
Recommended reading: MeasureSchool’s guide on setting up bounce rate as a custom metric in GA4.
8. Check your 404 page
A 404 (not found) page plays an often underappreciated role on a website. If they’re not set up correctly, broken pages can go unnoticed in analytics reports. If these pages have backlinks then your website will lose some link equity, too.
Broken pages without 404 pages also provide a poor user experience. Users could become frustrated and lose trust in your website.
How to check your 404 page is set up properly
To check if your 404 page is set up properly, simply add some random text to your domain name. For example: cnet.com/xyz
This displays CNET’s 404 page which has branding and a link back to the home page to keep visitors on the site.
In addition to checking if the 404 page loads, we also need to check if it is displaying the 404 html code. To do this, use an online HTML checker like ipvoid.com.
How to set up a 404 page correctly
If your website uses Wordpress, the 404 page will likely be in your SEO plugin settings. Here is a help page from Yoast that explains how to set one up.
There is also a help page from Webflow University for setting up a custom 404 page.
9. Investigate page speed
In my experience, page speed isn’t a significant ranking factor in all industries providing the website loads in a reasonable amount of time. It can be in some competitive niches, such as car rental services but it seems to matter a lot less than great content and decent backlinks.
That’s not to say you shouldn’t care about website speed. Google rankings aside, historical data shows that a slow site frustrates users and lowers conversion rates. From a technical SEO perspective, a slower site takes longer for Google to crawl — and its robots have a long list to get through!
How to check page speed
To check if Google thinks your website is on the slow side, take a look in Google Search Console under the “Core Web Vitals” tab. Currently, there are two graphs for the desktop and mobile views that show if your URLs are “Poor”, “Need Improvement”, or “Good”.
If your pages are graded “poor”, I recommend using Google’s PageSpeed Insights tool for further investigation. It will quickly give you a mobile and desktop score and areas where you can improve.
However, this is on a page-by-page basis. You can check pages’ speeds in bulk using a Screaming Frog crawl — check out their guide on bulk-testing page speed to make things easier.
Common issues that slow down websites
Several issues can affect your website speed:
- A slow server
- Lazy load not installed
- Large Images
- Incorrectly-sized images
- CSS issues
Fixing these issues will help improve your website speed.
How fast should your page be?
It’s easy to say “as fast as possible,” but constantly optimizing for speed can lead to diminishing returns. I suggest using the Page Speed Insights testing tool to compare the desktop and mobile scores for competitor websites in your niche.
Pro Tip: Do not compare home pages if this is not the most “important” page for your website/niche. For example, if your blog converts well then you should run a blog page through the test. Different pages have different templates and this can affect page speed significantly.
10. Check for toxic backlinks
Toxic backlinks are links from low-quality or spam websites that can make it appear your website is gaming Google’s search algorithm.
Google doesn’t take too kindly to this kind of thing and could slap your website with a link penalty or drop your rankings without any warning.
Many SEOs have debated Google’s ability to spot and ignore these “negative SEO” attacks. My experience is they can significantly impact your search traffic. I once removed a site-wide footer link I suspected was toxic. A few days later, traffic rebounded from less than 100 clicks a day to over 1,000.
So, it’s worth examining your existing backlink profile and regularly monitoring the new backlinks coming in.
How to check for toxic backlinks
You can often tell something is wrong if the number of new backlinks increases for no apparent reason.
If you think your website has attracted toxic backlinks, the best action is to remove them. This might be possible if you’re paid for a service to build links or swapped links with a site in your niche. If you can’t get the link removed, you can submit a link disavow file to Google.
Recommended reading: Semrush has a great guide on dealing with toxic backlinks
One Golden Rule of Technical SEO Audits
We always recommend keeping a change log before you change any website or hosting settings. Things can and do often go wrong or not turn out as expected.
In these instances, having a written record of what was done and by who can help you revert the change quickly and easily.
Summary
It’s important to remember that websites are always changing. Whether it’s adding new content, updating existing articles, or changing the theme, there are lots moving parts.
Performing a technical SEO audit periodically will keep your traffic and conversions high. It will also help you resolve issues with sudden traffic loss, to align with any major Google updates (e.g HCU), and fix any issues that crop up during website changes and migration. This will help Google crawl and index your website as easily as possible and give your content the very best chance of ranking highly.
Using our guide, you’ll be able to quickly perform a technical SEO audit, identify any issues, and fix them — in hours, not weeks.
Receive insider tips straight to your inbox.
Receive insider tips straight to your inbox.
Would you like to speak to one of our experts?
Create custom email campaigns, measure performance, and turn insights into results with Mailchimp’s email marketing tools.