A technical SEO checklist can help to identify issues that often get in the way of you leveraging the ranking and traffic potential of content campaigns and make you feel like you’re just spinning and not seeing organic search growth. Having done hundreds of Technical SEO audits in my career, I can tell you that I’ve never seen a site “ace” it. So the question becomes, how much is too much, and when do you make the move to activities that could potentially lead to larger improvements?
While I wouldn’t claim that this checklist is going to cover every possible technical SEO best practice, it can serve as a list of items that can cause the most harm to your marketing efforts and are the most commonly found in our experience of taking clients through a more comprehensive technical SEO checklist.
Why is a Technical SEO Checklist So Important, Anyway?
Often described as the “foundation” for your marketing efforts, technical SEO really considers two basic principles:
- Can Google see your site and find important content easily and quickly, regardless of how they got there?
- Can users find important content easily and quickly, regardless of how they got there?
If, on its face, Google is determined to provide the best answers to a user’s query, technical SEO is the idea of making it as simple as possible for Google to crawl and index your answers, and trust that they’re giving Google users a good experience. Over the years, as the needs of Google users have changed, so too have the priorities for sites from a technical perspective. For example, remember when mobile sites were a “nice to have”? Or how about when sites didn’t always have to be secure? As users shifted to increased smart device usage and developed an increased focus on personal data, Google emphasized the importance of these elements if you wanted to succeed in organic search. While the elements of importance might change, the end goal remains the same, which is why the two basic principles have held true.
Google Search Console (GSC)
Not setting up Google Search Console is actually a fairly common mistake. Google Search Console is designed to reinforce the priorities that Google has for websites, and it is typically as close as you’re going to get to finding out what Google actually thinks about your site. Make sure you have GSC profiles created for all of your site variations including (Or verify the site at the domain level using DNS information):
Creating these profiles will allow you to check configurations on the valid URL but also look at duplicate content and potential security issues that can impact other variations of your site. Having GSC verified can provide insight into technical problems like:
- Site crawling and indexation
- Security issues
- Mobile usability
- XML Sitemap Errors
- Site Performance Beyond Analytics (Not what they do once they get to the site, but how often is your site shown, and for what keywords)
- Duplicate Content Issues
There is No Duplicate Content Penalty, But…
Believe it or not, the most common technical problem found when crawling a site is actually an issue with content. Duplicate content across the site is a very common problem when reviewing the indexation of a site. Don’t buy the hype though, there is no “Duplicate Content Penalty”. The issue arrives when you give Google two different URLs to index, but the content is exactly (or nearly) the same. This splits the authority and leaves Google in the dark and left to its own devices. And generally speaking, if your site creates confusion for search engines on which page to rank, neither is going to do well.
The best way to check whether you have issues is to crawl the site using software like Screaming Frog or Sitebulb. These tools are relatively affordable, and if you’re only using it for this one instance, you don’t necessarily need a monthly subscription. Additionally, if you don’t want to spend money on a tool, you can check how many pages Google is indexing by doing a site: command within Google search results. (Ex. site:v9digital.com), or pull the data from Google Search Console. For the site command function, leave off the HTTPS or www. as this can also give you an idea if other variations of the URLs are being indexed that would be seen as duplicate content. Doing this manually is relatively simple if you have a small site, but we recommend saving yourself a headache and investing in a site crawling tool if your site is larger (> 100 URLs).
Whether it’s a technical configuration like HTTP and HTTPS are both being crawled and indexed, or it’s a feature of the CMS to spin up multiple pages that use the same content causing duplicate content, this is important to address. This will ensure that there is no confusion, and Google is indexing only one URL for each unique piece of content. Don’t leave the outcome of the game up to the refs (Google in this analogy), because complaining about bad calls when leaving Google up to its own devices won’t help. And for professional baseball fans, think of Google as Joe West, the worst umpire ever.
Having well-configured XML sitemaps is also critical to your foundational SEO and can help you spot indexation problems. Most companies never look at these after they are created, which is why you need to take some time to dig into what’s going on with your XML sitemaps. Remember when we said it was important to verify Search Console. You can submit your XML sitemap and Google will help you identify these issues.
First, you want to make sure that the XML sitemap includes each organic landing page on the site, then you want to make sure you’re not including URLs that aren’t meant to drive organic traffic or are leading to errors.
Other things to consider when evaluating your XML sitemaps include:
- The percentage of content getting indexed versus what was submitted
- For large sites, Sitemap Segmentation and whether files are broken into small groups of like content should be a consideration
- Use of optional tags like and can get in the way of natural indexation, so we suggest you remove these.
- Lastly, keep it updated. We highly recommend using a plugin or add-on to make sure XML sitemaps include pages as they’re added and stay consistently updated.
To help counteract duplicate content that is driven by the CMS or overall website configuration, canonicals are a great way of controlling what Google sees versus what Google indexes. Improper canonicalization or a lack of canonicalization is one of the more common issues found in conducting a technical SEO audit.
301 redirects are the best way to handle canonicalization at the site level. If https://www.example.com is the valid version or canonical version of your site, you want to make sure you have 301 redirects in place when you enter in alternate site variations like:
You also want to make sure that page-level variations like trailing slash and index.html are handled with 301 redirects.
On top of adding redirects to handle URL variations caused by the CMS, it’s also best practice to add a “canonical tag” to every page of your site. This can serve as a stop-gap if redirects aren’t working, or the site uses a search/filtering functionality that relies on query strings (URLs with a question mark in them). This isn’t as strong as a signal from a 301 redirect in how Google views them, but a canonical can serve as a recommendation to Google on which URL variation is the correct one to index.
On-Page Content Indexation
You can’t have good SEO if Google can’t index the content on your pages, so checking whether their crawler is seeing everything on your pages is critical.
Use GSC to Perform Fetch-and-Renders on several pages, and this will give you insight into how your pages are rendering when Google loads them. While historically many design files have been blocked from being crawled, Google’s focus on improving the mobile experience means you should be allowing access to these files/resources.
Both users and Googlebot love fast sites! As 5g is adopted by more carriers and high-speed internet is more common, you might think your site is already fast, but it’s important to remember that not everyone is on the fastest plans and not everyone has unlimited data. Again going back to Google wanting to improve the experience for users, even after they’ve left their initial Google search, ensuring that the side loads quick is a light-weight ranking factor for now, but it’s only rational to think it will become a bigger focus.
We like to evaluate page speed based on what Google Analytics is reporting, and then use tools like Pingdom’s Page Speed Tester to evaluate specific issues affecting that are affecting site speed. Doing so allows us to create a burndown list of opportunities that we can pass off to a development team for improving load time. The biggest culprit of site speed issues that we tend to see with B2B and B2C clients alike is image size!
Google’s Mobile Page Speed Checker helps you look at your site’s load time specifically from a mobile user perspective. This application will email you a list of things to improve after making an initial evaluation.
When running these tools, there are a few things to remember:
- The score in Google Page Speed Insights is based on what is implemented, not on the actual load speed of the site.
- Both tools are meant for developers, so if worse come to worse, pull the emergency handle and bring in a developer. If however, you want additional definitions as to what you’re seeing in Google’s Page Speed Insights, here is a guide.
- The goal for page size is ~1MB per URL. Remember, when thinking of a user without unlimited data or slow internet connections, if they view multiple pages on the site, this can add up.
- The load time should be under 3 seconds. Beyond 3 seconds you will begin to lose a larger and larger share of your potential audience who give up and leave.
You can no longer expect to compete in SEO if your pages are not mobile-friendly. Going back nearly a decade as user behavior started to shift to using smart devices, Google has leaned in hard on creating a great mobile experience and have gone as far as to make it a component in determining a site’s rankings. Over time, this has gone from pushing for all sites to have a mobile site, to sites having responsive sites, to sites having fast responsive sites, to updating the search algorithm to evaluate rankings based on the mobile site. Even changes in the past couple of years, with aligning desktop results to model updated mobile search results and the addition of featured snippets and voice search, have shown Google continuing to lean in on the importance of mobile-friendliness.
Just because your site is responsive or doesn’t mean that your smartphone users or Google are happy with the experience.
Use the mobile usability report available in GSC as your first line of defense. This tool doesn’t require a verified GSC account, but it checks one page at a time. In Search Console, a verified site will report problems as they come up while Google crawls the site for you. Both of these tools look at what is now the bare minimum expectation for mobile-friendliness. Is the text large enough? Do you have clickable items that are too close together? Are the viewports set so a user will get the appropriate experience on different devices and across different browsers?
We also like to use Chrome Developer Tools to “spoof” various mobile devices and then take some time surfing the site. We’ll look at conversion points such as mobile call functionality or poorly configured forms that may have usability problems. Beyond the basic expectations, use your own experiences testing the site and the data you see from mobile users in Google Analytics. If you want to get fancy, install a CRO tool like Lucky Orange to see if users are having issues that Google isn’t looking for, like forms or CTAs. For most of you reading this blog, with a few exceptions (largely B2B), the number of mobile users visiting your site will exceed your desktop users.
While this technical SEO checklist intends to highlight some of the more important and common findings from a technical SEO audit, it’s really scratching the surface. The most important thing to remember is that the realistic goal isn’t to have a perfectly configured website following every single best practice. While I won’t talk you out of it if you want to nerd out with me and my closest developer friends, having a site that follows every known technical best practice isn’t going to lead to 10x growth in organic sessions (I mean, assuming your site is being crawled and indexed). There are always compromises coming out of a technical SEO audit as not every site is built the same. Shoot for the moon, but don’t be disappointed if you land amongst the stars.
With updates and priorities from Google’s algorithm updates and general search engine result page treatments, focusing a larger part of your effort on creating great content that answers common questions will be more likely to drive significant results, especially in the short-term. Don’t get lost in technical and lose sight of what your website is intended to accomplish. Make it an on-going priority because it will undoubtedly help long term, but don’t be afraid of the words “good enough for now”. And please, don’t tell Google I said that.
Need more help fixing your technical SEO issues or not sure where to even start? Contact the SEO team at Volume Nine today – we’re here to help!