On-page, off-page, and technical SEO can be thought of as three shafts of organic search engine optimization.
Out of the three, technical SEO is most usually ignored, most likely because it’s the most critical to master. However, with the competition in search results now, marketers cannot afford to shy away from the challenges of technical SEO—having a crawlable, fast, and secure site has never been more essential to assure your site performs well and ranks well in search engines.
Because technical SEO is such a vast and growing topic, this piece won’t cover everything needed for a full technical SEO audit. However, it will address six fundamental aspects of technical SEO that you should be looking at to grow your website’s performance and keep it effective and healthy. Once you’ve got these six bases covered, you can move on to more advanced technical SEO tactics. But first...
How can convoying a technical SEO audit help your website rank better?
A technical SEO audit is a process during which you review the technical aspects of your website’s SEO. It also checks the health of a website and finds out what fixes might be needed to correct it.
Search engine bots crawl the web to find these pages and websites. The bots then monitor your pages for the different ranking factors before ranking your website in the search results.
SEO is changing constantly and your opponents are keeping up with the changes too. For this reason, you need to remain up-to-date to remain connected and relevant. If you don’t value your website’s health, you may end up losing traffic to your competitors.
As the search algorithms and technology change regularly, it is always a good practice to perform mini-audits monthly. You must also assure that you do conduct a full-fledged technical SEO audit every 4-5 months.
Technical SEO Audits are necessary because even if you’ve spent a long time creating excellent content, your users may not even see it if there are issues with your website’s crawlability or indexability.
However, even if your site can be found by internet users, its rankings could be harmed by performance-related technical factors. Page load time is one such ranking factor, which means that a slow website is unlikely to reach top places in SERPs (search engine results page).
Internet users are even less patient than Google crawlers and will leave your site if it takes much time to load.
Likewise, a poorly structured website can also lead to uncertainty among your users. A site that is effortless to navigate leads to a better user experience, and consequently, generates more leads.
During a Technical SEO audit, you could also find out that mobile users face copious problems while browsing your website. Given the fact that mobile devices make more than half of world wide web traffic, such issues could lead to a terrible loss of your revenue. Let’s also not forget that mobile-friendliness is an essential ranking factor.
The main factors that affect your website’s SEO can be broadly categorized into three different categories.
You need to conduct audits of each of these factors at frequent intervals. This will guarantee that you’re always up-to-date with the changing conditions in the industry.
Note: One of the most important determinants to be kept in mind is the mobile readiness of your website. As about 60% of all searches happen on mobile, Google has started giving more value to mobile-friendly websites.
Now, we'll be reviewing the first six things you should check to conduct a quick technical SEO audit.
There are no use writing pages of great content if search engines cannot crawl, examine and index these pages. Therefore, you should start by reviewing your robots.txt file. This file is the first point of visit for any web-crawling software when it arrives at your site. Your robots.txt file shapes which parts of your website should and should not be crawled. It does this by “allowing” or “disallowing” the behavior of specific user agents. The robots.txt file is publically open and can be found by adding /robots.txt to the end of any root domain. Here is the example of for the SEO Tools Kit Site
By designating were not to allow these user agents, you save bandwidth, server resources, and your crawl budget. You also don’t want to have stopped any search engine bots from crawling important parts of your website by accidentally “disallowing” them. Since it is the first file a bot sees when crawling your site, it is also good practice to point to your sitemap.
Note: You can also edit and test your robots.txt file in the old Google Search Console.
While Google has done a relatively good job of assigning the most important aspects of the old tool into the new Google Search Console, for many digital marketers the new version of the console still offers less functionality than the old one. This is particularly pertinent when it comes to technical SEO. At the time of writing, the crawl stats area on the old search console is still viewable and is necessary to understand how your site is being crawled.
Its report shows three main graphs with data from the last 90 days. Pages crawled per day, kilobytes downloaded per day, and time spent downloading a page (in milliseconds) all review your website’s crawl rate and relationship with search engine bots. You want your website to always have a large crawl rate; this means that your website is visited regularly by search engine bots and shows a fast and easy-to-crawl website.
If you find notable crawl errors or fluctuations in either the crawl stats or coverage reports, you can look into it further by taking out a log file analysis. Obtaining the raw data from your server logs can be a bit of a pain, and the analysis is very advanced, but it can help you recognize exactly what pages can and cannot be crawled, which pages are prioritized, areas of crawl budget waste, and the server responses encountered by bots during their crawl of your website.
Now we’ve analyzed whether Googlebot can actually crawl our site, we need to learn whether the pages on our site are being indexed. There are various ways to do this. Diving back into the Google Search Console coverage report, we can look at the status of every page of the website.
In this report we can see:
You can also analyze specific URLs using the URL inspection tool. Possibly you want to check if a new page you’ve added is indexed or troubleshoot a URL if there has been a drop in traffic to one of your main web pages.
Another good way to check the indexability of your site is to run a crawl or bot. One of the most powerful and handy pieces of crawling software is Screaming Frog. Depending on the size of your website, you can use the free version of it which has a crawl limit of 500 URLs, and more limited capabilities; or the paid version which is £149 per year with no such crawl limit, greater functionality, and APIs available.
Once the crawl has run you can see two columns concerning indexing.
This tool is also a great way of bulk auditing your site to understand which pages are being indexed and will consequently appear in the search results and which ones are non-indexable. Sort the columns and look for irregularities. Using the Google Analytics API is a good way of distinguishing important pages of which you can check their indexability.
Finally, the easiest way of checking how many of your web pages are indexed is by adopting the site: domain Google Search parameter. In the search bar, input site: your domain and then press enter. The search results will show you every page on your website that has been indexed by Google. Here’s an example:
Here we see that seotoolskit.co has around 66 URLs indexed. This function can give you a very good understanding of how many pages Google is currently storing. If you notice a large contrast between the number of pages you think you have and the number of pages being indexed, then it is worth investigating further.
Using these three techniques, you can build up a good picture of how your site is being indexed by Google or other search engines and make changes accordingly.
The value of a comprehensive and structured sitemap cannot be depreciated when it comes to SEO. Your XML sitemap is a map of your site to Google and other search engine bots. Essentially, it helps these crawlers find and rank your site's web pages.
There are some crucial elements to consider when it comes to an effective sitemap:
If you use the Yoast SEO plugin, it can create an XML sitemap for you on its own. If you’re using Screaming Frog, their sitemap analysis is also very detailed. You can see the URLs in your sitemap, missing URLs, and also orphaned URLs.
Make sure your sitemap includes all your most important pages, doesn't include pages you don’t want Google to index and is structured correctly. Once you have done all this, you should resubmit your sitemap to the Google Search Console.
Recently, Google announced the rollout of mobile-first indexing. This meant that instead of using the desktop versions of the page for ranking and indexing, Google would be using the mobile version of your page. This is all part of keeping up with how your users are engaging with content online. As 52% of global internet traffic now comes from mobile devices so ensuring your website is mobile-friendly is more prominent than ever.
Google’s Mobile-Friendly Test is a free tool you can use to check if your page is mobile receptive and easy to use. Input the domain of your site, and it will show you how the page is rendered for mobile and indicate whether it is mobile-friendly.
It’s important to manually check your site, too. Use your mobile phone and navigate across your site, spotting any errors along key conversion pathways of your site. Check if all contact forms, phone numbers, and key service pages are functioning correctly. If you’re on the desktop, right-click and inspect the web page.
If you haven’t built your website to be compatible on mobile, then you should consider addressing this immediately. Many of your competitors will have already considered this and the longer you neglect it the further behind you’ll be. Don’t miss out on traffic and potential conversions achieved through mobile-friendliness by leaving it any longer.
Page speed is now the most determining ranking factor. Having a site that is fast, responsive, and user-friendly is the name of the game for Google and other search engines.
You can assess your site’s speed with a whole variety of tools available on the SEO Tools Kit.
Google PageSpeed Insights
Google PageSpeed Insights is another powerful, efficient, and free Google tool. It gives you a score of “Fast,” “Average,” or “Slow” of your site on both mobile and desktop, and it also includes recommendations for improving your page speed.
Test your homepage and core pages to see where your site is coming up short and what you can do to improve your website speed.
It’s important to understand that when marketers talk about page speed, they aren’t just referring to how fast the page loads for a person but also how easy and fast it is for search engines to crawl through it. This is why it’s good practice to minify and bundle your CSS and Javascript files. Don’t rely on just checking how the page looks to the naked eye, use online tools to fully analyze how the page loads for users and search engines.
Google has another free tool for site speed focusing on mobile, in particular, strengthening how important mobile site speed is for Google. Test My Site is an amazing tool from Google. It provides you with in-depth analysis on how your website for mobile, including:
How fast your site is on different connections.
It will include your page loading speed in seconds, rating, and whether it is slowing down or speeding up.
This is more vital if you own an eCommerce site because it shows how much potential revenue you are losing from poor mobile site speed and the positive impact small improvements can make to your bottom line.
Conveniently, it can all be compiled in your free, easy-to-understand report.
Google Analytics
You can also use Google Analytics to see comprehensive diagnostics of how to improve your site speed. The site speed area in Google Analytics, found in Behaviour > Site Speed, is stuffed full of useful data including how specific pages perform in different browsers and countries. You can check this against your page views to make sure you are prioritizing and optimizing your most important pages.
Your page load speed depends on several different factors. But there are some common fixes that you can look at once you’ve done your analysis, including:
Finally, it’s time to look at your website’s plagiarized or duplicate content. As most people in the digital marketing industry know, duplicate content is a big no-no for SEO. Although there is no Google penalty for duplicate content, Google doesn’t like duplicated copies of the same information. Since they serve little purpose to the user, Google also struggles to understand which page to rank in the SERPs—ultimately meaning it’s more likely to serve one of your competitor’s pages.
There is one quick check you can conduct using Google search parameters. Enter “info:www.your-domain-name.com”
Head to the last page of the search results, if you have duplicate content on your site you may see the following message:
If you have duplicate content showing here, then it’s worth conducting a crawl using Screaming Frog. You can then sort by Page Title to see if there are any duplicate pages on your website.
These are the basics to check your technical SEO health, any digital marketer worth their salt will have these fundamentals working for any website they manage. What is really captivating is how much deeper you can go into technical SEO. It may seem daunting but hopefully, once you’ve done your first technical audit, you’ll be keen to see what other enhancements you can make to your website. These six steps are a great start for any digital marketer looking to make sure their website is working excellently for search engines. Most importantly, they are all free, so go get started with your Technical SEO Audit!