Doing an SEO audit is one of those things that feel very complicated but it is in fact, simple. After doing many audits I came to realize that we SEOs tend to overcomplicate it. This article aims to fix that.
Let’s dive in.
The first thing we need to take care of is to make sure Google is indexing your site.
If Google isn’t able to index your site then none of the things you do will matter, because Google won’t be able to access your site to see them.
Go to Google and type the URL of your site. Add “site:” at the beginning and click search.
If you can see your site in the search results, it means that Google is able to index it and you’re good to go.
If you want to make sure Google is indexing all pages on your site scroll down and look for each of them. You can also paste the URL of the page you want to check directly into Google, just make sure you add“site:” at the beginning.
Doing SEO without crawling and indexing is like living without breathing. If your website is not being crawled by Google you might as well forget about any organic traffic.
So, how does Google work?
The first step of SEO optimization is making sure that Google’s spiders are able to efficiently crawl your website. The spiders will store your site in the index and make sure it shows up on the SERP.
There are many automation tools for this process because executing it manually would be unthinkable, especially when it comes to large sites with hundreds of thousands of pages.
The most common ( free tool) for performing such an audit is Screaming Frog
There’s a free and paid version but the free version would be enough for most of the things you’ll need -> https://www.screamingfrog.co.uk/seo-spider/
There are different sections that analyze different elements of the site.
- Title tag – duplicated, missing, too long or too short.
- Meta description – duplicated, missing, too long or too short.
- H1 tag – duplicated, missing, more than one or too long.
- H2 tag – duplicated, missing or too long.
- URI – too long, upper log, lower traces, parameters, etc.
- Images – missing or too long ALT tag.
- Response codes – internal redirects, non-existent pages, server-wrong, and others.
- Directives – canonical, no canonical, next / prev, noindex, nofollow, etc.
What should you look out for?
1. Make sure there’re are not too many temporary redirects ( 302 )
302 is the response code for temporary redirects which can be a problem if you want to redirect a page permanently. In this case, you’ll need to use 301 redirects instead.
2. Fix the “not found” error ( 404 )
404 is the response code for ‘not found. People love deleting pages without thinking and more often than not, some of these pages have value. So, by deleting them, you can lose much more than you bargained for.
To be safe, follow this rule:
If a page has no traffic and no backlinks, you’re probably better off deleting it. If it does have traffic and backlinks, redirect it to the relevant page instead.
3. Look for duplicate page titles, meta descriptions, H1s, H2s
Having a unique page description and structure is important. It shows Google that each page is unique which boosts the likelihood of it having a high ranking on Google.
4. Look for missing page titles, meta descriptions, H1s, H2s
Sometimes it can be easy to forget to add some of the important elements to a page. Make sure that’s not the case on your website by looking for missing page titles, meta descriptions, etc.
5. Make sure your images aren’t too heavy
Heavy images are going to slow your site down. As a rule of thumb, try not to use images heavier than 100k.
Google Cache Check-up
It is important to check if Google properly crawls and previews your pages. It’s possible that you have a block on some of your scripts, which can prevent Google from properly crawling your site. To check this:
type “cache” and your URL into Google:
This is going to show you the cached version of the your site and so you will be able to see if there are any problems with the preview.
You can also see the date and time, as well as the site in text only, without styling and formatting. All of the links should be visible on the page.
If everything is as it should be, the cached version of your site should look almost the same as the normal version.
Google Search Console
Another free tool you should use is Google Search Console
If you’ve never used it before, here’s a gret “how to” guide provided by Google -> click
What makes Google Search Console so good?
Firstly, it’s provided by Google. SEO optimization is essentially finding out the best way to work with Google so most of the tools provided by tech giants are essential. Also, it’s completely free. SEO tools tend to be quite expensive. so this is a huge plus.
There are a lot of things you can do with the tool, including checking the status of the index, finding out how many pages are excluded from the indexation, uncovering valid or show errors, and inspecting URLs.
Here are the most important things to do:
Go to “Coverage”
Click on “Inspect URLs”. Here you can type any URL you want to see if it’s being crawled, its mobile usability and lots of other stuff.
Click on “Performance” to see how your website actually is performing on Google. How many impressions/clicks you’ve had for different keywords/pages.
This are just a few examples of what you can do with the tool. Google Search Console provides so much useful information about your site that it deserves its own article.
The structure of the site is an extremely important element of SEO optimization of the site. The site must have a clear structure and easy-to-use navigation. This way, users will be able to find the necessary information quickly and easily.
This is not only important for users but also for the so-called link juice and for crawling.
If your site has a good structure the crawlers will be able to easily locate your most valuable pages and your links will be able to flow in-between your pages making them even more powerful.
To ensure that your site has a good structure, you need to make sure it’s not too deep.
This refers to the accessibility of each page. In an ideal structure, must be accessible in a maximum of 3 clicks from the home page or anywhere else on the site. But when the structure is too deep this isn’t the case. The easiest way to fix this is to add more internal links to your pages and to make sure that URLs on your site aren’t too long.
You want your URLs to look like this:
Rather than this:
As you can see from the first example. You’d be able access the product page with just three clicks from the home page.
Whereas, in the second example, you’ll see the crawler would have to go through a lot of pages just to get to the product page.
If you have an ecommerce shop this is particularly important.
The rule of thumb is; the further a page is from the most powerful page ( which is usually the home page ) the less powerful it is and the less likely it is to get indexed.
The speed of your website is not only important to Google but also to your users. There are plenty of studies to show that if your site takes more than 2-3 seconds to load, 50% of your visitors will bounce back immediately.
A few years ago, Google also officially announced that speed is now a ranking factor.
As I mentioned, speed is extremely important not only for Google but also for your users, so this could be influencing your rankings on Google. There are a lot of handy tools to test the loading speed of your site.
For example, Google has an awesome tool for this: https://web.dev
Other popular tools include: GTmetrix, Pingdom, Webpagetest
From our experience we can safely say that one of this tools won’t be enough and so it would be smarter to check your site using a few of them. Most tools add helpful recommendations too, so you can implement their advice.
My personal favorite is Webpage test.
It’s really important to remember is that you don’t need to have a perfect result. Sure, the higher the speed the better, your conversion rate will improve and you’ll increase your chances of getting high rankings on Google.
But, and this is important, you need to cross-check your website against your competitors on Google to determine the difference in performance. Based on this you have to make a decision whether or not to take action.
In any case, if your website is slower than your competitors you need to do something about it.
Here are a few things to check:
- Your images are resized and are not too heavy -> https://squoosh.app/
- You have some kind of caching plugin installed, like Cache Enabler ( if you’re using WordPress )
- You have a plugin for optimizing code instaled, like Autoptimize ( if you’re using WordPress )
- You don’t have too many heavy videos or animations on your pages
If you can tick most of this boxes, you should be good to go ( in most cases).
At the end of the day, the single most important thing for speed is your hosting provider and this takes time and resources to fix. As a rule of thumb, you don’t want your website to take more than 3 seconds to load and the time-to-first-bite (TTFB ) to be more than 600ms.
We’re officially in the mobile indexing first era. This means, from now on Google is going to start the crawling process from the mobile version of your site. And that’s kind of a big deal.
If you don’t want your rankings to suffer, you’ll need to design your site with a lot more thought about the experience of smartphone users from now on. The good news is, you can test your site pretty quickly to find out how prepared you’re for this:
A tip: run the tool a few times, as sometimes can be little inaccurate. Also,don’t forget to pull your phone out and look at your site on the device to make sure that it’s responsive and easy-to-navigate from a user’s perspective.
The use of encrypted links on your site is extremely important in order to protect the privacy of your visitors.There are still too many people out there who think that SSL certificates aren’t needed but they’re wrong.
If your site doesn’t have one Google is likely to hit you with a little penalty. “Let’s Encrypt” offers free SSL / TLS certificates for anyone who wants to use them and all serious hosting companies now offer the installation of a free SSL certificate from “Let’s Encrypt” via cPanel.
So if your website URL currently looks like this:
You need to take action to turn it into this:
This is one of the most common problems we come across, which is why it needs special attention. Firstly, you need to start by checking for duplicate versions of your domain:
Enter each one of these versions of your site into the address bar at the top of your browser.
All four versions of the site should redirect to only one of them. If the redirects don’t happen as they should, you will need to add 301 redirects to resolve the issue. This will show Google that there is only one preferred version of your site. We’ve come across sites without a single one of these redirects. which will result in Google thinking there are four versions of your website which are exactly the same.
Next, you need to check the following:
Does the page load with or without a slash after the URL (/) ?
Does the page load with several slashes after the URL (/) ?
Does the page load with index.html or index.php after the URL?
Does the page load with an upper registry?
All of these errors can lead to the creation of duplicate content because the same content is available through different URLs. It’s been speculated as to whether or not Google penalizes sites with duplicate content.
However, one thing is clear, having duplicate content on your page lowers the power of your URLs, which in turn makes it more difficult to rank for more competitive terms. There are several solutions to avoid these problems.
- Add a permanent redirect with redirect 301 to the main (correct) URL.
- Set the server and site to return error 404 when accessing the incorrect address.
- Correct the rendering canonical [rel = “canonical”] to points to the primary (correct) URL.
Another major problem is when one piece of content is repeated on multiple pages.Examples of this include content text in the header, footer, and sidebar section because this content will be repeated on every page of your site.
If the content is small, this isn’t really an issue. However, uniqueness is one of the most important factors of SEO and, as you can imagine, if you have 500 words on a blog post and 30% aren’t unique this is going to hurt your page’s power. But how can you find these duplicates on your site?
- You can manually check the site elements that are repeated on each page, such as: the header, sidebar, footer, navigation, etc.
- You can use a tool like Siteliner to do the heavy lifting for you.
Low Quality Pages
Google’s primary goal is to show high quality pages to users, which are packed with useful content to help solve their problems. If you’re unable to provide this on your site Google will start paying less and less attention to you.
Think about a mediocre pages that doesn’t rank well, no one visits or uses them, the page has no real use or purpose. The presence of multiple “mediocre” pages may result in the deterioration of Google’s ranking for the entire site:
The goal is to insert pages with high quality content only, which is useful and engaging to users. Ultimately the objective is, as with anything related to SEO optimization, is to pursue quality over quantity.
Once low quality “mediocre” pages have been identified, consideration should be given as to whether or not consolidate them with other pages of the same subject or to delete them completely. Too many people now consider thin content as anything under 500 words. The problem is that 500 words is just a number someone came up with on the fly.
So, how can you figure out if your content is thin ?
- Go one-by-one and manually read all of your articles. Ask yourself if they answer the problem/question they aspire to.
If you feel like the content isn’t quite good enough and the answer isn’t complete then you need to improve this content.
- Check all of the results on the first page of Google for your target keyword. If the results include articles with significantly more words than your article, this is a good indication that you need to add more words to your content as well.
- Always follow the rule of thumb: Write the content for humans and but polish up t for machines. Never the other way around.
Internal links are extremely important for crawling, for link distribution and for the user. By properly utilizing internal links, you can make your site a powerful machine for traffic. But when we talk about internal links we must pay attention to link juice because the most important job of internal links is to distribute power around your site.
Let’s say you’ve acquired a few high quality backlinks. What you want to do is distribute these links around your most important pages to effectively distribute power.
The problem is that some people can overdo it and in some cases, can end up worse off than they were before.This is why we usually recommend to our clients to make sure that the links look good to the user. No one likes a page with thousands links. Ultimately, it looks bad and you’re probably doing yourself a disservice.
If you want to power up some of your site’s pages. Find the most powerful pages on ( usually the ones with the most links and the most traffic) and add internal links that point to the pages you want to strengthen.
Backlinks are also incredibly important.In fact, they are the single most important ranking factor. This is why you needto make sure your site doesn’t have hundreds of ‘spam’ links pointing to it. Ifyou’re prepared to pay for a tool to address this, you can use Ahrefs’s free backlink checker. I
Other options for a quick backlinks check up would be:
The truth is this, if you want to grow your site but your backlinks aren’t impressive you’ll need to focus on building some quality links. Good content is not enough.
This is the first file that search engine spiders crawl when they hit your site. In this file, you give instructions to the search spiders to show them which pages and directories to crawl on your site and which not to. The presence and proper writing of this file should be checked in any SEO audit.
Through it, you can forbid crawling on certain pages or directories and therefore, prevent them from being indexed.However, it is not advisable to forbid crawling and indexing through the robots.txt file, but rather to use noindex.
We don’t want Google to index our/wp-admin folder, which is why we’ve marked it with Disallow. If your site is small, you don’t need to overcomplicate thing sby forbidding certain pages. Just make sure your user-agent command looks exactly the same as the example above.
Each site, when completed and put into operation, must have a map of the site in XML format. The purpose of creating this map is to provide search engines with a complete list of the pagesa nd addresses you want to be crawled and included in the search engine index.
You can also make a separate sitemap for your photos or you can include them in the main one. Once you’ve created the site map, you need to submit it for crawling and indexing.
- · For Google, use the Google Search Console tool
- · For Yahoo and Bing, use the Bing Webmaster tool
- · For Yandex, use the Yandex Webmaster tool
If your site is on WordPress, you will need to install a plugin called Yoast SEO. Once installed, click on General and go to Features to make sure XML sitemaps is ON. If your site on WordPress is small, this will be enough in 80%of cases.
You can look at Robots.txt and Sitemap XML as complementary files.
With their help, you’re giving instructions to the search engines on how to crawl and index the pages on your site.
Next, you can do a Google search to see how your site qualifies. Ideally, the site should be ranked first and its internal pages should show as sitelinks in the results. If you have a Google My Business account, this information should also be displayed. However,if the site is not ranked first in a brand search, there may be several reasons for this:
- 1. The site was created very recently and does not yet have sufficient history and authority, so Google does not associate it as a brand
- 2. The site is an Exact Match Domain (EMD) and contains some very common words and/orphrases
- 3. The site has been penalized by Google
Current positions and rankings on Google
Each SEO audit should also uncover the current position of the site on Google. The positioning of each site is directly related to the traffic it generates from the search engine.
For this reason, we need to analyze the keywords and phrases on which the site is currently ranked in Google’s results. Both the pages and the keywords behind the ranking need to be identified.
There are a few basic tools to help you identify your current position on Google. Our favorite is Ahrefs,which is probably the best paid software out there.
If you’re not prepared to pay for SEO tools, you can always check your rankings with Google Search Console under the Performance tab.
You can then choose whether you want to view your organic traffic by Queries or Pages.
Since 2019, a proper SEO audit would not be complete without mentioning UX. User experience has become an incredibly important part of every SEO out there.
Because since 2019, user experience has become a ranking factor. If someone clicks on your site and then quickly bounces back, Google sees this as a clear signal that your site hasn’t satisfied the user.
How can you do a quick UX analysis of your site?
1. Check your Google Analytics account
Look closely for:
- · Bounce rate – this shouldn’t be higher than 70% on the key pages
- · Average time on page – this shouldn’t be lower than 20 seconds
2. Check speed and design
- · Make sure your site is fast – not from your location but from where your customers live
- · If you’re not confident that your site looks great visually, hire a designer to help – users are far more likely to bounce back because of an unattractive site.
3. Optimize your site for simplicity
- · Simplify your site as much as you can. This is the single most important task. Even if your site is created by the world’s greatest designers, this won’t matter if it’s messy. People want ‘quick and easy’, so give it to them.
- · Show your site to someone who has never seen it before and ask them if they understand what it is that you do.
- · Make sure it’s really easy to find the most important pages on your site.
- · Make the text as easy to read as possible by adding images, line brakes, spaces etc.
This is another easy-to-miss, but extremely important, aspect of SEO optimization..
SEO is objective. Google ranks sites on the SERP based on correlation. This is exactly why you need to know what it is your competitors are doing, especially if you want to out-do them.
When doing competition analysis,you will need to cover the following questions:
- · Who are your competitors in your business niche?
- · What strategy do they use to optimize their sites and how powerful are their backlinks?
- · What is their current position on Google?
- · What are their sources of traffic?
Although competition analysis is extremely important, it is vital not to copy the competition. Instead, we need to learn from them and not only implement similar working strategies, but also avoid the mistakes they’ve make..
An SEO audit can be viewed as the concrete foundation of a building. Every building needs a solid foundation on which to build, just as every website needs a solid foundation on which to operate successfully.
You can’t have a stable building without a strong foundation, right? Well, the same goes for websites and SEO.