Technical SEO can be super fun. You get to diagnose the technical site hygiene of your website, and make improvements to ensure your site is SEO-friendly.
It’s kinda like regular upkeep for your website to ensure optimal performance. Except, you can do one yourself without needing to hire an expensive agency 😉
So what we are gonna do in this post is breakdown the 3 most important technical SEO checks you need to do for any website that is more than 6 months old.
What is an SEO Audit?
Think of an SEO audit as spring cleaning for your website. There are hundreds of links, pages, images and content hosted on your site.
Performing a “clean up” of your website – both technical site hygiene and optimizing your content will ensure your site is SEO-friendly.
An SEO-friendly site is easily found by search engine crawlers (e.g, Googlebot) and has the highest chance to climb up Google’s search results, thus higher rankings and increased traffic. Whoo!
Now this is just scratching the surface, because there are over 200 ranking factors on Google!
So I identified those specific components because they have the greatest impact on SEO.
First, let’s go through quick fixes you can implement without much technical expertise, these also happen to be common technical SEO issues most websites face. Just making these improvements can reap huge benefits for your SEO.
What are some common technical SEO issues your website might encounter?
- Internal Linking and Orphaned Pages
- Poor Crawlability
- Broken Links (404 status code)
Technical problem #1: Internal Linking and Orphaned Pages
Internal linking is one of the most common technical SEO issues and it is also the easiest to fix. Lack of internal links lead to a problem called, ‘orphaned pages’.
This hurts your chances to rank on search results because Google simply cannot find all your pages. Let me explain.
Definition: Orphaned pages have no other pages on your website linking to it. Literally, orphaned.
So it becomes very difficult for users or Googlebot (web crawler) to find it because there’s no pathway, unless they typed that specific URL address directly into the browser.
This often happens when we create new content without mapping our old, existing content to it, or vice versa. The result – pages that are not connected to any part of your main pages (literally orphaned) and a lacklustre site architecture.
And when Googlebot cannot find your pages, it cannot rank it.
Perform an internal link audit with a tool like ScreamingFrog (it’s free for up to 500 URLs). Run a crawl of your website and include as many pages on your site to review its internal links.
By crawling various entry points in your site, you can get a fuller audit of all your pages.
Of course, you can do this manually as well by checking each link on your website one by one, that is if you have a small/new website.
But if your website is already a year old and has has at least 50 pages, then I suggest using a tool like ScreamingFrog. Don’t try to audit all the internal links on a large website! You might go insane :p
Technical Problem #2: Poor Crawlability
Crawlability is about discoverability. It refers to search engines’ ability to crawl and find content on your page.
There are billions of webpages on the Internet and Googlebot is trying its best to find and index every webpage.
Googlebot does this by following links on webpages. They go from link to link, bringing information about each webpage they found back to Google’s servers.
To rank well on Google’s search results, we just have to help Google find and crawl our pages faster and better.
An XML sitemap will improve your website’s crawlability.
What is an XML Sitemap?
An XML sitemap is a file that lists the web pages on your website.
It does two things:
- Telling search engines about the structure of your website and the important pages that should be crawled and indexed.
- Includes information about each URL, such as when it was last updated, how often it changes, and how important it is in relation to other URLs on your site.
When you have grown your blog, you might have neglected taking care of your sitemap (don’t worry, this happens to most of us!). This could affect how search engines index your pages, and hurt your SEO.
Update your sitemap regularly! This helps search engines index your new pages quicker.
Use Google Search Console to review your sitemap and crawl reports.
Step 1: Log in to search console, go to ‘Crawl’ -> Sitemaps
Step 2: At the top right corner, click ‘add/test sitemap’ button. Enter the URL of your sitemap and click ‘Test”.
Step 3: After the test is complete, you can view the results by clicking on ‘View test results’. This will reveal if there were any crawl errors in your sitemap.
Technical Problem #3: Broken Links
Ah, broken links. The bane of every website.
Broken endpoints are dead links. This happens when somebody tries to find a URL on your page that no longer exists.
It looks something like this.
Why are broken links a problem?
Broken links hurt user experience and affect search engines from crawling your website properly.
Nobody creates broken links intentionally. Links become broken when these things happen:
- URL typo: when you or somebody makes a mistake typing your URL into the browser. This is out of your control, so no worries.
- Changing the URL name. This is when you change/update the URL of your blog post or page (also known as “permalink”). When you change your URL permalink, you are creating a new URL. That old, previous link becomes broken, because it no longer exists. So if that old URL had been ranking on Google, then sadly it becomes a dead link.
- Website migrations: When you undergo a website revamp or migration, such as migrating from HTTP to HTTPS or changing your domain name – if your old pages did not redirect to the new ones, that would result in broken endpoints.
When you change the permalink of a page, you might think, okay nobody is going to find that old page.
Well, that is if nobody finds it. But the problem arises when you have internal links or external links that have linked to that old page before.
So your old link which no longer exists, is now living somewhere else on either another page on your site, or worse, on someone else’s website.
When you change the URL of www.xyz.com/bali-travel-tips, that page no longer exists with that old URL. It now “belongs” to the new URL, www.xyz.com/bali-guide.
Every page that had previously linked to www.xyz.com/bali-travel-tips, will have a broken endpoint in their site. Because that page no longer exists.
Prevention is better than cure!
Make sure to 301 redirect every URL you changed to the new one. This will prevent a ton of headache in the future.
On WordPress, use a handy 301 redirect plugin. There are a few you can try out:
On the server level, you can set up 301 redirect in your .htaccess file. This is the cleanest and fastest way to perform redirects.
Note that multiple redirects will slow down your site load speed, so if you have a huge website with tons of de-commissioned product pages, you might want to consider setting up a permanent 301 redirect in your .htaccess file.
Speak to your web developer orhosting provider if you are not familiar with code. Messing around with .htaccess configuration file is a no-no.
301 Redirects Simplified
Let’s break this down. The 301 refers to a response code. The Redirect is just as it sounds, sending visitors who are visiting a specific page to another page instead.
So, the 301 redirect is simply a type of redirect.
In SEO and for website migrations from HTTP to HTTPs, the 301 redirect is considered the best form of redirect.
Because it moves the URL permanently. Now, we won’t get into programming specifics anymore here, else I might scare you away! But if you want to learn more about redirects and the tech side of SEO, check out this article.
Audit your website for broken links
Now, there is no way to determine the health of your site without auditing it for broken links.
This is one of the most impactful technical SEO fixes you can do for your site.
Identify your broken links, find the most relevant page to redirect the broken links and find which pages were linking to the old broken page.
This audit process cannot be done manually. You need to use ScreamingFrog to crawl your entire website to (i) identify pages with broken links and (ii) check for any internal and external links pointing to it.
In advanced audits, you can check for redirects too.