SEO

3 Technical SEO Tips to Help Your Site Rank

5 years ago

There are few things I enjoy more than auditing a website, and fixing technical SEO issues that will help improve the site’s search rankings. Sad? Perhaps, but I love a good puzzle.

We have access to so much data that it can be difficult to decide where to focus our efforts. It can also be tempting to fix every single issue in the hopes that something helps to move the needle.

Here are three tips on how you can look in the right places to harness your technical SEO data and improve your presence in organic search.

1. Glean hints from the SERPs

I like to begin at the end and head straight to the search engine result pages (SERPs), to see which websites are ranking highly for my desired keywords, and reverse engineer how they’re ranking.

Competitor analysis won’t just look at technical SEO, of course; you’ll also be taking on-page, off-page and other factors into consideration to inform your complete strategy.

However, there are a number of technical SEO pointers we can tease out of top-ranking websites to help you rank better.

For example, you can determine whether there are any correlations between rankings and site architecture and indexing rules.

This can be especially useful when assessing how high-ranking ecommerce or recruitment websites use canonicalisation, noindex/nofollow and robots.txt to handle sub-categories, filters and parameters.

I’m not saying you need to copy your competitors’ setups, but it’s something you should consider when looking at your technical SEO.

2. Look to the search engines

My humble opinion is that it’s pointless crawling a website, identifying problematic areas of the site and tidying up those areas if search engines don’t even take any notice of them.

Luckily for us, search engines like Google and Bing are kind enough to tell us what they see when they look at our website.

Performing site: searches and using tools such as Google Search Console and Bing Webmaster Tools quickly gives us insight as to whether an issue you’ve discovered is actually causing problems for your site.

My general rule is that if it isn’t indexed, I’m not going to recommend burying too much time on fixing it, besides perhaps a pre-emptive disallow rule in robots.txt.

Heading straight to Search Console and Webmaster Tools can also flag up issues such as crawl errors, conflicting directives, rogue parameters and sitemap inconsistencies, which may be hindering the ranking performance of your pages.

I would focus on problems raised in GSC and BWT as a priority before anything you find through your own crawl of your website, as search engines are telling you directly which pages they’re having issues understanding.

3. Marry your crawl data with log file analysis

Checking Search Console and Webmaster Tools is a great starting point, but search engines come across plenty of other technical SEO issues that they won’t report on, which we can’t see in a standard crawl.

The answer to this problem is log file analysis. If you can get access to your log files, you can find out exactly which URLs get crawled by your most important search referrers, how often, and other incredibly useful insights.

Using the date range, number of unique URLs crawled and the crawl rate of your log files, you can determine whether any search engines are not crawling any pages that you consider important to your site:

We can also see whether the pages being crawled generate more organic traffic than those that don’t get crawled:

If the numbers aren’t looking too healthy, you can dig further into the log file data to identify any pages you need to either discourage search engines from crawling, or encourage them to crawl.

I hope these three technical SEO tips help to get you into the mindset you’ll need to identify and fix the right issues that will actually help your site to rank higher in the SERPs.

Download our credentials deck.

Pop it in my inbox.

Getting started is as easy as having a conversation.

crosschevron-down