Search Engine Optimization (SEO) has played an essential role in the growth and sustainability of businesses since the first search engine launched in 1990. It’s been reported that 60% of consumers begin their product research on a search engine, up to 77% when it comes to B2B research. So, contrary to what many ill informed marketers might proclaim, SEO is not dead.
Whether it’s big business or small business, a plumber or retail store, if you’re a web design newbie or a master of the SEO dark arts – performing an SEO audit should be step number one in growing your online visibility and brand recognition.
Much like when I wrote 4 DIY SEO Tips for Beginners, rather than a highly technical article this will be a top level view of what items to review at each given step. In this article we’ll discuss the three core components of a successful SEO audit, as well as the tools you’ll need to complete the job.
Building Your Toolkit
Before we dive into how to perform an SEO audit, we need to be sure we’re working with the right set of tools. I highly recommend Screaming Frog SEO Spider Tool. This tool allows you to quickly crawl, analyze and audit a site from an on-page SEO perspective. It will fetch key on-page elements and present them in tabs by type and allows you to filter for common SEO issues.
When it comes to seeing where your business ranks on a keyword level, it’s hard to beat SEMrush. SEMRush will tell you how many keywords your business ranks for, the position, cost per click, and overall monthly search volume.
I also like Moz for a lot of reasons. It gets a bad rap in SEO nerd culture, but it does a good job of taking an otherwise complex and confusing topic and make it very easy for the average user. I prefer it for the Open Site Explorer function to review backlinks of sites I’m auditing, as well as perform a spam analysis on each backlink to determine if I need to disavow the link or not.
Reviewing Technical & Architectural SEO Elements
While on-page SEO gets all the love and attention in the C-Suite, it’s the technical and architectural considerations that really help a site climb the search engine results page (SERP).
These elements require us to look at page load times, crawl rate optimization, server response codes, URL canonicalization and more.
To begin reviewing technical items, I’ll usually start by crawling the domain with Screaming Frog.
Once the crawl report is complete, I’ll skim through the Internal HTML pages crawled for any URL’s not following the existing best practices for length, link depth, formatting and other anomalies.
I’ll look at the External crawl tab to see where we’re sending the crawlers, then look at the Protocol report to make sure no HTTPS pages are bleeding over to non-HTTPS pages.
Most importantly I’ll review the Response Codes tab. This is where you’ll see which pages you are blocking search engines from accessing (Blocked by Robots.txt filter), you’ll get a nice list of all redirections (3xx errors), missing pages (4xx errors), and server errors (5xx errors). In my opinion, addressing these server response codes is the top priority during the implementation phase.
Within Screaming Frog you’ll also want to review the Directives tab and make sure all of your canonical URLs are correctly linked, and the pages you do not want to show in Google are set to noindex.
Performing an On-page SEO Analysis
If technical and architectural SEO is the blueprint and foundation to a website, on-page SEO is the aesthetics and ambience which make the site easy for a search engine crawler to visit and hang around for awhile.
Sticking with Screaming Frog, we’ll start with the Page Titles tab and investigate for any missing or duplicate page titles and character length considerations. Likewise, we’ll make these same considerations for the Meta Description, Meta Keywords and H1.
Consider this good housekeeping. These elements must be present on each page, uniquely written, of single usage, and within the recommended character count lengths.
This is also the point in our SEO auditing journey where we exit Screaming Frog and refer to your chosen keyword research tool (I recommended SEMrush) because now we need to know exactly which keywords we’ll be targeting.
Using your keyword research tool, identify four or five unique keyword phrases per page with as minimal overlap as possible. On larger sites siloing keywords becomes increasingly difficult, but necessary to prevent internal keyword cannibalization.
What we’re looking for when we research keywords is that the keyword is relevant to the contents of the page, a medium to high search volume, and competition levels are low to medium.
Once our list of keywords is built, begin distributing them among your page titles, descriptions, meta keywords and primary headline (H1), keeping the keywords in line with the individual pages focus.
Don’t be afraid to use the most competitive keywords in all four of the mentioned elements. Just be sure to use them in a phrase to avoid keyword stuffing.
For many years, SEO consisted of only on-page and technical SEO. As marketers and business owners learned how to manipulate these items, search engines began favoring off-site signals to determine the relevancy a site has to a given search query, and as a vote of trust between the site and the search engine.
The first thing I recommend looking at is a list of backlinks for the site. Backlinks are any third party website linking to the site that is being audited. To do this, I use Moz’s Open Site Explorer tool. A manual review of all inbound links should make it pretty clear if it’s a legitimate backlink or not. Any questionable backlinks should be manually removed, or as a last effort, submitted to Google for disavowal.
As part of the off-site audit, I’m also looking at business directory listings. All third party business listings and social media sites must have consistent data including name, address and phone number. The more consistent signals a business can to a search engine, the more confident the search engine is in that information.
Yext is a great tool to quickly scan more than 200 third party listings for data errors.
While we didn’t dive into the “how” of SEO, a semi-working knowledge of the items mentioned in this article are enough to get you asking the right questions and checking the boxes that need to be checked when it comes to auditing your businesses website.
Being proactive by understanding what to look for and know how to identify potential site problems before they harm your rankings is a major piece of the puzzle when it comes to maintaining and growing your brand’s awareness through search.
Dallas McLaughlin | Digital Marketing Specialist
Latest posts by dallas mclaughlin (see all)
- The Amazon Effect: Stealing from Amazon’s Strategic Playbook - September 18, 2017
- What is The Zero Moment of Truth (ZMOT)? - August 14, 2017
- Five Books That Make Us Better Marketers - December 9, 2016