how-to backgroundHow to do a Technical SEO audit in 4 Steps

How to do a Technical SEO audit in 4 Steps

By Stephanie VelazquezPublished: 10/23/20

The technical SEO audit is a process that can take place at different times in the life of a website: creation, migration, redesign ... or when its performance needs to be optimized.

How to identify the points to be optimized? What key criteria should be observed to identify areas of improvement? And what tools should be used to perform these technical enhancements?

Discover all the steps to follow in order to improve your search engine positioning and, ultimately, your conversions!

What is a technical SEO audit?

Technical SEO audit: definition

A technical SEO audit is an analysis of a website on the basis of multiple technical and structural criteria, in order to optimize it.

The starting point of any SEO approach is following the main audit branches:

  • Indexability,
  • Architecture,
  • Web Performance,
  • User Experience,
  • Microdata,
  • International,
  • Accessibility.

Purpose of an SEO technical audit

The technical SEO audit is used to :

  • Identify technical obstacles that impact positioning on Google (representing 95% of search engine users)
  • Identify operational and concrete solutions to make your site more efficient.

☝️ A technical SEO audit differs in this respect from the semantic SEO audit, which aims to analyze the quality of content and its interconnection.

This approach presents positive effects on your site, improving on:

  • visibility on search engines,
  • traffic,
  • user experience,
  • and, ideally,conversions.

Technical audit and page indexing

As a reminder, the presence of a site's pages in the Google index is essential to exist on the Internet.

Only 10% of web contents are indexed by Google.

Indexing is the process followed by a search engine to give visibility to a content.

From the indexing phase to traffic engagement, which can lead to conversions, a URL will follow these steps:

  1. Creation of the URL - page with content ;
  2. Crawl of the URL by Googlebot ;
  3. Adding or not adding the URL to the Google index, with a hidden "score" assigned to the page during the indexing phase;
  4. Positioning or not of the page on certain requests, according to the quality scoring;
  5. Generation of visits (or traffic) on the positioned page;
  6. Conversions of Internet users.

How to perform an SEO audit: the prerequisites

  1. Identify the problem at the origin of your SEO audit:
  • A loss of traffic after an update or a production release,
  • An algorithm update (ex: Speed Update),
  • Migration of your site (redirection plan, etc.)


     2. Define the perimeter the site audit will cover, knowing that the technique is at the service of the two other pillars of SEO: content and popularity. It is a question of determining the nature of each action to be taken:

  • The technical SEO criteria to be analyzed are mainly on-site, i.e. intervening on the site;
  • The off-site parameters will rather intervene during a link building audit;
  • The mesh network and the structure, which mixes technical and semantic aspects.


​    3. Have a specific SEO audit tools, such as a:

  • crawler,
  • log analyzer,
  • Google Search Console,
  • Google Analytics account.

Use the expertise of a professional, such as an SEO consultant or an internal resource. This person will be able to interpret and exploit all this information, in order to recommend a prioritized action plan and bring you the best ROI.

☝️ A technical SEO audit is essential in all cases, but an outline is necessary, in order to determine how to proceed and prioritize the different aspects to be studied.

Complete SEO audit methodology: steps to follow

Here is an example of a process that will be adapted according to the problems identified during your analysis.

Step 1: Getting started with the Google Search Console

Before even launching the crawl, start by analyzing the site with the Google Search Console (or GSC, formerly Google Webmaster Tools).

What can the Google Search Console detect?

The GSC is the free web site management tool provided by Google that allows you to obtain technical information on :

  • Encountered exploration problems,
  • Inconsistencies between the sitemap file and the explored site,
  • Detected technical defects (responsive or non-responsive site, response time, etc.).

Examples of checkpoints within the Google Search Console

  • Indexing errors: Detects defects in your sitemap proposal, which includes URLs that you do not want to see in the Google index.
  • Bad implementation of Hreflang: If it fails, Google won’t recognize the same content available on pages in other languages.
  • Management of sitemaps: The sitemap.xml is the file that the robots read by going by default to www.ndd.xx/sitemap.xml. The HTML sitemap, on the other hand, is used to bring up URLs that are too deep or to propose recent pages for indexing.
  • Crawl errors: The URL pushed in the sitemap submitted on the search console is not accessible, but Google does not indicate that the page is abandoned.
  • The choice of single version: It's advisable to choose the preferred displayed version (www or not www), and then make sure there is no duplicate content.
  • Responsive functional design: The web browser experience must be adapted to the uses of Internet users and mobile users... knowing that mobile traffic represents today more than 50% of web traffic!
Indexing coverage errors with Google Search Console

Indexing coverage errors with Google Search Console

As an SEO manager, you should check whether the published contents are :

  • Primordially indexed,
  • Secondarily well referenced on the searched keywords.
List of errors encountered on Google Search Console

List of errors encountered on Google Search Console

  1. To verify that a URL is present in the Google index, you can submit the URL via the Search Console
  2. Check via the Search Console that the pages are crawlable and indexable by observing the coverage of the site by Google: has the site been crawled by Google?
  3. After, inspect if it has encountered errors during its exploration by auditing each of the error codes returned:
    1. Redirection loops,
    2. 301 redirects or broken redirects,
    3. Server errors (5XX),
    4. URLs declared as a ¨noindex¨ although they are integrated in the sitemap,
    5. The 404 error,
    6. URLs whose access is blocked by the robots.txt file.
Submitted URL blocked by robors.txt

Step 2: Perform a crawl analysis

Depending on the indicators found on the Search Console, you can launch the crawler by setting analysis objectives.

What can a crawler detect?

A crawler simulates the path that the Google robot would take to scan the site. It allows you to obtain a map of the site as it is perceived by a search engine, conducting its exploration through the links encountered.

The launch of a crawl allows to check, among other things :

  • The indexing of the contents,
  • The structure of the pages,
  • The coherence of the internal web,
  • The absence of errors in the sitemap, by providing the list of URLs to the crawler.

Examples of on-site checkpoints held by the crawler

  • Duplicate content: to find pages that are generally 90% similar and deserve :
    • canonicalization,
    • or another specific treatment (e.g. deletion, redirection, etc.). 
  • Canonical errors.
  • Redirection chains and loops above 4 consecutive ones.
  • Server errors: the reason will be studied with the development team.
  • The 404s: they correspond to content that cannot be found or to an incorrect link.
  • Paging: its relevance must be justified.
  • Monitoring the installation of the Google Analytics tag and Google Tag Manager (GTM), to ensure that the overlay is applied to the desired locations on the site.
  • Microdata schema: they make pages eligible for certain data formats displayed in the SERPs and can thus promote click-through rates.
  • JavaScript rendering tracking: option to be activated when the site is built on a full JavaScript base, to facilitate the interpretation of the code by the crawler.
  • Robots.txt tracking: this file contains the rules that the robots must follow on a site, to mark a path and restrict certain accesses.

Path to check the sitemap structure with the crawler

Start by configuring your crawler, indicating different types of rules, depending on what you want to analyze or not:

  • Javascript rendering enabled,
  • Tracking the number of "nofollow" links,
  • Respect the "noindex",
  • Tracking of canonical URLs,
  • Follow the provided sitemap.xml file,
  • Collection of the microdata encountered.

You do not necessarily start from the home page. According to your needs, define first if you wish to obtain a cartography :

  • of the whole site,
  • of a part or section of the site.

1. Start by looking at the sitemap:

  1. Is the sitemap logical?
  2. Are there any mesh network defects between pages?
  3. Are some pages too deep (more than 4 clicks)?
  4. Can the essential content be made more easily accessible through internal navigation?
  5. Are there pages without canonical tags or that are not self-canonical? If so, observe what do they point to?
  6. By cross-checking with the sitemap data, identify:
    • Active pages: generating at least one user visit,
    • Inactive pages: crawled, but not generating any visits,
    • Active orphaned pages: which are no longer linked to any page, but continue to generate SEO visits?

2. Then, identify in the mesh network :

  1. Pages with very low or no traffic, by cross-referencing the crawler data with Analytics or the Search Console,
  2. Duplicated pages, and if they have a canonical page,
  3. Whether anchor texts within or leading to a category, are logical.

3. After that, you can work on the site structure by choosing a well-defined structure :

  1. Silo system,
  2. In the form of a semantic cocoon, etc.

4. Examine web performance by page type, to detect problems related to each page category and see if improvements are possible.

Step 3: Perform a log analysis

What can a log analyzer detect?

The role of the log analyzer is to give an account of the real activity of Google's robot on your site. The log analyzer :

  • Extracts data of the actual visits of Googlebot and the passage of visitors on the site;
  • Makes this data usable by the server on the platform used.

It provides information on :

  • Web Pages seen (or crawled) by Googlebot,
  • Web Pages not crawled by Googlebot,
  • The frequency of crawl and the impact it has on visits, allowing to estimate the load requested from Google to scan the entire site, etc...

☝️ This analysis is particularly interesting for sites with a high volume of pages, such as e-commerce sites for example. It allows to :

  • Check that new content has been found and crawled by Google;
  • Evaluate the frequency of updates triggered by the crawl;
  • Ensure the crawl budget is not used unnecessarily;
  • Identify which optimization actions to carry out on the site and its order of priority.

Log analysis can be based on :

  • SEO tools dedicated to log analysis such as Logit
  • A complete technical SEO platform, combining a crawler and a log analysis functionality,
  • Linux commands,
  • Dynamic Excel pivot tables.

Examples of control points with the log analyzer

  • Crawl rate, which is the ratio of the site that was visited by Googlebot ;
  • Crawl/visits ratio, i.e. the ratio of the crawl rate to the number of visits generated,
  • Crawl frequency, the delay between two crawls on the related site by Googlebot.

Verification path related to the analysis of the logs

  1. Segment the site, in order to prioritize SEO objectives by group of pages. The idea here is to get :
  • a stance of the most important categories of the site,
  • or a view according to page types (for example, a view on all pages determined as SEO conversion pages).

     2. Control internal links to optimize them: use semantically linked anchor text, with an unbroken mesh to the topic's parent page.

     3.Set up log monitoring.

     4. Identify problems with abandoned pages, which demonstrate a failure in the internal mesh network.
        *An abandoned page is a page that is not linked to any other page, usually because the URLs that were supposed to point to it no longer exist.

     5. Combine crawl analysis with log analysis with a dedicated solution.

The OnCrawl SEO platform is a complete tool for managing the effectiveness of a website, allowing comparisons of several data sources (Logs, Google Analytics, Google Search Console, Majestic, AT Internet, etc.).

Thanks to the cross-referencing of crawl data and log files, you can :

  • Follow the path of the indexing robots on your site,
  • Find out if your most strategic pages are visited and indexed,
  • Understand how your SEO performance influences your crawl budget.
Cross-referencing crawl and log analysis on OnCrawl

Cross-referencing crawl and log analysis on OnCrawl

Step 4: Study Web Performance

You can then focus on web performance, which is equivalent to a site speed audit.

What can web performance reveal?

The study of performance allows you to verify if the site meets the technical criteria recommended by Google, namely :

  • Accessibility of the site,
  • Speed of the site,
  • Crawlability.

Checkpoints related to web performance

Here are the main points to check when it comes to web performance:

  1. Mobile-first indexing: the site must be fully usable in mobile version and declared "mobile-first". A non-optimal mobile navigation can be harmful to the site.

    By using an SEO tool or Search Console, you can find out whether or not a website is "mobile-first", by informing whether visits come from a desktop or mobile.

Desktop vs mobile visits on OnCrawl

Desktop vs mobile visits on OnCrawl

  1. Accessibility: Is the site accessible to the blind, visually impaired? For example, check the presence of descriptions in the image tags (alt).
     
  2. Speed of the site :
    1. Does the site display content quickly? Loading time is an essential part of the user experience. A site that loads in less than 2 seconds is considered to perform well.
    2. What factors slow down the loading of the site? TTFB (time to first byte) - content axis, cache, CDN usage, hosting to be reviewed, etc.
  3. Visitors :
    1. Does the site display its content well without distinction between robots and visitors?
    2. To verify that the same content is offered as in the standard version of the browsers, add the command cache:insertwebsitepage.com and consult the page cached by Google.
  4. Security: is the site in HTTPS? (SSL).
  5. International: is the hreflang integration complete? (for international sites).
  6. Microdata: does the site have the important microdata to be eligible for certain display formats on the SERP? (FAQ, Carrousel, etc.). You can use the Google tool for the identification of micro-data: https://search.google.com/structured-data/testing-tool.

A word from the SEO expert - Cédric Cherchi

In response to the constraints linked to the predominance of mobile uses, Google announced the Speed Update at the very beginning of 2018. Since then, loading performance has become a mobile ranking criterion, while "mobile-first" indexing was already starting to be deployed. By March 2021, 100% of websites will be exclusively analyzed in their mobile version by Google. More than ever, technical SEO audits must focus on loading performance and page display on mobile devices!

Follow up on the technical audit!

The technical SEO audit reveals all possible SEO optimizations to make the navigation path simple and fluid for both users and robots.

It's the foundation of a website with high-performance, clear navigation, and multimedia, capable of hosting and showcasing quality content.

Besides technical aspects, you can focus on other parts of an SEO audit, making sure to create unique and engaging content for the reader, and working on its popularity with netlinking.

And you, have you already implemented a technical SEO audit in your company? Do you have additional checkpoints or feedback to share?

Transparency is an essential value for Appvizer. As a media, we strive to provide readers with useful quality content while allowing Appvizer to earn revenue from this content. Thus, we invite you to discover our compensation system.   Learn more

Best tools for you