fbpx Skip to main content

Full Seo Technical
Site Audit

All SEO campaigns should start with an SEO audit.
The purpose of running a technical SEO audit is to understand the current issues that need fixing, which in return will improve your site health and performance from an SEO prospective

Our comprehensive audit covers over 120 individual aspects of your website health. With this report it will contain instructions on how to improve and fix the issues highlighted.

Buy NowFind Out More

Our Technical SEO Checklist

We perform a Full SEO technical analysis and site report, this report contains the following aspects

  • A historical view of where they have been traffic wise and where they are now.
  • A technical analysis which will contain a 120 checklist performed on the website.

Below are the technical SEO areas we look over

AMP-related issues Broken internal links
External links that are broken Duplicate content issues
Homepage does not use HTTPS encryption (No SSL) External images that are broken
HTTP URLs in sitemap.xml for HTTPS site Hreflang conflicts within page source code
Issues with broken external JavaScript and CSS files Images that don’t have alt attributes
Issues with duplicate title tags Incorrect pages found in sitemap.xml
Issues with expiring or expired certificate (SSL) Internal images that are broken
Issues with hreflang values Issues with blocked external resources in robots.txt
Issues with incorrect certificate name (SSL) Issues with blocked internal resources in robots.txt
Issues with incorrect hreflang links Issues with broken internal JavaScript and CSS files
Issues with uncached JavaScript and CSS files Issues with mixed content
No redirect or canonical to HTTPS homepage from HTTP version Issues with old security protocol
Outgoing external links contain nofollow attributes Issues with uncompressed JavaScript and CSS files
Outgoing internal links that contain the nofollow attribute Issues with unminified JavaScript and CSS files
Pages blocked by X-Robots-Tag: noindex HTTP header Links on HTTPS pages that lead to an HTTP page
Pages couldn’t be crawled (incorrect URL formats) Non-secure pages
Pages have a meta refresh tag Orphaned pages in sitemaps
Pages have a WWW resolve issue Page’s that don’t have character encoding declared
Pages have duplicate meta descriptions Pages don’t have enough text within the title tags
Pages have slow load speed Pages have a JavaScript and CSS total size that is too large
Pages have too large HTML size Pages have hreflang language mismatch issues
Pages have too many on-page links Pages have multiple canonical URLs
Pages that don’t have an h1 heading Pages have no viewport tag
Pages that don’t have doctype declared Pages that are blocked from crawling
Pages that don’t have meta descriptions Pages that contain frames
Pages that have a low word count Pages that couldn’t be crawled
Pages that have duplicate H1 and title tags Pages that couldn’t be crawled (DNS resolution issues)
Pages that have low text-HTML ratio Pages that don’t have a title tag
Pages that have more than one H1 tag Pages that have no hreflang and lang attributes
Pages that have only one incoming internal link Pages that have temporary redirects
Pages that have underscores in the URL Pages that have too many parameters in their URLs
Pages that returned 4XX status code Pages that have too much text within the title tags
Pages that returned 5XX status code Pages that need more than 3 clicks to be reached
Pages that use Flash Redirect chains and loops
Pages use too many JavaScript and CSS files Robots.txt file format errors
Pages with a broken canonical link Sitemap.xml files are too large
Robots.txt not found Sitemap.xml not found
Sitemap.xml files have format errors Sitemap.xml not indicated in robots.txt
Subdomains don’t support HSTS Subdomains don’t support SNI
Subdomains don’t support secure encryption algorithms URLs on pages that are too long
Uncompressed pages URLs with a permanent redirect