This guide explores Ubersuggest's Site Audit tool, detailing how it identifies and resolves On-Page SEO issues to improve website performance and search rankings.
This guide explores common challenges encountered during a site audit and provides actionable recommendations to resolve them effectively.
- What Is the Site Audit
- How the Site Audit Works
- Troubleshooting Non-Crawled Websites
- Common Audit Issues and Resolutions
- Disabling Crawl on Specific Pages
What Is the Site Audit
The Site Audit feature is one of Ubersuggest's more advanced tools, providing a detailed technical analysis of a website. While it offers in-depth insights, its effectiveness depends on several external factors beyond Ubersuggest's control. In many cases, resolving certain issues may require the expertise of a webmaster or developer.
How the Site Audit Works
Ubersuggest starts the audit by checking the secure version of the opened site e.g. https://example.com. If an SSL certificate is not available, it simply opens an HTTP site. If the main URL starts with ‘www’ and there isn’t a proper redirect from the non-www versions, then users won’t be able to perform the crawl using Ubersuggest. Ideally, all versions should be pointed at the primary URL utilizing redirects.
Remember that the tool is only able to audit HTML pages. Many users want full site crawls but pages that are not written in HTML code will not be included. The applicators of the tool visited users' sites using internal links; if one is made with a "no follow" attribute then that link will not be audited any further. Make sure all internal structures of the links are active. It's important to make sure all internal links work properly and lead to the right pages to keep the SEO in good shape. Neil prepared a comprehensive guide on internal linking.
Troubleshooting Non-Crawled Websites
Ubersuggest makes every effort to crawl as many URLs as possible, focusing specifically on HTML pages. However, certain factors can obstruct this process:
Loading Times: If a page takes too long to load, the crawl might time out. Running a Google Page Speed test can help identify and address any loading issues.
Server or Host Blocks: Ubersuggest crawlers use various IPs, which may sometimes be mistaken for scraping attempts by hosts with firewalls or those using services like Cloudflare. To avoid this, adding Ubersuggest’s IPs as exceptions is recommended.
Other less common issues include blocks set via robots.txt, .htaccess files, or canonical loops.
Common Audit Issues and Resolutions
Errors Persisting Post-Fix
If errors continue to show up after making corrections, PRO users should consider performing a recrawl. This can often provide updated results. It’s also helpful to wait a bit between crawls and clear the browser cache if changes aren’t showing up.
Missing SSL Certificate and Sitemap
If Ubersuggest only crawls one page, it could mean there’s no SSL certificate or sitemap in place. It's important to check the sitemap structure and functionality using Google Search Console to ensure everything is set up correctly.
Handling 404 Errors and Broken Links
Ubersuggest can identify pages returning 404 errors but may not always indicate where these errors originate. It’s important to check for typos in internal links, as even small mistakes can cause 404 errors. There are various tools and browser extensions available to help detect and fix these issues efficiently.
Learn how to fix broken link issues here.
Duplicate Tags and Meta Descriptions on Paginated Pages
It’s common to see duplicate tags on paginated sequences (e.g., www.example.com/page1 and www.example.com/page2). This is generally acceptable since Google understands these pages as part of a sequence.
Learn more about pagination here.
Low Word Count on Non-Content Pages
Pages like "Contact Us" or "About" often have low word counts, which usually won’t impact users' SEO strategy if these pages aren’t meant to rank. Keep in mind that Ubersuggest flags pages with fewer than 200 words as having a low word count.
Disabling Crawl on Specific Pages
To exclude specific pages from the Ubersuggest crawl, especially for domains with multiple subdomains, the users should update the robots.txt file.
This will prevent Ubrsuggest bots from crawling through pages in designated directories, such as a "blog" subdomain.
For more information or further assistance, please contact Ubersuggest Support Team at support@ubersuggest.com. We're here to ensure the SEO strategy is as effective and efficient as possible.