Ubersuggest's Audit tool Site Audit feature addresses the On-Page SEO related errors that could obstruct the full capability of website optimization and potential.
Site Audit feature is one of the more advanced features of Ubersuggest. It gives a technical description but also possesses several external dependencies that determine the effectiveness of a crawl which are not within the domain of Ubersuggest. In most cases, some of the issues noted may call for the intervention of a webmaster or a developer.
In this guide, we'll be able to address common reasons and make recommendations to address common problem situations faced while conducting a site audit.
- How the Site Audit Works
- Troubleshooting Non-Crawled Websites
- Common Audit Issues and Resolutions
- Disabling Crawl on Specific Pages
How the Site Audit Works
Ubersuggest starts the audit by checking the secure version of the opened site e.g. https://example.com. If an SSL certificate is not available, it simply opens an HTTP site. If the main URL starts with ‘www’ and there isn’t a proper redirect from the non-www versions, then users won’t be able to perform the crawl using Ubersuggest. Ideally, all versions should be pointed at the primary URL utilizing redirects.s.
Learn more about redirects.
Remember that tool is only able to audit HTML pages. Many users want full site crawls but pages that are not written in HTML code will not be included. The applicators of the tool visited users' sites using internal links; if one is made with a "no follow" attribute then that link will not be audited any further. Make sure all internal structures of the links are active. It's important to make sure all internal links work properly and lead to the right pages to keep the SEO in good shape.
Explore our comprehensive guide on internal linking.
Troubleshooting Non-Crawled Websites
Ubersuggest makes every effort to crawl as many URLs as possible, focusing specifically on HTML pages. However, certain factors can obstruct this process:
Loading Times: If a page takes too long to load, the crawl might time out. Running a Google Page Speed test can help identify and address any loading issues.
Server or Host Blocks: Ubersuggest crawlers use various IPs, which may sometimes be mistaken for scraping attempts by hosts with firewalls or those using services like Cloudflare. To avoid this, adding Ubersuggest’s IPs as exceptions is recommended.
Other less common issues include blocks set via robots.txt, .htaccess files, or canonical loops.
Common Audit Issues and Resolutions
Errors Persisting Post-Fix: If errors continue to show up after making corrections, PRO users should consider performing a recrawl. This can often provide updated results. It’s also helpful to wait a bit between crawls and clear the browser cache if changes aren’t showing up.
Missing SSL Certificate and Sitemap: If Ubersuggest only crawls one page, it could mean there’s no SSL certificate or sitemap in place. It's important to check the sitemap structure and functionality using Google Search Console to ensure everything is set up correctly.
Handling 404 Errors and Broken Links: Ubersuggest can identify pages returning 404 errors but may not always indicate where these errors originate. It’s important to check for typos in internal links, as even small mistakes can cause 404 errors. There are various tools and browser extensions available to help detect and fix these issues efficiently.
Learn how to fix broken link issues here.
Duplicate Tags and Meta Descriptions on Paginated Pages: It’s common to see duplicate tags on paginated sequences (e.g., www.example.com/page1 and www.example.com/page2). This is generally acceptable since Google understands these pages as part of a sequence.
Learn more about pagination here.
Low Word Count on Non-Content Pages: Pages like "Contact Us" or "About" often have low word counts, which usually won’t impact users' SEO strategy if these pages aren’t meant to rank. Keep in mind that Ubersuggest flags pages with fewer than 200 words as having a low word count.
Disabling Crawl on Specific Pages
To exclude specific pages from the Ubersuggest crawl, especially for domains with multiple subdomains, update the robots.txt file.
This will prevent our bots from crawling through pages in designated directories, such as a "blog" subdomain.
When requiring additional help or have more complex issues, don't hesitate to contact our support team at support@ubersuggest.com. We're here to ensure the SEO strategy is as effective and efficient as possible.