Ubersuggest doesn’t allow you to choose which URLs you would like to crawl, because it analyzes the whole website.
There’s no specific crawl order since it randomly analyzes the whole website, and it could crawl different pages every time that you perform the audit.
Our crawlers are set to honor sitemaps and discover URLs on pages. Ubersuggest will always start with the homepage of the given website, and from there, it will find a path to continue with all other pages´ crawling.
The sitemap plays a role in this crawling process as well since the crawlers will look for the sitemap in its default location (example.com/sitemap.xml) or location specified in the robots.txt (example.com/robots.txt).
Crawling the whole website will depend on the website's size and the plan limits.
The higher the plan tier, the more pages it crawls.
At the moment the maximum of pages the Site Audit can crawl is 10,000 pages. To make sure our tool would find issues on the whole website, it may be a good idea to update plans if necessary.
For any questions, contact us at firstname.lastname@example.org.