What is Crawl Budget?
Crawl budget is the maximum number of pages search engines can crawl and index on a website within a specific timeframe (usually a day). If, for example, Googlebot does not crawl and index a particular page on a website, that page will not appear in search results for anything.
The crawl budget varies daily but not by much. Googlebot may crawl 5 or 5,000 or 50,000 pages on a website in a day. A site’s crawl budget is usually determined by:
- Website size. Larger sites will often have a larger crawl budget.
- Website health based on the number of errors that crawlers encounter. A high number of errors could progressively depress the crawl budget.
- Number of links to the website. The more the links, the higher the crawl budget.
Crawl budget is not usually a problem for most websites. However, for large websites, a small crawl budget could considerably delay the time it takes the search engine to notice and update changes to your site.
To check if a website has a crawl budget problem:
- Determine the total number of pages the website has. The number of URLs in the XML sitemaps could be a good starting point. If you have access to web server logs, that may provide an even more accurate metric.
- Go to Google Search Console -> Settings -> Crawl Stats’
- Note the average number of pages Google crawls each day. If the website has say 10 times more pages than Google crawls every day, its crawl budget needs to be optimized.
To improve crawl budget:
- Reduce errors by performing regular website maintenance.
- Block web pages that do not need to be indexed.
- Reduce the number of 301 redirects since Google could queue both the original and destination URLs for crawling thus increasing crawl time and budget.
- Get more links to the website through a successful link building plan.