We are back to talk about technical issues in the new #AskGooglebot pill, the series with which Google offers cues and guidance answering the doubts of the international SEO community. Todayâ€™s theme is the crawl budget and the techniques to try to impact less on this aspect, and the video also addresses other aspects related to the topic, such as caching or the reduction of embedded resources and their impact on the speed of the site for users.
Clarifications on Google’s crawl budget
The starting point for this reflection comes from the question of a user via Twitter, who asks whether “an intense use of WRS can also reduce the crawl budget of a site”. The answer is entrusted as usual to John Mueller, who first clarifies the definitions of terms and activities.
The WRS is the Web Rendering Service, the system that Googlebot uses to render pages like a browser, so you can index everything the same way users would see it.
The best-known term crawl budget refers to the system we “use to limit the number of requests we make to a server, so as not to cause problems during our crawling”.
How Google manages the crawl budget of a site
Mueller reiterates once again that the crawl budget is not a theme that should worry all sites (it was also mentioned in the SEO Mythbusting series), since generally “Google has no problem crawling enough URLs for most sites”.
Although there is no specific threshold or benchmark reference, according to the Google Webmaster Trends Analyst the crawl budget “is a topic that should mostly affect large websites, those with over hundred of thousand URLs”.
In general, Google systems can automatically determine the maximum number of requests a server can process over a given period of time. This operation is “done automatically and adjusted over time”, explains Mueller, because “as soon as we see that the server starts to slow down or to return server errors, we reduce the crawl budget available to our crawlers”.
Google and rendering
Google makes a “wide use of caching to try to reduce the number of requests needed to render a page, but in most cases the rendering is more than a simple request, so more than a simple HTML file that is sent to the server”.
Reducing embedded resources helps users as well
Ultimately, according to Mueller, especially when operating on large sites can be of help for crawling “to reduce the number of embedded resources needed to render a page”.
This technique also allows you to offer faster pages to users and to then achieve two key results for our strategy.