Single post hero image
THE DEALERON BLOG

Learn what a crawl budget is and how can you increase the rate at which pages are crawled in this week’s Wednesday Workshop.

Video Transcript

There are a lot of confusing concepts in SEO, especially when it comes to how the search algorithm evaluates your website.

One of the more confusing technical SEO elements is crawl budget.

As there are a lot of misconceptions as to what crawl budget actually is, today we’re going to break this term down so you can better understand how your website is indexed.

The first thing we need to do is define it.

Very simply put, crawl budget is the number of pages on your website crawled and indexed by Googlebot in a given timeframe.

It balances the crawler’s attempt to crawl your domain with not overloading your servers.

Most of the time, crawl budget is not something you really need to worry about, as it is not a ranking factor for SEO.

However, understanding how it works will only make you more internet savvy, and help bolster other areas of optimization.

When discussing crawl budget, it is helpful to also know which factors affect it.

Site speed is a biggie.

Not only does improving site speed help user experience, it also can increase the rate at which your webpages are crawled.

Talk about a win-win!

Another factor is page errors.

Many 400-level or 500-level page errors can slow crawling down. This is just another good reason to make sure your site has proper redirects in place and no server errors.

Tools like Screaming Frog and Google Search Console can help you discover these soft errors.

So, now that you understand a little bit more about crawl budget, how can you increase the rate at which pages are crawled?

A no-brainer first step is to make sure that robots.txt is crawlable. This file tells Googlebot what it is and isn’t allowed to crawl in the first place.

Then you should make sure your XML site map is up-to-date.

Googlebot will have a much easier time understanding how to follow internal links and canonicals when you keep your XML sitemap update both on your website, and with Google.

Another easy step is to make sure you don’t have any daisy chain redirects on your webpages.

A daisy chain redirect occurs when you redirect a redirected page.

For example, if you have page A redirecting to page B, but you want pages A and B to redirect to page C, you shouldn’t simply redirect B to C; rather, redirect page A to page C as well.

Reducing chain redirects is very important for crawl speed and site speed.

As you can see, there are a number of ways to impact your crawl budget, and get more pages crawled in less time; and any improvement you make could go a long way for your website.

That’s all the time we have left for today’s workshop.

As always, if you have questions or comments, leave ‘em down below and we’ll get back to you shortly.

Thanks for watching.

We’ll see you next time with another Wednesday Workshop from DealerOn.

Author DealerOn Admin

More posts by DealerOn Admin

Leave a Reply