Google: Stop Using 403s or 404s To Reduce Googlebot Crawl Rates

Feb 17, 2023 - 7:51 am 2 by

Bee Googlebot

Gary Illyes posted a new blog post on the Google Search Central site asking all of you to stop using 403 and 404 server status codes to reduce the crawl rate of Googlebot. He said they have seen an uptick in the number of sites and CDNs doing this and they need to cut it out.

Gary wrote, "Over the last few months we noticed an uptick in website owners and some content delivery networks (CDNs) attempting to use 404 and other 4xx client errors (but not 429) to attempt to reduce Googlebot's crawl rate." "The short version of this blog post is: please don't do that," he added.

Instead, he said Google has documentation about how to reduce Googlebot's crawl rate. "Read that instead and learn how to effectively manage Googlebot's crawl rate," he added.

Gary also posted on LinkedIn saying, "Friday rumble... ramble? One of those. Anyway: the 403 and 404 status codes will not help you quickly reduce crawl rate. If anything, they might have the opposite effect. We have documentation about how to reduce crawl rate and unsurprisingly 403/404 is not in them."

There are more details in the blog post.

Forum discussion at LinkedIn.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: May 28, 2025

May 28, 2025 - 10:00 am
Google

Google's Sundar Pichai Doesn't Think Web Publishing Is Dead

May 28, 2025 - 7:51 am
Google Maps

Google Business Profiles Chat Clicks Performance Analytics

May 28, 2025 - 7:41 am
Google

Poll: Most SEOs Are Scared About Google AI Mode

May 28, 2025 - 7:31 am
Google

Google Search Products Third-Level Overlay With Store Pricing & More

May 28, 2025 - 7:21 am
Google

Google AI Overviews Tests Query Expansion Tabs

May 28, 2025 - 7:15 am
Previous Story: Google: Not All Googlebots Use Same Rendering Engine & Render JavaScript