Google Will Overlook Robots.txt Rules If It Offers A 4xx Status Code

Here is one more PSA from Gary Illyes of Google. In other words, if you serve a 4xx standing code with your robots.txt file, then Google will disregard the policies you have defined because file.Why? Well, 4xx condition codes means the file is not available, so Google won’t inspect it since the web server states it is not readily available. Gary claimed this since he received a problem or more concerning Google not respecting the robots.txt rules. Gary wrote on LinkedIn, “PSA from my inbox: if you offer your robotstxt with a 403 HTTP condition code, all rules in the data will certainly be neglected by Googlebot. Client mistakes (4xx, other than 429) imply not available robotstxt, as in, a 404 as well as a 403 are equivalent in this situation.”In short, make sure your robots.txt data serves a 200 status code and Google can access it.Forum conversation at LinkedIn.

Facebook Comments Box

Share This