Articles

Google To Drop Any Support For crawl-delay, nofollow, and noindex in robots.txt

by Pankaj Sharma Digital Marketing Professional

Google posted this morning that they are going to stop unofficially supporting the noindex, nofollow and crawl-delay directives within robots.txt files. Google has been saying not to do this this for years actually and hinted this was coming really soon and now it is here.

Google wrote "While open-sourcing our parser library, we analyzed the usage of robots.txt rules. In particular, we focused on rules unsupported by the internet draft, such as crawl-delay, nofollow, and noindex. Since these rules were never documented by Google, naturally, their usage in relation to Googlebot is very low. Digging further, we saw their usage was contradicted by other rules in all but 0.001% of all robots.txt files on the internet. These mistakes hurt websites' presence in Google's search results in ways we don’t think webmasters intended."

In short, if you mention crawl-delay, nofollow, and noindex in your robots.txt file - Google on September 1, 2019 will stop honoring it. They currently do honor some of those implementations, even though they are "unsupported and unpublished rules" but will stop doing so on September 1, 2019.

Google may send out notifications via Google Search Console if you are using these unsupported commands in your robots.txt files.