Latest NewsSEO

Update Your Robots.txt Policy As Per New Rules

In a recent update, Google has clearly stated that the placement of unsupported fields in your robots.txt file will be ignored.

Any field listed apart from the one updated in Google’s robots.txt documentation will not be crawled anymore.

So, what are these fields and what is the purpose behind this move?

Basically, Google wants its users to update robots.txt files with supported fields and directives as mentioned in Google’s documentation. It wants website owners and managers to run audits and remove any unsupported directives from the list. Google has clarified that crawlers will ignore any custom or third-party field or directive added to the robots.txt file.

Supported Fields

As per the Google documentation, it supports only these specific fields in robots.txt files:

user-agent
allow
disallow
sitemap

If you have been using any other directive in your file, such as crawl-delay or noarchive directives, it may not be recognized by crawlers.

Disha

SEO Expert and Content Writer

Leave a Reply

Your email address will not be published. Required fields are marked *