Google's Gary Illyes confirmed that he is slowly working on improving the robots.txt test tool that is currently available in the old version of the Google Search Console. He said on Twitter: "We are currently (*) integrating the tester into the production version of the parser."

Gary said this because there was a complaint that either there was a bug with the tester tool or how Google reads a site's robots.txt file. Gary said it could just be the tool itself, and if the tool is updated to use the production version of the parser, then for sure you will know.

A reminder that Gary was involved in the standard Robot Exclusion Protocol project and then used the parser code as an open sourcing solution.

Here is Gary's tweet:

We'll see, but it's likely a tool bug. We are currently in the process of (*) integrating the tester into the production version of the parser to fix this in case it is actually a tool bug.

* – it's my postponement project, so no breaths

– Gary ly 理 / 6 Illyes (@ethod) January 6, 2021

You can find out more about the complaint by scrolling through the Twitter thread.

In 2014, Google revised its test tool robots.txt, but it was updated several times before and after this time.

Forum discussion on Twitter.

LEAVE A REPLY

Please enter your comment!
Please enter your name here