Webmasters can now ask Google to index and process updated robots.txt files.
Previously, if webmasters changed their robots.txt file, they had to wait until the search engine next crawled the website before the updates took effect.
Now, webmasters can get their robots.txt files updated faster by submitting them to the search engine for indexation and processing.
To do this, webmasters need to go to Search Console, click on the Crawl section, and choose the robots.txt Tester option.
They can then download the updated code, check the uploaded version and ask Google to update the robots.txt file.