Google Says Don’t Update Your Robots.txt File Multiple Times Per Day – Search Engine Roundtable
Google’s John Mueller said that since the robots.txt file is cached by Google for about 24-hours, it does not make much sense to dynamically update your robots.txt file throughout the day to control controlling.
Google won’t necessarily see that you don’t want Google to crawl a page at 7am and then at 9am you do want Google to crawl that page.
John Mueller wrote on Bluesky in response to this post:
QUESTION:
One of our technicians asked if they could upload a robots.txt file in the morning to block Googlebot and another one in the afternoon to allow it to crawl, as the website is extensive and they thought it might overload the server. Do you think this would be a good practice?
(Obviously, the crawl rate of Googlebot adapts to how well the server responds, but I found it an interesting question to ask you) Thanks!
ANSWER:
It’s a bad idea because robots.txt can be cached up to 24 hours ( developers.google.com/search/docs/… ). We don’t recommend dynamically changing your…
The post Google Says Don’t Update Your Robots.txt File Multiple Times Per Day – Search Engine Roundtable first appeared on One SEO Company News.
source: https://news.oneseocompany.com/2025/01/20/google-says-dont-update-your-robotstxt-file-multiple-times-per-day-search-engine-roundtable_2025012059895.html
Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.