How to Create Dynamic Robots.txt File for Subdomains using .htaccess Redirect

What is a robots.txt file? Most websites have one of these very simple file called “robots.txt” on the main directory of their server. The robots.txt file has been around for almost two decades, and it is now a standardized way of communicating what pages search engine bots (or crawlers) should and should not visit.

dynamic-robots-txt-techblogcorner

What about if your robots.txt file adjusted itself automatically based on whether it’s the test site or the live site?

Below is the code for generating dynamic file for your main domain site along with sub-domains :-

RewriteEngine on
RewriteCond %{HTTP_HOST} ^subdomain.website.com$
RewriteRule ^robotx\.txt$ robots-subdomain.txt
Then add:
User-agent: *
Disallow: /
to the robots-subdomain.txt file

Save

Must Read-  Top 5 Code Editors for Android to Learn Programming Easy

Leave a Reply