Prevent crawlers from searching subdomain

Hey all,

can someone enlighten me on how to prevent crawlers to index and search my subdomain which I use for testing purposes only? Guess I need to edit / create a .htaccess file for that, but I have no idea what to put in there.

Any help much appreciated. :slight_smile:

  1. If the dev site has not been indexed yet, you can block crawlers via robots.txt, see below.

  2. If the dev site is already indexed and you want it removed, add meta NOINDEX tag to all pages allow the site to be crawled via robots.txt (reason: you want google to crawl and noticed the noindex tag on the pages so that they remove it from search results. if the site is indexed and you block crawler via robots.txt, google will keep the pages indexed but won’t crawl them again). Once deindexed, you can block via robots.txt again.

(by: How to prevent development website subdomain from being indexed? | SEO Q&A | Moz)

To add robots.txt:

Add a robots.txt file on your subdomain folder with the following text:
User-agent: *
Disallow: /

1 Like

Danke Jan :slight_smile:

I just created the subdomain and it’s still empty. Going to put a robots.txt file in there.

Cheers,
Darian

1 Like

From what I’ve read, robots still get in because it’s actually just a courtesy. To keep it totally secure I use Joe Workman’s Page Safe stack. Make it a partial on every page. Works great, but does cost some bucks. Page Safe Stack for RapidWeaver

1 Like