.htaccess - Don't spider certain pages on multiple domains same hosting -
i have hosting account 2 domains parked on it, websites shows different content reading domain being used.
google spider , list 2 domains different websites.
so have these listed on google:-
www.blue.com/index.php
www.pink.com/index.php
then lets have page want on blue domain: www.blue.com/test.php, domains parked still work @ www.pink.com/test.php
this means spidered, don't want do.
how can stop this?
is possible have multiple htaccess rules depending on domain? or maybe robots.txt stop spidering - how work multiple domains?
what best solution me?
redirect different domain specific robots_(blue|pink).txt in .htaccess:
<ifmodule mod_write.c> rewriteengine on # internal redirect robots_blue.txt rewritecond %{http_host} =www.blue.com rewriterule ^robots\.txt$ /robots_blue.txt [l] # internal redirect robots_pink.txt rewritecond %{http_host} =www.pink.com rewriterule ^robots\.txt$ /robots_pink.txt [l] # internal redirct index_blue.php, rewrite internal rewritecond %{http_host} =www.blue.com rewriterule ^index\.php$ /index_blue.php [l] # or "... /index.php?site=blue" # external permanent redirect of test.php index.php if not www.blue.com rewritecond %{http_host} !=www.blue.com rewriterule ^test\.php$ /index.php [l,r=301] # internal redirect rewritecond %{http_host} =www.pink.com rewriterule ^index\.php$ /index_pink.php [l]
robots_blue.txt, don't crawl test.php in www.blue.com:
user-agent: * sitemap: http://www.blue.com/sitemap.xml disallow: /test.php disallow: ...
robots_pink.txt, crawling allowed in www.pink.com:
user-agent: * sitemap: http://www.blue.com/sitemap.xml disallow:
if disallows of www.blue.com identical www.pink.com, use robots_blue.txt robots.txt both domains. should work if there no test.php used in www.pink.com.
but if sitemap.xml used in robots.txt should solution.
Comments
Post a Comment