I am using Aegir / Barracuda / Nginx to maintain a multisite setup. My "file" directory is syminked in the mounted "files" directory. So when I clone a site used for dev purposes, then it uses the same "file" directory. The problem with the current practice of using sites / madmen / files as the location of robots.txt is that I can not put custom guidelines in my new clone development site to prevent crawlers being indexed and thus Duplicate content is being penalized for. What is an alternative option for me?
My file directory is very much synchronized because it contains lots of media files and every time I reconstruct the "complete file" directory, cloning a site.
After giving it some thought, you also do not have to give it to Duplicate Aeger handle the request. This is a plain text file, there is no need for DrupalEAgeer for bootstrap. Nginx handle the request directly
server {server_name server2; Root / var / server2; # NGN Tell us that the robots.txt request in the file directory # should be matched with robots.txt in our document root. Location = / file /robots.txt {aka $ document_root / robots.txt; } There is no need to bootstrap # straight robots.txt, Drupal Aegir. Location = /robots.txt {try_files $ uri = 404; } Location / {# your normal stuff}}
No comments:
Post a Comment