From bcca8fdf1f042045be46068ed6398ea321677e4d Mon Sep 17 00:00:00 2001 From: "Kyle J. McKay" Date: Sun, 26 Jan 2014 00:02:08 -0800 Subject: [PATCH] robots.txt: add basic exclusions Crawlers should not be visiting the blame, graphiclog or snapshot links. Add specific exclusions to make it so. Unfortunately the blame and snapshot links require wildcard support to match which some robots may not support. --- robots.txt | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/robots.txt b/robots.txt index e69de29..874fadc 100644 --- a/robots.txt +++ b/robots.txt @@ -0,0 +1,7 @@ +User-agent: * +Disallow: /git-browser/ +Disallow: /w/*/snapshot/ +Disallow: /w/*/blame/ +Disallow: /w/*/blame_incremental/ +Disallow: /w/*/blame_data/ +Allow: / -- 2.11.4.GIT