Tag: robots

  • Robots in the Wiki

    We made some minor adjustments to our installation of MediaWiki to prevent robots such as Googlebot from indexing irrelevant pages like article edit pages and history pages. Essentially, we prepended a “/w/” to all non-article pages and then used mod_rewrite to remove the /w/ so the pages still work normally. The robots.txt file then prohibits…