Robots.txt in Umbraco

The robots.txt file is a file that consists of instructions to web crawlers visiting your website. The purpose of it is to instruct search engines what areas of your website they should crawl and for instance where your sitemap is placed. Thus robots.txt is a very important SEO tool.

To make it a user.-friendly task for web editors to allow or block specific pages with robots.txt, we have simply added a toggle-box in the SEO tab on every content page in the backend.

If you want to learn more about robots.txt you can visit the Web Robots Page. You can also see the robots.txt file for our Umbraco demo site here.

Other SEO tools include XML Sitemap and Basic SEO setup.