robots.txt with azure deployments
Does anyone have an idea on how to add a robots.txt file to an azure deployment of Sitefinity for different environments. For DEV and TEST I don't want the search engines to be crawling my sites so I want a robots.txt file like this:
User-agent: *
Disallow: /
However for production I would want them to crawl it.
I'm not sure, other than adding the file to the web project, how to get it up to Azure. Adding it to the package is fine but how do you add a file based on the deployment?
Thanks
Hi,
For the Sitefinity backend pages, you have the option to disable crawlers from Title and Properties - Allow external search engines to index this page and include in Sitemap.
This option adds the following in the head:
<
meta
name
=
"robots"
content
=
"noindex"
/>
var pageMan = PageManager.GetManager();
var pages = pageMan.GetPageNodes().Where(p => p.RootNodeId == SiteInitializer.CurrentFrontendRootNodeId);
foreach
(var page
in
pages)
page.Crawlable =
false
;
pageMan.SaveChanges();