robots.txt with azure deployments

Posted by Community Admin on 05-Aug-2018 18:01

robots.txt with azure deployments

All Replies

Posted by Community Admin on 20-May-2015 00:00

Does anyone have an idea on how to add a robots.txt file to an azure deployment of Sitefinity for different environments. For DEV and TEST I don't want the search engines to be crawling my sites so I want a robots.txt file like this:

 User-agent: *
Disallow: /

However for production I would want them to crawl it.

I'm not sure, other than adding the file to the web project, how to get it up to Azure. Adding it to the package is fine but how do you add a file based on the deployment?

Thanks

Posted by Community Admin on 25-May-2015 00:00

Hi,

For the Sitefinity backend pages, you have the option to disable crawlers from Title and Properties - Allow external search engines to index this page and include in Sitemap.

This option adds the following in the head:

<meta name="robots" content="noindex" />

To easily turn this on and off, you can create a widget and do this programatically. For example:
var pageMan = PageManager.GetManager();
           var pages = pageMan.GetPageNodes().Where(p => p.RootNodeId == SiteInitializer.CurrentFrontendRootNodeId);
           foreach (var page in pages)
           
               page.Crawlable = false;
           
           pageMan.SaveChanges();

That way you can easily disable this on test before/after deploying. After you propagate to production, you can enable it with a click of a button.

Regards,
Atanas Valchev
Telerik
 
Do you want to have your say in the Sitefinity development roadmap? Do you want to know when a feature you requested is added or when a bug fixed? Explore the Telerik Sitefinity CMS Ideas&Feedback Portal and vote to affect the priority of the items
 

This thread is closed