I am working with Optimizely 11 and need to update the content of robots.txt for multiple sites programmatically. However, I haven't been able to find any documentation on how to achieve this.
Could someone provide guidance or a sample code snippet to help with this implementation?
I am working with Optimizely 11 and need to update the content of robots.txt for multiple sites programmatically. However, I haven't been able to find any documentation on how to achieve this.
Could someone provide guidance or a sample code snippet to help with this implementation?
https://support.optimizely.com/hc/en-us/articles/4413191642637-Edit-the-Robots-txt-file
There's no need to host robots.txt as a physical file in .NET. Instead add an API that will "fake" the robots file, then you can modify the contents as well, very practical when having multiple environments.
Here's how we do it in .NET 8 and CMS 12, but this should be portable to your needs as well
[Route("robots.txt")]
[ApiController]
public class RobotsController : ControllerBase
{
private readonly IWebHostEnvironment _hostEnvironment;
public RobotsController(IWebHostEnvironment hostEnvironment)
{
_hostEnvironment = hostEnvironment;
}
[HttpGet]
public ContentResult GetRobotsFile()
{
var content = "User-agent: *\n";
if (_hostEnvironment.EnvironmentName.StartsWith("Prod", StringComparison.OrdinalIgnoreCase))
{
// add content to your preference
content += "Crawl-delay: 10\n";
content += "sitemap: https://www.yourdomain.org/sitemap.xml";
}
else
{
// allow no crawling in public test environments
content += "Disallow: /";
}
return Content(content, "text/plain");
}
}