This package allows you to dynamically control the content of your website's robots.txt file based on an environment variable in your .NET project. This is useful for managing crawl behavior in different environments (e.g., development, staging, production).
Features:
- Generates robots.txt content based on a specified environment variable.
- Provides a test configuration for easy verification during development.
Installation:
- Install from NuGet:
- Open your Umbraco project in Visual Studio.
- Go to the NuGet Package Manager (Tools > NuGet Package Manager).
- Search for "Dyfort.Umbraco.RobotsTxt" and install the package (version 1.0.0).
- Add robots.txt Files:
- Create two plain text files in the root directory of your Umbraco project:
- robots.txt (default configuration)
- robots.production.txt (for production configuration)
- robots.staging.txt (for staging configuration)
Configuration:
- Set the Environment Variable:
- Define an environment variable in your project's configuration (e.g., web.config for ASP.NET Web Forms, appsettings.json for ASP.NET Core).
- The variable name should match the value you want to use to determine the robots.txt content.
- Example (web.config):
<aspNetCore processPath=".\XXX.exe" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout" hostingModel="inprocess">
<environmentVariables>
<environmentVariable name="ASPNETCORE_ENVIRONMENT" value="Staging" />
</environmentVariables>
</aspNetCore>
Usage:
- By default, accessing https://localhost:<PORT>/robots.txt will return the content of robots.txt.
- If the environment variable you defined matches the value for the Staging configuration (e.g., "Staging"), it will return the content of robots.Staging.txt instead.