The robots.txt file is the place where search engine bots and other crawlers (including AI/LLM ones) visit to review the rules about what they can or, more importantly, cannot crawl regarding your domain.
All websites should include this file because if it is not present there, search engines and AI crawlers will assume they can navigate the whole site without any limitations.
Note: The robots.txt file is a standard protocol. Nevertheless, not all search engines support the directives. Some crawlers partially follow the rules and some may even intentionally ignore them.
Short.io provides you with the possibility to create a custom robots.txt or, if you already have it uploaded on your server, to configure its redirect policy.
To configure the robots.txt file
Sign in to your Short.io account.
Navigate to Domain Settings -> Robots policy:
For the given domain you can:
configure the redirect by entering full URL to the robots.txt file at your domain’s root (for example, https://yourdomain.com/robots.txt).
And if the option Immediate redirect is enabled, the associated short links will not be taken into consideration allowing direct access to the file:
or
enter the file's content directly into the text area of the Custom robots.txt tab (More information on the syntax and structure of the rules can be found in this Google guide):
After you finish editing, click on Create file.
You can now browse the full robots.txt URL to review it.