robots.txt.liquid template renders the
robots.txt file, which is hosted at the
robots.txt file tells search engines which pages can, or can't, be crawled on a site. It contains groups of rules for doing so, and each group has three main components:
- The user agent, which notes which crawler the group of rules applies to. For example,
- The rules themselves, which note specific URLs that crawlers can, or can't, access.
- An optional sitemap URL.
Shopify generates a
robots.txt file by default, which works for most shops, so this template is not a default template.
The rules included in the default
robots.txt file are mirrored through the Liquid robots object, which the
robots.txt.liquid template uses to output the rules.
While you can replace all of the template content with plain text rules, it's strongly recommended to use the provided Liquid objects whenever possible. The default rules are updated by Shopify regularly to ensure that SEO best practices are always applied.