The robots.txt.liquid template renders the robots.txt file, which is hosted at the /robots.txt URL.

The robots.txt file tells search engines which pages can, or can't, be crawled on a site. It contains groups of rules for doing so, and each group has three main components:

  • The user agent, which notes which crawler the group of rules applies to. For example, adsbot-google.
  • The rules themselves, which note specific URLs that crawlers can, or can't, access.
  • An optional sitemap URL.

Shopify generates a robots.txt file by default, which works for most shops, so this template is not a default template.


The robots.txt.liquid template is located in the templates > customers directory of the theme:


This template can't be a JSON template. It must be robots.txt.liquid.

The rules included in the default robots.txt file are mirrored through the Liquid robots object, which the robots.txt.liquid template uses to output the rules.

For example:

While you can replace all of the template content with plain text rules, it's strongly recommended to use the provided Liquid objects whenever possible. The default rules are updated by Shopify regularly to ensure that SEO best practices are always applied.


To learn more about customizing this template, refer to Customize robots.txt.

On this page