robots.txt.liquid

The robots.txt.liquid template renders the robots.txt file, which is hosted at the /robots.txt URL.

The robots.txt file tells search engines which pages can, or can't, be crawled on a site. It contains groups of rules for doing so, and each group has three main components:

  • The user agent, which notes which crawler the group of rules applies to. For example, adsbot-google.
  • The rules themselves, which note specific URLs that crawlers can, or can't, access.
  • An optional sitemap URL.

Shopify generates a robots.txt file by default, which works for most shops, so this template is not a default template.

On this page