Customize robots.txt
The robots.txt
file tells search engines which pages can, or can't, be crawled on a site. It contains groups of rules for doing so, and each group has three main components:
- The user agent, which notes which crawler the group of rules applies to. For example,
adsbot-google
. - The rules themselves, which note specific URLs that crawlers can, or can't, access.
- An optional sitemap URL.
Shopify generates a default robots.txt
file that works for most stores. However, you can add the robots.txt.liquid
template to make customizations.
In this tutorial, you'll learn how you can customize the robots.txt.liquid
template.
Requirements
Anchor link to section titled "Requirements"Add the robots.txt.liquid
template with the following steps:
- In the code editor for the theme you want to edit, open the Templates folder.
- Click Add a new template.
- Select
robots.txt
under the Create a new template for drop-down menu. - Click Create template.
The robots.txt.liquid
template supports only the following Liquid objects:
Customize robots.txt.liquid
Anchor link to section titled "Customize robots.txt.liquid"You can make the following customizations:
While you can replace all of the template content with plain text rules, it's strongly recommended to use the provided Liquid objects whenever possible. The default rules are updated regularly to ensure that SEO best practices are always applied.
Add a new rule to an existing group
Anchor link to section titled "Add a new rule to an existing group"If you want to add a new rule to an existing group, then you can adjust the Liquid for outputting the default rules to check for the associated group and include your rule.
For example, you can use the following to block all crawlers from accessing pages with the URL parameter ?q=
:
Remove a default rule from an existing group
Anchor link to section titled "Remove a default rule from an existing group"If you want to remove a default rule from an existing group, then you can adjust the Liquid for outputting the default rules to check for that rule and skip over it.
For example, you can use the following to remove the rule blocking crawlers from accessing the /policies/
page:
Add custom rules
Anchor link to section titled "Add custom rules"If you want to add a new rule that's not part of a default group, then you can manually enter the rule outside of the Liquid for outputting the default rules.
Common examples of these custom rules are:
Block certain crawlers
Anchor link to section titled "Block certain crawlers"If a crawler isn't in the default rule set, then you can manually add a rule to block it.
For example, the following directive would allow you to block the discobot
crawler:
Allow certain crawlers
Anchor link to section titled "Allow certain crawlers"Similar to blocking certain crawlers, you can also manually add a rule to allow search engines to crawl a subdirectory or page.
For example, the following directive would allow the discobot
crawler:
Add extra sitemap URLs
Anchor link to section titled "Add extra sitemap URLs"The following example, where [sitemap-url]
is the sitemap URL, would allow you to include an extra sitemap URL: