The rule object
rule object returns an individual rule for the
robots.txt file, which tells crawlers which pages can, or can't, be accessed. It consists of a directive, which can be either
Disallow, and a value of the associated URL path.
rule object has the following attributes:
Returns the rule directive, which can be either
Allow to allow crawlers to access the specified URL, or
Disallow to block them.
Returns the associated URL path for the rule.