The rule object

The rule object returns an individual rule for the robots.txt file, which tells crawlers which pages can, or can't, be accessed. It consists of a directive, which can be either Allow or Disallow, and a value of the associated URL path.

For example:

The rule object has the following attributes:

rule.directive

Returns the rule directive, which can be either Allow to allow crawlers to access the specified URL, or Disallow to block them.

rule.value

Returns the associated URL path for the rule.