WWW::RobotRules parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at
can use the /robots.txt file to forbid conforming robots from accessing parts
of their web site.
.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed /robots.txt files
on any number of hosts.
Installed Size: 36.9 kB
Architectures: all