K 10 svn:author V 5 culot K 8 svn:date V 27 2011-04-14T13:18:39.000000Z K 7 svn:log V 357 WWW::RobotRules parses /robots.txt files which are used to forbid conforming robots from accessing parts of a web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. WWW: http://search.cpan.org/dist/WWW-RobotRules/ This new port is needed to update www/p5-libwww. END