summaryrefslogtreecommitdiff
path: root/www
diff options
context:
space:
mode:
authorspz <spz>2011-07-10 12:47:38 +0000
committerspz <spz>2011-07-10 12:47:38 +0000
commit64ae19885bc050fad42ae665db5556d69bc7069e (patch)
tree00b9b06b814a4d6d4f9e9f795cbe64cd88b6cade /www
parent830b793b5ab8875152f0df00769fc1e8c8f4dea7 (diff)
downloadpkgsrc-64ae19885bc050fad42ae665db5556d69bc7069e.tar.gz
The Perl 5 module WWW::RobotRules parses /robots.txt files as specified
in "A Standard for Robot Exclusion", at http://www.robotstxt.org/wc/norobots.htmls Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts.
Diffstat (limited to 'www')
-rw-r--r--www/p5-WWW-RobotRules/DESCR10
-rw-r--r--www/p5-WWW-RobotRules/Makefile21
-rw-r--r--www/p5-WWW-RobotRules/distinfo5
3 files changed, 36 insertions, 0 deletions
diff --git a/www/p5-WWW-RobotRules/DESCR b/www/p5-WWW-RobotRules/DESCR
new file mode 100644
index 00000000000..7901c8ef342
--- /dev/null
+++ b/www/p5-WWW-RobotRules/DESCR
@@ -0,0 +1,10 @@
+The Perl 5 module WWW::RobotRules parses /robots.txt files as specified
+in "A Standard for Robot Exclusion", at
+http://www.robotstxt.org/wc/norobots.htmls
+Webmasters can use the /robots.txt file to forbid conforming robots
+from accessing parts of their web site.
+
+The parsed files are kept in a WWW::RobotRules object, and this object
+provides methods to check if access to a given URL is prohibited.
+The same WWW::RobotRules object can be used for one or more parsed
+/robots.txt files on any number of hosts.
diff --git a/www/p5-WWW-RobotRules/Makefile b/www/p5-WWW-RobotRules/Makefile
new file mode 100644
index 00000000000..e2dfd1898ae
--- /dev/null
+++ b/www/p5-WWW-RobotRules/Makefile
@@ -0,0 +1,21 @@
+# $NetBSD: Makefile,v 1.1.1.1 2011/07/10 12:47:38 spz Exp $
+
+DISTNAME= WWW-RobotRules-6.01
+PKGNAME= p5-${DISTNAME}
+PKGREVISION= 1
+CATEGORIES= www perl5
+MASTER_SITES= ${MASTER_SITE_PERL_CPAN:=WWW/}
+
+MAINTAINER= pkgsrc-users@NetBSD.org
+HOMEPAGE= http://search.cpan.org/dist/WWW-RobotRules/
+COMMENT= Perl 5 module interface to URL shortening sites
+
+PKG_DESTDIR_SUPPORT= user-destdir
+
+USE_LANGUAGES= # empty
+PERL5_PACKLIST= auto/WWW/RobotRules/.packlist
+
+DEPENDS+= p5-URI>=1.27:../../www/p5-URI
+
+.include "../../lang/perl5/module.mk"
+.include "../../mk/bsd.pkg.mk"
diff --git a/www/p5-WWW-RobotRules/distinfo b/www/p5-WWW-RobotRules/distinfo
new file mode 100644
index 00000000000..821ff7e0cfa
--- /dev/null
+++ b/www/p5-WWW-RobotRules/distinfo
@@ -0,0 +1,5 @@
+$NetBSD: distinfo,v 1.1.1.1 2011/07/10 12:47:38 spz Exp $
+
+SHA1 (WWW-RobotRules-6.01.tar.gz) = 426920bbfc73a38dffa319dd2f53b0eb9b294b5b
+RMD160 (WWW-RobotRules-6.01.tar.gz) = 6f2c1bef375ad2b2f171b4feae721eec8e1007ec
+Size (WWW-RobotRules-6.01.tar.gz) = 9047 bytes