diff options
author | adam <adam@pkgsrc.org> | 2019-08-22 08:21:11 +0000 |
---|---|---|
committer | adam <adam@pkgsrc.org> | 2019-08-22 08:21:11 +0000 |
commit | af126cf1057f02461c7acdbbcfe65744b6faadbb (patch) | |
tree | 981d6703d3450801cd3679ccca2c6a130617644f /www | |
parent | c16d1aadcb2d1147838bd1be7b540b36db469804 (diff) | |
download | pkgsrc-af126cf1057f02461c7acdbbcfe65744b6faadbb.tar.gz |
py-scrapy: updated to 1.7.3
Scrapy 1.7.3:
Enforce lxml 4.3.5 or lower for Python 3.4 (issue 3912, issue 3918).
Scrapy 1.7.2:
Fix Python 2 support (issue 3889, issue 3893, issue 3896).
Scrapy 1.7.1:
Re-packaging of Scrapy 1.7.0, which was missing some changes in PyPI.
Scrapy 1.7.0:
Highlights:
Improvements for crawls targeting multiple domains
A cleaner way to pass arguments to callbacks
A new class for JSON requests
Improvements for rule-based spiders
New features for feed exports
Backward-incompatible changes
429 is now part of the RETRY_HTTP_CODES setting by default
This change is backward incompatible. If you don’t want to retry 429, you must override RETRY_HTTP_CODES accordingly.
Crawler, CrawlerRunner.crawl and CrawlerRunner.create_crawler no longer accept a Spider subclass instance, they only accept a Spider subclass now.
Spider subclass instances were never meant to work, and they were not working as one would expect: instead of using the passed Spider subclass instance, their from_crawler method was called to generate a new instance.
Non-default values for the SCHEDULER_PRIORITY_QUEUE setting may stop working. Scheduler priority queue classes now need to handle Request objects instead of arbitrary Python data structures.
New features
A new scheduler priority queue, scrapy.pqueues.DownloaderAwarePriorityQueue, may be enabled for a significant scheduling improvement on crawls targetting multiple web domains, at the cost of no CONCURRENT_REQUESTS_PER_IP support (issue 3520)
A new Request.cb_kwargs attribute provides a cleaner way to pass keyword arguments to callback methods (issue 1138, issue 3563)
A new JSONRequest class offers a more convenient way to build JSON requests (issue 3504, issue 3505)
A process_request callback passed to the Rule constructor now receives the Response object that originated the request as its second argument (issue 3682)
A new restrict_text parameter for the LinkExtractor constructor allows filtering links by linking text (issue 3622, issue 3635)
A new FEED_STORAGE_S3_ACL setting allows defining a custom ACL for feeds exported to Amazon S3 (issue 3607)
A new FEED_STORAGE_FTP_ACTIVE setting allows using FTP’s active connection mode for feeds exported to FTP servers (issue 3829)
A new METAREFRESH_IGNORE_TAGS setting allows overriding which HTML tags are ignored when searching a response for HTML meta tags that trigger a redirect (issue 1422, issue 3768)
A new redirect_reasons request meta key exposes the reason (status code, meta refresh) behind every followed redirect (issue 3581, issue 3687)
The SCRAPY_CHECK variable is now set to the true string during runs of the check command, which allows detecting contract check runs from code (issue 3704, issue 3739)
A new Item.deepcopy() method makes it easier to deep-copy items (issue 1493, issue 3671)
CoreStats also logs elapsed_time_seconds now (issue 3638)
Exceptions from ItemLoader input and output processors are now more verbose (issue 3836, issue 3840)
Crawler, CrawlerRunner.crawl and CrawlerRunner.create_crawler now fail gracefully if they receive a Spider subclass instance instead of the subclass itself (issue 2283, issue 3610, issue 3872)
Bug fixes
process_spider_exception() is now also invoked for generators (issue 220, issue 2061)
System exceptions like KeyboardInterrupt are no longer caught (issue 3726)
ItemLoader.load_item() no longer makes later calls to ItemLoader.get_output_value() or ItemLoader.load_item() return empty data (issue 3804, issue 3819)
The images pipeline (ImagesPipeline) no longer ignores these Amazon S3 settings: AWS_ENDPOINT_URL, AWS_REGION_NAME, AWS_USE_SSL, AWS_VERIFY (issue 3625)
Fixed a memory leak in MediaPipeline affecting, for example, non-200 responses and exceptions from custom middlewares (issue 3813)
Requests with private callbacks are now correctly unserialized from disk (issue 3790)
FormRequest.from_response() now handles invalid methods like major web browsers
Diffstat (limited to 'www')
-rw-r--r-- | www/py-scrapy/Makefile | 4 | ||||
-rw-r--r-- | www/py-scrapy/PLIST | 23 | ||||
-rw-r--r-- | www/py-scrapy/distinfo | 10 |
3 files changed, 14 insertions, 23 deletions
diff --git a/www/py-scrapy/Makefile b/www/py-scrapy/Makefile index 416167e34ee..aa8bf355f97 100644 --- a/www/py-scrapy/Makefile +++ b/www/py-scrapy/Makefile @@ -1,6 +1,6 @@ -# $NetBSD: Makefile,v 1.8 2019/01/31 09:07:46 adam Exp $ +# $NetBSD: Makefile,v 1.9 2019/08/22 08:21:11 adam Exp $ -DISTNAME= Scrapy-1.6.0 +DISTNAME= Scrapy-1.7.3 PKGNAME= ${PYPKGPREFIX}-${DISTNAME:tl} CATEGORIES= www python MASTER_SITES= ${MASTER_SITE_PYPI:=S/Scrapy/} diff --git a/www/py-scrapy/PLIST b/www/py-scrapy/PLIST index 7d1074e3f15..ec89c941457 100644 --- a/www/py-scrapy/PLIST +++ b/www/py-scrapy/PLIST @@ -1,4 +1,4 @@ -@comment $NetBSD: PLIST,v 1.5 2019/01/31 09:07:46 adam Exp $ +@comment $NetBSD: PLIST,v 1.6 2019/08/22 08:21:11 adam Exp $ bin/scrapy-${PYVERSSUFFIX} ${PYSITELIB}/${EGG_INFODIR}/PKG-INFO ${PYSITELIB}/${EGG_INFODIR}/SOURCES.txt @@ -65,9 +65,6 @@ ${PYSITELIB}/scrapy/commands/version.pyo ${PYSITELIB}/scrapy/commands/view.py ${PYSITELIB}/scrapy/commands/view.pyc ${PYSITELIB}/scrapy/commands/view.pyo -${PYSITELIB}/scrapy/conf.py -${PYSITELIB}/scrapy/conf.pyc -${PYSITELIB}/scrapy/conf.pyo ${PYSITELIB}/scrapy/contracts/__init__.py ${PYSITELIB}/scrapy/contracts/__init__.pyc ${PYSITELIB}/scrapy/contracts/__init__.pyo @@ -248,6 +245,9 @@ ${PYSITELIB}/scrapy/http/request/__init__.pyo ${PYSITELIB}/scrapy/http/request/form.py ${PYSITELIB}/scrapy/http/request/form.pyc ${PYSITELIB}/scrapy/http/request/form.pyo +${PYSITELIB}/scrapy/http/request/json_request.py +${PYSITELIB}/scrapy/http/request/json_request.pyc +${PYSITELIB}/scrapy/http/request/json_request.pyo ${PYSITELIB}/scrapy/http/request/rpc.py ${PYSITELIB}/scrapy/http/request/rpc.pyc ${PYSITELIB}/scrapy/http/request/rpc.pyo @@ -296,9 +296,6 @@ ${PYSITELIB}/scrapy/loader/common.pyo ${PYSITELIB}/scrapy/loader/processors.py ${PYSITELIB}/scrapy/loader/processors.pyc ${PYSITELIB}/scrapy/loader/processors.pyo -${PYSITELIB}/scrapy/log.py -${PYSITELIB}/scrapy/log.pyc -${PYSITELIB}/scrapy/log.pyo ${PYSITELIB}/scrapy/logformatter.py ${PYSITELIB}/scrapy/logformatter.pyc ${PYSITELIB}/scrapy/logformatter.pyo @@ -321,6 +318,9 @@ ${PYSITELIB}/scrapy/pipelines/images.pyo ${PYSITELIB}/scrapy/pipelines/media.py ${PYSITELIB}/scrapy/pipelines/media.pyc ${PYSITELIB}/scrapy/pipelines/media.pyo +${PYSITELIB}/scrapy/pqueues.py +${PYSITELIB}/scrapy/pqueues.pyc +${PYSITELIB}/scrapy/pqueues.pyo ${PYSITELIB}/scrapy/resolver.py ${PYSITELIB}/scrapy/resolver.pyc ${PYSITELIB}/scrapy/resolver.pyo @@ -330,12 +330,6 @@ ${PYSITELIB}/scrapy/responsetypes.pyo ${PYSITELIB}/scrapy/selector/__init__.py ${PYSITELIB}/scrapy/selector/__init__.pyc ${PYSITELIB}/scrapy/selector/__init__.pyo -${PYSITELIB}/scrapy/selector/csstranslator.py -${PYSITELIB}/scrapy/selector/csstranslator.pyc -${PYSITELIB}/scrapy/selector/csstranslator.pyo -${PYSITELIB}/scrapy/selector/lxmlsel.py -${PYSITELIB}/scrapy/selector/lxmlsel.pyc -${PYSITELIB}/scrapy/selector/lxmlsel.pyo ${PYSITELIB}/scrapy/selector/unified.py ${PYSITELIB}/scrapy/selector/unified.pyc ${PYSITELIB}/scrapy/selector/unified.pyo @@ -399,9 +393,6 @@ ${PYSITELIB}/scrapy/squeues.pyo ${PYSITELIB}/scrapy/statscollectors.py ${PYSITELIB}/scrapy/statscollectors.pyc ${PYSITELIB}/scrapy/statscollectors.pyo -${PYSITELIB}/scrapy/telnet.py -${PYSITELIB}/scrapy/telnet.pyc -${PYSITELIB}/scrapy/telnet.pyo ${PYSITELIB}/scrapy/templates/project/module/__init__.py ${PYSITELIB}/scrapy/templates/project/module/__init__.pyc ${PYSITELIB}/scrapy/templates/project/module/__init__.pyo diff --git a/www/py-scrapy/distinfo b/www/py-scrapy/distinfo index 8a6fb3e039b..abf5657be97 100644 --- a/www/py-scrapy/distinfo +++ b/www/py-scrapy/distinfo @@ -1,6 +1,6 @@ -$NetBSD: distinfo,v 1.7 2019/01/31 09:07:46 adam Exp $ +$NetBSD: distinfo,v 1.8 2019/08/22 08:21:11 adam Exp $ -SHA1 (Scrapy-1.6.0.tar.gz) = 731714a49ee4974008182527b0d9fe35f69b6769 -RMD160 (Scrapy-1.6.0.tar.gz) = 8fbe6fea79ba57f9c2f03d0c54a7982ab51e9f60 -SHA512 (Scrapy-1.6.0.tar.gz) = 8c0581977d5d4e22afc535fbfff96d51dcc171dc60e21b3a2e35b327f83a484960b7979a5fc79502175441cff92a2f6dfa9511fd3de259eb7a0d4cfc28577e1e -Size (Scrapy-1.6.0.tar.gz) = 926576 bytes +SHA1 (Scrapy-1.7.3.tar.gz) = 905d01beac4a1deeb742e72308b34348c37e4ae5 +RMD160 (Scrapy-1.7.3.tar.gz) = ebe54257cfa20c6bc28995ecf6926a9b5c029ed8 +SHA512 (Scrapy-1.7.3.tar.gz) = 45638732829976443714988ddcd016f7c222b2796c7bd353d6a93186e0182782211af60d1417cdf0980fa5ed6113c2e94b89e2d13ac42999ec1e45457913382d +Size (Scrapy-1.7.3.tar.gz) = 951640 bytes |