Bug 1247994 - Upgrade vendored requests package to 2.9.1; r=mshal
Previously, we We were running version 2.5.1 of requests. Newer versions have several bug fixes and even address a CVE. Source was obtained from https://pypi.python.org/packages/source/r/requests/requests-2.9.1.tar.gz and checked in without modification. This should be a rubber stamp review. MozReview-Commit-ID: 9tFSVJFfwGh
This commit is contained in:
@@ -3,6 +3,232 @@
|
||||
Release History
|
||||
---------------
|
||||
|
||||
2.9.1 (2015-12-21)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Resolve regression introduced in 2.9.0 that made it impossible to send binary
|
||||
strings as bodies in Python 3.
|
||||
- Fixed errors when calculating cookie expiration dates in certain locales.
|
||||
|
||||
**Miscellaneous**
|
||||
|
||||
- Updated bundled urllib3 to 1.13.1.
|
||||
|
||||
2.9.0 (2015-12-15)
|
||||
++++++++++++++++++
|
||||
|
||||
**Minor Improvements** (Backwards compatible)
|
||||
|
||||
- The ``verify`` keyword argument now supports being passed a path to a
|
||||
directory of CA certificates, not just a single-file bundle.
|
||||
- Warnings are now emitted when sending files opened in text mode.
|
||||
- Added the 511 Network Authentication Required status code to the status code
|
||||
registry.
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- For file-like objects that are not seeked to the very beginning, we now
|
||||
send the content length for the number of bytes we will actually read, rather
|
||||
than the total size of the file, allowing partial file uploads.
|
||||
- When uploading file-like objects, if they are empty or have no obvious
|
||||
content length we set ``Transfer-Encoding: chunked`` rather than
|
||||
``Content-Length: 0``.
|
||||
- We correctly receive the response in buffered mode when uploading chunked
|
||||
bodies.
|
||||
- We now handle being passed a query string as a bytestring on Python 3, by
|
||||
decoding it as UTF-8.
|
||||
- Sessions are now closed in all cases (exceptional and not) when using the
|
||||
functional API rather than leaking and waiting for the garbage collector to
|
||||
clean them up.
|
||||
- Correctly handle digest auth headers with a malformed ``qop`` directive that
|
||||
contains no token, by treating it the same as if no ``qop`` directive was
|
||||
provided at all.
|
||||
- Minor performance improvements when removing specific cookies by name.
|
||||
|
||||
**Miscellaneous**
|
||||
|
||||
- Updated urllib3 to 1.13.
|
||||
|
||||
2.8.1 (2015-10-13)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Update certificate bundle to match ``certifi`` 2015.9.6.2's weak certificate
|
||||
bundle.
|
||||
- Fix a bug in 2.8.0 where requests would raise ``ConnectTimeout`` instead of
|
||||
``ConnectionError``
|
||||
- When using the PreparedRequest flow, requests will now correctly respect the
|
||||
``json`` parameter. Broken in 2.8.0.
|
||||
- When using the PreparedRequest flow, requests will now correctly handle a
|
||||
Unicode-string method name on Python 2. Broken in 2.8.0.
|
||||
|
||||
2.8.0 (2015-10-05)
|
||||
++++++++++++++++++
|
||||
|
||||
**Minor Improvements** (Backwards Compatible)
|
||||
|
||||
- Requests now supports per-host proxies. This allows the ``proxies``
|
||||
dictionary to have entries of the form
|
||||
``{'<scheme>://<hostname>': '<proxy>'}``. Host-specific proxies will be used
|
||||
in preference to the previously-supported scheme-specific ones, but the
|
||||
previous syntax will continue to work.
|
||||
- ``Response.raise_for_status`` now prints the URL that failed as part of the
|
||||
exception message.
|
||||
- ``requests.utils.get_netrc_auth`` now takes an ``raise_errors`` kwarg,
|
||||
defaulting to ``False``. When ``True``, errors parsing ``.netrc`` files cause
|
||||
exceptions to be thrown.
|
||||
- Change to bundled projects import logic to make it easier to unbundle
|
||||
requests downstream.
|
||||
- Changed the default User-Agent string to avoid leaking data on Linux: now
|
||||
contains only the requests version.
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- The ``json`` parameter to ``post()`` and friends will now only be used if
|
||||
neither ``data`` nor ``files`` are present, consistent with the
|
||||
documentation.
|
||||
- We now ignore empty fields in the ``NO_PROXY`` environment variable.
|
||||
- Fixed problem where ``httplib.BadStatusLine`` would get raised if combining
|
||||
``stream=True`` with ``contextlib.closing``.
|
||||
- Prevented bugs where we would attempt to return the same connection back to
|
||||
the connection pool twice when sending a Chunked body.
|
||||
- Miscellaneous minor internal changes.
|
||||
- Digest Auth support is now thread safe.
|
||||
|
||||
**Updates**
|
||||
|
||||
- Updated urllib3 to 1.12.
|
||||
|
||||
2.7.0 (2015-05-03)
|
||||
++++++++++++++++++
|
||||
|
||||
This is the first release that follows our new release process. For more, see
|
||||
`our documentation
|
||||
<http://docs.python-requests.org/en/latest/community/release-process/>`_.
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Updated urllib3 to 1.10.4, resolving several bugs involving chunked transfer
|
||||
encoding and response framing.
|
||||
|
||||
2.6.2 (2015-04-23)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Fix regression where compressed data that was sent as chunked data was not
|
||||
properly decompressed. (#2561)
|
||||
|
||||
2.6.1 (2015-04-22)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Remove VendorAlias import machinery introduced in v2.5.2.
|
||||
|
||||
- Simplify the PreparedRequest.prepare API: We no longer require the user to
|
||||
pass an empty list to the hooks keyword argument. (c.f. #2552)
|
||||
|
||||
- Resolve redirects now receives and forwards all of the original arguments to
|
||||
the adapter. (#2503)
|
||||
|
||||
- Handle UnicodeDecodeErrors when trying to deal with a unicode URL that
|
||||
cannot be encoded in ASCII. (#2540)
|
||||
|
||||
- Populate the parsed path of the URI field when performing Digest
|
||||
Authentication. (#2426)
|
||||
|
||||
- Copy a PreparedRequest's CookieJar more reliably when it is not an instance
|
||||
of RequestsCookieJar. (#2527)
|
||||
|
||||
2.6.0 (2015-03-14)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- CVE-2015-2296: Fix handling of cookies on redirect. Previously a cookie
|
||||
without a host value set would use the hostname for the redirected URL
|
||||
exposing requests users to session fixation attacks and potentially cookie
|
||||
stealing. This was disclosed privately by Matthew Daley of
|
||||
`BugFuzz <https://bugfuzz.com>`_. This affects all versions of requests from
|
||||
v2.1.0 to v2.5.3 (inclusive on both ends).
|
||||
|
||||
- Fix error when requests is an ``install_requires`` dependency and ``python
|
||||
setup.py test`` is run. (#2462)
|
||||
|
||||
- Fix error when urllib3 is unbundled and requests continues to use the
|
||||
vendored import location.
|
||||
|
||||
- Include fixes to ``urllib3``'s header handling.
|
||||
|
||||
- Requests' handling of unvendored dependencies is now more restrictive.
|
||||
|
||||
**Features and Improvements**
|
||||
|
||||
- Support bytearrays when passed as parameters in the ``files`` argument.
|
||||
(#2468)
|
||||
|
||||
- Avoid data duplication when creating a request with ``str``, ``bytes``, or
|
||||
``bytearray`` input to the ``files`` argument.
|
||||
|
||||
2.5.3 (2015-02-24)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Revert changes to our vendored certificate bundle. For more context see
|
||||
(#2455, #2456, and http://bugs.python.org/issue23476)
|
||||
|
||||
2.5.2 (2015-02-23)
|
||||
++++++++++++++++++
|
||||
|
||||
**Features and Improvements**
|
||||
|
||||
- Add sha256 fingerprint support. (`shazow/urllib3#540`_)
|
||||
|
||||
- Improve the performance of headers. (`shazow/urllib3#544`_)
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Copy pip's import machinery. When downstream redistributors remove
|
||||
requests.packages.urllib3 the import machinery will continue to let those
|
||||
same symbols work. Example usage in requests' documentation and 3rd-party
|
||||
libraries relying on the vendored copies of urllib3 will work without having
|
||||
to fallback to the system urllib3.
|
||||
|
||||
- Attempt to quote parts of the URL on redirect if unquoting and then quoting
|
||||
fails. (#2356)
|
||||
|
||||
- Fix filename type check for multipart form-data uploads. (#2411)
|
||||
|
||||
- Properly handle the case where a server issuing digest authentication
|
||||
challenges provides both auth and auth-int qop-values. (#2408)
|
||||
|
||||
- Fix a socket leak. (`shazow/urllib3#549`_)
|
||||
|
||||
- Fix multiple ``Set-Cookie`` headers properly. (`shazow/urllib3#534`_)
|
||||
|
||||
- Disable the built-in hostname verification. (`shazow/urllib3#526`_)
|
||||
|
||||
- Fix the behaviour of decoding an exhausted stream. (`shazow/urllib3#535`_)
|
||||
|
||||
**Security**
|
||||
|
||||
- Pulled in an updated ``cacert.pem``.
|
||||
|
||||
- Drop RC4 from the default cipher list. (`shazow/urllib3#551`_)
|
||||
|
||||
.. _shazow/urllib3#551: https://github.com/shazow/urllib3/pull/551
|
||||
.. _shazow/urllib3#549: https://github.com/shazow/urllib3/pull/549
|
||||
.. _shazow/urllib3#544: https://github.com/shazow/urllib3/pull/544
|
||||
.. _shazow/urllib3#540: https://github.com/shazow/urllib3/pull/540
|
||||
.. _shazow/urllib3#535: https://github.com/shazow/urllib3/pull/535
|
||||
.. _shazow/urllib3#534: https://github.com/shazow/urllib3/pull/534
|
||||
.. _shazow/urllib3#526: https://github.com/shazow/urllib3/pull/526
|
||||
|
||||
2.5.1 (2014-12-23)
|
||||
++++++++++++++++++
|
||||
|
||||
@@ -35,7 +261,7 @@ Release History
|
||||
**Bugfixes**
|
||||
|
||||
- Only parse the URL once (#2353)
|
||||
- Allow Content-Length header to always be overriden (#2332)
|
||||
- Allow Content-Length header to always be overridden (#2332)
|
||||
- Properly handle files in HTTPDigestAuth (#2333)
|
||||
- Cap redirect_cache size to prevent memory abuse (#2299)
|
||||
- Fix HTTPDigestAuth handling of redirects after authenticating successfully
|
||||
@@ -103,7 +329,7 @@ Release History
|
||||
- Support for connect timeouts! Timeout now accepts a tuple (connect, read) which is used to set individual connect and read timeouts.
|
||||
- Allow copying of PreparedRequests without headers/cookies.
|
||||
- Updated bundled urllib3 version.
|
||||
- Refactored settings loading from environment — new `Session.merge_environment_settings`.
|
||||
- Refactored settings loading from environment -- new `Session.merge_environment_settings`.
|
||||
- Handle socket errors in iter_content.
|
||||
|
||||
|
||||
@@ -347,7 +573,7 @@ This is not a backwards compatible change.
|
||||
- Improved mime-compatible JSON handling
|
||||
- Proxy fixes
|
||||
- Path hack fixes
|
||||
- Case-Insensistive Content-Encoding headers
|
||||
- Case-Insensitive Content-Encoding headers
|
||||
- Support for CJK parameters in form posts
|
||||
|
||||
|
||||
@@ -383,7 +609,7 @@ This is not a backwards compatible change.
|
||||
- Digest Authentication improvements.
|
||||
- Ensure proxy exclusions work properly.
|
||||
- Clearer UnicodeError exceptions.
|
||||
- Automatic casting of URLs to tsrings (fURL and such)
|
||||
- Automatic casting of URLs to strings (fURL and such)
|
||||
- Bugfixes.
|
||||
|
||||
0.13.6 (2012-08-06)
|
||||
@@ -434,8 +660,8 @@ This is not a backwards compatible change.
|
||||
+++++++++++++++++++
|
||||
|
||||
- Removal of Requests.async in favor of `grequests <https://github.com/kennethreitz/grequests>`_
|
||||
- Allow disabling of cookie persistiance.
|
||||
- New implimentation of safe_mode
|
||||
- Allow disabling of cookie persistence.
|
||||
- New implementation of safe_mode
|
||||
- cookies.get now supports default argument
|
||||
- Session cookies not saved when Session.request is called with return_response=False
|
||||
- Env: no_proxy support.
|
||||
@@ -552,7 +778,7 @@ This is not a backwards compatible change.
|
||||
|
||||
* ``Response.content`` is now bytes-only. (*Backwards Incompatible*)
|
||||
* New ``Response.text`` is unicode-only.
|
||||
* If no ``Response.encoding`` is specified and ``chardet`` is available, ``Respoonse.text`` will guess an encoding.
|
||||
* If no ``Response.encoding`` is specified and ``chardet`` is available, ``Response.text`` will guess an encoding.
|
||||
* Default to ISO-8859-1 (Western) encoding for "text" subtypes.
|
||||
* Removal of `decode_unicode`. (*Backwards Incompatible*)
|
||||
* New multiple-hooks system.
|
||||
@@ -672,7 +898,7 @@ This is not a backwards compatible change.
|
||||
0.7.5 (2011-11-04)
|
||||
++++++++++++++++++
|
||||
|
||||
* Response.content = None if there was an invalid repsonse.
|
||||
* Response.content = None if there was an invalid response.
|
||||
* Redirection auth handling.
|
||||
|
||||
0.7.4 (2011-10-26)
|
||||
@@ -759,7 +985,7 @@ This is not a backwards compatible change.
|
||||
++++++++++++++++++
|
||||
|
||||
* New callback hook system
|
||||
* New persistient sessions object and context manager
|
||||
* New persistent sessions object and context manager
|
||||
* Transparent Dict-cookie handling
|
||||
* Status code reference object
|
||||
* Removed Response.cached
|
||||
@@ -793,7 +1019,7 @@ This is not a backwards compatible change.
|
||||
* Redirect Fixes
|
||||
* settings.verbose stream writing
|
||||
* Querystrings for all methods
|
||||
* URLErrors (Connection Refused, Timeout, Invalid URLs) are treated as explicity raised
|
||||
* URLErrors (Connection Refused, Timeout, Invalid URLs) are treated as explicitly raised
|
||||
``r.requests.get('hwe://blah'); r.raise_for_status()``
|
||||
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
Copyright 2014 Kenneth Reitz
|
||||
Copyright 2015 Kenneth Reitz
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Metadata-Version: 1.1
|
||||
Name: requests
|
||||
Version: 2.5.1
|
||||
Version: 2.9.1
|
||||
Summary: Python HTTP for Humans.
|
||||
Home-page: http://python-requests.org
|
||||
Author: Kenneth Reitz
|
||||
@@ -9,11 +9,13 @@ License: Apache 2.0
|
||||
Description: Requests: HTTP for Humans
|
||||
=========================
|
||||
|
||||
.. image:: https://badge.fury.io/py/requests.png
|
||||
:target: http://badge.fury.io/py/requests
|
||||
.. image:: https://img.shields.io/pypi/v/requests.svg
|
||||
:target: https://pypi.python.org/pypi/requests
|
||||
|
||||
.. image:: https://img.shields.io/pypi/dm/requests.svg
|
||||
:target: https://pypi.python.org/pypi/requests
|
||||
|
||||
|
||||
.. image:: https://pypip.in/d/requests/badge.png
|
||||
:target: https://crate.io/packages/requests/
|
||||
|
||||
|
||||
Requests is an Apache2 Licensed HTTP library, written in Python, for human
|
||||
@@ -83,7 +85,6 @@ Description: Requests: HTTP for Humans
|
||||
----------
|
||||
|
||||
#. Check for open issues or open a fresh issue to start a discussion around a feature idea or a bug. There is a `Contributor Friendly`_ tag for issues that should be ideal for people who are not very familiar with the codebase yet.
|
||||
#. If you feel uncomfortable or uncertain about an issue or your changes, feel free to email @sigmavirus24 and he will happily help you via email, Skype, remote pairing or whatever you are comfortable with.
|
||||
#. Fork `the repository`_ on GitHub to start making your changes to the **master** branch (or branch off of it).
|
||||
#. Write a test which shows that the bug was fixed or that the feature works as expected.
|
||||
#. Send a pull request and bug the maintainer until it gets merged and published. :) Make sure to add yourself to AUTHORS_.
|
||||
@@ -98,6 +99,232 @@ Description: Requests: HTTP for Humans
|
||||
Release History
|
||||
---------------
|
||||
|
||||
2.9.1 (2015-12-21)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Resolve regression introduced in 2.9.0 that made it impossible to send binary
|
||||
strings as bodies in Python 3.
|
||||
- Fixed errors when calculating cookie expiration dates in certain locales.
|
||||
|
||||
**Miscellaneous**
|
||||
|
||||
- Updated bundled urllib3 to 1.13.1.
|
||||
|
||||
2.9.0 (2015-12-15)
|
||||
++++++++++++++++++
|
||||
|
||||
**Minor Improvements** (Backwards compatible)
|
||||
|
||||
- The ``verify`` keyword argument now supports being passed a path to a
|
||||
directory of CA certificates, not just a single-file bundle.
|
||||
- Warnings are now emitted when sending files opened in text mode.
|
||||
- Added the 511 Network Authentication Required status code to the status code
|
||||
registry.
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- For file-like objects that are not seeked to the very beginning, we now
|
||||
send the content length for the number of bytes we will actually read, rather
|
||||
than the total size of the file, allowing partial file uploads.
|
||||
- When uploading file-like objects, if they are empty or have no obvious
|
||||
content length we set ``Transfer-Encoding: chunked`` rather than
|
||||
``Content-Length: 0``.
|
||||
- We correctly receive the response in buffered mode when uploading chunked
|
||||
bodies.
|
||||
- We now handle being passed a query string as a bytestring on Python 3, by
|
||||
decoding it as UTF-8.
|
||||
- Sessions are now closed in all cases (exceptional and not) when using the
|
||||
functional API rather than leaking and waiting for the garbage collector to
|
||||
clean them up.
|
||||
- Correctly handle digest auth headers with a malformed ``qop`` directive that
|
||||
contains no token, by treating it the same as if no ``qop`` directive was
|
||||
provided at all.
|
||||
- Minor performance improvements when removing specific cookies by name.
|
||||
|
||||
**Miscellaneous**
|
||||
|
||||
- Updated urllib3 to 1.13.
|
||||
|
||||
2.8.1 (2015-10-13)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Update certificate bundle to match ``certifi`` 2015.9.6.2's weak certificate
|
||||
bundle.
|
||||
- Fix a bug in 2.8.0 where requests would raise ``ConnectTimeout`` instead of
|
||||
``ConnectionError``
|
||||
- When using the PreparedRequest flow, requests will now correctly respect the
|
||||
``json`` parameter. Broken in 2.8.0.
|
||||
- When using the PreparedRequest flow, requests will now correctly handle a
|
||||
Unicode-string method name on Python 2. Broken in 2.8.0.
|
||||
|
||||
2.8.0 (2015-10-05)
|
||||
++++++++++++++++++
|
||||
|
||||
**Minor Improvements** (Backwards Compatible)
|
||||
|
||||
- Requests now supports per-host proxies. This allows the ``proxies``
|
||||
dictionary to have entries of the form
|
||||
``{'<scheme>://<hostname>': '<proxy>'}``. Host-specific proxies will be used
|
||||
in preference to the previously-supported scheme-specific ones, but the
|
||||
previous syntax will continue to work.
|
||||
- ``Response.raise_for_status`` now prints the URL that failed as part of the
|
||||
exception message.
|
||||
- ``requests.utils.get_netrc_auth`` now takes an ``raise_errors`` kwarg,
|
||||
defaulting to ``False``. When ``True``, errors parsing ``.netrc`` files cause
|
||||
exceptions to be thrown.
|
||||
- Change to bundled projects import logic to make it easier to unbundle
|
||||
requests downstream.
|
||||
- Changed the default User-Agent string to avoid leaking data on Linux: now
|
||||
contains only the requests version.
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- The ``json`` parameter to ``post()`` and friends will now only be used if
|
||||
neither ``data`` nor ``files`` are present, consistent with the
|
||||
documentation.
|
||||
- We now ignore empty fields in the ``NO_PROXY`` environment variable.
|
||||
- Fixed problem where ``httplib.BadStatusLine`` would get raised if combining
|
||||
``stream=True`` with ``contextlib.closing``.
|
||||
- Prevented bugs where we would attempt to return the same connection back to
|
||||
the connection pool twice when sending a Chunked body.
|
||||
- Miscellaneous minor internal changes.
|
||||
- Digest Auth support is now thread safe.
|
||||
|
||||
**Updates**
|
||||
|
||||
- Updated urllib3 to 1.12.
|
||||
|
||||
2.7.0 (2015-05-03)
|
||||
++++++++++++++++++
|
||||
|
||||
This is the first release that follows our new release process. For more, see
|
||||
`our documentation
|
||||
<http://docs.python-requests.org/en/latest/community/release-process/>`_.
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Updated urllib3 to 1.10.4, resolving several bugs involving chunked transfer
|
||||
encoding and response framing.
|
||||
|
||||
2.6.2 (2015-04-23)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Fix regression where compressed data that was sent as chunked data was not
|
||||
properly decompressed. (#2561)
|
||||
|
||||
2.6.1 (2015-04-22)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Remove VendorAlias import machinery introduced in v2.5.2.
|
||||
|
||||
- Simplify the PreparedRequest.prepare API: We no longer require the user to
|
||||
pass an empty list to the hooks keyword argument. (c.f. #2552)
|
||||
|
||||
- Resolve redirects now receives and forwards all of the original arguments to
|
||||
the adapter. (#2503)
|
||||
|
||||
- Handle UnicodeDecodeErrors when trying to deal with a unicode URL that
|
||||
cannot be encoded in ASCII. (#2540)
|
||||
|
||||
- Populate the parsed path of the URI field when performing Digest
|
||||
Authentication. (#2426)
|
||||
|
||||
- Copy a PreparedRequest's CookieJar more reliably when it is not an instance
|
||||
of RequestsCookieJar. (#2527)
|
||||
|
||||
2.6.0 (2015-03-14)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- CVE-2015-2296: Fix handling of cookies on redirect. Previously a cookie
|
||||
without a host value set would use the hostname for the redirected URL
|
||||
exposing requests users to session fixation attacks and potentially cookie
|
||||
stealing. This was disclosed privately by Matthew Daley of
|
||||
`BugFuzz <https://bugfuzz.com>`_. This affects all versions of requests from
|
||||
v2.1.0 to v2.5.3 (inclusive on both ends).
|
||||
|
||||
- Fix error when requests is an ``install_requires`` dependency and ``python
|
||||
setup.py test`` is run. (#2462)
|
||||
|
||||
- Fix error when urllib3 is unbundled and requests continues to use the
|
||||
vendored import location.
|
||||
|
||||
- Include fixes to ``urllib3``'s header handling.
|
||||
|
||||
- Requests' handling of unvendored dependencies is now more restrictive.
|
||||
|
||||
**Features and Improvements**
|
||||
|
||||
- Support bytearrays when passed as parameters in the ``files`` argument.
|
||||
(#2468)
|
||||
|
||||
- Avoid data duplication when creating a request with ``str``, ``bytes``, or
|
||||
``bytearray`` input to the ``files`` argument.
|
||||
|
||||
2.5.3 (2015-02-24)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Revert changes to our vendored certificate bundle. For more context see
|
||||
(#2455, #2456, and http://bugs.python.org/issue23476)
|
||||
|
||||
2.5.2 (2015-02-23)
|
||||
++++++++++++++++++
|
||||
|
||||
**Features and Improvements**
|
||||
|
||||
- Add sha256 fingerprint support. (`shazow/urllib3#540`_)
|
||||
|
||||
- Improve the performance of headers. (`shazow/urllib3#544`_)
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Copy pip's import machinery. When downstream redistributors remove
|
||||
requests.packages.urllib3 the import machinery will continue to let those
|
||||
same symbols work. Example usage in requests' documentation and 3rd-party
|
||||
libraries relying on the vendored copies of urllib3 will work without having
|
||||
to fallback to the system urllib3.
|
||||
|
||||
- Attempt to quote parts of the URL on redirect if unquoting and then quoting
|
||||
fails. (#2356)
|
||||
|
||||
- Fix filename type check for multipart form-data uploads. (#2411)
|
||||
|
||||
- Properly handle the case where a server issuing digest authentication
|
||||
challenges provides both auth and auth-int qop-values. (#2408)
|
||||
|
||||
- Fix a socket leak. (`shazow/urllib3#549`_)
|
||||
|
||||
- Fix multiple ``Set-Cookie`` headers properly. (`shazow/urllib3#534`_)
|
||||
|
||||
- Disable the built-in hostname verification. (`shazow/urllib3#526`_)
|
||||
|
||||
- Fix the behaviour of decoding an exhausted stream. (`shazow/urllib3#535`_)
|
||||
|
||||
**Security**
|
||||
|
||||
- Pulled in an updated ``cacert.pem``.
|
||||
|
||||
- Drop RC4 from the default cipher list. (`shazow/urllib3#551`_)
|
||||
|
||||
.. _shazow/urllib3#551: https://github.com/shazow/urllib3/pull/551
|
||||
.. _shazow/urllib3#549: https://github.com/shazow/urllib3/pull/549
|
||||
.. _shazow/urllib3#544: https://github.com/shazow/urllib3/pull/544
|
||||
.. _shazow/urllib3#540: https://github.com/shazow/urllib3/pull/540
|
||||
.. _shazow/urllib3#535: https://github.com/shazow/urllib3/pull/535
|
||||
.. _shazow/urllib3#534: https://github.com/shazow/urllib3/pull/534
|
||||
.. _shazow/urllib3#526: https://github.com/shazow/urllib3/pull/526
|
||||
|
||||
2.5.1 (2014-12-23)
|
||||
++++++++++++++++++
|
||||
|
||||
@@ -130,7 +357,7 @@ Description: Requests: HTTP for Humans
|
||||
**Bugfixes**
|
||||
|
||||
- Only parse the URL once (#2353)
|
||||
- Allow Content-Length header to always be overriden (#2332)
|
||||
- Allow Content-Length header to always be overridden (#2332)
|
||||
- Properly handle files in HTTPDigestAuth (#2333)
|
||||
- Cap redirect_cache size to prevent memory abuse (#2299)
|
||||
- Fix HTTPDigestAuth handling of redirects after authenticating successfully
|
||||
@@ -198,7 +425,7 @@ Description: Requests: HTTP for Humans
|
||||
- Support for connect timeouts! Timeout now accepts a tuple (connect, read) which is used to set individual connect and read timeouts.
|
||||
- Allow copying of PreparedRequests without headers/cookies.
|
||||
- Updated bundled urllib3 version.
|
||||
- Refactored settings loading from environment — new `Session.merge_environment_settings`.
|
||||
- Refactored settings loading from environment -- new `Session.merge_environment_settings`.
|
||||
- Handle socket errors in iter_content.
|
||||
|
||||
|
||||
@@ -442,7 +669,7 @@ Description: Requests: HTTP for Humans
|
||||
- Improved mime-compatible JSON handling
|
||||
- Proxy fixes
|
||||
- Path hack fixes
|
||||
- Case-Insensistive Content-Encoding headers
|
||||
- Case-Insensitive Content-Encoding headers
|
||||
- Support for CJK parameters in form posts
|
||||
|
||||
|
||||
@@ -478,7 +705,7 @@ Description: Requests: HTTP for Humans
|
||||
- Digest Authentication improvements.
|
||||
- Ensure proxy exclusions work properly.
|
||||
- Clearer UnicodeError exceptions.
|
||||
- Automatic casting of URLs to tsrings (fURL and such)
|
||||
- Automatic casting of URLs to strings (fURL and such)
|
||||
- Bugfixes.
|
||||
|
||||
0.13.6 (2012-08-06)
|
||||
@@ -529,8 +756,8 @@ Description: Requests: HTTP for Humans
|
||||
+++++++++++++++++++
|
||||
|
||||
- Removal of Requests.async in favor of `grequests <https://github.com/kennethreitz/grequests>`_
|
||||
- Allow disabling of cookie persistiance.
|
||||
- New implimentation of safe_mode
|
||||
- Allow disabling of cookie persistence.
|
||||
- New implementation of safe_mode
|
||||
- cookies.get now supports default argument
|
||||
- Session cookies not saved when Session.request is called with return_response=False
|
||||
- Env: no_proxy support.
|
||||
@@ -647,7 +874,7 @@ Description: Requests: HTTP for Humans
|
||||
|
||||
* ``Response.content`` is now bytes-only. (*Backwards Incompatible*)
|
||||
* New ``Response.text`` is unicode-only.
|
||||
* If no ``Response.encoding`` is specified and ``chardet`` is available, ``Respoonse.text`` will guess an encoding.
|
||||
* If no ``Response.encoding`` is specified and ``chardet`` is available, ``Response.text`` will guess an encoding.
|
||||
* Default to ISO-8859-1 (Western) encoding for "text" subtypes.
|
||||
* Removal of `decode_unicode`. (*Backwards Incompatible*)
|
||||
* New multiple-hooks system.
|
||||
@@ -767,7 +994,7 @@ Description: Requests: HTTP for Humans
|
||||
0.7.5 (2011-11-04)
|
||||
++++++++++++++++++
|
||||
|
||||
* Response.content = None if there was an invalid repsonse.
|
||||
* Response.content = None if there was an invalid response.
|
||||
* Redirection auth handling.
|
||||
|
||||
0.7.4 (2011-10-26)
|
||||
@@ -854,7 +1081,7 @@ Description: Requests: HTTP for Humans
|
||||
++++++++++++++++++
|
||||
|
||||
* New callback hook system
|
||||
* New persistient sessions object and context manager
|
||||
* New persistent sessions object and context manager
|
||||
* Transparent Dict-cookie handling
|
||||
* Status code reference object
|
||||
* Removed Response.cached
|
||||
@@ -888,7 +1115,7 @@ Description: Requests: HTTP for Humans
|
||||
* Redirect Fixes
|
||||
* settings.verbose stream writing
|
||||
* Querystrings for all methods
|
||||
* URLErrors (Connection Refused, Timeout, Invalid URLs) are treated as explicity raised
|
||||
* URLErrors (Connection Refused, Timeout, Invalid URLs) are treated as explicitly raised
|
||||
``r.requests.get('hwe://blah'); r.raise_for_status()``
|
||||
|
||||
|
||||
@@ -1004,8 +1231,8 @@ Classifier: Intended Audience :: Developers
|
||||
Classifier: Natural Language :: English
|
||||
Classifier: License :: OSI Approved :: Apache Software License
|
||||
Classifier: Programming Language :: Python
|
||||
Classifier: Programming Language :: Python :: 2.6
|
||||
Classifier: Programming Language :: Python :: 2.7
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
Classifier: Programming Language :: Python :: 3.3
|
||||
Classifier: Programming Language :: Python :: 3.4
|
||||
Classifier: Programming Language :: Python :: 3.5
|
||||
|
||||
@@ -1,11 +1,13 @@
|
||||
Requests: HTTP for Humans
|
||||
=========================
|
||||
|
||||
.. image:: https://badge.fury.io/py/requests.png
|
||||
:target: http://badge.fury.io/py/requests
|
||||
.. image:: https://img.shields.io/pypi/v/requests.svg
|
||||
:target: https://pypi.python.org/pypi/requests
|
||||
|
||||
.. image:: https://img.shields.io/pypi/dm/requests.svg
|
||||
:target: https://pypi.python.org/pypi/requests
|
||||
|
||||
|
||||
.. image:: https://pypip.in/d/requests/badge.png
|
||||
:target: https://crate.io/packages/requests/
|
||||
|
||||
|
||||
Requests is an Apache2 Licensed HTTP library, written in Python, for human
|
||||
@@ -75,7 +77,6 @@ Contribute
|
||||
----------
|
||||
|
||||
#. Check for open issues or open a fresh issue to start a discussion around a feature idea or a bug. There is a `Contributor Friendly`_ tag for issues that should be ideal for people who are not very familiar with the codebase yet.
|
||||
#. If you feel uncomfortable or uncertain about an issue or your changes, feel free to email @sigmavirus24 and he will happily help you via email, Skype, remote pairing or whatever you are comfortable with.
|
||||
#. Fork `the repository`_ on GitHub to start making your changes to the **master** branch (or branch off of it).
|
||||
#. Write a test which shows that the bug was fixed or that the feature works as expected.
|
||||
#. Send a pull request and bug the maintainer until it gets merged and published. :) Make sure to add yourself to AUTHORS_.
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
Metadata-Version: 1.1
|
||||
Name: requests
|
||||
Version: 2.5.1
|
||||
Version: 2.9.1
|
||||
Summary: Python HTTP for Humans.
|
||||
Home-page: http://python-requests.org
|
||||
Author: Kenneth Reitz
|
||||
@@ -9,11 +9,13 @@ License: Apache 2.0
|
||||
Description: Requests: HTTP for Humans
|
||||
=========================
|
||||
|
||||
.. image:: https://badge.fury.io/py/requests.png
|
||||
:target: http://badge.fury.io/py/requests
|
||||
.. image:: https://img.shields.io/pypi/v/requests.svg
|
||||
:target: https://pypi.python.org/pypi/requests
|
||||
|
||||
.. image:: https://img.shields.io/pypi/dm/requests.svg
|
||||
:target: https://pypi.python.org/pypi/requests
|
||||
|
||||
|
||||
.. image:: https://pypip.in/d/requests/badge.png
|
||||
:target: https://crate.io/packages/requests/
|
||||
|
||||
|
||||
Requests is an Apache2 Licensed HTTP library, written in Python, for human
|
||||
@@ -83,7 +85,6 @@ Description: Requests: HTTP for Humans
|
||||
----------
|
||||
|
||||
#. Check for open issues or open a fresh issue to start a discussion around a feature idea or a bug. There is a `Contributor Friendly`_ tag for issues that should be ideal for people who are not very familiar with the codebase yet.
|
||||
#. If you feel uncomfortable or uncertain about an issue or your changes, feel free to email @sigmavirus24 and he will happily help you via email, Skype, remote pairing or whatever you are comfortable with.
|
||||
#. Fork `the repository`_ on GitHub to start making your changes to the **master** branch (or branch off of it).
|
||||
#. Write a test which shows that the bug was fixed or that the feature works as expected.
|
||||
#. Send a pull request and bug the maintainer until it gets merged and published. :) Make sure to add yourself to AUTHORS_.
|
||||
@@ -98,6 +99,232 @@ Description: Requests: HTTP for Humans
|
||||
Release History
|
||||
---------------
|
||||
|
||||
2.9.1 (2015-12-21)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Resolve regression introduced in 2.9.0 that made it impossible to send binary
|
||||
strings as bodies in Python 3.
|
||||
- Fixed errors when calculating cookie expiration dates in certain locales.
|
||||
|
||||
**Miscellaneous**
|
||||
|
||||
- Updated bundled urllib3 to 1.13.1.
|
||||
|
||||
2.9.0 (2015-12-15)
|
||||
++++++++++++++++++
|
||||
|
||||
**Minor Improvements** (Backwards compatible)
|
||||
|
||||
- The ``verify`` keyword argument now supports being passed a path to a
|
||||
directory of CA certificates, not just a single-file bundle.
|
||||
- Warnings are now emitted when sending files opened in text mode.
|
||||
- Added the 511 Network Authentication Required status code to the status code
|
||||
registry.
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- For file-like objects that are not seeked to the very beginning, we now
|
||||
send the content length for the number of bytes we will actually read, rather
|
||||
than the total size of the file, allowing partial file uploads.
|
||||
- When uploading file-like objects, if they are empty or have no obvious
|
||||
content length we set ``Transfer-Encoding: chunked`` rather than
|
||||
``Content-Length: 0``.
|
||||
- We correctly receive the response in buffered mode when uploading chunked
|
||||
bodies.
|
||||
- We now handle being passed a query string as a bytestring on Python 3, by
|
||||
decoding it as UTF-8.
|
||||
- Sessions are now closed in all cases (exceptional and not) when using the
|
||||
functional API rather than leaking and waiting for the garbage collector to
|
||||
clean them up.
|
||||
- Correctly handle digest auth headers with a malformed ``qop`` directive that
|
||||
contains no token, by treating it the same as if no ``qop`` directive was
|
||||
provided at all.
|
||||
- Minor performance improvements when removing specific cookies by name.
|
||||
|
||||
**Miscellaneous**
|
||||
|
||||
- Updated urllib3 to 1.13.
|
||||
|
||||
2.8.1 (2015-10-13)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Update certificate bundle to match ``certifi`` 2015.9.6.2's weak certificate
|
||||
bundle.
|
||||
- Fix a bug in 2.8.0 where requests would raise ``ConnectTimeout`` instead of
|
||||
``ConnectionError``
|
||||
- When using the PreparedRequest flow, requests will now correctly respect the
|
||||
``json`` parameter. Broken in 2.8.0.
|
||||
- When using the PreparedRequest flow, requests will now correctly handle a
|
||||
Unicode-string method name on Python 2. Broken in 2.8.0.
|
||||
|
||||
2.8.0 (2015-10-05)
|
||||
++++++++++++++++++
|
||||
|
||||
**Minor Improvements** (Backwards Compatible)
|
||||
|
||||
- Requests now supports per-host proxies. This allows the ``proxies``
|
||||
dictionary to have entries of the form
|
||||
``{'<scheme>://<hostname>': '<proxy>'}``. Host-specific proxies will be used
|
||||
in preference to the previously-supported scheme-specific ones, but the
|
||||
previous syntax will continue to work.
|
||||
- ``Response.raise_for_status`` now prints the URL that failed as part of the
|
||||
exception message.
|
||||
- ``requests.utils.get_netrc_auth`` now takes an ``raise_errors`` kwarg,
|
||||
defaulting to ``False``. When ``True``, errors parsing ``.netrc`` files cause
|
||||
exceptions to be thrown.
|
||||
- Change to bundled projects import logic to make it easier to unbundle
|
||||
requests downstream.
|
||||
- Changed the default User-Agent string to avoid leaking data on Linux: now
|
||||
contains only the requests version.
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- The ``json`` parameter to ``post()`` and friends will now only be used if
|
||||
neither ``data`` nor ``files`` are present, consistent with the
|
||||
documentation.
|
||||
- We now ignore empty fields in the ``NO_PROXY`` environment variable.
|
||||
- Fixed problem where ``httplib.BadStatusLine`` would get raised if combining
|
||||
``stream=True`` with ``contextlib.closing``.
|
||||
- Prevented bugs where we would attempt to return the same connection back to
|
||||
the connection pool twice when sending a Chunked body.
|
||||
- Miscellaneous minor internal changes.
|
||||
- Digest Auth support is now thread safe.
|
||||
|
||||
**Updates**
|
||||
|
||||
- Updated urllib3 to 1.12.
|
||||
|
||||
2.7.0 (2015-05-03)
|
||||
++++++++++++++++++
|
||||
|
||||
This is the first release that follows our new release process. For more, see
|
||||
`our documentation
|
||||
<http://docs.python-requests.org/en/latest/community/release-process/>`_.
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Updated urllib3 to 1.10.4, resolving several bugs involving chunked transfer
|
||||
encoding and response framing.
|
||||
|
||||
2.6.2 (2015-04-23)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Fix regression where compressed data that was sent as chunked data was not
|
||||
properly decompressed. (#2561)
|
||||
|
||||
2.6.1 (2015-04-22)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Remove VendorAlias import machinery introduced in v2.5.2.
|
||||
|
||||
- Simplify the PreparedRequest.prepare API: We no longer require the user to
|
||||
pass an empty list to the hooks keyword argument. (c.f. #2552)
|
||||
|
||||
- Resolve redirects now receives and forwards all of the original arguments to
|
||||
the adapter. (#2503)
|
||||
|
||||
- Handle UnicodeDecodeErrors when trying to deal with a unicode URL that
|
||||
cannot be encoded in ASCII. (#2540)
|
||||
|
||||
- Populate the parsed path of the URI field when performing Digest
|
||||
Authentication. (#2426)
|
||||
|
||||
- Copy a PreparedRequest's CookieJar more reliably when it is not an instance
|
||||
of RequestsCookieJar. (#2527)
|
||||
|
||||
2.6.0 (2015-03-14)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- CVE-2015-2296: Fix handling of cookies on redirect. Previously a cookie
|
||||
without a host value set would use the hostname for the redirected URL
|
||||
exposing requests users to session fixation attacks and potentially cookie
|
||||
stealing. This was disclosed privately by Matthew Daley of
|
||||
`BugFuzz <https://bugfuzz.com>`_. This affects all versions of requests from
|
||||
v2.1.0 to v2.5.3 (inclusive on both ends).
|
||||
|
||||
- Fix error when requests is an ``install_requires`` dependency and ``python
|
||||
setup.py test`` is run. (#2462)
|
||||
|
||||
- Fix error when urllib3 is unbundled and requests continues to use the
|
||||
vendored import location.
|
||||
|
||||
- Include fixes to ``urllib3``'s header handling.
|
||||
|
||||
- Requests' handling of unvendored dependencies is now more restrictive.
|
||||
|
||||
**Features and Improvements**
|
||||
|
||||
- Support bytearrays when passed as parameters in the ``files`` argument.
|
||||
(#2468)
|
||||
|
||||
- Avoid data duplication when creating a request with ``str``, ``bytes``, or
|
||||
``bytearray`` input to the ``files`` argument.
|
||||
|
||||
2.5.3 (2015-02-24)
|
||||
++++++++++++++++++
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Revert changes to our vendored certificate bundle. For more context see
|
||||
(#2455, #2456, and http://bugs.python.org/issue23476)
|
||||
|
||||
2.5.2 (2015-02-23)
|
||||
++++++++++++++++++
|
||||
|
||||
**Features and Improvements**
|
||||
|
||||
- Add sha256 fingerprint support. (`shazow/urllib3#540`_)
|
||||
|
||||
- Improve the performance of headers. (`shazow/urllib3#544`_)
|
||||
|
||||
**Bugfixes**
|
||||
|
||||
- Copy pip's import machinery. When downstream redistributors remove
|
||||
requests.packages.urllib3 the import machinery will continue to let those
|
||||
same symbols work. Example usage in requests' documentation and 3rd-party
|
||||
libraries relying on the vendored copies of urllib3 will work without having
|
||||
to fallback to the system urllib3.
|
||||
|
||||
- Attempt to quote parts of the URL on redirect if unquoting and then quoting
|
||||
fails. (#2356)
|
||||
|
||||
- Fix filename type check for multipart form-data uploads. (#2411)
|
||||
|
||||
- Properly handle the case where a server issuing digest authentication
|
||||
challenges provides both auth and auth-int qop-values. (#2408)
|
||||
|
||||
- Fix a socket leak. (`shazow/urllib3#549`_)
|
||||
|
||||
- Fix multiple ``Set-Cookie`` headers properly. (`shazow/urllib3#534`_)
|
||||
|
||||
- Disable the built-in hostname verification. (`shazow/urllib3#526`_)
|
||||
|
||||
- Fix the behaviour of decoding an exhausted stream. (`shazow/urllib3#535`_)
|
||||
|
||||
**Security**
|
||||
|
||||
- Pulled in an updated ``cacert.pem``.
|
||||
|
||||
- Drop RC4 from the default cipher list. (`shazow/urllib3#551`_)
|
||||
|
||||
.. _shazow/urllib3#551: https://github.com/shazow/urllib3/pull/551
|
||||
.. _shazow/urllib3#549: https://github.com/shazow/urllib3/pull/549
|
||||
.. _shazow/urllib3#544: https://github.com/shazow/urllib3/pull/544
|
||||
.. _shazow/urllib3#540: https://github.com/shazow/urllib3/pull/540
|
||||
.. _shazow/urllib3#535: https://github.com/shazow/urllib3/pull/535
|
||||
.. _shazow/urllib3#534: https://github.com/shazow/urllib3/pull/534
|
||||
.. _shazow/urllib3#526: https://github.com/shazow/urllib3/pull/526
|
||||
|
||||
2.5.1 (2014-12-23)
|
||||
++++++++++++++++++
|
||||
|
||||
@@ -130,7 +357,7 @@ Description: Requests: HTTP for Humans
|
||||
**Bugfixes**
|
||||
|
||||
- Only parse the URL once (#2353)
|
||||
- Allow Content-Length header to always be overriden (#2332)
|
||||
- Allow Content-Length header to always be overridden (#2332)
|
||||
- Properly handle files in HTTPDigestAuth (#2333)
|
||||
- Cap redirect_cache size to prevent memory abuse (#2299)
|
||||
- Fix HTTPDigestAuth handling of redirects after authenticating successfully
|
||||
@@ -198,7 +425,7 @@ Description: Requests: HTTP for Humans
|
||||
- Support for connect timeouts! Timeout now accepts a tuple (connect, read) which is used to set individual connect and read timeouts.
|
||||
- Allow copying of PreparedRequests without headers/cookies.
|
||||
- Updated bundled urllib3 version.
|
||||
- Refactored settings loading from environment — new `Session.merge_environment_settings`.
|
||||
- Refactored settings loading from environment -- new `Session.merge_environment_settings`.
|
||||
- Handle socket errors in iter_content.
|
||||
|
||||
|
||||
@@ -442,7 +669,7 @@ Description: Requests: HTTP for Humans
|
||||
- Improved mime-compatible JSON handling
|
||||
- Proxy fixes
|
||||
- Path hack fixes
|
||||
- Case-Insensistive Content-Encoding headers
|
||||
- Case-Insensitive Content-Encoding headers
|
||||
- Support for CJK parameters in form posts
|
||||
|
||||
|
||||
@@ -478,7 +705,7 @@ Description: Requests: HTTP for Humans
|
||||
- Digest Authentication improvements.
|
||||
- Ensure proxy exclusions work properly.
|
||||
- Clearer UnicodeError exceptions.
|
||||
- Automatic casting of URLs to tsrings (fURL and such)
|
||||
- Automatic casting of URLs to strings (fURL and such)
|
||||
- Bugfixes.
|
||||
|
||||
0.13.6 (2012-08-06)
|
||||
@@ -529,8 +756,8 @@ Description: Requests: HTTP for Humans
|
||||
+++++++++++++++++++
|
||||
|
||||
- Removal of Requests.async in favor of `grequests <https://github.com/kennethreitz/grequests>`_
|
||||
- Allow disabling of cookie persistiance.
|
||||
- New implimentation of safe_mode
|
||||
- Allow disabling of cookie persistence.
|
||||
- New implementation of safe_mode
|
||||
- cookies.get now supports default argument
|
||||
- Session cookies not saved when Session.request is called with return_response=False
|
||||
- Env: no_proxy support.
|
||||
@@ -647,7 +874,7 @@ Description: Requests: HTTP for Humans
|
||||
|
||||
* ``Response.content`` is now bytes-only. (*Backwards Incompatible*)
|
||||
* New ``Response.text`` is unicode-only.
|
||||
* If no ``Response.encoding`` is specified and ``chardet`` is available, ``Respoonse.text`` will guess an encoding.
|
||||
* If no ``Response.encoding`` is specified and ``chardet`` is available, ``Response.text`` will guess an encoding.
|
||||
* Default to ISO-8859-1 (Western) encoding for "text" subtypes.
|
||||
* Removal of `decode_unicode`. (*Backwards Incompatible*)
|
||||
* New multiple-hooks system.
|
||||
@@ -767,7 +994,7 @@ Description: Requests: HTTP for Humans
|
||||
0.7.5 (2011-11-04)
|
||||
++++++++++++++++++
|
||||
|
||||
* Response.content = None if there was an invalid repsonse.
|
||||
* Response.content = None if there was an invalid response.
|
||||
* Redirection auth handling.
|
||||
|
||||
0.7.4 (2011-10-26)
|
||||
@@ -854,7 +1081,7 @@ Description: Requests: HTTP for Humans
|
||||
++++++++++++++++++
|
||||
|
||||
* New callback hook system
|
||||
* New persistient sessions object and context manager
|
||||
* New persistent sessions object and context manager
|
||||
* Transparent Dict-cookie handling
|
||||
* Status code reference object
|
||||
* Removed Response.cached
|
||||
@@ -888,7 +1115,7 @@ Description: Requests: HTTP for Humans
|
||||
* Redirect Fixes
|
||||
* settings.verbose stream writing
|
||||
* Querystrings for all methods
|
||||
* URLErrors (Connection Refused, Timeout, Invalid URLs) are treated as explicity raised
|
||||
* URLErrors (Connection Refused, Timeout, Invalid URLs) are treated as explicitly raised
|
||||
``r.requests.get('hwe://blah'); r.raise_for_status()``
|
||||
|
||||
|
||||
@@ -1004,8 +1231,8 @@ Classifier: Intended Audience :: Developers
|
||||
Classifier: Natural Language :: English
|
||||
Classifier: License :: OSI Approved :: Apache Software License
|
||||
Classifier: Programming Language :: Python
|
||||
Classifier: Programming Language :: Python :: 2.6
|
||||
Classifier: Programming Language :: Python :: 2.7
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
Classifier: Programming Language :: Python :: 3.3
|
||||
Classifier: Programming Language :: Python :: 3.4
|
||||
Classifier: Programming Language :: Python :: 3.5
|
||||
|
||||
@@ -78,6 +78,7 @@ requests/packages/urllib3/poolmanager.py
|
||||
requests/packages/urllib3/request.py
|
||||
requests/packages/urllib3/response.py
|
||||
requests/packages/urllib3/contrib/__init__.py
|
||||
requests/packages/urllib3/contrib/appengine.py
|
||||
requests/packages/urllib3/contrib/ntlmpool.py
|
||||
requests/packages/urllib3/contrib/pyopenssl.py
|
||||
requests/packages/urllib3/packages/__init__.py
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
|
||||
[security]
|
||||
pyOpenSSL
|
||||
pyOpenSSL>=0.13
|
||||
ndg-httpsclient
|
||||
pyasn1
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
# /
|
||||
|
||||
"""
|
||||
requests HTTP library
|
||||
Requests HTTP library
|
||||
~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Requests is an HTTP library, written in Python, for human beings. Basic GET
|
||||
@@ -36,17 +36,17 @@ usage:
|
||||
The other HTTP methods are supported - see `requests.api`. Full documentation
|
||||
is at <http://python-requests.org>.
|
||||
|
||||
:copyright: (c) 2014 by Kenneth Reitz.
|
||||
:copyright: (c) 2015 by Kenneth Reitz.
|
||||
:license: Apache 2.0, see LICENSE for more details.
|
||||
|
||||
"""
|
||||
|
||||
__title__ = 'requests'
|
||||
__version__ = '2.5.1'
|
||||
__build__ = 0x020501
|
||||
__version__ = '2.9.1'
|
||||
__build__ = 0x020901
|
||||
__author__ = 'Kenneth Reitz'
|
||||
__license__ = 'Apache 2.0'
|
||||
__copyright__ = 'Copyright 2014 Kenneth Reitz'
|
||||
__copyright__ = 'Copyright 2015 Kenneth Reitz'
|
||||
|
||||
# Attempt to enable urllib3's SNI support, if possible
|
||||
try:
|
||||
@@ -62,7 +62,8 @@ from .sessions import session, Session
|
||||
from .status_codes import codes
|
||||
from .exceptions import (
|
||||
RequestException, Timeout, URLRequired,
|
||||
TooManyRedirects, HTTPError, ConnectionError
|
||||
TooManyRedirects, HTTPError, ConnectionError,
|
||||
FileModeWarning,
|
||||
)
|
||||
|
||||
# Set default logging handler to avoid "No handler found" warnings.
|
||||
@@ -75,3 +76,8 @@ except ImportError:
|
||||
pass
|
||||
|
||||
logging.getLogger(__name__).addHandler(NullHandler())
|
||||
|
||||
import warnings
|
||||
|
||||
# FileModeWarnings go off per the default.
|
||||
warnings.simplefilter('default', FileModeWarning, append=True)
|
||||
|
||||
@@ -8,20 +8,24 @@ This module contains the transport adapters that Requests uses to define
|
||||
and maintain connections.
|
||||
"""
|
||||
|
||||
import os.path
|
||||
import socket
|
||||
|
||||
from .models import Response
|
||||
from .packages.urllib3 import Retry
|
||||
from .packages.urllib3.poolmanager import PoolManager, proxy_from_url
|
||||
from .packages.urllib3.response import HTTPResponse
|
||||
from .packages.urllib3.util import Timeout as TimeoutSauce
|
||||
from .packages.urllib3.util.retry import Retry
|
||||
from .compat import urlparse, basestring
|
||||
from .utils import (DEFAULT_CA_BUNDLE_PATH, get_encoding_from_headers,
|
||||
prepend_scheme_if_needed, get_auth_from_url, urldefragauth)
|
||||
prepend_scheme_if_needed, get_auth_from_url, urldefragauth,
|
||||
select_proxy)
|
||||
from .structures import CaseInsensitiveDict
|
||||
from .packages.urllib3.exceptions import ClosedPoolError
|
||||
from .packages.urllib3.exceptions import ConnectTimeoutError
|
||||
from .packages.urllib3.exceptions import HTTPError as _HTTPError
|
||||
from .packages.urllib3.exceptions import MaxRetryError
|
||||
from .packages.urllib3.exceptions import NewConnectionError
|
||||
from .packages.urllib3.exceptions import ProxyError as _ProxyError
|
||||
from .packages.urllib3.exceptions import ProtocolError
|
||||
from .packages.urllib3.exceptions import ReadTimeoutError
|
||||
@@ -35,6 +39,7 @@ from .auth import _basic_auth_str
|
||||
DEFAULT_POOLBLOCK = False
|
||||
DEFAULT_POOLSIZE = 10
|
||||
DEFAULT_RETRIES = 0
|
||||
DEFAULT_POOL_TIMEOUT = None
|
||||
|
||||
|
||||
class BaseAdapter(object):
|
||||
@@ -103,7 +108,7 @@ class HTTPAdapter(BaseAdapter):
|
||||
|
||||
def __setstate__(self, state):
|
||||
# Can't handle by adding 'proxy_manager' to self.__attrs__ because
|
||||
# because self.poolmanager uses a lambda function, which isn't pickleable.
|
||||
# self.poolmanager uses a lambda function, which isn't pickleable.
|
||||
self.proxy_manager = {}
|
||||
self.config = {}
|
||||
|
||||
@@ -181,10 +186,15 @@ class HTTPAdapter(BaseAdapter):
|
||||
raise Exception("Could not find a suitable SSL CA certificate bundle.")
|
||||
|
||||
conn.cert_reqs = 'CERT_REQUIRED'
|
||||
conn.ca_certs = cert_loc
|
||||
|
||||
if not os.path.isdir(cert_loc):
|
||||
conn.ca_certs = cert_loc
|
||||
else:
|
||||
conn.ca_cert_dir = cert_loc
|
||||
else:
|
||||
conn.cert_reqs = 'CERT_NONE'
|
||||
conn.ca_certs = None
|
||||
conn.ca_cert_dir = None
|
||||
|
||||
if cert:
|
||||
if not isinstance(cert, basestring):
|
||||
@@ -237,8 +247,7 @@ class HTTPAdapter(BaseAdapter):
|
||||
:param url: The URL to connect to.
|
||||
:param proxies: (optional) A Requests-style dictionary of proxies used on this request.
|
||||
"""
|
||||
proxies = proxies or {}
|
||||
proxy = proxies.get(urlparse(url.lower()).scheme)
|
||||
proxy = select_proxy(url, proxies)
|
||||
|
||||
if proxy:
|
||||
proxy = prepend_scheme_if_needed(proxy, 'http')
|
||||
@@ -271,12 +280,10 @@ class HTTPAdapter(BaseAdapter):
|
||||
:class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.
|
||||
|
||||
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
|
||||
:param proxies: A dictionary of schemes to proxy URLs.
|
||||
:param proxies: A dictionary of schemes or schemes and hosts to proxy URLs.
|
||||
"""
|
||||
proxies = proxies or {}
|
||||
proxy = select_proxy(request.url, proxies)
|
||||
scheme = urlparse(request.url).scheme
|
||||
proxy = proxies.get(scheme)
|
||||
|
||||
if proxy and scheme != 'https':
|
||||
url = urldefragauth(request.url)
|
||||
else:
|
||||
@@ -309,7 +316,6 @@ class HTTPAdapter(BaseAdapter):
|
||||
:class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.
|
||||
|
||||
:param proxies: The url of the proxy being used for this request.
|
||||
:param kwargs: Optional additional keyword arguments.
|
||||
"""
|
||||
headers = {}
|
||||
username, password = get_auth_from_url(proxy)
|
||||
@@ -326,8 +332,8 @@ class HTTPAdapter(BaseAdapter):
|
||||
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
|
||||
:param stream: (optional) Whether to stream the request content.
|
||||
:param timeout: (optional) How long to wait for the server to send
|
||||
data before giving up, as a float, or a (`connect timeout, read
|
||||
timeout <user/advanced.html#timeouts>`_) tuple.
|
||||
data before giving up, as a float, or a :ref:`(connect timeout,
|
||||
read timeout) <timeouts>` tuple.
|
||||
:type timeout: float or tuple
|
||||
:param verify: (optional) Whether to verify SSL certificates.
|
||||
:param cert: (optional) Any user-provided SSL certificate to be trusted.
|
||||
@@ -375,7 +381,7 @@ class HTTPAdapter(BaseAdapter):
|
||||
if hasattr(conn, 'proxy_pool'):
|
||||
conn = conn.proxy_pool
|
||||
|
||||
low_conn = conn._get_conn(timeout=timeout)
|
||||
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
|
||||
|
||||
try:
|
||||
low_conn.putrequest(request.method,
|
||||
@@ -394,7 +400,15 @@ class HTTPAdapter(BaseAdapter):
|
||||
low_conn.send(b'\r\n')
|
||||
low_conn.send(b'0\r\n\r\n')
|
||||
|
||||
r = low_conn.getresponse()
|
||||
# Receive the response from the server
|
||||
try:
|
||||
# For Python 2.7+ versions, use buffering of HTTP
|
||||
# responses
|
||||
r = low_conn.getresponse(buffering=True)
|
||||
except TypeError:
|
||||
# For compatibility with Python 2.6 versions and back
|
||||
r = low_conn.getresponse()
|
||||
|
||||
resp = HTTPResponse.from_httplib(
|
||||
r,
|
||||
pool=conn,
|
||||
@@ -407,22 +421,24 @@ class HTTPAdapter(BaseAdapter):
|
||||
# Then, reraise so that we can handle the actual exception.
|
||||
low_conn.close()
|
||||
raise
|
||||
else:
|
||||
# All is well, return the connection to the pool.
|
||||
conn._put_conn(low_conn)
|
||||
|
||||
except (ProtocolError, socket.error) as err:
|
||||
raise ConnectionError(err, request=request)
|
||||
|
||||
except MaxRetryError as e:
|
||||
if isinstance(e.reason, ConnectTimeoutError):
|
||||
raise ConnectTimeout(e, request=request)
|
||||
# TODO: Remove this in 3.0.0: see #2811
|
||||
if not isinstance(e.reason, NewConnectionError):
|
||||
raise ConnectTimeout(e, request=request)
|
||||
|
||||
if isinstance(e.reason, ResponseError):
|
||||
raise RetryError(e, request=request)
|
||||
|
||||
raise ConnectionError(e, request=request)
|
||||
|
||||
except ClosedPoolError as e:
|
||||
raise ConnectionError(e, request=request)
|
||||
|
||||
except _ProxyError as e:
|
||||
raise ProxyError(e)
|
||||
|
||||
|
||||
@@ -16,7 +16,6 @@ from . import sessions
|
||||
|
||||
def request(method, url, **kwargs):
|
||||
"""Constructs and sends a :class:`Request <Request>`.
|
||||
Returns :class:`Response <Response>` object.
|
||||
|
||||
:param method: method for the new :class:`Request` object.
|
||||
:param url: URL for the new :class:`Request` object.
|
||||
@@ -28,15 +27,17 @@ def request(method, url, **kwargs):
|
||||
:param files: (optional) Dictionary of ``'name': file-like-objects`` (or ``{'name': ('filename', fileobj)}``) for multipart encoding upload.
|
||||
:param auth: (optional) Auth tuple to enable Basic/Digest/Custom HTTP Auth.
|
||||
:param timeout: (optional) How long to wait for the server to send data
|
||||
before giving up, as a float, or a (`connect timeout, read timeout
|
||||
<user/advanced.html#timeouts>`_) tuple.
|
||||
before giving up, as a float, or a :ref:`(connect timeout, read
|
||||
timeout) <timeouts>` tuple.
|
||||
:type timeout: float or tuple
|
||||
:param allow_redirects: (optional) Boolean. Set to True if POST/PUT/DELETE redirect following is allowed.
|
||||
:type allow_redirects: bool
|
||||
:param proxies: (optional) Dictionary mapping protocol to the URL of the proxy.
|
||||
:param verify: (optional) if ``True``, the SSL cert will be verified. A CA_BUNDLE path can also be provided.
|
||||
:param verify: (optional) whether the SSL cert will be verified. A CA_BUNDLE path can also be provided. Defaults to ``True``.
|
||||
:param stream: (optional) if ``False``, the response content will be immediately downloaded.
|
||||
:param cert: (optional) if String, path to ssl client cert file (.pem). If Tuple, ('cert', 'key') pair.
|
||||
:return: :class:`Response <Response>` object
|
||||
:rtype: requests.Response
|
||||
|
||||
Usage::
|
||||
|
||||
@@ -45,31 +46,34 @@ def request(method, url, **kwargs):
|
||||
<Response [200]>
|
||||
"""
|
||||
|
||||
session = sessions.Session()
|
||||
response = session.request(method=method, url=url, **kwargs)
|
||||
# By explicitly closing the session, we avoid leaving sockets open which
|
||||
# can trigger a ResourceWarning in some cases, and look like a memory leak
|
||||
# in others.
|
||||
session.close()
|
||||
return response
|
||||
# By using the 'with' statement we are sure the session is closed, thus we
|
||||
# avoid leaving sockets open which can trigger a ResourceWarning in some
|
||||
# cases, and look like a memory leak in others.
|
||||
with sessions.Session() as session:
|
||||
return session.request(method=method, url=url, **kwargs)
|
||||
|
||||
|
||||
def get(url, **kwargs):
|
||||
"""Sends a GET request. Returns :class:`Response` object.
|
||||
def get(url, params=None, **kwargs):
|
||||
"""Sends a GET request.
|
||||
|
||||
:param url: URL for the new :class:`Request` object.
|
||||
:param params: (optional) Dictionary or bytes to be sent in the query string for the :class:`Request`.
|
||||
:param \*\*kwargs: Optional arguments that ``request`` takes.
|
||||
:return: :class:`Response <Response>` object
|
||||
:rtype: requests.Response
|
||||
"""
|
||||
|
||||
kwargs.setdefault('allow_redirects', True)
|
||||
return request('get', url, **kwargs)
|
||||
return request('get', url, params=params, **kwargs)
|
||||
|
||||
|
||||
def options(url, **kwargs):
|
||||
"""Sends a OPTIONS request. Returns :class:`Response` object.
|
||||
"""Sends a OPTIONS request.
|
||||
|
||||
:param url: URL for the new :class:`Request` object.
|
||||
:param \*\*kwargs: Optional arguments that ``request`` takes.
|
||||
:return: :class:`Response <Response>` object
|
||||
:rtype: requests.Response
|
||||
"""
|
||||
|
||||
kwargs.setdefault('allow_redirects', True)
|
||||
@@ -77,10 +81,12 @@ def options(url, **kwargs):
|
||||
|
||||
|
||||
def head(url, **kwargs):
|
||||
"""Sends a HEAD request. Returns :class:`Response` object.
|
||||
"""Sends a HEAD request.
|
||||
|
||||
:param url: URL for the new :class:`Request` object.
|
||||
:param \*\*kwargs: Optional arguments that ``request`` takes.
|
||||
:return: :class:`Response <Response>` object
|
||||
:rtype: requests.Response
|
||||
"""
|
||||
|
||||
kwargs.setdefault('allow_redirects', False)
|
||||
@@ -88,44 +94,52 @@ def head(url, **kwargs):
|
||||
|
||||
|
||||
def post(url, data=None, json=None, **kwargs):
|
||||
"""Sends a POST request. Returns :class:`Response` object.
|
||||
"""Sends a POST request.
|
||||
|
||||
:param url: URL for the new :class:`Request` object.
|
||||
:param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
|
||||
:param json: (optional) json data to send in the body of the :class:`Request`.
|
||||
:param \*\*kwargs: Optional arguments that ``request`` takes.
|
||||
:return: :class:`Response <Response>` object
|
||||
:rtype: requests.Response
|
||||
"""
|
||||
|
||||
return request('post', url, data=data, json=json, **kwargs)
|
||||
|
||||
|
||||
def put(url, data=None, **kwargs):
|
||||
"""Sends a PUT request. Returns :class:`Response` object.
|
||||
"""Sends a PUT request.
|
||||
|
||||
:param url: URL for the new :class:`Request` object.
|
||||
:param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
|
||||
:param \*\*kwargs: Optional arguments that ``request`` takes.
|
||||
:return: :class:`Response <Response>` object
|
||||
:rtype: requests.Response
|
||||
"""
|
||||
|
||||
return request('put', url, data=data, **kwargs)
|
||||
|
||||
|
||||
def patch(url, data=None, **kwargs):
|
||||
"""Sends a PATCH request. Returns :class:`Response` object.
|
||||
"""Sends a PATCH request.
|
||||
|
||||
:param url: URL for the new :class:`Request` object.
|
||||
:param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
|
||||
:param \*\*kwargs: Optional arguments that ``request`` takes.
|
||||
:return: :class:`Response <Response>` object
|
||||
:rtype: requests.Response
|
||||
"""
|
||||
|
||||
return request('patch', url, data=data, **kwargs)
|
||||
|
||||
|
||||
def delete(url, **kwargs):
|
||||
"""Sends a DELETE request. Returns :class:`Response` object.
|
||||
"""Sends a DELETE request.
|
||||
|
||||
:param url: URL for the new :class:`Request` object.
|
||||
:param \*\*kwargs: Optional arguments that ``request`` takes.
|
||||
:return: :class:`Response <Response>` object
|
||||
:rtype: requests.Response
|
||||
"""
|
||||
|
||||
return request('delete', url, **kwargs)
|
||||
|
||||
@@ -11,6 +11,7 @@ import os
|
||||
import re
|
||||
import time
|
||||
import hashlib
|
||||
import threading
|
||||
|
||||
from base64 import b64encode
|
||||
|
||||
@@ -63,19 +64,26 @@ class HTTPDigestAuth(AuthBase):
|
||||
def __init__(self, username, password):
|
||||
self.username = username
|
||||
self.password = password
|
||||
self.last_nonce = ''
|
||||
self.nonce_count = 0
|
||||
self.chal = {}
|
||||
self.pos = None
|
||||
self.num_401_calls = 1
|
||||
# Keep state in per-thread local storage
|
||||
self._thread_local = threading.local()
|
||||
|
||||
def init_per_thread_state(self):
|
||||
# Ensure state is initialized just once per-thread
|
||||
if not hasattr(self._thread_local, 'init'):
|
||||
self._thread_local.init = True
|
||||
self._thread_local.last_nonce = ''
|
||||
self._thread_local.nonce_count = 0
|
||||
self._thread_local.chal = {}
|
||||
self._thread_local.pos = None
|
||||
self._thread_local.num_401_calls = None
|
||||
|
||||
def build_digest_header(self, method, url):
|
||||
|
||||
realm = self.chal['realm']
|
||||
nonce = self.chal['nonce']
|
||||
qop = self.chal.get('qop')
|
||||
algorithm = self.chal.get('algorithm')
|
||||
opaque = self.chal.get('opaque')
|
||||
realm = self._thread_local.chal['realm']
|
||||
nonce = self._thread_local.chal['nonce']
|
||||
qop = self._thread_local.chal.get('qop')
|
||||
algorithm = self._thread_local.chal.get('algorithm')
|
||||
opaque = self._thread_local.chal.get('opaque')
|
||||
|
||||
if algorithm is None:
|
||||
_algorithm = 'MD5'
|
||||
@@ -103,7 +111,8 @@ class HTTPDigestAuth(AuthBase):
|
||||
# XXX not implemented yet
|
||||
entdig = None
|
||||
p_parsed = urlparse(url)
|
||||
path = p_parsed.path
|
||||
#: path is request-uri defined in RFC 2616 which should not be empty
|
||||
path = p_parsed.path or "/"
|
||||
if p_parsed.query:
|
||||
path += '?' + p_parsed.query
|
||||
|
||||
@@ -113,30 +122,32 @@ class HTTPDigestAuth(AuthBase):
|
||||
HA1 = hash_utf8(A1)
|
||||
HA2 = hash_utf8(A2)
|
||||
|
||||
if nonce == self.last_nonce:
|
||||
self.nonce_count += 1
|
||||
if nonce == self._thread_local.last_nonce:
|
||||
self._thread_local.nonce_count += 1
|
||||
else:
|
||||
self.nonce_count = 1
|
||||
ncvalue = '%08x' % self.nonce_count
|
||||
s = str(self.nonce_count).encode('utf-8')
|
||||
self._thread_local.nonce_count = 1
|
||||
ncvalue = '%08x' % self._thread_local.nonce_count
|
||||
s = str(self._thread_local.nonce_count).encode('utf-8')
|
||||
s += nonce.encode('utf-8')
|
||||
s += time.ctime().encode('utf-8')
|
||||
s += os.urandom(8)
|
||||
|
||||
cnonce = (hashlib.sha1(s).hexdigest()[:16])
|
||||
noncebit = "%s:%s:%s:%s:%s" % (nonce, ncvalue, cnonce, qop, HA2)
|
||||
if _algorithm == 'MD5-SESS':
|
||||
HA1 = hash_utf8('%s:%s:%s' % (HA1, nonce, cnonce))
|
||||
|
||||
if qop is None:
|
||||
if not qop:
|
||||
respdig = KD(HA1, "%s:%s" % (nonce, HA2))
|
||||
elif qop == 'auth' or 'auth' in qop.split(','):
|
||||
noncebit = "%s:%s:%s:%s:%s" % (
|
||||
nonce, ncvalue, cnonce, 'auth', HA2
|
||||
)
|
||||
respdig = KD(HA1, noncebit)
|
||||
else:
|
||||
# XXX handle auth-int.
|
||||
return None
|
||||
|
||||
self.last_nonce = nonce
|
||||
self._thread_local.last_nonce = nonce
|
||||
|
||||
# XXX should the partial digests be encoded too?
|
||||
base = 'username="%s", realm="%s", nonce="%s", uri="%s", ' \
|
||||
@@ -155,28 +166,27 @@ class HTTPDigestAuth(AuthBase):
|
||||
def handle_redirect(self, r, **kwargs):
|
||||
"""Reset num_401_calls counter on redirects."""
|
||||
if r.is_redirect:
|
||||
self.num_401_calls = 1
|
||||
self._thread_local.num_401_calls = 1
|
||||
|
||||
def handle_401(self, r, **kwargs):
|
||||
"""Takes the given response and tries digest-auth, if needed."""
|
||||
|
||||
if self.pos is not None:
|
||||
if self._thread_local.pos is not None:
|
||||
# Rewind the file position indicator of the body to where
|
||||
# it was to resend the request.
|
||||
r.request.body.seek(self.pos)
|
||||
num_401_calls = getattr(self, 'num_401_calls', 1)
|
||||
r.request.body.seek(self._thread_local.pos)
|
||||
s_auth = r.headers.get('www-authenticate', '')
|
||||
|
||||
if 'digest' in s_auth.lower() and num_401_calls < 2:
|
||||
if 'digest' in s_auth.lower() and self._thread_local.num_401_calls < 2:
|
||||
|
||||
self.num_401_calls += 1
|
||||
self._thread_local.num_401_calls += 1
|
||||
pat = re.compile(r'digest ', flags=re.IGNORECASE)
|
||||
self.chal = parse_dict_header(pat.sub('', s_auth, count=1))
|
||||
self._thread_local.chal = parse_dict_header(pat.sub('', s_auth, count=1))
|
||||
|
||||
# Consume content and release the original connection
|
||||
# to allow our new request to reuse the same one.
|
||||
r.content
|
||||
r.raw.release_conn()
|
||||
r.close()
|
||||
prep = r.request.copy()
|
||||
extract_cookies_to_jar(prep._cookies, r.request, r.raw)
|
||||
prep.prepare_cookies(prep._cookies)
|
||||
@@ -189,21 +199,25 @@ class HTTPDigestAuth(AuthBase):
|
||||
|
||||
return _r
|
||||
|
||||
self.num_401_calls = 1
|
||||
self._thread_local.num_401_calls = 1
|
||||
return r
|
||||
|
||||
def __call__(self, r):
|
||||
# Initialize per-thread state, if needed
|
||||
self.init_per_thread_state()
|
||||
# If we have a saved nonce, skip the 401
|
||||
if self.last_nonce:
|
||||
if self._thread_local.last_nonce:
|
||||
r.headers['Authorization'] = self.build_digest_header(r.method, r.url)
|
||||
try:
|
||||
self.pos = r.body.tell()
|
||||
self._thread_local.pos = r.body.tell()
|
||||
except AttributeError:
|
||||
# In the case of HTTPDigestAuth being reused and the body of
|
||||
# the previous request was a file-like object, pos has the
|
||||
# file position of the previous body. Ensure it's set to
|
||||
# None.
|
||||
self.pos = None
|
||||
self._thread_local.pos = None
|
||||
r.register_hook('response', self.handle_401)
|
||||
r.register_hook('response', self.handle_redirect)
|
||||
self._thread_local.num_401_calls = 1
|
||||
|
||||
return r
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -21,58 +21,6 @@ is_py2 = (_ver[0] == 2)
|
||||
#: Python 3.x?
|
||||
is_py3 = (_ver[0] == 3)
|
||||
|
||||
#: Python 3.0.x
|
||||
is_py30 = (is_py3 and _ver[1] == 0)
|
||||
|
||||
#: Python 3.1.x
|
||||
is_py31 = (is_py3 and _ver[1] == 1)
|
||||
|
||||
#: Python 3.2.x
|
||||
is_py32 = (is_py3 and _ver[1] == 2)
|
||||
|
||||
#: Python 3.3.x
|
||||
is_py33 = (is_py3 and _ver[1] == 3)
|
||||
|
||||
#: Python 3.4.x
|
||||
is_py34 = (is_py3 and _ver[1] == 4)
|
||||
|
||||
#: Python 2.7.x
|
||||
is_py27 = (is_py2 and _ver[1] == 7)
|
||||
|
||||
#: Python 2.6.x
|
||||
is_py26 = (is_py2 and _ver[1] == 6)
|
||||
|
||||
#: Python 2.5.x
|
||||
is_py25 = (is_py2 and _ver[1] == 5)
|
||||
|
||||
#: Python 2.4.x
|
||||
is_py24 = (is_py2 and _ver[1] == 4) # I'm assuming this is not by choice.
|
||||
|
||||
|
||||
# ---------
|
||||
# Platforms
|
||||
# ---------
|
||||
|
||||
|
||||
# Syntax sugar.
|
||||
_ver = sys.version.lower()
|
||||
|
||||
is_pypy = ('pypy' in _ver)
|
||||
is_jython = ('jython' in _ver)
|
||||
is_ironpython = ('iron' in _ver)
|
||||
|
||||
# Assume CPython, if nothing else.
|
||||
is_cpython = not any((is_pypy, is_jython, is_ironpython))
|
||||
|
||||
# Windows-based system.
|
||||
is_windows = 'win32' in str(sys.platform).lower()
|
||||
|
||||
# Standard Linux 2+ system.
|
||||
is_linux = ('linux' in str(sys.platform).lower())
|
||||
is_osx = ('darwin' in str(sys.platform).lower())
|
||||
is_hpux = ('hpux' in str(sys.platform).lower()) # Complete guess.
|
||||
is_solaris = ('solar==' in str(sys.platform).lower()) # Complete guess.
|
||||
|
||||
try:
|
||||
import simplejson as json
|
||||
except (ImportError, SyntaxError):
|
||||
@@ -99,7 +47,6 @@ if is_py2:
|
||||
basestring = basestring
|
||||
numeric_types = (int, long, float)
|
||||
|
||||
|
||||
elif is_py3:
|
||||
from urllib.parse import urlparse, urlunparse, urljoin, urlsplit, urlencode, quote, unquote, quote_plus, unquote_plus, urldefrag
|
||||
from urllib.request import parse_http_list, getproxies, proxy_bypass
|
||||
|
||||
@@ -6,7 +6,9 @@ Compatibility code to be able to use `cookielib.CookieJar` with requests.
|
||||
requests.utils imports from here, so be careful with imports.
|
||||
"""
|
||||
|
||||
import copy
|
||||
import time
|
||||
import calendar
|
||||
import collections
|
||||
from .compat import cookielib, urlparse, urlunparse, Morsel
|
||||
|
||||
@@ -142,10 +144,13 @@ def remove_cookie_by_name(cookiejar, name, domain=None, path=None):
|
||||
"""
|
||||
clearables = []
|
||||
for cookie in cookiejar:
|
||||
if cookie.name == name:
|
||||
if domain is None or domain == cookie.domain:
|
||||
if path is None or path == cookie.path:
|
||||
clearables.append((cookie.domain, cookie.path, cookie.name))
|
||||
if cookie.name != name:
|
||||
continue
|
||||
if domain is not None and domain != cookie.domain:
|
||||
continue
|
||||
if path is not None and path != cookie.path:
|
||||
continue
|
||||
clearables.append((cookie.domain, cookie.path, cookie.name))
|
||||
|
||||
for domain, path, name in clearables:
|
||||
cookiejar.clear(domain, path, name)
|
||||
@@ -157,26 +162,28 @@ class CookieConflictError(RuntimeError):
|
||||
|
||||
|
||||
class RequestsCookieJar(cookielib.CookieJar, collections.MutableMapping):
|
||||
"""Compatibility class; is a cookielib.CookieJar, but exposes a dict interface.
|
||||
"""Compatibility class; is a cookielib.CookieJar, but exposes a dict
|
||||
interface.
|
||||
|
||||
This is the CookieJar we create by default for requests and sessions that
|
||||
don't specify one, since some clients may expect response.cookies and
|
||||
session.cookies to support dict operations.
|
||||
|
||||
Don't use the dict interface internally; it's just for compatibility with
|
||||
with external client code. All `requests` code should work out of the box
|
||||
with externally provided instances of CookieJar, e.g., LWPCookieJar and
|
||||
FileCookieJar.
|
||||
|
||||
Caution: dictionary operations that are normally O(1) may be O(n).
|
||||
Requests does not use the dict interface internally; it's just for
|
||||
compatibility with external client code. All requests code should work
|
||||
out of the box with externally provided instances of ``CookieJar``, e.g.
|
||||
``LWPCookieJar`` and ``FileCookieJar``.
|
||||
|
||||
Unlike a regular CookieJar, this class is pickleable.
|
||||
"""
|
||||
|
||||
.. warning:: dictionary operations that are normally O(1) may be O(n).
|
||||
"""
|
||||
def get(self, name, default=None, domain=None, path=None):
|
||||
"""Dict-like get() that also supports optional domain and path args in
|
||||
order to resolve naming collisions from using one cookie jar over
|
||||
multiple domains. Caution: operation is O(n), not O(1)."""
|
||||
multiple domains.
|
||||
|
||||
.. warning:: operation is O(n), not O(1)."""
|
||||
try:
|
||||
return self._find_no_duplicates(name, domain, path)
|
||||
except KeyError:
|
||||
@@ -199,37 +206,38 @@ class RequestsCookieJar(cookielib.CookieJar, collections.MutableMapping):
|
||||
return c
|
||||
|
||||
def iterkeys(self):
|
||||
"""Dict-like iterkeys() that returns an iterator of names of cookies from the jar.
|
||||
See itervalues() and iteritems()."""
|
||||
"""Dict-like iterkeys() that returns an iterator of names of cookies
|
||||
from the jar. See itervalues() and iteritems()."""
|
||||
for cookie in iter(self):
|
||||
yield cookie.name
|
||||
|
||||
def keys(self):
|
||||
"""Dict-like keys() that returns a list of names of cookies from the jar.
|
||||
See values() and items()."""
|
||||
"""Dict-like keys() that returns a list of names of cookies from the
|
||||
jar. See values() and items()."""
|
||||
return list(self.iterkeys())
|
||||
|
||||
def itervalues(self):
|
||||
"""Dict-like itervalues() that returns an iterator of values of cookies from the jar.
|
||||
See iterkeys() and iteritems()."""
|
||||
"""Dict-like itervalues() that returns an iterator of values of cookies
|
||||
from the jar. See iterkeys() and iteritems()."""
|
||||
for cookie in iter(self):
|
||||
yield cookie.value
|
||||
|
||||
def values(self):
|
||||
"""Dict-like values() that returns a list of values of cookies from the jar.
|
||||
See keys() and items()."""
|
||||
"""Dict-like values() that returns a list of values of cookies from the
|
||||
jar. See keys() and items()."""
|
||||
return list(self.itervalues())
|
||||
|
||||
def iteritems(self):
|
||||
"""Dict-like iteritems() that returns an iterator of name-value tuples from the jar.
|
||||
See iterkeys() and itervalues()."""
|
||||
"""Dict-like iteritems() that returns an iterator of name-value tuples
|
||||
from the jar. See iterkeys() and itervalues()."""
|
||||
for cookie in iter(self):
|
||||
yield cookie.name, cookie.value
|
||||
|
||||
def items(self):
|
||||
"""Dict-like items() that returns a list of name-value tuples from the jar.
|
||||
See keys() and values(). Allows client-code to call "dict(RequestsCookieJar)
|
||||
and get a vanilla python dict of key value pairs."""
|
||||
"""Dict-like items() that returns a list of name-value tuples from the
|
||||
jar. See keys() and values(). Allows client-code to call
|
||||
``dict(RequestsCookieJar)`` and get a vanilla python dict of key value
|
||||
pairs."""
|
||||
return list(self.iteritems())
|
||||
|
||||
def list_domains(self):
|
||||
@@ -259,8 +267,9 @@ class RequestsCookieJar(cookielib.CookieJar, collections.MutableMapping):
|
||||
return False # there is only one domain in jar
|
||||
|
||||
def get_dict(self, domain=None, path=None):
|
||||
"""Takes as an argument an optional domain and path and returns a plain old
|
||||
Python dict of name-value pairs of cookies that meet the requirements."""
|
||||
"""Takes as an argument an optional domain and path and returns a plain
|
||||
old Python dict of name-value pairs of cookies that meet the
|
||||
requirements."""
|
||||
dictionary = {}
|
||||
for cookie in iter(self):
|
||||
if (domain is None or cookie.domain == domain) and (path is None
|
||||
@@ -269,21 +278,24 @@ class RequestsCookieJar(cookielib.CookieJar, collections.MutableMapping):
|
||||
return dictionary
|
||||
|
||||
def __getitem__(self, name):
|
||||
"""Dict-like __getitem__() for compatibility with client code. Throws exception
|
||||
if there are more than one cookie with name. In that case, use the more
|
||||
explicit get() method instead. Caution: operation is O(n), not O(1)."""
|
||||
"""Dict-like __getitem__() for compatibility with client code. Throws
|
||||
exception if there are more than one cookie with name. In that case,
|
||||
use the more explicit get() method instead.
|
||||
|
||||
.. warning:: operation is O(n), not O(1)."""
|
||||
|
||||
return self._find_no_duplicates(name)
|
||||
|
||||
def __setitem__(self, name, value):
|
||||
"""Dict-like __setitem__ for compatibility with client code. Throws exception
|
||||
if there is already a cookie of that name in the jar. In that case, use the more
|
||||
explicit set() method instead."""
|
||||
"""Dict-like __setitem__ for compatibility with client code. Throws
|
||||
exception if there is already a cookie of that name in the jar. In that
|
||||
case, use the more explicit set() method instead."""
|
||||
|
||||
self.set(name, value)
|
||||
|
||||
def __delitem__(self, name):
|
||||
"""Deletes a cookie given a name. Wraps cookielib.CookieJar's remove_cookie_by_name()."""
|
||||
"""Deletes a cookie given a name. Wraps ``cookielib.CookieJar``'s
|
||||
``remove_cookie_by_name()``."""
|
||||
remove_cookie_by_name(self, name)
|
||||
|
||||
def set_cookie(self, cookie, *args, **kwargs):
|
||||
@@ -295,15 +307,16 @@ class RequestsCookieJar(cookielib.CookieJar, collections.MutableMapping):
|
||||
"""Updates this jar with cookies from another CookieJar or dict-like"""
|
||||
if isinstance(other, cookielib.CookieJar):
|
||||
for cookie in other:
|
||||
self.set_cookie(cookie)
|
||||
self.set_cookie(copy.copy(cookie))
|
||||
else:
|
||||
super(RequestsCookieJar, self).update(other)
|
||||
|
||||
def _find(self, name, domain=None, path=None):
|
||||
"""Requests uses this method internally to get cookie values. Takes as args name
|
||||
and optional domain and path. Returns a cookie.value. If there are conflicting cookies,
|
||||
_find arbitrarily chooses one. See _find_no_duplicates if you want an exception thrown
|
||||
if there are conflicting cookies."""
|
||||
"""Requests uses this method internally to get cookie values. Takes as
|
||||
args name and optional domain and path. Returns a cookie.value. If
|
||||
there are conflicting cookies, _find arbitrarily chooses one. See
|
||||
_find_no_duplicates if you want an exception thrown if there are
|
||||
conflicting cookies."""
|
||||
for cookie in iter(self):
|
||||
if cookie.name == name:
|
||||
if domain is None or cookie.domain == domain:
|
||||
@@ -313,10 +326,11 @@ class RequestsCookieJar(cookielib.CookieJar, collections.MutableMapping):
|
||||
raise KeyError('name=%r, domain=%r, path=%r' % (name, domain, path))
|
||||
|
||||
def _find_no_duplicates(self, name, domain=None, path=None):
|
||||
"""__get_item__ and get call _find_no_duplicates -- never used in Requests internally.
|
||||
Takes as args name and optional domain and path. Returns a cookie.value.
|
||||
Throws KeyError if cookie is not found and CookieConflictError if there are
|
||||
multiple cookies that match name and optionally domain and path."""
|
||||
"""Both ``__get_item__`` and ``get`` call this function: it's never
|
||||
used elsewhere in Requests. Takes as args name and optional domain and
|
||||
path. Returns a cookie.value. Throws KeyError if cookie is not found
|
||||
and CookieConflictError if there are multiple cookies that match name
|
||||
and optionally domain and path."""
|
||||
toReturn = None
|
||||
for cookie in iter(self):
|
||||
if cookie.name == name:
|
||||
@@ -350,6 +364,21 @@ class RequestsCookieJar(cookielib.CookieJar, collections.MutableMapping):
|
||||
return new_cj
|
||||
|
||||
|
||||
def _copy_cookie_jar(jar):
|
||||
if jar is None:
|
||||
return None
|
||||
|
||||
if hasattr(jar, 'copy'):
|
||||
# We're dealing with an instance of RequestsCookieJar
|
||||
return jar.copy()
|
||||
# We're dealing with a generic CookieJar instance
|
||||
new_jar = copy.copy(jar)
|
||||
new_jar.clear()
|
||||
for cookie in jar:
|
||||
new_jar.set_cookie(copy.copy(cookie))
|
||||
return new_jar
|
||||
|
||||
|
||||
def create_cookie(name, value, **kwargs):
|
||||
"""Make a cookie from underspecified parameters.
|
||||
|
||||
@@ -390,11 +419,15 @@ def morsel_to_cookie(morsel):
|
||||
|
||||
expires = None
|
||||
if morsel['max-age']:
|
||||
expires = time.time() + morsel['max-age']
|
||||
try:
|
||||
expires = int(time.time() + int(morsel['max-age']))
|
||||
except ValueError:
|
||||
raise TypeError('max-age: %s must be integer' % morsel['max-age'])
|
||||
elif morsel['expires']:
|
||||
time_template = '%a, %d-%b-%Y %H:%M:%S GMT'
|
||||
expires = time.mktime(
|
||||
time.strptime(morsel['expires'], time_template)) - time.timezone
|
||||
expires = calendar.timegm(
|
||||
time.strptime(morsel['expires'], time_template)
|
||||
)
|
||||
return create_cookie(
|
||||
comment=morsel['comment'],
|
||||
comment_url=bool(morsel['comment']),
|
||||
@@ -440,7 +473,7 @@ def merge_cookies(cookiejar, cookies):
|
||||
"""
|
||||
if not isinstance(cookiejar, cookielib.CookieJar):
|
||||
raise ValueError('You can only merge into CookieJar')
|
||||
|
||||
|
||||
if isinstance(cookies, dict):
|
||||
cookiejar = cookiejar_from_dict(
|
||||
cookies, cookiejar=cookiejar, overwrite=False)
|
||||
|
||||
@@ -97,3 +97,18 @@ class StreamConsumedError(RequestException, TypeError):
|
||||
|
||||
class RetryError(RequestException):
|
||||
"""Custom retries logic failed"""
|
||||
|
||||
|
||||
# Warnings
|
||||
|
||||
|
||||
class RequestsWarning(Warning):
|
||||
"""Base warning for Requests."""
|
||||
pass
|
||||
|
||||
|
||||
class FileModeWarning(RequestsWarning, DeprecationWarning):
|
||||
"""
|
||||
A file was opened in text mode, but Requests determined its binary length.
|
||||
"""
|
||||
pass
|
||||
|
||||
@@ -12,34 +12,23 @@ Available hooks:
|
||||
The response generated from a Request.
|
||||
|
||||
"""
|
||||
|
||||
|
||||
HOOKS = ['response']
|
||||
|
||||
|
||||
def default_hooks():
|
||||
hooks = {}
|
||||
for event in HOOKS:
|
||||
hooks[event] = []
|
||||
return hooks
|
||||
return dict((event, []) for event in HOOKS)
|
||||
|
||||
# TODO: response is the only one
|
||||
|
||||
|
||||
def dispatch_hook(key, hooks, hook_data, **kwargs):
|
||||
"""Dispatches a hook dictionary on a given piece of data."""
|
||||
|
||||
hooks = hooks or dict()
|
||||
|
||||
if key in hooks:
|
||||
hooks = hooks.get(key)
|
||||
|
||||
hooks = hooks.get(key)
|
||||
if hooks:
|
||||
if hasattr(hooks, '__call__'):
|
||||
hooks = [hooks]
|
||||
|
||||
for hook in hooks:
|
||||
_hook_data = hook(hook_data, **kwargs)
|
||||
if _hook_data is not None:
|
||||
hook_data = _hook_data
|
||||
|
||||
return hook_data
|
||||
|
||||
@@ -15,7 +15,7 @@ from .hooks import default_hooks
|
||||
from .structures import CaseInsensitiveDict
|
||||
|
||||
from .auth import HTTPBasicAuth
|
||||
from .cookies import cookiejar_from_dict, get_cookie_header
|
||||
from .cookies import cookiejar_from_dict, get_cookie_header, _copy_cookie_jar
|
||||
from .packages.urllib3.fields import RequestField
|
||||
from .packages.urllib3.filepost import encode_multipart_formdata
|
||||
from .packages.urllib3.util import parse_url
|
||||
@@ -30,7 +30,8 @@ from .utils import (
|
||||
iter_slices, guess_json_utf, super_len, to_native_string)
|
||||
from .compat import (
|
||||
cookielib, urlunparse, urlsplit, urlencode, str, bytes, StringIO,
|
||||
is_py2, chardet, json, builtin_str, basestring)
|
||||
is_py2, chardet, builtin_str, basestring)
|
||||
from .compat import json as complexjson
|
||||
from .status_codes import codes
|
||||
|
||||
#: The set of HTTP status codes that indicate an automatically
|
||||
@@ -42,12 +43,11 @@ REDIRECT_STATI = (
|
||||
codes.temporary_redirect, # 307
|
||||
codes.permanent_redirect, # 308
|
||||
)
|
||||
|
||||
DEFAULT_REDIRECT_LIMIT = 30
|
||||
CONTENT_CHUNK_SIZE = 10 * 1024
|
||||
ITER_CHUNK_SIZE = 512
|
||||
|
||||
json_dumps = json.dumps
|
||||
|
||||
|
||||
class RequestEncodingMixin(object):
|
||||
@property
|
||||
@@ -143,13 +143,13 @@ class RequestEncodingMixin(object):
|
||||
else:
|
||||
fn = guess_filename(v) or k
|
||||
fp = v
|
||||
if isinstance(fp, str):
|
||||
fp = StringIO(fp)
|
||||
if isinstance(fp, bytes):
|
||||
fp = BytesIO(fp)
|
||||
|
||||
rf = RequestField(name=k, data=fp.read(),
|
||||
filename=fn, headers=fh)
|
||||
if isinstance(fp, (str, bytes, bytearray)):
|
||||
fdata = fp
|
||||
else:
|
||||
fdata = fp.read()
|
||||
|
||||
rf = RequestField(name=k, data=fdata, filename=fn, headers=fh)
|
||||
rf.make_multipart(content_type=ft)
|
||||
new_fields.append(rf)
|
||||
|
||||
@@ -192,7 +192,7 @@ class Request(RequestHooksMixin):
|
||||
:param headers: dictionary of headers to send.
|
||||
:param files: dictionary of {filename: fileobject} files to multipart upload.
|
||||
:param data: the body to attach to the request. If a dictionary is provided, form-encoding will take place.
|
||||
:param json: json for the body to attach to the request (if data is not specified).
|
||||
:param json: json for the body to attach to the request (if files or data is not specified).
|
||||
:param params: dictionary of URL parameters to append to the URL.
|
||||
:param auth: Auth handler or (user, pass) tuple.
|
||||
:param cookies: dictionary or CookieJar of cookies to attach to this request.
|
||||
@@ -206,17 +206,8 @@ class Request(RequestHooksMixin):
|
||||
<PreparedRequest [GET]>
|
||||
|
||||
"""
|
||||
def __init__(self,
|
||||
method=None,
|
||||
url=None,
|
||||
headers=None,
|
||||
files=None,
|
||||
data=None,
|
||||
params=None,
|
||||
auth=None,
|
||||
cookies=None,
|
||||
hooks=None,
|
||||
json=None):
|
||||
def __init__(self, method=None, url=None, headers=None, files=None,
|
||||
data=None, params=None, auth=None, cookies=None, hooks=None, json=None):
|
||||
|
||||
# Default empty dicts for dict params.
|
||||
data = [] if data is None else data
|
||||
@@ -295,8 +286,7 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
|
||||
self.hooks = default_hooks()
|
||||
|
||||
def prepare(self, method=None, url=None, headers=None, files=None,
|
||||
data=None, params=None, auth=None, cookies=None, hooks=None,
|
||||
json=None):
|
||||
data=None, params=None, auth=None, cookies=None, hooks=None, json=None):
|
||||
"""Prepares the entire request with the given parameters."""
|
||||
|
||||
self.prepare_method(method)
|
||||
@@ -305,6 +295,7 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
|
||||
self.prepare_cookies(cookies)
|
||||
self.prepare_body(data, files, json)
|
||||
self.prepare_auth(auth, url)
|
||||
|
||||
# Note that prepare_auth must be last to enable authentication schemes
|
||||
# such as OAuth to work on a fully prepared request.
|
||||
|
||||
@@ -319,7 +310,7 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
|
||||
p.method = self.method
|
||||
p.url = self.url
|
||||
p.headers = self.headers.copy() if self.headers is not None else None
|
||||
p._cookies = self._cookies.copy() if self._cookies is not None else None
|
||||
p._cookies = _copy_cookie_jar(self._cookies)
|
||||
p.body = self.body
|
||||
p.hooks = self.hooks
|
||||
return p
|
||||
@@ -328,12 +319,12 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
|
||||
"""Prepares the given HTTP method."""
|
||||
self.method = method
|
||||
if self.method is not None:
|
||||
self.method = self.method.upper()
|
||||
self.method = to_native_string(self.method.upper())
|
||||
|
||||
def prepare_url(self, url, params):
|
||||
"""Prepares the given HTTP URL."""
|
||||
#: Accept objects that have string representations.
|
||||
#: We're unable to blindy call unicode/str functions
|
||||
#: We're unable to blindly call unicode/str functions
|
||||
#: as this will include the bytestring indicator (b'')
|
||||
#: on python 3.x.
|
||||
#: https://github.com/kennethreitz/requests/pull/2238
|
||||
@@ -356,8 +347,10 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
|
||||
raise InvalidURL(*e.args)
|
||||
|
||||
if not scheme:
|
||||
raise MissingSchema("Invalid URL {0!r}: No schema supplied. "
|
||||
"Perhaps you meant http://{0}?".format(url))
|
||||
error = ("Invalid URL {0!r}: No schema supplied. Perhaps you meant http://{0}?")
|
||||
error = error.format(to_native_string(url, 'utf8'))
|
||||
|
||||
raise MissingSchema(error)
|
||||
|
||||
if not host:
|
||||
raise InvalidURL("Invalid URL %r: No host supplied" % url)
|
||||
@@ -392,6 +385,9 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
|
||||
if isinstance(fragment, str):
|
||||
fragment = fragment.encode('utf-8')
|
||||
|
||||
if isinstance(params, (str, bytes)):
|
||||
params = to_native_string(params)
|
||||
|
||||
enc_params = self._encode_params(params)
|
||||
if enc_params:
|
||||
if query:
|
||||
@@ -421,9 +417,9 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
|
||||
content_type = None
|
||||
length = None
|
||||
|
||||
if json is not None:
|
||||
if not data and json is not None:
|
||||
content_type = 'application/json'
|
||||
body = json_dumps(json)
|
||||
body = complexjson.dumps(json)
|
||||
|
||||
is_stream = all([
|
||||
hasattr(data, '__iter__'),
|
||||
@@ -441,7 +437,7 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
|
||||
if files:
|
||||
raise NotImplementedError('Streamed bodies and files are mutually exclusive.')
|
||||
|
||||
if length is not None:
|
||||
if length:
|
||||
self.headers['Content-Length'] = builtin_str(length)
|
||||
else:
|
||||
self.headers['Transfer-Encoding'] = 'chunked'
|
||||
@@ -450,7 +446,7 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
|
||||
if files:
|
||||
(body, content_type) = self._encode_files(files, data)
|
||||
else:
|
||||
if data and json is None:
|
||||
if data:
|
||||
body = self._encode_params(data)
|
||||
if isinstance(data, basestring) or hasattr(data, 'read'):
|
||||
content_type = None
|
||||
@@ -500,7 +496,15 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
|
||||
self.prepare_content_length(self.body)
|
||||
|
||||
def prepare_cookies(self, cookies):
|
||||
"""Prepares the given HTTP cookie data."""
|
||||
"""Prepares the given HTTP cookie data.
|
||||
|
||||
This function eventually generates a ``Cookie`` header from the
|
||||
given cookies using cookielib. Due to cookielib's design, the header
|
||||
will not be regenerated if it already exists, meaning this function
|
||||
can only be called once for the life of the
|
||||
:class:`PreparedRequest <PreparedRequest>` object. Any subsequent calls
|
||||
to ``prepare_cookies`` will have no actual effect, unless the "Cookie"
|
||||
header is removed beforehand."""
|
||||
|
||||
if isinstance(cookies, cookielib.CookieJar):
|
||||
self._cookies = cookies
|
||||
@@ -513,6 +517,10 @@ class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
|
||||
|
||||
def prepare_hooks(self, hooks):
|
||||
"""Prepares the given hooks."""
|
||||
# hooks can be passed as None to the prepare method and to this
|
||||
# method. To prevent iterating over None, simply use an empty list
|
||||
# if hooks is False-y
|
||||
hooks = hooks or []
|
||||
for event in hooks:
|
||||
self.register_hook(event, hooks[event])
|
||||
|
||||
@@ -523,16 +531,8 @@ class Response(object):
|
||||
"""
|
||||
|
||||
__attrs__ = [
|
||||
'_content',
|
||||
'status_code',
|
||||
'headers',
|
||||
'url',
|
||||
'history',
|
||||
'encoding',
|
||||
'reason',
|
||||
'cookies',
|
||||
'elapsed',
|
||||
'request',
|
||||
'_content', 'status_code', 'headers', 'url', 'history',
|
||||
'encoding', 'reason', 'cookies', 'elapsed', 'request'
|
||||
]
|
||||
|
||||
def __init__(self):
|
||||
@@ -572,7 +572,11 @@ class Response(object):
|
||||
self.cookies = cookiejar_from_dict({})
|
||||
|
||||
#: The amount of time elapsed between sending the request
|
||||
#: and the arrival of the response (as a timedelta)
|
||||
#: and the arrival of the response (as a timedelta).
|
||||
#: This property specifically measures the time taken between sending
|
||||
#: the first byte of the request and finishing parsing the headers. It
|
||||
#: is therefore unaffected by consuming the response content or the
|
||||
#: value of the ``stream`` keyword argument.
|
||||
self.elapsed = datetime.timedelta(0)
|
||||
|
||||
#: The :class:`PreparedRequest <PreparedRequest>` object to which this
|
||||
@@ -630,7 +634,7 @@ class Response(object):
|
||||
|
||||
@property
|
||||
def is_permanent_redirect(self):
|
||||
"""True if this Response one of the permanant versions of redirect"""
|
||||
"""True if this Response one of the permanent versions of redirect"""
|
||||
return ('location' in self.headers and self.status_code in (codes.moved_permanently, codes.permanent_redirect))
|
||||
|
||||
@property
|
||||
@@ -648,9 +652,10 @@ class Response(object):
|
||||
If decode_unicode is True, content will be decoded using the best
|
||||
available encoding based on the response.
|
||||
"""
|
||||
|
||||
def generate():
|
||||
try:
|
||||
# Special case for urllib3.
|
||||
# Special case for urllib3.
|
||||
if hasattr(self.raw, 'stream'):
|
||||
try:
|
||||
for chunk in self.raw.stream(chunk_size, decode_content=True):
|
||||
yield chunk
|
||||
@@ -660,7 +665,7 @@ class Response(object):
|
||||
raise ContentDecodingError(e)
|
||||
except ReadTimeoutError as e:
|
||||
raise ConnectionError(e)
|
||||
except AttributeError:
|
||||
else:
|
||||
# Standard file-like object.
|
||||
while True:
|
||||
chunk = self.raw.read(chunk_size)
|
||||
@@ -688,6 +693,8 @@ class Response(object):
|
||||
"""Iterates over the response data, one line at a time. When
|
||||
stream=True is set on the request, this avoids reading the
|
||||
content at once into memory for large responses.
|
||||
|
||||
.. note:: This method is not reentrant safe.
|
||||
"""
|
||||
|
||||
pending = None
|
||||
@@ -789,14 +796,16 @@ class Response(object):
|
||||
encoding = guess_json_utf(self.content)
|
||||
if encoding is not None:
|
||||
try:
|
||||
return json.loads(self.content.decode(encoding), **kwargs)
|
||||
return complexjson.loads(
|
||||
self.content.decode(encoding), **kwargs
|
||||
)
|
||||
except UnicodeDecodeError:
|
||||
# Wrong UTF codec detected; usually because it's not UTF-8
|
||||
# but some other 8-bit codec. This is an RFC violation,
|
||||
# and the server didn't bother to tell us what codec *was*
|
||||
# used.
|
||||
pass
|
||||
return json.loads(self.text, **kwargs)
|
||||
return complexjson.loads(self.text, **kwargs)
|
||||
|
||||
@property
|
||||
def links(self):
|
||||
@@ -822,10 +831,10 @@ class Response(object):
|
||||
http_error_msg = ''
|
||||
|
||||
if 400 <= self.status_code < 500:
|
||||
http_error_msg = '%s Client Error: %s' % (self.status_code, self.reason)
|
||||
http_error_msg = '%s Client Error: %s for url: %s' % (self.status_code, self.reason, self.url)
|
||||
|
||||
elif 500 <= self.status_code < 600:
|
||||
http_error_msg = '%s Server Error: %s' % (self.status_code, self.reason)
|
||||
http_error_msg = '%s Server Error: %s for url: %s' % (self.status_code, self.reason, self.url)
|
||||
|
||||
if http_error_msg:
|
||||
raise HTTPError(http_error_msg, response=self)
|
||||
@@ -836,4 +845,7 @@ class Response(object):
|
||||
|
||||
*Note: Should not normally need to be called explicitly.*
|
||||
"""
|
||||
if not self._content_consumed:
|
||||
return self.raw.close()
|
||||
|
||||
return self.raw.release_conn()
|
||||
|
||||
@@ -1,3 +1,36 @@
|
||||
from __future__ import absolute_import
|
||||
'''
|
||||
Debian and other distributions "unbundle" requests' vendored dependencies, and
|
||||
rewrite all imports to use the global versions of ``urllib3`` and ``chardet``.
|
||||
The problem with this is that not only requests itself imports those
|
||||
dependencies, but third-party code outside of the distros' control too.
|
||||
|
||||
from . import urllib3
|
||||
In reaction to these problems, the distro maintainers replaced
|
||||
``requests.packages`` with a magical "stub module" that imports the correct
|
||||
modules. The implementations were varying in quality and all had severe
|
||||
problems. For example, a symlink (or hardlink) that links the correct modules
|
||||
into place introduces problems regarding object identity, since you now have
|
||||
two modules in `sys.modules` with the same API, but different identities::
|
||||
|
||||
requests.packages.urllib3 is not urllib3
|
||||
|
||||
With version ``2.5.2``, requests started to maintain its own stub, so that
|
||||
distro-specific breakage would be reduced to a minimum, even though the whole
|
||||
issue is not requests' fault in the first place. See
|
||||
https://github.com/kennethreitz/requests/pull/2375 for the corresponding pull
|
||||
request.
|
||||
'''
|
||||
|
||||
from __future__ import absolute_import
|
||||
import sys
|
||||
|
||||
try:
|
||||
from . import urllib3
|
||||
except ImportError:
|
||||
import urllib3
|
||||
sys.modules['%s.urllib3' % __name__] = urllib3
|
||||
|
||||
try:
|
||||
from . import chardet
|
||||
except ImportError:
|
||||
import chardet
|
||||
sys.modules['%s.chardet' % __name__] = chardet
|
||||
|
||||
@@ -2,10 +2,8 @@
|
||||
urllib3 - Thread-safe connection pooling and re-using.
|
||||
"""
|
||||
|
||||
__author__ = 'Andrey Petrov (andrey.petrov@shazow.net)'
|
||||
__license__ = 'MIT'
|
||||
__version__ = 'dev'
|
||||
|
||||
from __future__ import absolute_import
|
||||
import warnings
|
||||
|
||||
from .connectionpool import (
|
||||
HTTPConnectionPool,
|
||||
@@ -32,8 +30,30 @@ except ImportError:
|
||||
def emit(self, record):
|
||||
pass
|
||||
|
||||
__author__ = 'Andrey Petrov (andrey.petrov@shazow.net)'
|
||||
__license__ = 'MIT'
|
||||
__version__ = '1.13.1'
|
||||
|
||||
__all__ = (
|
||||
'HTTPConnectionPool',
|
||||
'HTTPSConnectionPool',
|
||||
'PoolManager',
|
||||
'ProxyManager',
|
||||
'HTTPResponse',
|
||||
'Retry',
|
||||
'Timeout',
|
||||
'add_stderr_logger',
|
||||
'connection_from_url',
|
||||
'disable_warnings',
|
||||
'encode_multipart_formdata',
|
||||
'get_host',
|
||||
'make_headers',
|
||||
'proxy_from_url',
|
||||
)
|
||||
|
||||
logging.getLogger(__name__).addHandler(NullHandler())
|
||||
|
||||
|
||||
def add_stderr_logger(level=logging.DEBUG):
|
||||
"""
|
||||
Helper for quickly adding a StreamHandler to the logger. Useful for
|
||||
@@ -55,9 +75,16 @@ def add_stderr_logger(level=logging.DEBUG):
|
||||
del NullHandler
|
||||
|
||||
|
||||
# Set security warning to only go off once by default.
|
||||
import warnings
|
||||
warnings.simplefilter('always', exceptions.SecurityWarning)
|
||||
# SecurityWarning's always go off by default.
|
||||
warnings.simplefilter('always', exceptions.SecurityWarning, append=True)
|
||||
# SubjectAltNameWarning's should go off once per host
|
||||
warnings.simplefilter('default', exceptions.SubjectAltNameWarning)
|
||||
# InsecurePlatformWarning's don't vary between requests, so we keep it default.
|
||||
warnings.simplefilter('default', exceptions.InsecurePlatformWarning,
|
||||
append=True)
|
||||
# SNIMissingWarnings should go off only once.
|
||||
warnings.simplefilter('default', exceptions.SNIMissingWarning)
|
||||
|
||||
|
||||
def disable_warnings(category=exceptions.HTTPWarning):
|
||||
"""
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
from __future__ import absolute_import
|
||||
from collections import Mapping, MutableMapping
|
||||
try:
|
||||
from threading import RLock
|
||||
except ImportError: # Platform-specific: No threads available
|
||||
except ImportError: # Platform-specific: No threads available
|
||||
class RLock:
|
||||
def __enter__(self):
|
||||
pass
|
||||
@@ -10,11 +11,11 @@ except ImportError: # Platform-specific: No threads available
|
||||
pass
|
||||
|
||||
|
||||
try: # Python 2.7+
|
||||
try: # Python 2.7+
|
||||
from collections import OrderedDict
|
||||
except ImportError:
|
||||
from .packages.ordered_dict import OrderedDict
|
||||
from .packages.six import iterkeys, itervalues
|
||||
from .packages.six import iterkeys, itervalues, PY3
|
||||
|
||||
|
||||
__all__ = ['RecentlyUsedContainer', 'HTTPHeaderDict']
|
||||
@@ -129,25 +130,82 @@ class HTTPHeaderDict(MutableMapping):
|
||||
'foo=bar, baz=quxx'
|
||||
>>> headers['Content-Length']
|
||||
'7'
|
||||
|
||||
If you want to access the raw headers with their original casing
|
||||
for debugging purposes you can access the private ``._data`` attribute
|
||||
which is a normal python ``dict`` that maps the case-insensitive key to a
|
||||
list of tuples stored as (case-sensitive-original-name, value). Using the
|
||||
structure from above as our example:
|
||||
|
||||
>>> headers._data
|
||||
{'set-cookie': [('Set-Cookie', 'foo=bar'), ('set-cookie', 'baz=quxx')],
|
||||
'content-length': [('content-length', '7')]}
|
||||
"""
|
||||
|
||||
def __init__(self, headers=None, **kwargs):
|
||||
self._data = {}
|
||||
if headers is None:
|
||||
headers = {}
|
||||
self.update(headers, **kwargs)
|
||||
super(HTTPHeaderDict, self).__init__()
|
||||
self._container = {}
|
||||
if headers is not None:
|
||||
if isinstance(headers, HTTPHeaderDict):
|
||||
self._copy_from(headers)
|
||||
else:
|
||||
self.extend(headers)
|
||||
if kwargs:
|
||||
self.extend(kwargs)
|
||||
|
||||
def add(self, key, value):
|
||||
def __setitem__(self, key, val):
|
||||
self._container[key.lower()] = (key, val)
|
||||
return self._container[key.lower()]
|
||||
|
||||
def __getitem__(self, key):
|
||||
val = self._container[key.lower()]
|
||||
return ', '.join(val[1:])
|
||||
|
||||
def __delitem__(self, key):
|
||||
del self._container[key.lower()]
|
||||
|
||||
def __contains__(self, key):
|
||||
return key.lower() in self._container
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, Mapping) and not hasattr(other, 'keys'):
|
||||
return False
|
||||
if not isinstance(other, type(self)):
|
||||
other = type(self)(other)
|
||||
return (dict((k.lower(), v) for k, v in self.itermerged()) ==
|
||||
dict((k.lower(), v) for k, v in other.itermerged()))
|
||||
|
||||
def __ne__(self, other):
|
||||
return not self.__eq__(other)
|
||||
|
||||
if not PY3: # Python 2
|
||||
iterkeys = MutableMapping.iterkeys
|
||||
itervalues = MutableMapping.itervalues
|
||||
|
||||
__marker = object()
|
||||
|
||||
def __len__(self):
|
||||
return len(self._container)
|
||||
|
||||
def __iter__(self):
|
||||
# Only provide the originally cased names
|
||||
for vals in self._container.values():
|
||||
yield vals[0]
|
||||
|
||||
def pop(self, key, default=__marker):
|
||||
'''D.pop(k[,d]) -> v, remove specified key and return the corresponding value.
|
||||
If key is not found, d is returned if given, otherwise KeyError is raised.
|
||||
'''
|
||||
# Using the MutableMapping function directly fails due to the private marker.
|
||||
# Using ordinary dict.pop would expose the internal structures.
|
||||
# So let's reinvent the wheel.
|
||||
try:
|
||||
value = self[key]
|
||||
except KeyError:
|
||||
if default is self.__marker:
|
||||
raise
|
||||
return default
|
||||
else:
|
||||
del self[key]
|
||||
return value
|
||||
|
||||
def discard(self, key):
|
||||
try:
|
||||
del self[key]
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
def add(self, key, val):
|
||||
"""Adds a (name, value) pair, doesn't overwrite the value if it already
|
||||
exists.
|
||||
|
||||
@@ -156,43 +214,111 @@ class HTTPHeaderDict(MutableMapping):
|
||||
>>> headers['foo']
|
||||
'bar, baz'
|
||||
"""
|
||||
self._data.setdefault(key.lower(), []).append((key, value))
|
||||
key_lower = key.lower()
|
||||
new_vals = key, val
|
||||
# Keep the common case aka no item present as fast as possible
|
||||
vals = self._container.setdefault(key_lower, new_vals)
|
||||
if new_vals is not vals:
|
||||
# new_vals was not inserted, as there was a previous one
|
||||
if isinstance(vals, list):
|
||||
# If already several items got inserted, we have a list
|
||||
vals.append(val)
|
||||
else:
|
||||
# vals should be a tuple then, i.e. only one item so far
|
||||
# Need to convert the tuple to list for further extension
|
||||
self._container[key_lower] = [vals[0], vals[1], val]
|
||||
|
||||
def extend(self, *args, **kwargs):
|
||||
"""Generic import function for any type of header-like object.
|
||||
Adapted version of MutableMapping.update in order to insert items
|
||||
with self.add instead of self.__setitem__
|
||||
"""
|
||||
if len(args) > 1:
|
||||
raise TypeError("extend() takes at most 1 positional "
|
||||
"arguments ({0} given)".format(len(args)))
|
||||
other = args[0] if len(args) >= 1 else ()
|
||||
|
||||
if isinstance(other, HTTPHeaderDict):
|
||||
for key, val in other.iteritems():
|
||||
self.add(key, val)
|
||||
elif isinstance(other, Mapping):
|
||||
for key in other:
|
||||
self.add(key, other[key])
|
||||
elif hasattr(other, "keys"):
|
||||
for key in other.keys():
|
||||
self.add(key, other[key])
|
||||
else:
|
||||
for key, value in other:
|
||||
self.add(key, value)
|
||||
|
||||
for key, value in kwargs.items():
|
||||
self.add(key, value)
|
||||
|
||||
def getlist(self, key):
|
||||
"""Returns a list of all the values for the named field. Returns an
|
||||
empty list if the key doesn't exist."""
|
||||
return self[key].split(', ') if key in self else []
|
||||
try:
|
||||
vals = self._container[key.lower()]
|
||||
except KeyError:
|
||||
return []
|
||||
else:
|
||||
if isinstance(vals, tuple):
|
||||
return [vals[1]]
|
||||
else:
|
||||
return vals[1:]
|
||||
|
||||
def copy(self):
|
||||
h = HTTPHeaderDict()
|
||||
for key in self._data:
|
||||
for rawkey, value in self._data[key]:
|
||||
h.add(rawkey, value)
|
||||
return h
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, Mapping):
|
||||
return False
|
||||
other = HTTPHeaderDict(other)
|
||||
return dict((k1, self[k1]) for k1 in self._data) == \
|
||||
dict((k2, other[k2]) for k2 in other._data)
|
||||
|
||||
def __getitem__(self, key):
|
||||
values = self._data[key.lower()]
|
||||
return ', '.join(value[1] for value in values)
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
self._data[key.lower()] = [(key, value)]
|
||||
|
||||
def __delitem__(self, key):
|
||||
del self._data[key.lower()]
|
||||
|
||||
def __len__(self):
|
||||
return len(self._data)
|
||||
|
||||
def __iter__(self):
|
||||
for headers in itervalues(self._data):
|
||||
yield headers[0][0]
|
||||
# Backwards compatibility for httplib
|
||||
getheaders = getlist
|
||||
getallmatchingheaders = getlist
|
||||
iget = getlist
|
||||
|
||||
def __repr__(self):
|
||||
return '%s(%r)' % (self.__class__.__name__, dict(self.items()))
|
||||
return "%s(%s)" % (type(self).__name__, dict(self.itermerged()))
|
||||
|
||||
def _copy_from(self, other):
|
||||
for key in other:
|
||||
val = other.getlist(key)
|
||||
if isinstance(val, list):
|
||||
# Don't need to convert tuples
|
||||
val = list(val)
|
||||
self._container[key.lower()] = [key] + val
|
||||
|
||||
def copy(self):
|
||||
clone = type(self)()
|
||||
clone._copy_from(self)
|
||||
return clone
|
||||
|
||||
def iteritems(self):
|
||||
"""Iterate over all header lines, including duplicate ones."""
|
||||
for key in self:
|
||||
vals = self._container[key.lower()]
|
||||
for val in vals[1:]:
|
||||
yield vals[0], val
|
||||
|
||||
def itermerged(self):
|
||||
"""Iterate over all headers, merging duplicate ones together."""
|
||||
for key in self:
|
||||
val = self._container[key.lower()]
|
||||
yield val[0], ', '.join(val[1:])
|
||||
|
||||
def items(self):
|
||||
return list(self.iteritems())
|
||||
|
||||
@classmethod
|
||||
def from_httplib(cls, message): # Python 2
|
||||
"""Read headers from a Python 2 httplib message object."""
|
||||
# python2.7 does not expose a proper API for exporting multiheaders
|
||||
# efficiently. This function re-reads raw lines from the message
|
||||
# object and extracts the multiheaders properly.
|
||||
headers = []
|
||||
|
||||
for line in message.headers:
|
||||
if line.startswith((' ', '\t')):
|
||||
key, value = headers[-1]
|
||||
headers[-1] = (key, value + '\r\n' + line.rstrip())
|
||||
continue
|
||||
|
||||
key, value = line.split(':', 1)
|
||||
headers.append((key, value.strip()))
|
||||
|
||||
return cls(headers)
|
||||
|
||||
@@ -1,23 +1,20 @@
|
||||
from __future__ import absolute_import
|
||||
import datetime
|
||||
import os
|
||||
import sys
|
||||
import socket
|
||||
from socket import timeout as SocketTimeout
|
||||
from socket import error as SocketError, timeout as SocketTimeout
|
||||
import warnings
|
||||
from .packages import six
|
||||
|
||||
try: # Python 3
|
||||
from http.client import HTTPConnection as _HTTPConnection, HTTPException
|
||||
from http.client import HTTPConnection as _HTTPConnection
|
||||
from http.client import HTTPException # noqa: unused in this module
|
||||
except ImportError:
|
||||
from httplib import HTTPConnection as _HTTPConnection, HTTPException
|
||||
|
||||
|
||||
class DummyConnection(object):
|
||||
"Used to detect a failed ConnectionCls import."
|
||||
pass
|
||||
|
||||
from httplib import HTTPConnection as _HTTPConnection
|
||||
from httplib import HTTPException # noqa: unused in this module
|
||||
|
||||
try: # Compiled with SSL?
|
||||
HTTPSConnection = DummyConnection
|
||||
import ssl
|
||||
BaseSSLError = ssl.SSLError
|
||||
except (ImportError, AttributeError): # Platform-specific: No SSL.
|
||||
@@ -36,9 +33,10 @@ except NameError: # Python 2:
|
||||
|
||||
|
||||
from .exceptions import (
|
||||
NewConnectionError,
|
||||
ConnectTimeoutError,
|
||||
SubjectAltNameWarning,
|
||||
SystemTimeWarning,
|
||||
SecurityWarning,
|
||||
)
|
||||
from .packages.ssl_match_hostname import match_hostname
|
||||
|
||||
@@ -60,6 +58,11 @@ port_by_scheme = {
|
||||
RECENT_DATE = datetime.date(2014, 1, 1)
|
||||
|
||||
|
||||
class DummyConnection(object):
|
||||
"""Used to detect a failed ConnectionCls import."""
|
||||
pass
|
||||
|
||||
|
||||
class HTTPConnection(_HTTPConnection, object):
|
||||
"""
|
||||
Based on httplib.HTTPConnection but provides an extra constructor
|
||||
@@ -133,11 +136,15 @@ class HTTPConnection(_HTTPConnection, object):
|
||||
conn = connection.create_connection(
|
||||
(self.host, self.port), self.timeout, **extra_kw)
|
||||
|
||||
except SocketTimeout:
|
||||
except SocketTimeout as e:
|
||||
raise ConnectTimeoutError(
|
||||
self, "Connection to %s timed out. (connect timeout=%s)" %
|
||||
(self.host, self.timeout))
|
||||
|
||||
except SocketError as e:
|
||||
raise NewConnectionError(
|
||||
self, "Failed to establish a new connection: %s" % e)
|
||||
|
||||
return conn
|
||||
|
||||
def _prepare_conn(self, conn):
|
||||
@@ -185,19 +192,25 @@ class VerifiedHTTPSConnection(HTTPSConnection):
|
||||
"""
|
||||
cert_reqs = None
|
||||
ca_certs = None
|
||||
ca_cert_dir = None
|
||||
ssl_version = None
|
||||
assert_fingerprint = None
|
||||
|
||||
def set_cert(self, key_file=None, cert_file=None,
|
||||
cert_reqs=None, ca_certs=None,
|
||||
assert_hostname=None, assert_fingerprint=None):
|
||||
assert_hostname=None, assert_fingerprint=None,
|
||||
ca_cert_dir=None):
|
||||
|
||||
if (ca_certs or ca_cert_dir) and cert_reqs is None:
|
||||
cert_reqs = 'CERT_REQUIRED'
|
||||
|
||||
self.key_file = key_file
|
||||
self.cert_file = cert_file
|
||||
self.cert_reqs = cert_reqs
|
||||
self.ca_certs = ca_certs
|
||||
self.assert_hostname = assert_hostname
|
||||
self.assert_fingerprint = assert_fingerprint
|
||||
self.ca_certs = ca_certs and os.path.expanduser(ca_certs)
|
||||
self.ca_cert_dir = ca_cert_dir and os.path.expanduser(ca_cert_dir)
|
||||
|
||||
def connect(self):
|
||||
# Add certificate verification
|
||||
@@ -234,6 +247,7 @@ class VerifiedHTTPSConnection(HTTPSConnection):
|
||||
self.sock = ssl_wrap_socket(conn, self.key_file, self.cert_file,
|
||||
cert_reqs=resolved_cert_reqs,
|
||||
ca_certs=self.ca_certs,
|
||||
ca_cert_dir=self.ca_cert_dir,
|
||||
server_hostname=hostname,
|
||||
ssl_version=resolved_ssl_version)
|
||||
|
||||
@@ -245,18 +259,30 @@ class VerifiedHTTPSConnection(HTTPSConnection):
|
||||
cert = self.sock.getpeercert()
|
||||
if not cert.get('subjectAltName', ()):
|
||||
warnings.warn((
|
||||
'Certificate has no `subjectAltName`, falling back to check for a `commonName` for now. '
|
||||
'This feature is being removed by major browsers and deprecated by RFC 2818. '
|
||||
'(See https://github.com/shazow/urllib3/issues/497 for details.)'),
|
||||
SecurityWarning
|
||||
'Certificate for {0} has no `subjectAltName`, falling back to check for a '
|
||||
'`commonName` for now. This feature is being removed by major browsers and '
|
||||
'deprecated by RFC 2818. (See https://github.com/shazow/urllib3/issues/497 '
|
||||
'for details.)'.format(hostname)),
|
||||
SubjectAltNameWarning
|
||||
)
|
||||
match_hostname(cert, self.assert_hostname or hostname)
|
||||
|
||||
self.is_verified = (resolved_cert_reqs == ssl.CERT_REQUIRED
|
||||
or self.assert_fingerprint is not None)
|
||||
# In case the hostname is an IPv6 address, strip the square
|
||||
# brackets from it before using it to validate. This is because
|
||||
# a certificate with an IPv6 address in it won't have square
|
||||
# brackets around that address. Sadly, match_hostname won't do this
|
||||
# for us: it expects the plain host part without any extra work
|
||||
# that might have been done to make it palatable to httplib.
|
||||
asserted_hostname = self.assert_hostname or hostname
|
||||
asserted_hostname = asserted_hostname.strip('[]')
|
||||
match_hostname(cert, asserted_hostname)
|
||||
|
||||
self.is_verified = (resolved_cert_reqs == ssl.CERT_REQUIRED or
|
||||
self.assert_fingerprint is not None)
|
||||
|
||||
|
||||
if ssl:
|
||||
# Make a copy for testing.
|
||||
UnverifiedHTTPSConnection = HTTPSConnection
|
||||
HTTPSConnection = VerifiedHTTPSConnection
|
||||
else:
|
||||
HTTPSConnection = DummyConnection
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from __future__ import absolute_import
|
||||
import errno
|
||||
import logging
|
||||
import sys
|
||||
@@ -10,13 +11,15 @@ try: # Python 3
|
||||
from queue import LifoQueue, Empty, Full
|
||||
except ImportError:
|
||||
from Queue import LifoQueue, Empty, Full
|
||||
import Queue as _ # Platform-specific: Windows
|
||||
# Queue is imported for side effects on MS Windows
|
||||
import Queue as _unused_module_Queue # noqa: unused
|
||||
|
||||
|
||||
from .exceptions import (
|
||||
ClosedPoolError,
|
||||
ProtocolError,
|
||||
EmptyPoolError,
|
||||
HeaderParsingError,
|
||||
HostChangedError,
|
||||
LocationValueError,
|
||||
MaxRetryError,
|
||||
@@ -25,6 +28,7 @@ from .exceptions import (
|
||||
SSLError,
|
||||
TimeoutError,
|
||||
InsecureRequestWarning,
|
||||
NewConnectionError,
|
||||
)
|
||||
from .packages.ssl_match_hostname import CertificateError
|
||||
from .packages import six
|
||||
@@ -32,15 +36,16 @@ from .connection import (
|
||||
port_by_scheme,
|
||||
DummyConnection,
|
||||
HTTPConnection, HTTPSConnection, VerifiedHTTPSConnection,
|
||||
HTTPException, BaseSSLError, ConnectionError
|
||||
HTTPException, BaseSSLError,
|
||||
)
|
||||
from .request import RequestMethods
|
||||
from .response import HTTPResponse
|
||||
|
||||
from .util.connection import is_connection_dropped
|
||||
from .util.response import assert_header_parsing
|
||||
from .util.retry import Retry
|
||||
from .util.timeout import Timeout
|
||||
from .util.url import get_host
|
||||
from .util.url import get_host, Url
|
||||
|
||||
|
||||
xrange = six.moves.xrange
|
||||
@@ -50,7 +55,7 @@ log = logging.getLogger(__name__)
|
||||
_Default = object()
|
||||
|
||||
|
||||
## Pool objects
|
||||
# Pool objects
|
||||
class ConnectionPool(object):
|
||||
"""
|
||||
Base class for all connection pools, such as
|
||||
@@ -64,14 +69,28 @@ class ConnectionPool(object):
|
||||
if not host:
|
||||
raise LocationValueError("No host specified.")
|
||||
|
||||
# httplib doesn't like it when we include brackets in ipv6 addresses
|
||||
self.host = host.strip('[]')
|
||||
self.host = host
|
||||
self.port = port
|
||||
|
||||
def __str__(self):
|
||||
return '%s(host=%r, port=%r)' % (type(self).__name__,
|
||||
self.host, self.port)
|
||||
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
self.close()
|
||||
# Return False to re-raise any potential exceptions
|
||||
return False
|
||||
|
||||
def close():
|
||||
"""
|
||||
Close all pooled connections and disable the pool.
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
# This is taken from http://hg.python.org/cpython/file/7aaba721ebc0/Lib/socket.py#l252
|
||||
_blocking_errnos = set([errno.EAGAIN, errno.EWOULDBLOCK])
|
||||
|
||||
@@ -105,7 +124,7 @@ class HTTPConnectionPool(ConnectionPool, RequestMethods):
|
||||
|
||||
:param maxsize:
|
||||
Number of connections to save that can be reused. More than 1 is useful
|
||||
in multithreaded situations. If ``block`` is set to false, more
|
||||
in multithreaded situations. If ``block`` is set to False, more
|
||||
connections will be created but they will not be saved once they've
|
||||
been used.
|
||||
|
||||
@@ -266,6 +285,10 @@ class HTTPConnectionPool(ConnectionPool, RequestMethods):
|
||||
"""
|
||||
pass
|
||||
|
||||
def _prepare_proxy(self, conn):
|
||||
# Nothing to do for HTTP connections.
|
||||
pass
|
||||
|
||||
def _get_timeout(self, timeout):
|
||||
""" Helper that always returns a :class:`urllib3.util.Timeout` """
|
||||
if timeout is _Default:
|
||||
@@ -349,7 +372,7 @@ class HTTPConnectionPool(ConnectionPool, RequestMethods):
|
||||
|
||||
# Receive the response from the server
|
||||
try:
|
||||
try: # Python 2.7+, use buffering of HTTP responses
|
||||
try: # Python 2.7, use buffering of HTTP responses
|
||||
httplib_response = conn.getresponse(buffering=True)
|
||||
except TypeError: # Python 2.6 and older
|
||||
httplib_response = conn.getresponse()
|
||||
@@ -362,8 +385,19 @@ class HTTPConnectionPool(ConnectionPool, RequestMethods):
|
||||
log.debug("\"%s %s %s\" %s %s" % (method, url, http_version,
|
||||
httplib_response.status,
|
||||
httplib_response.length))
|
||||
|
||||
try:
|
||||
assert_header_parsing(httplib_response.msg)
|
||||
except HeaderParsingError as hpe: # Platform-specific: Python 3
|
||||
log.warning(
|
||||
'Failed to parse headers (url=%s): %s',
|
||||
self._absolute_url(url), hpe, exc_info=True)
|
||||
|
||||
return httplib_response
|
||||
|
||||
def _absolute_url(self, path):
|
||||
return Url(scheme=self.scheme, host=self.host, port=self.port, path=path).url
|
||||
|
||||
def close(self):
|
||||
"""
|
||||
Close all pooled connections and disable the pool.
|
||||
@@ -510,11 +544,18 @@ class HTTPConnectionPool(ConnectionPool, RequestMethods):
|
||||
|
||||
try:
|
||||
# Request a connection from the queue.
|
||||
timeout_obj = self._get_timeout(timeout)
|
||||
conn = self._get_conn(timeout=pool_timeout)
|
||||
|
||||
conn.timeout = timeout_obj.connect_timeout
|
||||
|
||||
is_new_proxy_conn = self.proxy is not None and not getattr(conn, 'sock', None)
|
||||
if is_new_proxy_conn:
|
||||
self._prepare_proxy(conn)
|
||||
|
||||
# Make the request on the httplib connection object.
|
||||
httplib_response = self._make_request(conn, method, url,
|
||||
timeout=timeout,
|
||||
timeout=timeout_obj,
|
||||
body=body, headers=headers)
|
||||
|
||||
# If we're going to release the connection in ``finally:``, then
|
||||
@@ -542,26 +583,30 @@ class HTTPConnectionPool(ConnectionPool, RequestMethods):
|
||||
# Close the connection. If a connection is reused on which there
|
||||
# was a Certificate error, the next request will certainly raise
|
||||
# another Certificate error.
|
||||
if conn:
|
||||
conn.close()
|
||||
conn = None
|
||||
conn = conn and conn.close()
|
||||
release_conn = True
|
||||
raise SSLError(e)
|
||||
|
||||
except (TimeoutError, HTTPException, SocketError, ConnectionError) as e:
|
||||
if conn:
|
||||
# Discard the connection for these exceptions. It will be
|
||||
# be replaced during the next _get_conn() call.
|
||||
conn.close()
|
||||
conn = None
|
||||
except SSLError:
|
||||
# Treat SSLError separately from BaseSSLError to preserve
|
||||
# traceback.
|
||||
conn = conn and conn.close()
|
||||
release_conn = True
|
||||
raise
|
||||
|
||||
stacktrace = sys.exc_info()[2]
|
||||
if isinstance(e, SocketError) and self.proxy:
|
||||
except (TimeoutError, HTTPException, SocketError, ProtocolError) as e:
|
||||
# Discard the connection for these exceptions. It will be
|
||||
# be replaced during the next _get_conn() call.
|
||||
conn = conn and conn.close()
|
||||
release_conn = True
|
||||
|
||||
if isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
|
||||
e = ProxyError('Cannot connect to proxy.', e)
|
||||
elif isinstance(e, (SocketError, HTTPException)):
|
||||
e = ProtocolError('Connection aborted.', e)
|
||||
|
||||
retries = retries.increment(method, url, error=e,
|
||||
_pool=self, _stacktrace=stacktrace)
|
||||
retries = retries.increment(method, url, error=e, _pool=self,
|
||||
_stacktrace=sys.exc_info()[2])
|
||||
retries.sleep()
|
||||
|
||||
# Keep track of the error for the retry warning.
|
||||
@@ -593,26 +638,31 @@ class HTTPConnectionPool(ConnectionPool, RequestMethods):
|
||||
retries = retries.increment(method, url, response=response, _pool=self)
|
||||
except MaxRetryError:
|
||||
if retries.raise_on_redirect:
|
||||
# Release the connection for this response, since we're not
|
||||
# returning it to be released manually.
|
||||
response.release_conn()
|
||||
raise
|
||||
return response
|
||||
|
||||
log.info("Redirecting %s -> %s" % (url, redirect_location))
|
||||
return self.urlopen(method, redirect_location, body, headers,
|
||||
retries=retries, redirect=redirect,
|
||||
assert_same_host=assert_same_host,
|
||||
timeout=timeout, pool_timeout=pool_timeout,
|
||||
release_conn=release_conn, **response_kw)
|
||||
return self.urlopen(
|
||||
method, redirect_location, body, headers,
|
||||
retries=retries, redirect=redirect,
|
||||
assert_same_host=assert_same_host,
|
||||
timeout=timeout, pool_timeout=pool_timeout,
|
||||
release_conn=release_conn, **response_kw)
|
||||
|
||||
# Check if we should retry the HTTP response.
|
||||
if retries.is_forced_retry(method, status_code=response.status):
|
||||
retries = retries.increment(method, url, response=response, _pool=self)
|
||||
retries.sleep()
|
||||
log.info("Forced retry: %s" % url)
|
||||
return self.urlopen(method, url, body, headers,
|
||||
retries=retries, redirect=redirect,
|
||||
assert_same_host=assert_same_host,
|
||||
timeout=timeout, pool_timeout=pool_timeout,
|
||||
release_conn=release_conn, **response_kw)
|
||||
return self.urlopen(
|
||||
method, url, body, headers,
|
||||
retries=retries, redirect=redirect,
|
||||
assert_same_host=assert_same_host,
|
||||
timeout=timeout, pool_timeout=pool_timeout,
|
||||
release_conn=release_conn, **response_kw)
|
||||
|
||||
return response
|
||||
|
||||
@@ -629,10 +679,10 @@ class HTTPSConnectionPool(HTTPConnectionPool):
|
||||
``assert_hostname`` and ``host`` in this order to verify connections.
|
||||
If ``assert_hostname`` is False, no verification is done.
|
||||
|
||||
The ``key_file``, ``cert_file``, ``cert_reqs``, ``ca_certs`` and
|
||||
``ssl_version`` are only used if :mod:`ssl` is available and are fed into
|
||||
:meth:`urllib3.util.ssl_wrap_socket` to upgrade the connection socket
|
||||
into an SSL socket.
|
||||
The ``key_file``, ``cert_file``, ``cert_reqs``, ``ca_certs``,
|
||||
``ca_cert_dir``, and ``ssl_version`` are only used if :mod:`ssl` is
|
||||
available and are fed into :meth:`urllib3.util.ssl_wrap_socket` to upgrade
|
||||
the connection socket into an SSL socket.
|
||||
"""
|
||||
|
||||
scheme = 'https'
|
||||
@@ -645,15 +695,20 @@ class HTTPSConnectionPool(HTTPConnectionPool):
|
||||
key_file=None, cert_file=None, cert_reqs=None,
|
||||
ca_certs=None, ssl_version=None,
|
||||
assert_hostname=None, assert_fingerprint=None,
|
||||
**conn_kw):
|
||||
ca_cert_dir=None, **conn_kw):
|
||||
|
||||
HTTPConnectionPool.__init__(self, host, port, strict, timeout, maxsize,
|
||||
block, headers, retries, _proxy, _proxy_headers,
|
||||
**conn_kw)
|
||||
|
||||
if ca_certs and cert_reqs is None:
|
||||
cert_reqs = 'CERT_REQUIRED'
|
||||
|
||||
self.key_file = key_file
|
||||
self.cert_file = cert_file
|
||||
self.cert_reqs = cert_reqs
|
||||
self.ca_certs = ca_certs
|
||||
self.ca_cert_dir = ca_cert_dir
|
||||
self.ssl_version = ssl_version
|
||||
self.assert_hostname = assert_hostname
|
||||
self.assert_fingerprint = assert_fingerprint
|
||||
@@ -669,28 +724,31 @@ class HTTPSConnectionPool(HTTPConnectionPool):
|
||||
cert_file=self.cert_file,
|
||||
cert_reqs=self.cert_reqs,
|
||||
ca_certs=self.ca_certs,
|
||||
ca_cert_dir=self.ca_cert_dir,
|
||||
assert_hostname=self.assert_hostname,
|
||||
assert_fingerprint=self.assert_fingerprint)
|
||||
conn.ssl_version = self.ssl_version
|
||||
|
||||
if self.proxy is not None:
|
||||
# Python 2.7+
|
||||
try:
|
||||
set_tunnel = conn.set_tunnel
|
||||
except AttributeError: # Platform-specific: Python 2.6
|
||||
set_tunnel = conn._set_tunnel
|
||||
|
||||
if sys.version_info <= (2, 6, 4) and not self.proxy_headers: # Python 2.6.4 and older
|
||||
set_tunnel(self.host, self.port)
|
||||
else:
|
||||
set_tunnel(self.host, self.port, self.proxy_headers)
|
||||
|
||||
# Establish tunnel connection early, because otherwise httplib
|
||||
# would improperly set Host: header to proxy's IP:port.
|
||||
conn.connect()
|
||||
|
||||
return conn
|
||||
|
||||
def _prepare_proxy(self, conn):
|
||||
"""
|
||||
Establish tunnel connection early, because otherwise httplib
|
||||
would improperly set Host: header to proxy's IP:port.
|
||||
"""
|
||||
# Python 2.7+
|
||||
try:
|
||||
set_tunnel = conn.set_tunnel
|
||||
except AttributeError: # Platform-specific: Python 2.6
|
||||
set_tunnel = conn._set_tunnel
|
||||
|
||||
if sys.version_info <= (2, 6, 4) and not self.proxy_headers: # Python 2.6.4 and older
|
||||
set_tunnel(self.host, self.port)
|
||||
else:
|
||||
set_tunnel(self.host, self.port, self.proxy_headers)
|
||||
|
||||
conn.connect()
|
||||
|
||||
def _new_conn(self):
|
||||
"""
|
||||
Return a fresh :class:`httplib.HTTPSConnection`.
|
||||
@@ -700,7 +758,6 @@ class HTTPSConnectionPool(HTTPConnectionPool):
|
||||
% (self.num_connections, self.host))
|
||||
|
||||
if not self.ConnectionCls or self.ConnectionCls is DummyConnection:
|
||||
# Platform-specific: Python without ssl
|
||||
raise SSLError("Can't connect to HTTPS URL because the SSL "
|
||||
"module is not available.")
|
||||
|
||||
|
||||
223
python/requests/requests/packages/urllib3/contrib/appengine.py
Normal file
223
python/requests/requests/packages/urllib3/contrib/appengine.py
Normal file
@@ -0,0 +1,223 @@
|
||||
from __future__ import absolute_import
|
||||
import logging
|
||||
import os
|
||||
import warnings
|
||||
|
||||
from ..exceptions import (
|
||||
HTTPError,
|
||||
HTTPWarning,
|
||||
MaxRetryError,
|
||||
ProtocolError,
|
||||
TimeoutError,
|
||||
SSLError
|
||||
)
|
||||
|
||||
from ..packages.six import BytesIO
|
||||
from ..request import RequestMethods
|
||||
from ..response import HTTPResponse
|
||||
from ..util.timeout import Timeout
|
||||
from ..util.retry import Retry
|
||||
|
||||
try:
|
||||
from google.appengine.api import urlfetch
|
||||
except ImportError:
|
||||
urlfetch = None
|
||||
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class AppEnginePlatformWarning(HTTPWarning):
|
||||
pass
|
||||
|
||||
|
||||
class AppEnginePlatformError(HTTPError):
|
||||
pass
|
||||
|
||||
|
||||
class AppEngineManager(RequestMethods):
|
||||
"""
|
||||
Connection manager for Google App Engine sandbox applications.
|
||||
|
||||
This manager uses the URLFetch service directly instead of using the
|
||||
emulated httplib, and is subject to URLFetch limitations as described in
|
||||
the App Engine documentation here:
|
||||
|
||||
https://cloud.google.com/appengine/docs/python/urlfetch
|
||||
|
||||
Notably it will raise an AppEnginePlatformError if:
|
||||
* URLFetch is not available.
|
||||
* If you attempt to use this on GAEv2 (Managed VMs), as full socket
|
||||
support is available.
|
||||
* If a request size is more than 10 megabytes.
|
||||
* If a response size is more than 32 megabtyes.
|
||||
* If you use an unsupported request method such as OPTIONS.
|
||||
|
||||
Beyond those cases, it will raise normal urllib3 errors.
|
||||
"""
|
||||
|
||||
def __init__(self, headers=None, retries=None, validate_certificate=True):
|
||||
if not urlfetch:
|
||||
raise AppEnginePlatformError(
|
||||
"URLFetch is not available in this environment.")
|
||||
|
||||
if is_prod_appengine_mvms():
|
||||
raise AppEnginePlatformError(
|
||||
"Use normal urllib3.PoolManager instead of AppEngineManager"
|
||||
"on Managed VMs, as using URLFetch is not necessary in "
|
||||
"this environment.")
|
||||
|
||||
warnings.warn(
|
||||
"urllib3 is using URLFetch on Google App Engine sandbox instead "
|
||||
"of sockets. To use sockets directly instead of URLFetch see "
|
||||
"https://urllib3.readthedocs.org/en/latest/contrib.html.",
|
||||
AppEnginePlatformWarning)
|
||||
|
||||
RequestMethods.__init__(self, headers)
|
||||
self.validate_certificate = validate_certificate
|
||||
|
||||
self.retries = retries or Retry.DEFAULT
|
||||
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
# Return False to re-raise any potential exceptions
|
||||
return False
|
||||
|
||||
def urlopen(self, method, url, body=None, headers=None,
|
||||
retries=None, redirect=True, timeout=Timeout.DEFAULT_TIMEOUT,
|
||||
**response_kw):
|
||||
|
||||
retries = self._get_retries(retries, redirect)
|
||||
|
||||
try:
|
||||
response = urlfetch.fetch(
|
||||
url,
|
||||
payload=body,
|
||||
method=method,
|
||||
headers=headers or {},
|
||||
allow_truncated=False,
|
||||
follow_redirects=(
|
||||
redirect and
|
||||
retries.redirect != 0 and
|
||||
retries.total),
|
||||
deadline=self._get_absolute_timeout(timeout),
|
||||
validate_certificate=self.validate_certificate,
|
||||
)
|
||||
except urlfetch.DeadlineExceededError as e:
|
||||
raise TimeoutError(self, e)
|
||||
|
||||
except urlfetch.InvalidURLError as e:
|
||||
if 'too large' in str(e):
|
||||
raise AppEnginePlatformError(
|
||||
"URLFetch request too large, URLFetch only "
|
||||
"supports requests up to 10mb in size.", e)
|
||||
raise ProtocolError(e)
|
||||
|
||||
except urlfetch.DownloadError as e:
|
||||
if 'Too many redirects' in str(e):
|
||||
raise MaxRetryError(self, url, reason=e)
|
||||
raise ProtocolError(e)
|
||||
|
||||
except urlfetch.ResponseTooLargeError as e:
|
||||
raise AppEnginePlatformError(
|
||||
"URLFetch response too large, URLFetch only supports"
|
||||
"responses up to 32mb in size.", e)
|
||||
|
||||
except urlfetch.SSLCertificateError as e:
|
||||
raise SSLError(e)
|
||||
|
||||
except urlfetch.InvalidMethodError as e:
|
||||
raise AppEnginePlatformError(
|
||||
"URLFetch does not support method: %s" % method, e)
|
||||
|
||||
http_response = self._urlfetch_response_to_http_response(
|
||||
response, **response_kw)
|
||||
|
||||
# Check for redirect response
|
||||
if (http_response.get_redirect_location() and
|
||||
retries.raise_on_redirect and redirect):
|
||||
raise MaxRetryError(self, url, "too many redirects")
|
||||
|
||||
# Check if we should retry the HTTP response.
|
||||
if retries.is_forced_retry(method, status_code=http_response.status):
|
||||
retries = retries.increment(
|
||||
method, url, response=http_response, _pool=self)
|
||||
log.info("Forced retry: %s" % url)
|
||||
retries.sleep()
|
||||
return self.urlopen(
|
||||
method, url,
|
||||
body=body, headers=headers,
|
||||
retries=retries, redirect=redirect,
|
||||
timeout=timeout, **response_kw)
|
||||
|
||||
return http_response
|
||||
|
||||
def _urlfetch_response_to_http_response(self, urlfetch_resp, **response_kw):
|
||||
|
||||
if is_prod_appengine():
|
||||
# Production GAE handles deflate encoding automatically, but does
|
||||
# not remove the encoding header.
|
||||
content_encoding = urlfetch_resp.headers.get('content-encoding')
|
||||
|
||||
if content_encoding == 'deflate':
|
||||
del urlfetch_resp.headers['content-encoding']
|
||||
|
||||
return HTTPResponse(
|
||||
# In order for decoding to work, we must present the content as
|
||||
# a file-like object.
|
||||
body=BytesIO(urlfetch_resp.content),
|
||||
headers=urlfetch_resp.headers,
|
||||
status=urlfetch_resp.status_code,
|
||||
**response_kw
|
||||
)
|
||||
|
||||
def _get_absolute_timeout(self, timeout):
|
||||
if timeout is Timeout.DEFAULT_TIMEOUT:
|
||||
return 5 # 5s is the default timeout for URLFetch.
|
||||
if isinstance(timeout, Timeout):
|
||||
if timeout.read is not timeout.connect:
|
||||
warnings.warn(
|
||||
"URLFetch does not support granular timeout settings, "
|
||||
"reverting to total timeout.", AppEnginePlatformWarning)
|
||||
return timeout.total
|
||||
return timeout
|
||||
|
||||
def _get_retries(self, retries, redirect):
|
||||
if not isinstance(retries, Retry):
|
||||
retries = Retry.from_int(
|
||||
retries, redirect=redirect, default=self.retries)
|
||||
|
||||
if retries.connect or retries.read or retries.redirect:
|
||||
warnings.warn(
|
||||
"URLFetch only supports total retries and does not "
|
||||
"recognize connect, read, or redirect retry parameters.",
|
||||
AppEnginePlatformWarning)
|
||||
|
||||
return retries
|
||||
|
||||
|
||||
def is_appengine():
|
||||
return (is_local_appengine() or
|
||||
is_prod_appengine() or
|
||||
is_prod_appengine_mvms())
|
||||
|
||||
|
||||
def is_appengine_sandbox():
|
||||
return is_appengine() and not is_prod_appengine_mvms()
|
||||
|
||||
|
||||
def is_local_appengine():
|
||||
return ('APPENGINE_RUNTIME' in os.environ and
|
||||
'Development/' in os.environ['SERVER_SOFTWARE'])
|
||||
|
||||
|
||||
def is_prod_appengine():
|
||||
return ('APPENGINE_RUNTIME' in os.environ and
|
||||
'Google App Engine/' in os.environ['SERVER_SOFTWARE'] and
|
||||
not is_prod_appengine_mvms())
|
||||
|
||||
|
||||
def is_prod_appengine_mvms():
|
||||
return os.environ.get('GAE_VM', False) == 'true'
|
||||
@@ -3,6 +3,7 @@ NTLM authenticating pool, contributed by erikcederstran
|
||||
|
||||
Issue #10, see: http://code.google.com/p/urllib3/issues/detail?id=10
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
|
||||
try:
|
||||
from http.client import HTTPSConnection
|
||||
|
||||
@@ -38,13 +38,12 @@ Module Variables
|
||||
----------------
|
||||
|
||||
:var DEFAULT_SSL_CIPHER_LIST: The list of supported SSL/TLS cipher suites.
|
||||
Default: ``ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:
|
||||
ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+3DES:!aNULL:!MD5:!DSS``
|
||||
|
||||
.. _sni: https://en.wikipedia.org/wiki/Server_Name_Indication
|
||||
.. _crime attack: https://en.wikipedia.org/wiki/CRIME_(security_exploit)
|
||||
|
||||
'''
|
||||
from __future__ import absolute_import
|
||||
|
||||
try:
|
||||
from ndg.httpsclient.ssl_peer_verification import SUBJ_ALT_NAME_SUPPORT
|
||||
@@ -55,7 +54,7 @@ except SyntaxError as e:
|
||||
import OpenSSL.SSL
|
||||
from pyasn1.codec.der import decoder as der_decoder
|
||||
from pyasn1.type import univ, constraint
|
||||
from socket import _fileobject, timeout
|
||||
from socket import _fileobject, timeout, error as SocketError
|
||||
import ssl
|
||||
import select
|
||||
|
||||
@@ -73,6 +72,12 @@ _openssl_versions = {
|
||||
ssl.PROTOCOL_TLSv1: OpenSSL.SSL.TLSv1_METHOD,
|
||||
}
|
||||
|
||||
if hasattr(ssl, 'PROTOCOL_TLSv1_1') and hasattr(OpenSSL.SSL, 'TLSv1_1_METHOD'):
|
||||
_openssl_versions[ssl.PROTOCOL_TLSv1_1] = OpenSSL.SSL.TLSv1_1_METHOD
|
||||
|
||||
if hasattr(ssl, 'PROTOCOL_TLSv1_2') and hasattr(OpenSSL.SSL, 'TLSv1_2_METHOD'):
|
||||
_openssl_versions[ssl.PROTOCOL_TLSv1_2] = OpenSSL.SSL.TLSv1_2_METHOD
|
||||
|
||||
try:
|
||||
_openssl_versions.update({ssl.PROTOCOL_SSLv3: OpenSSL.SSL.SSLv3_METHOD})
|
||||
except AttributeError:
|
||||
@@ -81,27 +86,14 @@ except AttributeError:
|
||||
_openssl_verify = {
|
||||
ssl.CERT_NONE: OpenSSL.SSL.VERIFY_NONE,
|
||||
ssl.CERT_OPTIONAL: OpenSSL.SSL.VERIFY_PEER,
|
||||
ssl.CERT_REQUIRED: OpenSSL.SSL.VERIFY_PEER
|
||||
+ OpenSSL.SSL.VERIFY_FAIL_IF_NO_PEER_CERT,
|
||||
ssl.CERT_REQUIRED:
|
||||
OpenSSL.SSL.VERIFY_PEER + OpenSSL.SSL.VERIFY_FAIL_IF_NO_PEER_CERT,
|
||||
}
|
||||
|
||||
# A secure default.
|
||||
# Sources for more information on TLS ciphers:
|
||||
#
|
||||
# - https://wiki.mozilla.org/Security/Server_Side_TLS
|
||||
# - https://www.ssllabs.com/projects/best-practices/index.html
|
||||
# - https://hynek.me/articles/hardening-your-web-servers-ssl-ciphers/
|
||||
#
|
||||
# The general intent is:
|
||||
# - Prefer cipher suites that offer perfect forward secrecy (DHE/ECDHE),
|
||||
# - prefer ECDHE over DHE for better performance,
|
||||
# - prefer any AES-GCM over any AES-CBC for better performance and security,
|
||||
# - use 3DES as fallback which is secure but slow,
|
||||
# - disable NULL authentication, MD5 MACs and DSS for security reasons.
|
||||
DEFAULT_SSL_CIPHER_LIST = "ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:" + \
|
||||
"ECDH+AES128:DH+AES:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+3DES:" + \
|
||||
"!aNULL:!MD5:!DSS"
|
||||
DEFAULT_SSL_CIPHER_LIST = util.ssl_.DEFAULT_CIPHERS
|
||||
|
||||
# OpenSSL will only write 16K at a time
|
||||
SSL_WRITE_BLOCKSIZE = 16384
|
||||
|
||||
orig_util_HAS_SNI = util.HAS_SNI
|
||||
orig_connection_ssl_wrap_socket = connection.ssl_wrap_socket
|
||||
@@ -121,7 +113,7 @@ def extract_from_urllib3():
|
||||
util.HAS_SNI = orig_util_HAS_SNI
|
||||
|
||||
|
||||
### Note: This is a slightly bug-fixed version of same from ndg-httpsclient.
|
||||
# Note: This is a slightly bug-fixed version of same from ndg-httpsclient.
|
||||
class SubjectAltName(BaseSubjectAltName):
|
||||
'''ASN.1 implementation for subjectAltNames support'''
|
||||
|
||||
@@ -132,7 +124,7 @@ class SubjectAltName(BaseSubjectAltName):
|
||||
constraint.ValueSizeConstraint(1, 1024)
|
||||
|
||||
|
||||
### Note: This is a slightly bug-fixed version of same from ndg-httpsclient.
|
||||
# Note: This is a slightly bug-fixed version of same from ndg-httpsclient.
|
||||
def get_subj_alt_name(peer_cert):
|
||||
# Search through extensions
|
||||
dns_name = []
|
||||
@@ -189,6 +181,11 @@ class WrappedSocket(object):
|
||||
except OpenSSL.SSL.SysCallError as e:
|
||||
if self.suppress_ragged_eofs and e.args == (-1, 'Unexpected EOF'):
|
||||
return b''
|
||||
else:
|
||||
raise SocketError(e)
|
||||
except OpenSSL.SSL.ZeroReturnError as e:
|
||||
if self.connection.get_shutdown() == OpenSSL.SSL.RECEIVED_SHUTDOWN:
|
||||
return b''
|
||||
else:
|
||||
raise
|
||||
except OpenSSL.SSL.WantReadError:
|
||||
@@ -216,13 +213,21 @@ class WrappedSocket(object):
|
||||
continue
|
||||
|
||||
def sendall(self, data):
|
||||
while len(data):
|
||||
sent = self._send_until_done(data)
|
||||
data = data[sent:]
|
||||
total_sent = 0
|
||||
while total_sent < len(data):
|
||||
sent = self._send_until_done(data[total_sent:total_sent + SSL_WRITE_BLOCKSIZE])
|
||||
total_sent += sent
|
||||
|
||||
def shutdown(self):
|
||||
# FIXME rethrow compatible exceptions should we ever use this
|
||||
self.connection.shutdown()
|
||||
|
||||
def close(self):
|
||||
if self._makefile_refs < 1:
|
||||
return self.connection.shutdown()
|
||||
try:
|
||||
return self.connection.close()
|
||||
except OpenSSL.SSL.Error:
|
||||
return
|
||||
else:
|
||||
self._makefile_refs -= 1
|
||||
|
||||
@@ -263,7 +268,7 @@ def _verify_callback(cnx, x509, err_no, err_depth, return_code):
|
||||
|
||||
def ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None,
|
||||
ca_certs=None, server_hostname=None,
|
||||
ssl_version=None):
|
||||
ssl_version=None, ca_cert_dir=None):
|
||||
ctx = OpenSSL.SSL.Context(_openssl_versions[ssl_version])
|
||||
if certfile:
|
||||
keyfile = keyfile or certfile # Match behaviour of the normal python ssl library
|
||||
@@ -272,9 +277,9 @@ def ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None,
|
||||
ctx.use_privatekey_file(keyfile)
|
||||
if cert_reqs != ssl.CERT_NONE:
|
||||
ctx.set_verify(_openssl_verify[cert_reqs], _verify_callback)
|
||||
if ca_certs:
|
||||
if ca_certs or ca_cert_dir:
|
||||
try:
|
||||
ctx.load_verify_locations(ca_certs, None)
|
||||
ctx.load_verify_locations(ca_certs, ca_cert_dir)
|
||||
except OpenSSL.SSL.Error as e:
|
||||
raise ssl.SSLError('bad ca_certs: %r' % ca_certs, e)
|
||||
else:
|
||||
@@ -294,10 +299,12 @@ def ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None,
|
||||
try:
|
||||
cnx.do_handshake()
|
||||
except OpenSSL.SSL.WantReadError:
|
||||
select.select([sock], [], [])
|
||||
rd, _, _ = select.select([sock], [], [], sock.gettimeout())
|
||||
if not rd:
|
||||
raise timeout('select timed out')
|
||||
continue
|
||||
except OpenSSL.SSL.Error as e:
|
||||
raise ssl.SSLError('bad handshake', e)
|
||||
raise ssl.SSLError('bad handshake: %r' % e)
|
||||
break
|
||||
|
||||
return WrappedSocket(cnx, sock)
|
||||
|
||||
@@ -1,16 +1,17 @@
|
||||
from __future__ import absolute_import
|
||||
# Base Exceptions
|
||||
|
||||
## Base Exceptions
|
||||
|
||||
class HTTPError(Exception):
|
||||
"Base exception used by this module."
|
||||
pass
|
||||
|
||||
|
||||
class HTTPWarning(Warning):
|
||||
"Base warning used by this module."
|
||||
pass
|
||||
|
||||
|
||||
|
||||
class PoolError(HTTPError):
|
||||
"Base exception for errors caused within a pool."
|
||||
def __init__(self, pool, message):
|
||||
@@ -57,7 +58,7 @@ class ProtocolError(HTTPError):
|
||||
ConnectionError = ProtocolError
|
||||
|
||||
|
||||
## Leaf Exceptions
|
||||
# Leaf Exceptions
|
||||
|
||||
class MaxRetryError(RequestError):
|
||||
"""Raised when the maximum number of retries is exceeded.
|
||||
@@ -113,6 +114,11 @@ class ConnectTimeoutError(TimeoutError):
|
||||
pass
|
||||
|
||||
|
||||
class NewConnectionError(ConnectTimeoutError, PoolError):
|
||||
"Raised when we fail to establish a new connection. Usually ECONNREFUSED."
|
||||
pass
|
||||
|
||||
|
||||
class EmptyPoolError(PoolError):
|
||||
"Raised when a pool runs out of connections and no more are allowed."
|
||||
pass
|
||||
@@ -149,6 +155,11 @@ class SecurityWarning(HTTPWarning):
|
||||
pass
|
||||
|
||||
|
||||
class SubjectAltNameWarning(SecurityWarning):
|
||||
"Warned when connecting to a host with a certificate missing a SAN."
|
||||
pass
|
||||
|
||||
|
||||
class InsecureRequestWarning(SecurityWarning):
|
||||
"Warned when making an unverified HTTPS request."
|
||||
pass
|
||||
@@ -157,3 +168,34 @@ class InsecureRequestWarning(SecurityWarning):
|
||||
class SystemTimeWarning(SecurityWarning):
|
||||
"Warned when system time is suspected to be wrong"
|
||||
pass
|
||||
|
||||
|
||||
class InsecurePlatformWarning(SecurityWarning):
|
||||
"Warned when certain SSL configuration is not available on a platform."
|
||||
pass
|
||||
|
||||
|
||||
class SNIMissingWarning(HTTPWarning):
|
||||
"Warned when making a HTTPS request without SNI available."
|
||||
pass
|
||||
|
||||
|
||||
class ResponseNotChunked(ProtocolError, ValueError):
|
||||
"Response needs to be chunked in order to read it as chunks."
|
||||
pass
|
||||
|
||||
|
||||
class ProxySchemeUnknown(AssertionError, ValueError):
|
||||
"ProxyManager does not support the supplied scheme"
|
||||
# TODO(t-8ch): Stop inheriting from AssertionError in v2.0.
|
||||
|
||||
def __init__(self, scheme):
|
||||
message = "Not supported proxy scheme %s" % scheme
|
||||
super(ProxySchemeUnknown, self).__init__(message)
|
||||
|
||||
|
||||
class HeaderParsingError(HTTPError):
|
||||
"Raised by assert_header_parsing, but we convert it to a log.warning statement."
|
||||
def __init__(self, defects, unparsed_data):
|
||||
message = '%s, unparsed data: %r' % (defects or 'Unknown', unparsed_data)
|
||||
super(HeaderParsingError, self).__init__(message)
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from __future__ import absolute_import
|
||||
import email.utils
|
||||
import mimetypes
|
||||
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from __future__ import absolute_import
|
||||
import codecs
|
||||
|
||||
from uuid import uuid4
|
||||
|
||||
@@ -2,3 +2,4 @@ from __future__ import absolute_import
|
||||
|
||||
from . import ssl_match_hostname
|
||||
|
||||
__all__ = ('ssl_match_hostname', )
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from __future__ import absolute_import
|
||||
import logging
|
||||
|
||||
try: # Python 3
|
||||
@@ -8,7 +9,7 @@ except ImportError:
|
||||
from ._collections import RecentlyUsedContainer
|
||||
from .connectionpool import HTTPConnectionPool, HTTPSConnectionPool
|
||||
from .connectionpool import port_by_scheme
|
||||
from .exceptions import LocationValueError
|
||||
from .exceptions import LocationValueError, MaxRetryError, ProxySchemeUnknown
|
||||
from .request import RequestMethods
|
||||
from .util.url import parse_url
|
||||
from .util.retry import Retry
|
||||
@@ -25,7 +26,7 @@ pool_classes_by_scheme = {
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
SSL_KEYWORDS = ('key_file', 'cert_file', 'cert_reqs', 'ca_certs',
|
||||
'ssl_version')
|
||||
'ssl_version', 'ca_cert_dir')
|
||||
|
||||
|
||||
class PoolManager(RequestMethods):
|
||||
@@ -64,6 +65,14 @@ class PoolManager(RequestMethods):
|
||||
self.pools = RecentlyUsedContainer(num_pools,
|
||||
dispose_func=lambda p: p.close())
|
||||
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
self.clear()
|
||||
# Return False to re-raise any potential exceptions
|
||||
return False
|
||||
|
||||
def _new_pool(self, scheme, host, port):
|
||||
"""
|
||||
Create a new :class:`ConnectionPool` based on host, port and scheme.
|
||||
@@ -167,7 +176,14 @@ class PoolManager(RequestMethods):
|
||||
if not isinstance(retries, Retry):
|
||||
retries = Retry.from_int(retries, redirect=redirect)
|
||||
|
||||
kw['retries'] = retries.increment(method, redirect_location)
|
||||
try:
|
||||
retries = retries.increment(method, url, response=response, _pool=conn)
|
||||
except MaxRetryError:
|
||||
if retries.raise_on_redirect:
|
||||
raise
|
||||
return response
|
||||
|
||||
kw['retries'] = retries
|
||||
kw['redirect'] = redirect
|
||||
|
||||
log.info("Redirecting %s -> %s" % (url, redirect_location))
|
||||
@@ -212,8 +228,8 @@ class ProxyManager(PoolManager):
|
||||
port = port_by_scheme.get(proxy.scheme, 80)
|
||||
proxy = proxy._replace(port=port)
|
||||
|
||||
assert proxy.scheme in ("http", "https"), \
|
||||
'Not supported proxy scheme %s' % proxy.scheme
|
||||
if proxy.scheme not in ("http", "https"):
|
||||
raise ProxySchemeUnknown(proxy.scheme)
|
||||
|
||||
self.proxy = proxy
|
||||
self.proxy_headers = proxy_headers or {}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from __future__ import absolute_import
|
||||
try:
|
||||
from urllib.parse import urlencode
|
||||
except ImportError:
|
||||
@@ -71,14 +72,22 @@ class RequestMethods(object):
|
||||
headers=headers,
|
||||
**urlopen_kw)
|
||||
|
||||
def request_encode_url(self, method, url, fields=None, **urlopen_kw):
|
||||
def request_encode_url(self, method, url, fields=None, headers=None,
|
||||
**urlopen_kw):
|
||||
"""
|
||||
Make a request using :meth:`urlopen` with the ``fields`` encoded in
|
||||
the url. This is useful for request methods like GET, HEAD, DELETE, etc.
|
||||
"""
|
||||
if headers is None:
|
||||
headers = self.headers
|
||||
|
||||
extra_kw = {'headers': headers}
|
||||
extra_kw.update(urlopen_kw)
|
||||
|
||||
if fields:
|
||||
url += '?' + urlencode(fields)
|
||||
return self.urlopen(method, url, **urlopen_kw)
|
||||
|
||||
return self.urlopen(method, url, **extra_kw)
|
||||
|
||||
def request_encode_body(self, method, url, fields=None, headers=None,
|
||||
encode_multipart=True, multipart_boundary=None,
|
||||
@@ -125,7 +134,8 @@ class RequestMethods(object):
|
||||
|
||||
if fields:
|
||||
if 'body' in urlopen_kw:
|
||||
raise TypeError('request got values for both \'fields\' and \'body\', can only specify one.')
|
||||
raise TypeError(
|
||||
"request got values for both 'fields' and 'body', can only specify one.")
|
||||
|
||||
if encode_multipart:
|
||||
body, content_type = encode_multipart_formdata(fields, boundary=multipart_boundary)
|
||||
|
||||
@@ -1,13 +1,18 @@
|
||||
from __future__ import absolute_import
|
||||
from contextlib import contextmanager
|
||||
import zlib
|
||||
import io
|
||||
from socket import timeout as SocketTimeout
|
||||
from socket import error as SocketError
|
||||
|
||||
from ._collections import HTTPHeaderDict
|
||||
from .exceptions import ProtocolError, DecodeError, ReadTimeoutError
|
||||
from .packages.six import string_types as basestring, binary_type
|
||||
from .exceptions import (
|
||||
ProtocolError, DecodeError, ReadTimeoutError, ResponseNotChunked
|
||||
)
|
||||
from .packages.six import string_types as basestring, binary_type, PY3
|
||||
from .packages.six.moves import http_client as httplib
|
||||
from .connection import HTTPException, BaseSSLError
|
||||
from .util.response import is_fp_closed
|
||||
|
||||
from .util.response import is_fp_closed, is_response_to_head
|
||||
|
||||
|
||||
class DeflateDecoder(object):
|
||||
@@ -21,6 +26,9 @@ class DeflateDecoder(object):
|
||||
return getattr(self._obj, name)
|
||||
|
||||
def decompress(self, data):
|
||||
if not data:
|
||||
return data
|
||||
|
||||
if not self._first_try:
|
||||
return self._obj.decompress(data)
|
||||
|
||||
@@ -36,9 +44,23 @@ class DeflateDecoder(object):
|
||||
self._data = None
|
||||
|
||||
|
||||
class GzipDecoder(object):
|
||||
|
||||
def __init__(self):
|
||||
self._obj = zlib.decompressobj(16 + zlib.MAX_WBITS)
|
||||
|
||||
def __getattr__(self, name):
|
||||
return getattr(self._obj, name)
|
||||
|
||||
def decompress(self, data):
|
||||
if not data:
|
||||
return data
|
||||
return self._obj.decompress(data)
|
||||
|
||||
|
||||
def _get_decoder(mode):
|
||||
if mode == 'gzip':
|
||||
return zlib.decompressobj(16 + zlib.MAX_WBITS)
|
||||
return GzipDecoder()
|
||||
|
||||
return DeflateDecoder()
|
||||
|
||||
@@ -76,9 +98,10 @@ class HTTPResponse(io.IOBase):
|
||||
strict=0, preload_content=True, decode_content=True,
|
||||
original_response=None, pool=None, connection=None):
|
||||
|
||||
self.headers = HTTPHeaderDict()
|
||||
if headers:
|
||||
self.headers.update(headers)
|
||||
if isinstance(headers, HTTPHeaderDict):
|
||||
self.headers = headers
|
||||
else:
|
||||
self.headers = HTTPHeaderDict(headers)
|
||||
self.status = status
|
||||
self.version = version
|
||||
self.reason = reason
|
||||
@@ -100,6 +123,16 @@ class HTTPResponse(io.IOBase):
|
||||
if hasattr(body, 'read'):
|
||||
self._fp = body
|
||||
|
||||
# Are we using the chunked-style of transfer encoding?
|
||||
self.chunked = False
|
||||
self.chunk_left = None
|
||||
tr_enc = self.headers.get('transfer-encoding', '').lower()
|
||||
# Don't incur the penalty of creating a list and then discarding it
|
||||
encodings = (enc.strip() for enc in tr_enc.split(","))
|
||||
if "chunked" in encodings:
|
||||
self.chunked = True
|
||||
|
||||
# If requested, preload the body.
|
||||
if preload_content and not self._body:
|
||||
self._body = self.read(decode_content=decode_content)
|
||||
|
||||
@@ -140,6 +173,93 @@ class HTTPResponse(io.IOBase):
|
||||
"""
|
||||
return self._fp_bytes_read
|
||||
|
||||
def _init_decoder(self):
|
||||
"""
|
||||
Set-up the _decoder attribute if necessar.
|
||||
"""
|
||||
# Note: content-encoding value should be case-insensitive, per RFC 7230
|
||||
# Section 3.2
|
||||
content_encoding = self.headers.get('content-encoding', '').lower()
|
||||
if self._decoder is None and content_encoding in self.CONTENT_DECODERS:
|
||||
self._decoder = _get_decoder(content_encoding)
|
||||
|
||||
def _decode(self, data, decode_content, flush_decoder):
|
||||
"""
|
||||
Decode the data passed in and potentially flush the decoder.
|
||||
"""
|
||||
try:
|
||||
if decode_content and self._decoder:
|
||||
data = self._decoder.decompress(data)
|
||||
except (IOError, zlib.error) as e:
|
||||
content_encoding = self.headers.get('content-encoding', '').lower()
|
||||
raise DecodeError(
|
||||
"Received response with content-encoding: %s, but "
|
||||
"failed to decode it." % content_encoding, e)
|
||||
|
||||
if flush_decoder and decode_content:
|
||||
data += self._flush_decoder()
|
||||
|
||||
return data
|
||||
|
||||
def _flush_decoder(self):
|
||||
"""
|
||||
Flushes the decoder. Should only be called if the decoder is actually
|
||||
being used.
|
||||
"""
|
||||
if self._decoder:
|
||||
buf = self._decoder.decompress(b'')
|
||||
return buf + self._decoder.flush()
|
||||
|
||||
return b''
|
||||
|
||||
@contextmanager
|
||||
def _error_catcher(self):
|
||||
"""
|
||||
Catch low-level python exceptions, instead re-raising urllib3
|
||||
variants, so that low-level exceptions are not leaked in the
|
||||
high-level api.
|
||||
|
||||
On exit, release the connection back to the pool.
|
||||
"""
|
||||
try:
|
||||
try:
|
||||
yield
|
||||
|
||||
except SocketTimeout:
|
||||
# FIXME: Ideally we'd like to include the url in the ReadTimeoutError but
|
||||
# there is yet no clean way to get at it from this context.
|
||||
raise ReadTimeoutError(self._pool, None, 'Read timed out.')
|
||||
|
||||
except BaseSSLError as e:
|
||||
# FIXME: Is there a better way to differentiate between SSLErrors?
|
||||
if 'read operation timed out' not in str(e): # Defensive:
|
||||
# This shouldn't happen but just in case we're missing an edge
|
||||
# case, let's avoid swallowing SSL errors.
|
||||
raise
|
||||
|
||||
raise ReadTimeoutError(self._pool, None, 'Read timed out.')
|
||||
|
||||
except (HTTPException, SocketError) as e:
|
||||
# This includes IncompleteRead.
|
||||
raise ProtocolError('Connection broken: %r' % e, e)
|
||||
|
||||
except Exception:
|
||||
# The response may not be closed but we're not going to use it anymore
|
||||
# so close it now to ensure that the connection is released back to the pool.
|
||||
if self._original_response and not self._original_response.isclosed():
|
||||
self._original_response.close()
|
||||
|
||||
# Closing the response may not actually be sufficient to close
|
||||
# everything, so if we have a hold of the connection close that
|
||||
# too.
|
||||
if self._connection is not None:
|
||||
self._connection.close()
|
||||
|
||||
raise
|
||||
finally:
|
||||
if self._original_response and self._original_response.isclosed():
|
||||
self.release_conn()
|
||||
|
||||
def read(self, amt=None, decode_content=None, cache_content=False):
|
||||
"""
|
||||
Similar to :meth:`httplib.HTTPResponse.read`, but with two additional
|
||||
@@ -161,12 +281,7 @@ class HTTPResponse(io.IOBase):
|
||||
after having ``.read()`` the file object. (Overridden if ``amt`` is
|
||||
set.)
|
||||
"""
|
||||
# Note: content-encoding value should be case-insensitive, per RFC 7230
|
||||
# Section 3.2
|
||||
content_encoding = self.headers.get('content-encoding', '').lower()
|
||||
if self._decoder is None:
|
||||
if content_encoding in self.CONTENT_DECODERS:
|
||||
self._decoder = _get_decoder(content_encoding)
|
||||
self._init_decoder()
|
||||
if decode_content is None:
|
||||
decode_content = self.decode_content
|
||||
|
||||
@@ -174,67 +289,36 @@ class HTTPResponse(io.IOBase):
|
||||
return
|
||||
|
||||
flush_decoder = False
|
||||
data = None
|
||||
|
||||
try:
|
||||
try:
|
||||
if amt is None:
|
||||
# cStringIO doesn't like amt=None
|
||||
data = self._fp.read()
|
||||
with self._error_catcher():
|
||||
if amt is None:
|
||||
# cStringIO doesn't like amt=None
|
||||
data = self._fp.read()
|
||||
flush_decoder = True
|
||||
else:
|
||||
cache_content = False
|
||||
data = self._fp.read(amt)
|
||||
if amt != 0 and not data: # Platform-specific: Buggy versions of Python.
|
||||
# Close the connection when no data is returned
|
||||
#
|
||||
# This is redundant to what httplib/http.client _should_
|
||||
# already do. However, versions of python released before
|
||||
# December 15, 2012 (http://bugs.python.org/issue16298) do
|
||||
# not properly close the connection in all cases. There is
|
||||
# no harm in redundantly calling close.
|
||||
self._fp.close()
|
||||
flush_decoder = True
|
||||
else:
|
||||
cache_content = False
|
||||
data = self._fp.read(amt)
|
||||
if amt != 0 and not data: # Platform-specific: Buggy versions of Python.
|
||||
# Close the connection when no data is returned
|
||||
#
|
||||
# This is redundant to what httplib/http.client _should_
|
||||
# already do. However, versions of python released before
|
||||
# December 15, 2012 (http://bugs.python.org/issue16298) do
|
||||
# not properly close the connection in all cases. There is
|
||||
# no harm in redundantly calling close.
|
||||
self._fp.close()
|
||||
flush_decoder = True
|
||||
|
||||
except SocketTimeout:
|
||||
# FIXME: Ideally we'd like to include the url in the ReadTimeoutError but
|
||||
# there is yet no clean way to get at it from this context.
|
||||
raise ReadTimeoutError(self._pool, None, 'Read timed out.')
|
||||
|
||||
except BaseSSLError as e:
|
||||
# FIXME: Is there a better way to differentiate between SSLErrors?
|
||||
if not 'read operation timed out' in str(e): # Defensive:
|
||||
# This shouldn't happen but just in case we're missing an edge
|
||||
# case, let's avoid swallowing SSL errors.
|
||||
raise
|
||||
|
||||
raise ReadTimeoutError(self._pool, None, 'Read timed out.')
|
||||
|
||||
except HTTPException as e:
|
||||
# This includes IncompleteRead.
|
||||
raise ProtocolError('Connection broken: %r' % e, e)
|
||||
|
||||
if data:
|
||||
self._fp_bytes_read += len(data)
|
||||
|
||||
try:
|
||||
if decode_content and self._decoder:
|
||||
data = self._decoder.decompress(data)
|
||||
except (IOError, zlib.error) as e:
|
||||
raise DecodeError(
|
||||
"Received response with content-encoding: %s, but "
|
||||
"failed to decode it." % content_encoding, e)
|
||||
|
||||
if flush_decoder and decode_content and self._decoder:
|
||||
buf = self._decoder.decompress(binary_type())
|
||||
data += buf + self._decoder.flush()
|
||||
data = self._decode(data, decode_content, flush_decoder)
|
||||
|
||||
if cache_content:
|
||||
self._body = data
|
||||
|
||||
return data
|
||||
|
||||
finally:
|
||||
if self._original_response and self._original_response.isclosed():
|
||||
self.release_conn()
|
||||
return data
|
||||
|
||||
def stream(self, amt=2**16, decode_content=None):
|
||||
"""
|
||||
@@ -252,11 +336,15 @@ class HTTPResponse(io.IOBase):
|
||||
If True, will attempt to decode the body based on the
|
||||
'content-encoding' header.
|
||||
"""
|
||||
while not is_fp_closed(self._fp):
|
||||
data = self.read(amt=amt, decode_content=decode_content)
|
||||
if self.chunked:
|
||||
for line in self.read_chunked(amt, decode_content=decode_content):
|
||||
yield line
|
||||
else:
|
||||
while not is_fp_closed(self._fp):
|
||||
data = self.read(amt=amt, decode_content=decode_content)
|
||||
|
||||
if data:
|
||||
yield data
|
||||
if data:
|
||||
yield data
|
||||
|
||||
@classmethod
|
||||
def from_httplib(ResponseCls, r, **response_kw):
|
||||
@@ -267,14 +355,17 @@ class HTTPResponse(io.IOBase):
|
||||
Remaining parameters are passed to the HTTPResponse constructor, along
|
||||
with ``original_response=r``.
|
||||
"""
|
||||
headers = r.msg
|
||||
|
||||
headers = HTTPHeaderDict()
|
||||
for k, v in r.getheaders():
|
||||
headers.add(k, v)
|
||||
if not isinstance(headers, HTTPHeaderDict):
|
||||
if PY3: # Python 3
|
||||
headers = HTTPHeaderDict(headers.items())
|
||||
else: # Python 2
|
||||
headers = HTTPHeaderDict.from_httplib(headers)
|
||||
|
||||
# HTTPResponse objects in Python 3 don't have a .strict attribute
|
||||
strict = getattr(r, 'strict', 0)
|
||||
return ResponseCls(body=r,
|
||||
resp = ResponseCls(body=r,
|
||||
headers=headers,
|
||||
status=r.status,
|
||||
version=r.version,
|
||||
@@ -282,6 +373,7 @@ class HTTPResponse(io.IOBase):
|
||||
strict=strict,
|
||||
original_response=r,
|
||||
**response_kw)
|
||||
return resp
|
||||
|
||||
# Backwards-compatibility methods for httplib.HTTPResponse
|
||||
def getheaders(self):
|
||||
@@ -331,3 +423,92 @@ class HTTPResponse(io.IOBase):
|
||||
else:
|
||||
b[:len(temp)] = temp
|
||||
return len(temp)
|
||||
|
||||
def _update_chunk_length(self):
|
||||
# First, we'll figure out length of a chunk and then
|
||||
# we'll try to read it from socket.
|
||||
if self.chunk_left is not None:
|
||||
return
|
||||
line = self._fp.fp.readline()
|
||||
line = line.split(b';', 1)[0]
|
||||
try:
|
||||
self.chunk_left = int(line, 16)
|
||||
except ValueError:
|
||||
# Invalid chunked protocol response, abort.
|
||||
self.close()
|
||||
raise httplib.IncompleteRead(line)
|
||||
|
||||
def _handle_chunk(self, amt):
|
||||
returned_chunk = None
|
||||
if amt is None:
|
||||
chunk = self._fp._safe_read(self.chunk_left)
|
||||
returned_chunk = chunk
|
||||
self._fp._safe_read(2) # Toss the CRLF at the end of the chunk.
|
||||
self.chunk_left = None
|
||||
elif amt < self.chunk_left:
|
||||
value = self._fp._safe_read(amt)
|
||||
self.chunk_left = self.chunk_left - amt
|
||||
returned_chunk = value
|
||||
elif amt == self.chunk_left:
|
||||
value = self._fp._safe_read(amt)
|
||||
self._fp._safe_read(2) # Toss the CRLF at the end of the chunk.
|
||||
self.chunk_left = None
|
||||
returned_chunk = value
|
||||
else: # amt > self.chunk_left
|
||||
returned_chunk = self._fp._safe_read(self.chunk_left)
|
||||
self._fp._safe_read(2) # Toss the CRLF at the end of the chunk.
|
||||
self.chunk_left = None
|
||||
return returned_chunk
|
||||
|
||||
def read_chunked(self, amt=None, decode_content=None):
|
||||
"""
|
||||
Similar to :meth:`HTTPResponse.read`, but with an additional
|
||||
parameter: ``decode_content``.
|
||||
|
||||
:param decode_content:
|
||||
If True, will attempt to decode the body based on the
|
||||
'content-encoding' header.
|
||||
"""
|
||||
self._init_decoder()
|
||||
# FIXME: Rewrite this method and make it a class with a better structured logic.
|
||||
if not self.chunked:
|
||||
raise ResponseNotChunked(
|
||||
"Response is not chunked. "
|
||||
"Header 'transfer-encoding: chunked' is missing.")
|
||||
|
||||
# Don't bother reading the body of a HEAD request.
|
||||
if self._original_response and is_response_to_head(self._original_response):
|
||||
self._original_response.close()
|
||||
return
|
||||
|
||||
with self._error_catcher():
|
||||
while True:
|
||||
self._update_chunk_length()
|
||||
if self.chunk_left == 0:
|
||||
break
|
||||
chunk = self._handle_chunk(amt)
|
||||
decoded = self._decode(chunk, decode_content=decode_content,
|
||||
flush_decoder=False)
|
||||
if decoded:
|
||||
yield decoded
|
||||
|
||||
if decode_content:
|
||||
# On CPython and PyPy, we should never need to flush the
|
||||
# decoder. However, on Jython we *might* need to, so
|
||||
# lets defensively do it anyway.
|
||||
decoded = self._flush_decoder()
|
||||
if decoded: # Platform-specific: Jython.
|
||||
yield decoded
|
||||
|
||||
# Chunk content ends with \r\n: discard it.
|
||||
while True:
|
||||
line = self._fp.fp.readline()
|
||||
if not line:
|
||||
# Some sites may not end with '\r\n'.
|
||||
break
|
||||
if line == b'\r\n':
|
||||
break
|
||||
|
||||
# We read everything; close the "file".
|
||||
if self._original_response:
|
||||
self._original_response.close()
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from __future__ import absolute_import
|
||||
# For backwards compatibility, provide imports that used to be here.
|
||||
from .connection import is_connection_dropped
|
||||
from .request import make_headers
|
||||
@@ -22,3 +23,22 @@ from .url import (
|
||||
split_first,
|
||||
Url,
|
||||
)
|
||||
|
||||
__all__ = (
|
||||
'HAS_SNI',
|
||||
'SSLContext',
|
||||
'Retry',
|
||||
'Timeout',
|
||||
'Url',
|
||||
'assert_fingerprint',
|
||||
'current_time',
|
||||
'is_connection_dropped',
|
||||
'is_fp_closed',
|
||||
'get_host',
|
||||
'parse_url',
|
||||
'make_headers',
|
||||
'resolve_cert_reqs',
|
||||
'resolve_ssl_version',
|
||||
'split_first',
|
||||
'ssl_wrap_socket',
|
||||
)
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from __future__ import absolute_import
|
||||
import socket
|
||||
try:
|
||||
from select import poll, POLLIN
|
||||
@@ -60,6 +61,8 @@ def create_connection(address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
|
||||
"""
|
||||
|
||||
host, port = address
|
||||
if host.startswith('['):
|
||||
host = host.strip('[]')
|
||||
err = None
|
||||
for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM):
|
||||
af, socktype, proto, canonname, sa = res
|
||||
@@ -78,15 +81,16 @@ def create_connection(address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
|
||||
sock.connect(sa)
|
||||
return sock
|
||||
|
||||
except socket.error as _:
|
||||
err = _
|
||||
except socket.error as e:
|
||||
err = e
|
||||
if sock is not None:
|
||||
sock.close()
|
||||
sock = None
|
||||
|
||||
if err is not None:
|
||||
raise err
|
||||
else:
|
||||
raise socket.error("getaddrinfo returns an empty list")
|
||||
|
||||
raise socket.error("getaddrinfo returns an empty list")
|
||||
|
||||
|
||||
def _set_socket_options(sock, options):
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from __future__ import absolute_import
|
||||
from base64 import b64encode
|
||||
|
||||
from ..packages.six import b
|
||||
|
||||
@@ -1,3 +1,9 @@
|
||||
from __future__ import absolute_import
|
||||
from ..packages.six.moves import http_client as httplib
|
||||
|
||||
from ..exceptions import HeaderParsingError
|
||||
|
||||
|
||||
def is_fp_closed(obj):
|
||||
"""
|
||||
Checks whether a given file-like object is closed.
|
||||
@@ -20,3 +26,49 @@ def is_fp_closed(obj):
|
||||
pass
|
||||
|
||||
raise ValueError("Unable to determine whether fp is closed.")
|
||||
|
||||
|
||||
def assert_header_parsing(headers):
|
||||
"""
|
||||
Asserts whether all headers have been successfully parsed.
|
||||
Extracts encountered errors from the result of parsing headers.
|
||||
|
||||
Only works on Python 3.
|
||||
|
||||
:param headers: Headers to verify.
|
||||
:type headers: `httplib.HTTPMessage`.
|
||||
|
||||
:raises urllib3.exceptions.HeaderParsingError:
|
||||
If parsing errors are found.
|
||||
"""
|
||||
|
||||
# This will fail silently if we pass in the wrong kind of parameter.
|
||||
# To make debugging easier add an explicit check.
|
||||
if not isinstance(headers, httplib.HTTPMessage):
|
||||
raise TypeError('expected httplib.Message, got {0}.'.format(
|
||||
type(headers)))
|
||||
|
||||
defects = getattr(headers, 'defects', None)
|
||||
get_payload = getattr(headers, 'get_payload', None)
|
||||
|
||||
unparsed_data = None
|
||||
if get_payload: # Platform-specific: Python 3.
|
||||
unparsed_data = get_payload()
|
||||
|
||||
if defects or unparsed_data:
|
||||
raise HeaderParsingError(defects=defects, unparsed_data=unparsed_data)
|
||||
|
||||
|
||||
def is_response_to_head(response):
|
||||
"""
|
||||
Checks, wether a the request of a response has been a HEAD-request.
|
||||
Handles the quirks of AppEngine.
|
||||
|
||||
:param conn:
|
||||
:type conn: :class:`httplib.HTTPResponse`
|
||||
"""
|
||||
# FIXME: Can we do this somehow without accessing private httplib _method?
|
||||
method = response._method
|
||||
if isinstance(method, int): # Platform-specific: Appengine
|
||||
return method == 3
|
||||
return method.upper() == 'HEAD'
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from __future__ import absolute_import
|
||||
import time
|
||||
import logging
|
||||
|
||||
@@ -94,7 +95,7 @@ class Retry(object):
|
||||
|
||||
seconds. If the backoff_factor is 0.1, then :func:`.sleep` will sleep
|
||||
for [0.1s, 0.2s, 0.4s, ...] between retries. It will never be longer
|
||||
than :attr:`Retry.MAX_BACKOFF`.
|
||||
than :attr:`Retry.BACKOFF_MAX`.
|
||||
|
||||
By default, backoff is disabled (set to 0).
|
||||
|
||||
@@ -126,7 +127,7 @@ class Retry(object):
|
||||
self.method_whitelist = method_whitelist
|
||||
self.backoff_factor = backoff_factor
|
||||
self.raise_on_redirect = raise_on_redirect
|
||||
self._observed_errors = _observed_errors # TODO: use .history instead?
|
||||
self._observed_errors = _observed_errors # TODO: use .history instead?
|
||||
|
||||
def new(self, **kw):
|
||||
params = dict(
|
||||
@@ -190,7 +191,7 @@ class Retry(object):
|
||||
return isinstance(err, (ReadTimeoutError, ProtocolError))
|
||||
|
||||
def is_forced_retry(self, method, status_code):
|
||||
""" Is this method/response retryable? (Based on method/codes whitelists)
|
||||
""" Is this method/status code retryable? (Based on method/codes whitelists)
|
||||
"""
|
||||
if self.method_whitelist and method.upper() not in self.method_whitelist:
|
||||
return False
|
||||
@@ -206,7 +207,8 @@ class Retry(object):
|
||||
|
||||
return min(retry_counts) < 0
|
||||
|
||||
def increment(self, method=None, url=None, response=None, error=None, _pool=None, _stacktrace=None):
|
||||
def increment(self, method=None, url=None, response=None, error=None,
|
||||
_pool=None, _stacktrace=None):
|
||||
""" Return a new Retry object with incremented retry counters.
|
||||
|
||||
:param response: A response object, or None, if the server did not
|
||||
@@ -274,7 +276,6 @@ class Retry(object):
|
||||
|
||||
return new_retry
|
||||
|
||||
|
||||
def __repr__(self):
|
||||
return ('{cls.__name__}(total={self.total}, connect={self.connect}, '
|
||||
'read={self.read}, redirect={self.redirect})').format(
|
||||
|
||||
@@ -1,17 +1,45 @@
|
||||
from binascii import hexlify, unhexlify
|
||||
from hashlib import md5, sha1
|
||||
from __future__ import absolute_import
|
||||
import errno
|
||||
import warnings
|
||||
import hmac
|
||||
|
||||
from ..exceptions import SSLError
|
||||
from binascii import hexlify, unhexlify
|
||||
from hashlib import md5, sha1, sha256
|
||||
|
||||
from ..exceptions import SSLError, InsecurePlatformWarning, SNIMissingWarning
|
||||
|
||||
|
||||
SSLContext = None
|
||||
HAS_SNI = False
|
||||
create_default_context = None
|
||||
|
||||
import errno
|
||||
import ssl
|
||||
# Maps the length of a digest to a possible hash function producing this digest
|
||||
HASHFUNC_MAP = {
|
||||
32: md5,
|
||||
40: sha1,
|
||||
64: sha256,
|
||||
}
|
||||
|
||||
|
||||
def _const_compare_digest_backport(a, b):
|
||||
"""
|
||||
Compare two digests of equal length in constant time.
|
||||
|
||||
The digests must be of type str/bytes.
|
||||
Returns True if the digests match, and False otherwise.
|
||||
"""
|
||||
result = abs(len(a) - len(b))
|
||||
for l, r in zip(bytearray(a), bytearray(b)):
|
||||
result |= l ^ r
|
||||
return result == 0
|
||||
|
||||
|
||||
_const_compare_digest = getattr(hmac, 'compare_digest',
|
||||
_const_compare_digest_backport)
|
||||
|
||||
|
||||
try: # Test for SSL features
|
||||
import ssl
|
||||
from ssl import wrap_socket, CERT_NONE, PROTOCOL_SSLv23
|
||||
from ssl import HAS_SNI # Has SNI?
|
||||
except ImportError:
|
||||
@@ -24,14 +52,24 @@ except ImportError:
|
||||
OP_NO_SSLv2, OP_NO_SSLv3 = 0x1000000, 0x2000000
|
||||
OP_NO_COMPRESSION = 0x20000
|
||||
|
||||
try:
|
||||
from ssl import _DEFAULT_CIPHERS
|
||||
except ImportError:
|
||||
_DEFAULT_CIPHERS = (
|
||||
'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'
|
||||
'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:ECDH+RC4:'
|
||||
'DH+RC4:RSA+RC4:!aNULL:!eNULL:!MD5'
|
||||
)
|
||||
# A secure default.
|
||||
# Sources for more information on TLS ciphers:
|
||||
#
|
||||
# - https://wiki.mozilla.org/Security/Server_Side_TLS
|
||||
# - https://www.ssllabs.com/projects/best-practices/index.html
|
||||
# - https://hynek.me/articles/hardening-your-web-servers-ssl-ciphers/
|
||||
#
|
||||
# The general intent is:
|
||||
# - Prefer cipher suites that offer perfect forward secrecy (DHE/ECDHE),
|
||||
# - prefer ECDHE over DHE for better performance,
|
||||
# - prefer any AES-GCM over any AES-CBC for better performance and security,
|
||||
# - use 3DES as fallback which is secure but slow,
|
||||
# - disable NULL authentication, MD5 MACs and DSS for security reasons.
|
||||
DEFAULT_CIPHERS = (
|
||||
'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'
|
||||
'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:'
|
||||
'!eNULL:!MD5'
|
||||
)
|
||||
|
||||
try:
|
||||
from ssl import SSLContext # Modern SSL?
|
||||
@@ -39,7 +77,8 @@ except ImportError:
|
||||
import sys
|
||||
|
||||
class SSLContext(object): # Platform-specific: Python 2 & 3.1
|
||||
supports_set_ciphers = sys.version_info >= (2, 7)
|
||||
supports_set_ciphers = ((2, 7) <= sys.version_info < (3,) or
|
||||
(3, 2) <= sys.version_info)
|
||||
|
||||
def __init__(self, protocol_version):
|
||||
self.protocol = protocol_version
|
||||
@@ -56,8 +95,11 @@ except ImportError:
|
||||
self.certfile = certfile
|
||||
self.keyfile = keyfile
|
||||
|
||||
def load_verify_locations(self, location):
|
||||
self.ca_certs = location
|
||||
def load_verify_locations(self, cafile=None, capath=None):
|
||||
self.ca_certs = cafile
|
||||
|
||||
if capath is not None:
|
||||
raise SSLError("CA directories not supported in older Pythons")
|
||||
|
||||
def set_ciphers(self, cipher_suite):
|
||||
if not self.supports_set_ciphers:
|
||||
@@ -69,6 +111,14 @@ except ImportError:
|
||||
self.ciphers = cipher_suite
|
||||
|
||||
def wrap_socket(self, socket, server_hostname=None):
|
||||
warnings.warn(
|
||||
'A true SSLContext object is not available. This prevents '
|
||||
'urllib3 from configuring SSL appropriately and may cause '
|
||||
'certain SSL connections to fail. For more information, see '
|
||||
'https://urllib3.readthedocs.org/en/latest/security.html'
|
||||
'#insecureplatformwarning.',
|
||||
InsecurePlatformWarning
|
||||
)
|
||||
kwargs = {
|
||||
'keyfile': self.keyfile,
|
||||
'certfile': self.certfile,
|
||||
@@ -92,30 +142,21 @@ def assert_fingerprint(cert, fingerprint):
|
||||
Fingerprint as string of hexdigits, can be interspersed by colons.
|
||||
"""
|
||||
|
||||
# Maps the length of a digest to a possible hash function producing
|
||||
# this digest.
|
||||
hashfunc_map = {
|
||||
16: md5,
|
||||
20: sha1
|
||||
}
|
||||
|
||||
fingerprint = fingerprint.replace(':', '').lower()
|
||||
digest_length, odd = divmod(len(fingerprint), 2)
|
||||
|
||||
if odd or digest_length not in hashfunc_map:
|
||||
raise SSLError('Fingerprint is of invalid length.')
|
||||
digest_length = len(fingerprint)
|
||||
hashfunc = HASHFUNC_MAP.get(digest_length)
|
||||
if not hashfunc:
|
||||
raise SSLError(
|
||||
'Fingerprint of invalid length: {0}'.format(fingerprint))
|
||||
|
||||
# We need encode() here for py32; works on py2 and p33.
|
||||
fingerprint_bytes = unhexlify(fingerprint.encode())
|
||||
|
||||
hashfunc = hashfunc_map[digest_length]
|
||||
|
||||
cert_digest = hashfunc(cert).digest()
|
||||
|
||||
if not cert_digest == fingerprint_bytes:
|
||||
if not _const_compare_digest(cert_digest, fingerprint_bytes):
|
||||
raise SSLError('Fingerprints did not match. Expected "{0}", got "{1}".'
|
||||
.format(hexlify(fingerprint_bytes),
|
||||
hexlify(cert_digest)))
|
||||
.format(fingerprint, hexlify(cert_digest)))
|
||||
|
||||
|
||||
def resolve_cert_reqs(candidate):
|
||||
@@ -157,7 +198,7 @@ def resolve_ssl_version(candidate):
|
||||
return candidate
|
||||
|
||||
|
||||
def create_urllib3_context(ssl_version=None, cert_reqs=ssl.CERT_REQUIRED,
|
||||
def create_urllib3_context(ssl_version=None, cert_reqs=None,
|
||||
options=None, ciphers=None):
|
||||
"""All arguments have the same meaning as ``ssl_wrap_socket``.
|
||||
|
||||
@@ -194,6 +235,9 @@ def create_urllib3_context(ssl_version=None, cert_reqs=ssl.CERT_REQUIRED,
|
||||
"""
|
||||
context = SSLContext(ssl_version or ssl.PROTOCOL_SSLv23)
|
||||
|
||||
# Setting the default here, as we may have no ssl module on import
|
||||
cert_reqs = ssl.CERT_REQUIRED if cert_reqs is None else cert_reqs
|
||||
|
||||
if options is None:
|
||||
options = 0
|
||||
# SSLv2 is easily broken and is considered harmful and dangerous
|
||||
@@ -207,20 +251,23 @@ def create_urllib3_context(ssl_version=None, cert_reqs=ssl.CERT_REQUIRED,
|
||||
context.options |= options
|
||||
|
||||
if getattr(context, 'supports_set_ciphers', True): # Platform-specific: Python 2.6
|
||||
context.set_ciphers(ciphers or _DEFAULT_CIPHERS)
|
||||
context.set_ciphers(ciphers or DEFAULT_CIPHERS)
|
||||
|
||||
context.verify_mode = cert_reqs
|
||||
if getattr(context, 'check_hostname', None) is not None: # Platform-specific: Python 3.2
|
||||
context.check_hostname = (context.verify_mode == ssl.CERT_REQUIRED)
|
||||
# We do our own verification, including fingerprints and alternative
|
||||
# hostnames. So disable it here
|
||||
context.check_hostname = False
|
||||
return context
|
||||
|
||||
|
||||
def ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None,
|
||||
ca_certs=None, server_hostname=None,
|
||||
ssl_version=None, ciphers=None, ssl_context=None):
|
||||
ssl_version=None, ciphers=None, ssl_context=None,
|
||||
ca_cert_dir=None):
|
||||
"""
|
||||
All arguments except for server_hostname and ssl_context have the same
|
||||
meaning as they do when using :func:`ssl.wrap_socket`.
|
||||
All arguments except for server_hostname, ssl_context, and ca_cert_dir have
|
||||
the same meaning as they do when using :func:`ssl.wrap_socket`.
|
||||
|
||||
:param server_hostname:
|
||||
When SNI is supported, the expected hostname of the certificate
|
||||
@@ -230,15 +277,19 @@ def ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None,
|
||||
:param ciphers:
|
||||
A string of ciphers we wish the client to support. This is not
|
||||
supported on Python 2.6 as the ssl module does not support it.
|
||||
:param ca_cert_dir:
|
||||
A directory containing CA certificates in multiple separate files, as
|
||||
supported by OpenSSL's -CApath flag or the capath argument to
|
||||
SSLContext.load_verify_locations().
|
||||
"""
|
||||
context = ssl_context
|
||||
if context is None:
|
||||
context = create_urllib3_context(ssl_version, cert_reqs,
|
||||
ciphers=ciphers)
|
||||
|
||||
if ca_certs:
|
||||
if ca_certs or ca_cert_dir:
|
||||
try:
|
||||
context.load_verify_locations(ca_certs)
|
||||
context.load_verify_locations(ca_certs, ca_cert_dir)
|
||||
except IOError as e: # Platform-specific: Python 2.6, 2.7, 3.2
|
||||
raise SSLError(e)
|
||||
# Py33 raises FileNotFoundError which subclasses OSError
|
||||
@@ -247,8 +298,20 @@ def ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None,
|
||||
if e.errno == errno.ENOENT:
|
||||
raise SSLError(e)
|
||||
raise
|
||||
|
||||
if certfile:
|
||||
context.load_cert_chain(certfile, keyfile)
|
||||
if HAS_SNI: # Platform-specific: OpenSSL with enabled SNI
|
||||
return context.wrap_socket(sock, server_hostname=server_hostname)
|
||||
|
||||
warnings.warn(
|
||||
'An HTTPS request has been made, but the SNI (Subject Name '
|
||||
'Indication) extension to TLS is not available on this platform. '
|
||||
'This may cause the server to present an incorrect TLS '
|
||||
'certificate, which can cause validation failures. For more '
|
||||
'information, see '
|
||||
'https://urllib3.readthedocs.org/en/latest/security.html'
|
||||
'#snimissingwarning.',
|
||||
SNIMissingWarning
|
||||
)
|
||||
return context.wrap_socket(sock)
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from __future__ import absolute_import
|
||||
# The default socket timeout, used by httplib to indicate that no timeout was
|
||||
# specified by the user
|
||||
from socket import _GLOBAL_DEFAULT_TIMEOUT
|
||||
@@ -9,6 +10,7 @@ from ..exceptions import TimeoutStateError
|
||||
# urllib3
|
||||
_Default = object()
|
||||
|
||||
|
||||
def current_time():
|
||||
"""
|
||||
Retrieve the current time. This function is mocked out in unit testing.
|
||||
@@ -226,9 +228,9 @@ class Timeout(object):
|
||||
has not yet been called on this object.
|
||||
"""
|
||||
if (self.total is not None and
|
||||
self.total is not self.DEFAULT_TIMEOUT and
|
||||
self._read is not None and
|
||||
self._read is not self.DEFAULT_TIMEOUT):
|
||||
self.total is not self.DEFAULT_TIMEOUT and
|
||||
self._read is not None and
|
||||
self._read is not self.DEFAULT_TIMEOUT):
|
||||
# In case the connect timeout has not yet been established.
|
||||
if self._start_connect is None:
|
||||
return self._read
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from __future__ import absolute_import
|
||||
from collections import namedtuple
|
||||
|
||||
from ..exceptions import LocationParseError
|
||||
@@ -15,6 +16,8 @@ class Url(namedtuple('Url', url_attrs)):
|
||||
|
||||
def __new__(cls, scheme=None, auth=None, host=None, port=None, path=None,
|
||||
query=None, fragment=None):
|
||||
if path and not path.startswith('/'):
|
||||
path = '/' + path
|
||||
return super(Url, cls).__new__(cls, scheme, auth, host, port, path,
|
||||
query, fragment)
|
||||
|
||||
@@ -83,6 +86,7 @@ class Url(namedtuple('Url', url_attrs)):
|
||||
def __str__(self):
|
||||
return self.url
|
||||
|
||||
|
||||
def split_first(s, delims):
|
||||
"""
|
||||
Given a string and an iterable of delimiters, split on the first found
|
||||
@@ -113,7 +117,7 @@ def split_first(s, delims):
|
||||
if min_idx is None or min_idx < 0:
|
||||
return s, '', None
|
||||
|
||||
return s[:min_idx], s[min_idx+1:], min_delim
|
||||
return s[:min_idx], s[min_idx + 1:], min_delim
|
||||
|
||||
|
||||
def parse_url(url):
|
||||
@@ -204,6 +208,7 @@ def parse_url(url):
|
||||
|
||||
return Url(scheme, auth, host, port, path, query, fragment)
|
||||
|
||||
|
||||
def get_host(url):
|
||||
"""
|
||||
Deprecated. Use :func:`.parse_url` instead.
|
||||
|
||||
@@ -62,12 +62,11 @@ def merge_setting(request_setting, session_setting, dict_class=OrderedDict):
|
||||
merged_setting = dict_class(to_key_val_list(session_setting))
|
||||
merged_setting.update(to_key_val_list(request_setting))
|
||||
|
||||
# Remove keys that are set to None.
|
||||
for (k, v) in request_setting.items():
|
||||
if v is None:
|
||||
del merged_setting[k]
|
||||
|
||||
merged_setting = dict((k, v) for (k, v) in merged_setting.items() if v is not None)
|
||||
# Remove keys that are set to None. Extract keys first to avoid altering
|
||||
# the dictionary during iteration.
|
||||
none_keys = [k for (k, v) in merged_setting.items() if v is None]
|
||||
for key in none_keys:
|
||||
del merged_setting[key]
|
||||
|
||||
return merged_setting
|
||||
|
||||
@@ -90,7 +89,7 @@ def merge_hooks(request_hooks, session_hooks, dict_class=OrderedDict):
|
||||
|
||||
class SessionRedirectMixin(object):
|
||||
def resolve_redirects(self, resp, req, stream=False, timeout=None,
|
||||
verify=True, cert=None, proxies=None):
|
||||
verify=True, cert=None, proxies=None, **adapter_kwargs):
|
||||
"""Receives a Response. Returns a generator of Responses."""
|
||||
|
||||
i = 0
|
||||
@@ -171,7 +170,10 @@ class SessionRedirectMixin(object):
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
extract_cookies_to_jar(prepared_request._cookies, prepared_request, resp.raw)
|
||||
# Extract any cookies sent on the response to the cookiejar
|
||||
# in the new request. Because we've mutated our copied prepared
|
||||
# request, use the old one that we haven't yet touched.
|
||||
extract_cookies_to_jar(prepared_request._cookies, req, resp.raw)
|
||||
prepared_request._cookies.update(self.cookies)
|
||||
prepared_request.prepare_cookies(prepared_request._cookies)
|
||||
|
||||
@@ -190,6 +192,7 @@ class SessionRedirectMixin(object):
|
||||
cert=cert,
|
||||
proxies=proxies,
|
||||
allow_redirects=False,
|
||||
**adapter_kwargs
|
||||
)
|
||||
|
||||
extract_cookies_to_jar(self.cookies, prepared_request, resp.raw)
|
||||
@@ -270,7 +273,13 @@ class Session(SessionRedirectMixin):
|
||||
>>> import requests
|
||||
>>> s = requests.Session()
|
||||
>>> s.get('http://httpbin.org/get')
|
||||
200
|
||||
<Response [200]>
|
||||
|
||||
Or as a context manager::
|
||||
|
||||
>>> with requests.Session() as s:
|
||||
>>> s.get('http://httpbin.org/get')
|
||||
<Response [200]>
|
||||
"""
|
||||
|
||||
__attrs__ = [
|
||||
@@ -290,9 +299,9 @@ class Session(SessionRedirectMixin):
|
||||
#: :class:`Request <Request>`.
|
||||
self.auth = None
|
||||
|
||||
#: Dictionary mapping protocol to the URL of the proxy (e.g.
|
||||
#: {'http': 'foo.bar:3128'}) to be used on each
|
||||
#: :class:`Request <Request>`.
|
||||
#: Dictionary mapping protocol or protocol and host to the URL of the proxy
|
||||
#: (e.g. {'http': 'foo.bar:3128', 'http://host.name': 'foo.bar:4012'}) to
|
||||
#: be used on each :class:`Request <Request>`.
|
||||
self.proxies = {}
|
||||
|
||||
#: Event-handling hooks.
|
||||
@@ -316,7 +325,8 @@ class Session(SessionRedirectMixin):
|
||||
#: limit, a :class:`TooManyRedirects` exception is raised.
|
||||
self.max_redirects = DEFAULT_REDIRECT_LIMIT
|
||||
|
||||
#: Should we trust the environment?
|
||||
#: Trust environment settings for proxy configuration, default
|
||||
#: authentication and similar.
|
||||
self.trust_env = True
|
||||
|
||||
#: A CookieJar containing all currently outstanding cookies set on this
|
||||
@@ -401,8 +411,8 @@ class Session(SessionRedirectMixin):
|
||||
:param url: URL for the new :class:`Request` object.
|
||||
:param params: (optional) Dictionary or bytes to be sent in the query
|
||||
string for the :class:`Request`.
|
||||
:param data: (optional) Dictionary or bytes to send in the body of the
|
||||
:class:`Request`.
|
||||
:param data: (optional) Dictionary, bytes, or file-like object to send
|
||||
in the body of the :class:`Request`.
|
||||
:param json: (optional) json to send in the body of the
|
||||
:class:`Request`.
|
||||
:param headers: (optional) Dictionary of HTTP Headers to send with the
|
||||
@@ -414,23 +424,20 @@ class Session(SessionRedirectMixin):
|
||||
:param auth: (optional) Auth tuple or callable to enable
|
||||
Basic/Digest/Custom HTTP Auth.
|
||||
:param timeout: (optional) How long to wait for the server to send
|
||||
data before giving up, as a float, or a (`connect timeout, read
|
||||
timeout <user/advanced.html#timeouts>`_) tuple.
|
||||
data before giving up, as a float, or a :ref:`(connect timeout,
|
||||
read timeout) <timeouts>` tuple.
|
||||
:type timeout: float or tuple
|
||||
:param allow_redirects: (optional) Set to True by default.
|
||||
:type allow_redirects: bool
|
||||
:param proxies: (optional) Dictionary mapping protocol to the URL of
|
||||
the proxy.
|
||||
:param proxies: (optional) Dictionary mapping protocol or protocol and
|
||||
hostname to the URL of the proxy.
|
||||
:param stream: (optional) whether to immediately download the response
|
||||
content. Defaults to ``False``.
|
||||
:param verify: (optional) if ``True``, the SSL cert will be verified.
|
||||
A CA_BUNDLE path can also be provided.
|
||||
:param verify: (optional) whether the SSL cert will be verified.
|
||||
A CA_BUNDLE path can also be provided. Defaults to ``True``.
|
||||
:param cert: (optional) if String, path to ssl client cert file (.pem).
|
||||
If Tuple, ('cert', 'key') pair.
|
||||
"""
|
||||
|
||||
method = to_native_string(method)
|
||||
|
||||
# Create the Request.
|
||||
req = Request(
|
||||
method = method.upper(),
|
||||
@@ -557,10 +564,6 @@ class Session(SessionRedirectMixin):
|
||||
# Set up variables needed for resolve_redirects and dispatching of hooks
|
||||
allow_redirects = kwargs.pop('allow_redirects', True)
|
||||
stream = kwargs.get('stream')
|
||||
timeout = kwargs.get('timeout')
|
||||
verify = kwargs.get('verify')
|
||||
cert = kwargs.get('cert')
|
||||
proxies = kwargs.get('proxies')
|
||||
hooks = request.hooks
|
||||
|
||||
# Get the appropriate adapter to use
|
||||
@@ -588,12 +591,7 @@ class Session(SessionRedirectMixin):
|
||||
extract_cookies_to_jar(self.cookies, request, r.raw)
|
||||
|
||||
# Redirect resolving generator.
|
||||
gen = self.resolve_redirects(r, request,
|
||||
stream=stream,
|
||||
timeout=timeout,
|
||||
verify=verify,
|
||||
cert=cert,
|
||||
proxies=proxies)
|
||||
gen = self.resolve_redirects(r, request, **kwargs)
|
||||
|
||||
# Resolve redirects if allowed.
|
||||
history = [resp for resp in gen] if allow_redirects else []
|
||||
@@ -636,7 +634,7 @@ class Session(SessionRedirectMixin):
|
||||
'cert': cert}
|
||||
|
||||
def get_adapter(self, url):
|
||||
"""Returns the appropriate connnection adapter for the given URL."""
|
||||
"""Returns the appropriate connection adapter for the given URL."""
|
||||
for (prefix, adapter) in self.adapters.items():
|
||||
|
||||
if url.lower().startswith(prefix):
|
||||
|
||||
@@ -78,11 +78,12 @@ _codes = {
|
||||
507: ('insufficient_storage',),
|
||||
509: ('bandwidth_limit_exceeded', 'bandwidth'),
|
||||
510: ('not_extended',),
|
||||
511: ('network_authentication_required', 'network_auth', 'network_authentication'),
|
||||
}
|
||||
|
||||
codes = LookupDict(name='status_codes')
|
||||
|
||||
for (code, titles) in list(_codes.items()):
|
||||
for code, titles in _codes.items():
|
||||
for title in titles:
|
||||
setattr(codes, title, code)
|
||||
if not title.startswith('\\'):
|
||||
|
||||
@@ -25,10 +25,11 @@ from . import __version__
|
||||
from . import certs
|
||||
from .compat import parse_http_list as _parse_list_header
|
||||
from .compat import (quote, urlparse, bytes, str, OrderedDict, unquote, is_py2,
|
||||
builtin_str, getproxies, proxy_bypass, urlunparse)
|
||||
builtin_str, getproxies, proxy_bypass, urlunparse,
|
||||
basestring)
|
||||
from .cookies import RequestsCookieJar, cookiejar_from_dict
|
||||
from .structures import CaseInsensitiveDict
|
||||
from .exceptions import InvalidURL
|
||||
from .exceptions import InvalidURL, FileModeWarning
|
||||
|
||||
_hush_pyflakes = (RequestsCookieJar,)
|
||||
|
||||
@@ -47,26 +48,47 @@ def dict_to_sequence(d):
|
||||
|
||||
|
||||
def super_len(o):
|
||||
total_length = 0
|
||||
current_position = 0
|
||||
|
||||
if hasattr(o, '__len__'):
|
||||
return len(o)
|
||||
total_length = len(o)
|
||||
|
||||
if hasattr(o, 'len'):
|
||||
return o.len
|
||||
elif hasattr(o, 'len'):
|
||||
total_length = o.len
|
||||
|
||||
if hasattr(o, 'fileno'):
|
||||
elif hasattr(o, 'getvalue'):
|
||||
# e.g. BytesIO, cStringIO.StringIO
|
||||
total_length = len(o.getvalue())
|
||||
|
||||
elif hasattr(o, 'fileno'):
|
||||
try:
|
||||
fileno = o.fileno()
|
||||
except io.UnsupportedOperation:
|
||||
pass
|
||||
else:
|
||||
return os.fstat(fileno).st_size
|
||||
total_length = os.fstat(fileno).st_size
|
||||
|
||||
if hasattr(o, 'getvalue'):
|
||||
# e.g. BytesIO, cStringIO.StringIO
|
||||
return len(o.getvalue())
|
||||
# Having used fstat to determine the file length, we need to
|
||||
# confirm that this file was opened up in binary mode.
|
||||
if 'b' not in o.mode:
|
||||
warnings.warn((
|
||||
"Requests has determined the content-length for this "
|
||||
"request using the binary size of the file: however, the "
|
||||
"file has been opened in text mode (i.e. without the 'b' "
|
||||
"flag in the mode). This may lead to an incorrect "
|
||||
"content-length. In Requests 3.0, support will be removed "
|
||||
"for files in text mode."),
|
||||
FileModeWarning
|
||||
)
|
||||
|
||||
if hasattr(o, 'tell'):
|
||||
current_position = o.tell()
|
||||
|
||||
return max(0, total_length - current_position)
|
||||
|
||||
|
||||
def get_netrc_auth(url):
|
||||
def get_netrc_auth(url, raise_errors=False):
|
||||
"""Returns the Requests tuple auth for a given url from netrc."""
|
||||
|
||||
try:
|
||||
@@ -93,8 +115,12 @@ def get_netrc_auth(url):
|
||||
|
||||
ri = urlparse(url)
|
||||
|
||||
# Strip port numbers from netloc
|
||||
host = ri.netloc.split(':')[0]
|
||||
# Strip port numbers from netloc. This weird `if...encode`` dance is
|
||||
# used for Python 3.2, which doesn't support unicode literals.
|
||||
splitstr = b':'
|
||||
if isinstance(url, str):
|
||||
splitstr = splitstr.decode('ascii')
|
||||
host = ri.netloc.split(splitstr)[0]
|
||||
|
||||
try:
|
||||
_netrc = netrc(netrc_path).authenticators(host)
|
||||
@@ -104,8 +130,9 @@ def get_netrc_auth(url):
|
||||
return (_netrc[login_i], _netrc[2])
|
||||
except (NetrcParseError, IOError):
|
||||
# If there was a parsing error or a permissions issue reading the file,
|
||||
# we'll just skip netrc auth
|
||||
pass
|
||||
# we'll just skip netrc auth unless explicitly asked to raise errors.
|
||||
if raise_errors:
|
||||
raise
|
||||
|
||||
# AppEngine hackiness.
|
||||
except (ImportError, AttributeError):
|
||||
@@ -115,7 +142,8 @@ def get_netrc_auth(url):
|
||||
def guess_filename(obj):
|
||||
"""Tries to guess the filename of the given object."""
|
||||
name = getattr(obj, 'name', None)
|
||||
if name and isinstance(name, builtin_str) and name[0] != '<' and name[-1] != '>':
|
||||
if (name and isinstance(name, basestring) and name[0] != '<' and
|
||||
name[-1] != '>'):
|
||||
return os.path.basename(name)
|
||||
|
||||
|
||||
@@ -418,10 +446,18 @@ def requote_uri(uri):
|
||||
This function passes the given URI through an unquote/quote cycle to
|
||||
ensure that it is fully and consistently quoted.
|
||||
"""
|
||||
# Unquote only the unreserved characters
|
||||
# Then quote only illegal characters (do not quote reserved, unreserved,
|
||||
# or '%')
|
||||
return quote(unquote_unreserved(uri), safe="!#$%&'()*+,/:;=?@[]~")
|
||||
safe_with_percent = "!#$%&'()*+,/:;=?@[]~"
|
||||
safe_without_percent = "!#$&'()*+,/:;=?@[]~"
|
||||
try:
|
||||
# Unquote only the unreserved characters
|
||||
# Then quote only illegal characters (do not quote reserved,
|
||||
# unreserved, or '%')
|
||||
return quote(unquote_unreserved(uri), safe=safe_with_percent)
|
||||
except InvalidURL:
|
||||
# We couldn't unquote the given URI, so let's try quoting it, but
|
||||
# there may be unquoted '%'s in the URI. We need to make sure they're
|
||||
# properly quoted so they do not cause issues elsewhere.
|
||||
return quote(uri, safe=safe_without_percent)
|
||||
|
||||
|
||||
def address_in_network(ip, net):
|
||||
@@ -488,7 +524,9 @@ def should_bypass_proxies(url):
|
||||
if no_proxy:
|
||||
# We need to check whether we match here. We need to see if we match
|
||||
# the end of the netloc, both with and without the port.
|
||||
no_proxy = no_proxy.replace(' ', '').split(',')
|
||||
no_proxy = (
|
||||
host for host in no_proxy.replace(' ', '').split(',') if host
|
||||
)
|
||||
|
||||
ip = netloc.split(':')[0]
|
||||
if is_ipv4_address(ip):
|
||||
@@ -526,36 +564,22 @@ def get_environ_proxies(url):
|
||||
else:
|
||||
return getproxies()
|
||||
|
||||
def select_proxy(url, proxies):
|
||||
"""Select a proxy for the url, if applicable.
|
||||
|
||||
:param url: The url being for the request
|
||||
:param proxies: A dictionary of schemes or schemes and hosts to proxy URLs
|
||||
"""
|
||||
proxies = proxies or {}
|
||||
urlparts = urlparse(url)
|
||||
proxy = proxies.get(urlparts.scheme+'://'+urlparts.hostname)
|
||||
if proxy is None:
|
||||
proxy = proxies.get(urlparts.scheme)
|
||||
return proxy
|
||||
|
||||
def default_user_agent(name="python-requests"):
|
||||
"""Return a string representing the default user agent."""
|
||||
_implementation = platform.python_implementation()
|
||||
|
||||
if _implementation == 'CPython':
|
||||
_implementation_version = platform.python_version()
|
||||
elif _implementation == 'PyPy':
|
||||
_implementation_version = '%s.%s.%s' % (sys.pypy_version_info.major,
|
||||
sys.pypy_version_info.minor,
|
||||
sys.pypy_version_info.micro)
|
||||
if sys.pypy_version_info.releaselevel != 'final':
|
||||
_implementation_version = ''.join([_implementation_version, sys.pypy_version_info.releaselevel])
|
||||
elif _implementation == 'Jython':
|
||||
_implementation_version = platform.python_version() # Complete Guess
|
||||
elif _implementation == 'IronPython':
|
||||
_implementation_version = platform.python_version() # Complete Guess
|
||||
else:
|
||||
_implementation_version = 'Unknown'
|
||||
|
||||
try:
|
||||
p_system = platform.system()
|
||||
p_release = platform.release()
|
||||
except IOError:
|
||||
p_system = 'Unknown'
|
||||
p_release = 'Unknown'
|
||||
|
||||
return " ".join(['%s/%s' % (name, __version__),
|
||||
'%s/%s' % (_implementation, _implementation_version),
|
||||
'%s/%s' % (p_system, p_release)])
|
||||
return '%s/%s' % (name, __version__)
|
||||
|
||||
|
||||
def default_headers():
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
py==1.4.12
|
||||
pytest==2.3.4
|
||||
pytest-cov==1.6
|
||||
py==1.4.30
|
||||
pytest==2.8.1
|
||||
pytest-cov==2.1.0
|
||||
pytest-httpbin==0.0.7
|
||||
httpbin==0.4.0
|
||||
wheel
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
|
||||
import requests
|
||||
|
||||
from codecs import open
|
||||
|
||||
try:
|
||||
@@ -29,6 +28,14 @@ packages = [
|
||||
|
||||
requires = []
|
||||
|
||||
version = ''
|
||||
with open('requests/__init__.py', 'r') as fd:
|
||||
version = re.search(r'^__version__\s*=\s*[\'"]([^\'"]*)[\'"]',
|
||||
fd.read(), re.MULTILINE).group(1)
|
||||
|
||||
if not version:
|
||||
raise RuntimeError('Cannot find version information')
|
||||
|
||||
with open('README.rst', 'r', 'utf-8') as f:
|
||||
readme = f.read()
|
||||
with open('HISTORY.rst', 'r', 'utf-8') as f:
|
||||
@@ -36,7 +43,7 @@ with open('HISTORY.rst', 'r', 'utf-8') as f:
|
||||
|
||||
setup(
|
||||
name='requests',
|
||||
version=requests.__version__,
|
||||
version=version,
|
||||
description='Python HTTP for Humans.',
|
||||
long_description=readme + '\n\n' + history,
|
||||
author='Kenneth Reitz',
|
||||
@@ -55,14 +62,13 @@ setup(
|
||||
'Natural Language :: English',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 2.6',
|
||||
'Programming Language :: Python :: 2.7',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.3',
|
||||
'Programming Language :: Python :: 3.4'
|
||||
|
||||
'Programming Language :: Python :: 3.4',
|
||||
'Programming Language :: Python :: 3.5',
|
||||
),
|
||||
extras_require={
|
||||
'security': ['pyOpenSSL', 'ndg-httpsclient', 'pyasn1'],
|
||||
'security': ['pyOpenSSL>=0.13', 'ndg-httpsclient', 'pyasn1'],
|
||||
},
|
||||
)
|
||||
|
||||
@@ -9,6 +9,7 @@ import os
|
||||
import pickle
|
||||
import unittest
|
||||
import collections
|
||||
import contextlib
|
||||
|
||||
import io
|
||||
import requests
|
||||
@@ -16,7 +17,9 @@ import pytest
|
||||
from requests.adapters import HTTPAdapter
|
||||
from requests.auth import HTTPDigestAuth, _basic_auth_str
|
||||
from requests.compat import (
|
||||
Morsel, cookielib, getproxies, str, urljoin, urlparse, is_py3, builtin_str)
|
||||
Morsel, cookielib, getproxies, str, urljoin, urlparse, is_py3,
|
||||
builtin_str, OrderedDict
|
||||
)
|
||||
from requests.cookies import cookiejar_from_dict, morsel_to_cookie
|
||||
from requests.exceptions import (ConnectionError, ConnectTimeout,
|
||||
InvalidSchema, InvalidURL, MissingSchema,
|
||||
@@ -32,6 +35,11 @@ try:
|
||||
except ImportError:
|
||||
import io as StringIO
|
||||
|
||||
try:
|
||||
from multiprocessing.pool import ThreadPool
|
||||
except ImportError:
|
||||
ThreadPool = None
|
||||
|
||||
if is_py3:
|
||||
def u(s):
|
||||
return s
|
||||
@@ -40,20 +48,33 @@ else:
|
||||
return s.decode('unicode-escape')
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def httpbin(httpbin):
|
||||
# Issue #1483: Make sure the URL always has a trailing slash
|
||||
httpbin_url = httpbin.url.rstrip('/') + '/'
|
||||
|
||||
def inner(*suffix):
|
||||
return urljoin(httpbin_url, '/'.join(suffix))
|
||||
|
||||
return inner
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def httpsbin_url(httpbin_secure):
|
||||
# Issue #1483: Make sure the URL always has a trailing slash
|
||||
httpbin_url = httpbin_secure.url.rstrip('/') + '/'
|
||||
|
||||
def inner(*suffix):
|
||||
return urljoin(httpbin_url, '/'.join(suffix))
|
||||
|
||||
return inner
|
||||
|
||||
|
||||
# Requests to this URL should always fail with a connection timeout (nothing
|
||||
# listening on that port)
|
||||
TARPIT = "http://10.255.255.1"
|
||||
HTTPBIN = os.environ.get('HTTPBIN_URL', 'http://httpbin.org/')
|
||||
# Issue #1483: Make sure the URL always has a trailing slash
|
||||
HTTPBIN = HTTPBIN.rstrip('/') + '/'
|
||||
|
||||
|
||||
def httpbin(*suffix):
|
||||
"""Returns url for HTTPBIN resource."""
|
||||
return urljoin(HTTPBIN, '/'.join(suffix))
|
||||
|
||||
|
||||
class RequestsTestCase(unittest.TestCase):
|
||||
class TestRequests(object):
|
||||
|
||||
_multiprocess_can_split_ = True
|
||||
|
||||
@@ -97,13 +118,13 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert pr.url == req.url
|
||||
assert pr.body == 'life=42'
|
||||
|
||||
def test_no_content_length(self):
|
||||
def test_no_content_length(self, httpbin):
|
||||
get_req = requests.Request('GET', httpbin('get')).prepare()
|
||||
assert 'Content-Length' not in get_req.headers
|
||||
head_req = requests.Request('HEAD', httpbin('head')).prepare()
|
||||
assert 'Content-Length' not in head_req.headers
|
||||
|
||||
def test_override_content_length(self):
|
||||
def test_override_content_length(self, httpbin):
|
||||
headers = {
|
||||
'Content-Length': 'not zero'
|
||||
}
|
||||
@@ -124,19 +145,35 @@ class RequestsTestCase(unittest.TestCase):
|
||||
"http://example.com/path?key=value#fragment", params={"a": "b"}).prepare()
|
||||
assert request.url == "http://example.com/path?key=value&a=b#fragment"
|
||||
|
||||
def test_mixed_case_scheme_acceptable(self):
|
||||
def test_params_original_order_is_preserved_by_default(self):
|
||||
param_ordered_dict = OrderedDict((('z', 1), ('a', 1), ('k', 1), ('d', 1)))
|
||||
session = requests.Session()
|
||||
request = requests.Request('GET', 'http://example.com/', params=param_ordered_dict)
|
||||
prep = session.prepare_request(request)
|
||||
assert prep.url == 'http://example.com/?z=1&a=1&k=1&d=1'
|
||||
|
||||
def test_params_bytes_are_encoded(self):
|
||||
request = requests.Request('GET', 'http://example.com',
|
||||
params=b'test=foo').prepare()
|
||||
assert request.url == 'http://example.com/?test=foo'
|
||||
|
||||
def test_binary_put(self):
|
||||
request = requests.Request('PUT', 'http://example.com',
|
||||
data=u"ööö".encode("utf-8")).prepare()
|
||||
assert isinstance(request.body, bytes)
|
||||
|
||||
def test_mixed_case_scheme_acceptable(self, httpbin):
|
||||
s = requests.Session()
|
||||
s.proxies = getproxies()
|
||||
parts = urlparse(httpbin('get'))
|
||||
schemes = ['http://', 'HTTP://', 'hTTp://', 'HttP://',
|
||||
'https://', 'HTTPS://', 'hTTps://', 'HttPs://']
|
||||
schemes = ['http://', 'HTTP://', 'hTTp://', 'HttP://']
|
||||
for scheme in schemes:
|
||||
url = scheme + parts.netloc + parts.path
|
||||
r = requests.Request('GET', url)
|
||||
r = s.send(r.prepare())
|
||||
assert r.status_code == 200, 'failed for scheme {0}'.format(scheme)
|
||||
|
||||
def test_HTTP_200_OK_GET_ALTERNATIVE(self):
|
||||
def test_HTTP_200_OK_GET_ALTERNATIVE(self, httpbin):
|
||||
r = requests.Request('GET', httpbin('get'))
|
||||
s = requests.Session()
|
||||
s.proxies = getproxies()
|
||||
@@ -145,7 +182,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_HTTP_302_ALLOW_REDIRECT_GET(self):
|
||||
def test_HTTP_302_ALLOW_REDIRECT_GET(self, httpbin):
|
||||
r = requests.get(httpbin('redirect', '1'))
|
||||
assert r.status_code == 200
|
||||
assert r.history[0].status_code == 302
|
||||
@@ -155,7 +192,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
# r = requests.post(httpbin('status', '302'), data={'some': 'data'})
|
||||
# self.assertEqual(r.status_code, 200)
|
||||
|
||||
def test_HTTP_200_OK_GET_WITH_PARAMS(self):
|
||||
def test_HTTP_200_OK_GET_WITH_PARAMS(self, httpbin):
|
||||
heads = {'User-agent': 'Mozilla/5.0'}
|
||||
|
||||
r = requests.get(httpbin('user-agent'), headers=heads)
|
||||
@@ -163,25 +200,25 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert heads['User-agent'] in r.text
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_HTTP_200_OK_GET_WITH_MIXED_PARAMS(self):
|
||||
def test_HTTP_200_OK_GET_WITH_MIXED_PARAMS(self, httpbin):
|
||||
heads = {'User-agent': 'Mozilla/5.0'}
|
||||
|
||||
r = requests.get(httpbin('get') + '?test=true', params={'q': 'test'}, headers=heads)
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_set_cookie_on_301(self):
|
||||
def test_set_cookie_on_301(self, httpbin):
|
||||
s = requests.session()
|
||||
url = httpbin('cookies/set?foo=bar')
|
||||
s.get(url)
|
||||
assert s.cookies['foo'] == 'bar'
|
||||
|
||||
def test_cookie_sent_on_redirect(self):
|
||||
def test_cookie_sent_on_redirect(self, httpbin):
|
||||
s = requests.session()
|
||||
s.get(httpbin('cookies/set?foo=bar'))
|
||||
r = s.get(httpbin('redirect/1')) # redirects to httpbin('get')
|
||||
assert 'Cookie' in r.json()['headers']
|
||||
|
||||
def test_cookie_removed_on_expire(self):
|
||||
def test_cookie_removed_on_expire(self, httpbin):
|
||||
s = requests.session()
|
||||
s.get(httpbin('cookies/set?foo=bar'))
|
||||
assert s.cookies['foo'] == 'bar'
|
||||
@@ -194,18 +231,18 @@ class RequestsTestCase(unittest.TestCase):
|
||||
)
|
||||
assert 'foo' not in s.cookies
|
||||
|
||||
def test_cookie_quote_wrapped(self):
|
||||
def test_cookie_quote_wrapped(self, httpbin):
|
||||
s = requests.session()
|
||||
s.get(httpbin('cookies/set?foo="bar:baz"'))
|
||||
assert s.cookies['foo'] == '"bar:baz"'
|
||||
|
||||
def test_cookie_persists_via_api(self):
|
||||
def test_cookie_persists_via_api(self, httpbin):
|
||||
s = requests.session()
|
||||
r = s.get(httpbin('redirect/1'), cookies={'foo': 'bar'})
|
||||
assert 'foo' in r.request.headers['Cookie']
|
||||
assert 'foo' in r.history[0].request.headers['Cookie']
|
||||
|
||||
def test_request_cookie_overrides_session_cookie(self):
|
||||
def test_request_cookie_overrides_session_cookie(self, httpbin):
|
||||
s = requests.session()
|
||||
s.cookies['foo'] = 'bar'
|
||||
r = s.get(httpbin('cookies'), cookies={'foo': 'baz'})
|
||||
@@ -213,13 +250,13 @@ class RequestsTestCase(unittest.TestCase):
|
||||
# Session cookie should not be modified
|
||||
assert s.cookies['foo'] == 'bar'
|
||||
|
||||
def test_request_cookies_not_persisted(self):
|
||||
def test_request_cookies_not_persisted(self, httpbin):
|
||||
s = requests.session()
|
||||
s.get(httpbin('cookies'), cookies={'foo': 'baz'})
|
||||
# Sending a request with cookies should not add cookies to the session
|
||||
assert not s.cookies
|
||||
|
||||
def test_generic_cookiejar_works(self):
|
||||
def test_generic_cookiejar_works(self, httpbin):
|
||||
cj = cookielib.CookieJar()
|
||||
cookiejar_from_dict({'foo': 'bar'}, cj)
|
||||
s = requests.session()
|
||||
@@ -230,7 +267,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
# Make sure the session cj is still the custom one
|
||||
assert s.cookies is cj
|
||||
|
||||
def test_param_cookiejar_works(self):
|
||||
def test_param_cookiejar_works(self, httpbin):
|
||||
cj = cookielib.CookieJar()
|
||||
cookiejar_from_dict({'foo': 'bar'}, cj)
|
||||
s = requests.session()
|
||||
@@ -238,13 +275,13 @@ class RequestsTestCase(unittest.TestCase):
|
||||
# Make sure the cookie was sent
|
||||
assert r.json()['cookies']['foo'] == 'bar'
|
||||
|
||||
def test_requests_in_history_are_not_overridden(self):
|
||||
def test_requests_in_history_are_not_overridden(self, httpbin):
|
||||
resp = requests.get(httpbin('redirect/3'))
|
||||
urls = [r.url for r in resp.history]
|
||||
req_urls = [r.request.url for r in resp.history]
|
||||
assert urls == req_urls
|
||||
|
||||
def test_history_is_always_a_list(self):
|
||||
def test_history_is_always_a_list(self, httpbin):
|
||||
"""
|
||||
Show that even with redirects, Response.history is always a list.
|
||||
"""
|
||||
@@ -254,7 +291,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert isinstance(resp.history, list)
|
||||
assert not isinstance(resp.history, tuple)
|
||||
|
||||
def test_headers_on_session_with_None_are_not_sent(self):
|
||||
def test_headers_on_session_with_None_are_not_sent(self, httpbin):
|
||||
"""Do not send headers in Session.headers with None values."""
|
||||
ses = requests.Session()
|
||||
ses.headers['Accept-Encoding'] = None
|
||||
@@ -262,7 +299,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
prep = ses.prepare_request(req)
|
||||
assert 'Accept-Encoding' not in prep.headers
|
||||
|
||||
def test_user_agent_transfers(self):
|
||||
def test_user_agent_transfers(self, httpbin):
|
||||
|
||||
heads = {
|
||||
'User-agent': 'Mozilla/5.0 (github.com/kennethreitz/requests)'
|
||||
@@ -278,15 +315,15 @@ class RequestsTestCase(unittest.TestCase):
|
||||
r = requests.get(httpbin('user-agent'), headers=heads)
|
||||
assert heads['user-agent'] in r.text
|
||||
|
||||
def test_HTTP_200_OK_HEAD(self):
|
||||
def test_HTTP_200_OK_HEAD(self, httpbin):
|
||||
r = requests.head(httpbin('get'))
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_HTTP_200_OK_PUT(self):
|
||||
def test_HTTP_200_OK_PUT(self, httpbin):
|
||||
r = requests.put(httpbin('put'))
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_BASICAUTH_TUPLE_HTTP_200_OK_GET(self):
|
||||
def test_BASICAUTH_TUPLE_HTTP_200_OK_GET(self, httpbin):
|
||||
auth = ('user', 'pass')
|
||||
url = httpbin('basic-auth', 'user', 'pass')
|
||||
|
||||
@@ -301,48 +338,55 @@ class RequestsTestCase(unittest.TestCase):
|
||||
r = s.get(url)
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_connection_error(self):
|
||||
def test_connection_error_invalid_domain(self):
|
||||
"""Connecting to an unknown domain should raise a ConnectionError"""
|
||||
with pytest.raises(ConnectionError):
|
||||
requests.get("http://fooobarbangbazbing.httpbin.org")
|
||||
requests.get("http://doesnotexist.google.com")
|
||||
|
||||
def test_connection_error_invalid_port(self):
|
||||
"""Connecting to an invalid port should raise a ConnectionError"""
|
||||
with pytest.raises(ConnectionError):
|
||||
requests.get("http://httpbin.org:1")
|
||||
requests.get("http://localhost:1", timeout=1)
|
||||
|
||||
def test_LocationParseError(self):
|
||||
"""Inputing a URL that cannot be parsed should raise an InvalidURL error"""
|
||||
with pytest.raises(InvalidURL):
|
||||
requests.get("http://fe80::5054:ff:fe5a:fc0")
|
||||
|
||||
def test_basicauth_with_netrc(self):
|
||||
def test_basicauth_with_netrc(self, httpbin):
|
||||
auth = ('user', 'pass')
|
||||
wrong_auth = ('wronguser', 'wrongpass')
|
||||
url = httpbin('basic-auth', 'user', 'pass')
|
||||
|
||||
def get_netrc_auth_mock(url):
|
||||
return auth
|
||||
requests.sessions.get_netrc_auth = get_netrc_auth_mock
|
||||
old_auth = requests.sessions.get_netrc_auth
|
||||
|
||||
# Should use netrc and work.
|
||||
r = requests.get(url)
|
||||
assert r.status_code == 200
|
||||
try:
|
||||
def get_netrc_auth_mock(url):
|
||||
return auth
|
||||
requests.sessions.get_netrc_auth = get_netrc_auth_mock
|
||||
|
||||
# Given auth should override and fail.
|
||||
r = requests.get(url, auth=wrong_auth)
|
||||
assert r.status_code == 401
|
||||
# Should use netrc and work.
|
||||
r = requests.get(url)
|
||||
assert r.status_code == 200
|
||||
|
||||
s = requests.session()
|
||||
# Given auth should override and fail.
|
||||
r = requests.get(url, auth=wrong_auth)
|
||||
assert r.status_code == 401
|
||||
|
||||
# Should use netrc and work.
|
||||
r = s.get(url)
|
||||
assert r.status_code == 200
|
||||
s = requests.session()
|
||||
|
||||
# Given auth should override and fail.
|
||||
s.auth = wrong_auth
|
||||
r = s.get(url)
|
||||
assert r.status_code == 401
|
||||
# Should use netrc and work.
|
||||
r = s.get(url)
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_DIGEST_HTTP_200_OK_GET(self):
|
||||
# Given auth should override and fail.
|
||||
s.auth = wrong_auth
|
||||
r = s.get(url)
|
||||
assert r.status_code == 401
|
||||
finally:
|
||||
requests.sessions.get_netrc_auth = old_auth
|
||||
|
||||
def test_DIGEST_HTTP_200_OK_GET(self, httpbin):
|
||||
|
||||
auth = HTTPDigestAuth('user', 'pass')
|
||||
url = httpbin('digest-auth', 'auth', 'user', 'pass')
|
||||
@@ -358,7 +402,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
r = s.get(url)
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_DIGEST_AUTH_RETURNS_COOKIE(self):
|
||||
def test_DIGEST_AUTH_RETURNS_COOKIE(self, httpbin):
|
||||
url = httpbin('digest-auth', 'auth', 'user', 'pass')
|
||||
auth = HTTPDigestAuth('user', 'pass')
|
||||
r = requests.get(url)
|
||||
@@ -367,14 +411,14 @@ class RequestsTestCase(unittest.TestCase):
|
||||
r = requests.get(url, auth=auth)
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_DIGEST_AUTH_SETS_SESSION_COOKIES(self):
|
||||
def test_DIGEST_AUTH_SETS_SESSION_COOKIES(self, httpbin):
|
||||
url = httpbin('digest-auth', 'auth', 'user', 'pass')
|
||||
auth = HTTPDigestAuth('user', 'pass')
|
||||
s = requests.Session()
|
||||
s.get(url, auth=auth)
|
||||
assert s.cookies['fake'] == 'fake_value'
|
||||
|
||||
def test_DIGEST_STREAM(self):
|
||||
def test_DIGEST_STREAM(self, httpbin):
|
||||
|
||||
auth = HTTPDigestAuth('user', 'pass')
|
||||
url = httpbin('digest-auth', 'auth', 'user', 'pass')
|
||||
@@ -385,7 +429,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
r = requests.get(url, auth=auth, stream=False)
|
||||
assert r.raw.read() == b''
|
||||
|
||||
def test_DIGESTAUTH_WRONG_HTTP_401_GET(self):
|
||||
def test_DIGESTAUTH_WRONG_HTTP_401_GET(self, httpbin):
|
||||
|
||||
auth = HTTPDigestAuth('user', 'wrongpass')
|
||||
url = httpbin('digest-auth', 'auth', 'user', 'pass')
|
||||
@@ -401,7 +445,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
r = s.get(url)
|
||||
assert r.status_code == 401
|
||||
|
||||
def test_DIGESTAUTH_QUOTES_QOP_VALUE(self):
|
||||
def test_DIGESTAUTH_QUOTES_QOP_VALUE(self, httpbin):
|
||||
|
||||
auth = HTTPDigestAuth('user', 'pass')
|
||||
url = httpbin('digest-auth', 'auth', 'user', 'pass')
|
||||
@@ -409,7 +453,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
r = requests.get(url, auth=auth)
|
||||
assert '"auth"' in r.request.headers['Authorization']
|
||||
|
||||
def test_POSTBIN_GET_POST_FILES(self):
|
||||
def test_POSTBIN_GET_POST_FILES(self, httpbin):
|
||||
|
||||
url = httpbin('post')
|
||||
post1 = requests.post(url).raise_for_status()
|
||||
@@ -427,7 +471,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
with pytest.raises(ValueError):
|
||||
requests.post(url, files=['bad file data'])
|
||||
|
||||
def test_POSTBIN_GET_POST_FILES_WITH_DATA(self):
|
||||
def test_POSTBIN_GET_POST_FILES_WITH_DATA(self, httpbin):
|
||||
|
||||
url = httpbin('post')
|
||||
post1 = requests.post(url).raise_for_status()
|
||||
@@ -446,17 +490,17 @@ class RequestsTestCase(unittest.TestCase):
|
||||
with pytest.raises(ValueError):
|
||||
requests.post(url, files=['bad file data'])
|
||||
|
||||
def test_conflicting_post_params(self):
|
||||
def test_conflicting_post_params(self, httpbin):
|
||||
url = httpbin('post')
|
||||
with open('requirements.txt') as f:
|
||||
pytest.raises(ValueError, "requests.post(url, data='[{\"some\": \"data\"}]', files={'some': f})")
|
||||
pytest.raises(ValueError, "requests.post(url, data=u('[{\"some\": \"data\"}]'), files={'some': f})")
|
||||
|
||||
def test_request_ok_set(self):
|
||||
def test_request_ok_set(self, httpbin):
|
||||
r = requests.get(httpbin('status', '404'))
|
||||
assert not r.ok
|
||||
|
||||
def test_status_raising(self):
|
||||
def test_status_raising(self, httpbin):
|
||||
r = requests.get(httpbin('status', '404'))
|
||||
with pytest.raises(requests.exceptions.HTTPError):
|
||||
r.raise_for_status()
|
||||
@@ -464,11 +508,11 @@ class RequestsTestCase(unittest.TestCase):
|
||||
r = requests.get(httpbin('status', '500'))
|
||||
assert not r.ok
|
||||
|
||||
def test_decompress_gzip(self):
|
||||
def test_decompress_gzip(self, httpbin):
|
||||
r = requests.get(httpbin('gzip'))
|
||||
r.content.decode('ascii')
|
||||
|
||||
def test_unicode_get(self):
|
||||
def test_unicode_get(self, httpbin):
|
||||
url = httpbin('/get')
|
||||
requests.get(url, params={'foo': 'føø'})
|
||||
requests.get(url, params={'føø': 'føø'})
|
||||
@@ -476,29 +520,29 @@ class RequestsTestCase(unittest.TestCase):
|
||||
requests.get(url, params={'foo': 'foo'})
|
||||
requests.get(httpbin('ø'), params={'foo': 'foo'})
|
||||
|
||||
def test_unicode_header_name(self):
|
||||
def test_unicode_header_name(self, httpbin):
|
||||
requests.put(
|
||||
httpbin('put'),
|
||||
headers={str('Content-Type'): 'application/octet-stream'},
|
||||
data='\xff') # compat.str is unicode.
|
||||
|
||||
def test_pyopenssl_redirect(self):
|
||||
requests.get('https://httpbin.org/status/301')
|
||||
def test_pyopenssl_redirect(self, httpsbin_url, httpbin_ca_bundle):
|
||||
requests.get(httpsbin_url('status', '301'), verify=httpbin_ca_bundle)
|
||||
|
||||
def test_urlencoded_get_query_multivalued_param(self):
|
||||
def test_urlencoded_get_query_multivalued_param(self, httpbin):
|
||||
|
||||
r = requests.get(httpbin('get'), params=dict(test=['foo', 'baz']))
|
||||
assert r.status_code == 200
|
||||
assert r.url == httpbin('get?test=foo&test=baz')
|
||||
|
||||
def test_different_encodings_dont_break_post(self):
|
||||
def test_different_encodings_dont_break_post(self, httpbin):
|
||||
r = requests.post(httpbin('post'),
|
||||
data={'stuff': json.dumps({'a': 123})},
|
||||
params={'blah': 'asdf1234'},
|
||||
files={'file': ('test_requests.py', open(__file__, 'rb'))})
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_unicode_multipart_post(self):
|
||||
def test_unicode_multipart_post(self, httpbin):
|
||||
r = requests.post(httpbin('post'),
|
||||
data={'stuff': u('ëlïxr')},
|
||||
files={'file': ('test_requests.py', open(__file__, 'rb'))})
|
||||
@@ -519,7 +563,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
files={'file': ('test_requests.py', open(__file__, 'rb'))})
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_unicode_multipart_post_fieldnames(self):
|
||||
def test_unicode_multipart_post_fieldnames(self, httpbin):
|
||||
filename = os.path.splitext(__file__)[0] + '.py'
|
||||
r = requests.Request(method='POST',
|
||||
url=httpbin('post'),
|
||||
@@ -530,13 +574,24 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert b'name="stuff"' in prep.body
|
||||
assert b'name="b\'stuff\'"' not in prep.body
|
||||
|
||||
def test_unicode_method_name(self):
|
||||
def test_unicode_method_name(self, httpbin):
|
||||
files = {'file': open('test_requests.py', 'rb')}
|
||||
r = requests.request(
|
||||
method=u('POST'), url=httpbin('post'), files=files)
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_custom_content_type(self):
|
||||
def test_unicode_method_name_with_request_object(self, httpbin):
|
||||
files = {'file': open('test_requests.py', 'rb')}
|
||||
s = requests.Session()
|
||||
req = requests.Request(u("POST"), httpbin('post'), files=files)
|
||||
prep = s.prepare_request(req)
|
||||
assert isinstance(prep.method, builtin_str)
|
||||
assert prep.method == "POST"
|
||||
|
||||
resp = s.send(prep)
|
||||
assert resp.status_code == 200
|
||||
|
||||
def test_custom_content_type(self, httpbin):
|
||||
r = requests.post(
|
||||
httpbin('post'),
|
||||
data={'stuff': json.dumps({'a': 123})},
|
||||
@@ -546,38 +601,38 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert r.status_code == 200
|
||||
assert b"text/py-content-type" in r.request.body
|
||||
|
||||
def test_hook_receives_request_arguments(self):
|
||||
def test_hook_receives_request_arguments(self, httpbin):
|
||||
def hook(resp, **kwargs):
|
||||
assert resp is not None
|
||||
assert kwargs != {}
|
||||
|
||||
requests.Request('GET', HTTPBIN, hooks={'response': hook})
|
||||
requests.Request('GET', httpbin(), hooks={'response': hook})
|
||||
|
||||
def test_session_hooks_are_used_with_no_request_hooks(self):
|
||||
def test_session_hooks_are_used_with_no_request_hooks(self, httpbin):
|
||||
hook = lambda x, *args, **kwargs: x
|
||||
s = requests.Session()
|
||||
s.hooks['response'].append(hook)
|
||||
r = requests.Request('GET', HTTPBIN)
|
||||
r = requests.Request('GET', httpbin())
|
||||
prep = s.prepare_request(r)
|
||||
assert prep.hooks['response'] != []
|
||||
assert prep.hooks['response'] == [hook]
|
||||
|
||||
def test_session_hooks_are_overriden_by_request_hooks(self):
|
||||
def test_session_hooks_are_overridden_by_request_hooks(self, httpbin):
|
||||
hook1 = lambda x, *args, **kwargs: x
|
||||
hook2 = lambda x, *args, **kwargs: x
|
||||
assert hook1 is not hook2
|
||||
s = requests.Session()
|
||||
s.hooks['response'].append(hook2)
|
||||
r = requests.Request('GET', HTTPBIN, hooks={'response': [hook1]})
|
||||
r = requests.Request('GET', httpbin(), hooks={'response': [hook1]})
|
||||
prep = s.prepare_request(r)
|
||||
assert prep.hooks['response'] == [hook1]
|
||||
|
||||
def test_prepared_request_hook(self):
|
||||
def test_prepared_request_hook(self, httpbin):
|
||||
def hook(resp, **kwargs):
|
||||
resp.hook_working = True
|
||||
return resp
|
||||
|
||||
req = requests.Request('GET', HTTPBIN, hooks={'response': hook})
|
||||
req = requests.Request('GET', httpbin(), hooks={'response': hook})
|
||||
prep = req.prepare()
|
||||
|
||||
s = requests.Session()
|
||||
@@ -586,7 +641,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
|
||||
assert hasattr(resp, 'hook_working')
|
||||
|
||||
def test_prepared_from_session(self):
|
||||
def test_prepared_from_session(self, httpbin):
|
||||
class DummyAuth(requests.auth.AuthBase):
|
||||
def __call__(self, r):
|
||||
r.headers['Dummy-Auth-Test'] = 'dummy-auth-test-ok'
|
||||
@@ -739,7 +794,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
# make sure one can use items multiple times
|
||||
assert list(items) == list(items)
|
||||
|
||||
def test_time_elapsed_blank(self):
|
||||
def test_time_elapsed_blank(self, httpbin):
|
||||
r = requests.get(httpbin('get'))
|
||||
td = r.elapsed
|
||||
total_seconds = ((td.microseconds + (td.seconds + td.days * 24 * 3600)
|
||||
@@ -778,7 +833,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
chunks = r.iter_content(decode_unicode=True)
|
||||
assert all(isinstance(chunk, str) for chunk in chunks)
|
||||
|
||||
def test_request_and_response_are_pickleable(self):
|
||||
def test_request_and_response_are_pickleable(self, httpbin):
|
||||
r = requests.get(httpbin('get'))
|
||||
|
||||
# verify we can pickle the original request
|
||||
@@ -810,8 +865,8 @@ class RequestsTestCase(unittest.TestCase):
|
||||
url = 'http://user:pass%23pass@complex.url.com/path?query=yes'
|
||||
assert ('user', 'pass#pass') == requests.utils.get_auth_from_url(url)
|
||||
|
||||
def test_cannot_send_unprepared_requests(self):
|
||||
r = requests.Request(url=HTTPBIN)
|
||||
def test_cannot_send_unprepared_requests(self, httpbin):
|
||||
r = requests.Request(url=httpbin())
|
||||
with pytest.raises(ValueError):
|
||||
requests.Session().send(r)
|
||||
|
||||
@@ -825,7 +880,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert str(error) == 'message'
|
||||
assert error.response == response
|
||||
|
||||
def test_session_pickling(self):
|
||||
def test_session_pickling(self, httpbin):
|
||||
r = requests.Request('GET', httpbin('get'))
|
||||
s = requests.Session()
|
||||
|
||||
@@ -835,7 +890,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
r = s.send(r.prepare())
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_fixes_1329(self):
|
||||
def test_fixes_1329(self, httpbin):
|
||||
"""
|
||||
Ensure that header updates are done case-insensitively.
|
||||
"""
|
||||
@@ -848,7 +903,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert headers['Accept'] == 'application/json'
|
||||
assert headers['ACCEPT'] == 'application/json'
|
||||
|
||||
def test_uppercase_scheme_redirect(self):
|
||||
def test_uppercase_scheme_redirect(self, httpbin):
|
||||
parts = urlparse(httpbin('html'))
|
||||
url = "HTTP://" + parts.netloc + parts.path
|
||||
r = requests.get(httpbin('redirect-to'), params={'url': url})
|
||||
@@ -893,14 +948,14 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert 'http://' in s2.adapters
|
||||
assert 'https://' in s2.adapters
|
||||
|
||||
def test_header_remove_is_case_insensitive(self):
|
||||
def test_header_remove_is_case_insensitive(self, httpbin):
|
||||
# From issue #1321
|
||||
s = requests.Session()
|
||||
s.headers['foo'] = 'bar'
|
||||
r = s.get(httpbin('get'), headers={'FOO': None})
|
||||
assert 'foo' not in r.request.headers
|
||||
|
||||
def test_params_are_merged_case_sensitive(self):
|
||||
def test_params_are_merged_case_sensitive(self, httpbin):
|
||||
s = requests.Session()
|
||||
s.params['foo'] = 'bar'
|
||||
r = s.get(httpbin('get'), params={'FOO': 'bar'})
|
||||
@@ -915,7 +970,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
r = requests.Request('GET', url).prepare()
|
||||
assert r.url == url
|
||||
|
||||
def test_header_keys_are_native(self):
|
||||
def test_header_keys_are_native(self, httpbin):
|
||||
headers = {u('unicode'): 'blah', 'byte'.encode('ascii'): 'blah'}
|
||||
r = requests.Request('GET', httpbin('get'), headers=headers)
|
||||
p = r.prepare()
|
||||
@@ -925,7 +980,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert 'unicode' in p.headers.keys()
|
||||
assert 'byte' in p.headers.keys()
|
||||
|
||||
def test_can_send_nonstring_objects_with_files(self):
|
||||
def test_can_send_nonstring_objects_with_files(self, httpbin):
|
||||
data = {'a': 0.0}
|
||||
files = {'b': 'foo'}
|
||||
r = requests.Request('POST', httpbin('post'), data=data, files=files)
|
||||
@@ -933,7 +988,20 @@ class RequestsTestCase(unittest.TestCase):
|
||||
|
||||
assert 'multipart/form-data' in p.headers['Content-Type']
|
||||
|
||||
def test_can_send_file_object_with_non_string_filename(self):
|
||||
def test_can_send_bytes_bytearray_objects_with_files(self, httpbin):
|
||||
# Test bytes:
|
||||
data = {'a': 'this is a string'}
|
||||
files = {'b': b'foo'}
|
||||
r = requests.Request('POST', httpbin('post'), data=data, files=files)
|
||||
p = r.prepare()
|
||||
assert 'multipart/form-data' in p.headers['Content-Type']
|
||||
# Test bytearrays:
|
||||
files = {'b': bytearray(b'foo')}
|
||||
r = requests.Request('POST', httpbin('post'), data=data, files=files)
|
||||
p = r.prepare()
|
||||
assert 'multipart/form-data' in p.headers['Content-Type']
|
||||
|
||||
def test_can_send_file_object_with_non_string_filename(self, httpbin):
|
||||
f = io.BytesIO()
|
||||
f.name = 2
|
||||
r = requests.Request('POST', httpbin('post'), files={'f': f})
|
||||
@@ -941,7 +1009,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
|
||||
assert 'multipart/form-data' in p.headers['Content-Type']
|
||||
|
||||
def test_autoset_header_values_are_native(self):
|
||||
def test_autoset_header_values_are_native(self, httpbin):
|
||||
data = 'this is a string'
|
||||
length = '16'
|
||||
req = requests.Request('POST', httpbin('post'), data=data)
|
||||
@@ -960,7 +1028,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
preq = req.prepare()
|
||||
assert test_url == preq.url
|
||||
|
||||
def test_auth_is_stripped_on_redirect_off_host(self):
|
||||
def test_auth_is_stripped_on_redirect_off_host(self, httpbin):
|
||||
r = requests.get(
|
||||
httpbin('redirect-to'),
|
||||
params={'url': 'http://www.google.co.uk'},
|
||||
@@ -969,14 +1037,14 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert r.history[0].request.headers['Authorization']
|
||||
assert not r.request.headers.get('Authorization', '')
|
||||
|
||||
def test_auth_is_retained_for_redirect_on_host(self):
|
||||
def test_auth_is_retained_for_redirect_on_host(self, httpbin):
|
||||
r = requests.get(httpbin('redirect/1'), auth=('user', 'pass'))
|
||||
h1 = r.history[0].request.headers['Authorization']
|
||||
h2 = r.request.headers['Authorization']
|
||||
|
||||
assert h1 == h2
|
||||
|
||||
def test_manual_redirect_with_partial_body_read(self):
|
||||
def test_manual_redirect_with_partial_body_read(self, httpbin):
|
||||
s = requests.Session()
|
||||
r1 = s.get(httpbin('redirect/2'), allow_redirects=False, stream=True)
|
||||
assert r1.is_redirect
|
||||
@@ -1009,7 +1077,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
|
||||
adapter.build_response = build_response
|
||||
|
||||
def test_redirect_with_wrong_gzipped_header(self):
|
||||
def test_redirect_with_wrong_gzipped_header(self, httpbin):
|
||||
s = requests.Session()
|
||||
url = httpbin('redirect/1')
|
||||
self._patch_adapter_gzipped_redirect(s, url)
|
||||
@@ -1020,7 +1088,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert isinstance(s, builtin_str)
|
||||
assert s == "Basic dGVzdDp0ZXN0"
|
||||
|
||||
def test_requests_history_is_saved(self):
|
||||
def test_requests_history_is_saved(self, httpbin):
|
||||
r = requests.get(httpbin('redirect/5'))
|
||||
total = r.history[-1].history
|
||||
i = 0
|
||||
@@ -1028,7 +1096,7 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert item.history == total[0:i]
|
||||
i = i + 1
|
||||
|
||||
def test_json_param_post_content_type_works(self):
|
||||
def test_json_param_post_content_type_works(self, httpbin):
|
||||
r = requests.post(
|
||||
httpbin('post'),
|
||||
json={'life': 42}
|
||||
@@ -1037,6 +1105,39 @@ class RequestsTestCase(unittest.TestCase):
|
||||
assert 'application/json' in r.request.headers['Content-Type']
|
||||
assert {'life': 42} == r.json()['json']
|
||||
|
||||
def test_json_param_post_should_not_override_data_param(self, httpbin):
|
||||
r = requests.Request(method='POST', url=httpbin('post'),
|
||||
data={'stuff': 'elixr'},
|
||||
json={'music': 'flute'})
|
||||
prep = r.prepare()
|
||||
assert 'stuff=elixr' == prep.body
|
||||
|
||||
def test_response_iter_lines(self, httpbin):
|
||||
r = requests.get(httpbin('stream/4'), stream=True)
|
||||
assert r.status_code == 200
|
||||
|
||||
it = r.iter_lines()
|
||||
next(it)
|
||||
assert len(list(it)) == 3
|
||||
|
||||
def test_unconsumed_session_response_closes_connection(self, httpbin):
|
||||
s = requests.session()
|
||||
|
||||
with contextlib.closing(s.get(httpbin('stream/4'), stream=True)) as response:
|
||||
pass
|
||||
|
||||
assert response._content_consumed is False
|
||||
assert response.raw.closed
|
||||
|
||||
@pytest.mark.xfail
|
||||
def test_response_iter_lines_reentrant(self, httpbin):
|
||||
"""Response.iter_lines() is not reentrant safe"""
|
||||
r = requests.get(httpbin('stream/4'), stream=True)
|
||||
assert r.status_code == 200
|
||||
|
||||
next(r.iter_lines())
|
||||
assert len(list(r.iter_lines())) == 3
|
||||
|
||||
|
||||
class TestContentEncodingDetection(unittest.TestCase):
|
||||
|
||||
@@ -1182,6 +1283,7 @@ class TestCaseInsensitiveDict(unittest.TestCase):
|
||||
del othercid['spam']
|
||||
assert cid != othercid
|
||||
assert cid == {'spam': 'blueval', 'eggs': 'redval'}
|
||||
assert cid != object()
|
||||
|
||||
def test_setdefault(self):
|
||||
cid = CaseInsensitiveDict({'Spam': 'blueval'})
|
||||
@@ -1219,6 +1321,16 @@ class TestCaseInsensitiveDict(unittest.TestCase):
|
||||
assert frozenset(cid.keys()) == keyset
|
||||
assert frozenset(cid) == keyset
|
||||
|
||||
def test_copy(self):
|
||||
cid = CaseInsensitiveDict({
|
||||
'Accept': 'application/json',
|
||||
'user-Agent': 'requests',
|
||||
})
|
||||
cid_copy = cid.copy()
|
||||
assert cid == cid_copy
|
||||
cid['changed'] = True
|
||||
assert cid != cid_copy
|
||||
|
||||
|
||||
class UtilsTestCase(unittest.TestCase):
|
||||
|
||||
@@ -1244,6 +1356,13 @@ class UtilsTestCase(unittest.TestCase):
|
||||
assert super_len(
|
||||
cStringIO.StringIO('but some how, some way...')) == 25
|
||||
|
||||
def test_super_len_correctly_calculates_len_of_partially_read_file(self):
|
||||
"""Ensure that we handle partially consumed file like objects."""
|
||||
from requests.utils import super_len
|
||||
s = StringIO.StringIO()
|
||||
s.write('foobarbogus')
|
||||
assert super_len(s) == 0
|
||||
|
||||
def test_get_environ_proxies_ip_ranges(self):
|
||||
"""Ensures that IP addresses are correctly matches with ranges
|
||||
in no_proxy variable."""
|
||||
@@ -1265,6 +1384,41 @@ class UtilsTestCase(unittest.TestCase):
|
||||
'http://localhost.localdomain:5000/v1.0/') == {}
|
||||
assert get_environ_proxies('http://www.requests.com/') != {}
|
||||
|
||||
def test_select_proxies(self):
|
||||
"""Make sure we can select per-host proxies correctly."""
|
||||
from requests.utils import select_proxy
|
||||
proxies = {'http': 'http://http.proxy',
|
||||
'http://some.host': 'http://some.host.proxy'}
|
||||
assert select_proxy('hTTp://u:p@Some.Host/path', proxies) == 'http://some.host.proxy'
|
||||
assert select_proxy('hTTp://u:p@Other.Host/path', proxies) == 'http://http.proxy'
|
||||
assert select_proxy('hTTps://Other.Host', proxies) is None
|
||||
|
||||
def test_guess_filename_when_int(self):
|
||||
from requests.utils import guess_filename
|
||||
assert None is guess_filename(1)
|
||||
|
||||
def test_guess_filename_when_filename_is_an_int(self):
|
||||
from requests.utils import guess_filename
|
||||
fake = type('Fake', (object,), {'name': 1})()
|
||||
assert None is guess_filename(fake)
|
||||
|
||||
def test_guess_filename_with_file_like_obj(self):
|
||||
from requests.utils import guess_filename
|
||||
from requests import compat
|
||||
fake = type('Fake', (object,), {'name': b'value'})()
|
||||
guessed_name = guess_filename(fake)
|
||||
assert b'value' == guessed_name
|
||||
assert isinstance(guessed_name, compat.bytes)
|
||||
|
||||
def test_guess_filename_with_unicode_name(self):
|
||||
from requests.utils import guess_filename
|
||||
from requests import compat
|
||||
filename = b'value'.decode('utf-8')
|
||||
fake = type('Fake', (object,), {'name': filename})()
|
||||
guessed_name = guess_filename(fake)
|
||||
assert filename == guessed_name
|
||||
assert isinstance(guessed_name, compat.str)
|
||||
|
||||
def test_is_ipv4_address(self):
|
||||
from requests.utils import is_ipv4_address
|
||||
assert is_ipv4_address('8.8.8.8')
|
||||
@@ -1301,6 +1455,22 @@ class UtilsTestCase(unittest.TestCase):
|
||||
assert username == percent_encoding_test_chars
|
||||
assert password == percent_encoding_test_chars
|
||||
|
||||
def test_requote_uri_with_unquoted_percents(self):
|
||||
"""Ensure we handle unquoted percent signs in redirects.
|
||||
|
||||
See: https://github.com/kennethreitz/requests/issues/2356
|
||||
"""
|
||||
from requests.utils import requote_uri
|
||||
bad_uri = 'http://example.com/fiz?buz=%ppicture'
|
||||
quoted = 'http://example.com/fiz?buz=%25ppicture'
|
||||
assert quoted == requote_uri(bad_uri)
|
||||
|
||||
def test_requote_uri_properly_requotes(self):
|
||||
"""Ensure requoting doesn't break expectations."""
|
||||
from requests.utils import requote_uri
|
||||
quoted = 'http://example.com/fiz?buz=%25ppicture'
|
||||
assert quoted == requote_uri(quoted)
|
||||
|
||||
|
||||
class TestMorselToCookieExpires(unittest.TestCase):
|
||||
|
||||
@@ -1361,13 +1531,13 @@ class TestMorselToCookieMaxAge(unittest.TestCase):
|
||||
|
||||
|
||||
class TestTimeout:
|
||||
def test_stream_timeout(self):
|
||||
def test_stream_timeout(self, httpbin):
|
||||
try:
|
||||
requests.get(httpbin('delay/10'), timeout=2.0)
|
||||
except requests.exceptions.Timeout as e:
|
||||
assert 'Read timed out' in e.args[0].args[0]
|
||||
|
||||
def test_invalid_timeout(self):
|
||||
def test_invalid_timeout(self, httpbin):
|
||||
with pytest.raises(ValueError) as e:
|
||||
requests.get(httpbin('get'), timeout=(3, 4, 5))
|
||||
assert '(connect, read)' in str(e)
|
||||
@@ -1376,7 +1546,7 @@ class TestTimeout:
|
||||
requests.get(httpbin('get'), timeout="foo")
|
||||
assert 'must be an int or float' in str(e)
|
||||
|
||||
def test_none_timeout(self):
|
||||
def test_none_timeout(self, httpbin):
|
||||
""" Check that you can set None as a valid timeout value.
|
||||
|
||||
To actually test this behavior, we'd want to check that setting the
|
||||
@@ -1388,7 +1558,7 @@ class TestTimeout:
|
||||
r = requests.get(httpbin('get'), timeout=None)
|
||||
assert r.status_code == 200
|
||||
|
||||
def test_read_timeout(self):
|
||||
def test_read_timeout(self, httpbin):
|
||||
try:
|
||||
requests.get(httpbin('delay/10'), timeout=(None, 0.1))
|
||||
assert False, "The recv() request should time out."
|
||||
@@ -1410,7 +1580,7 @@ class TestTimeout:
|
||||
except ConnectTimeout:
|
||||
pass
|
||||
|
||||
def test_encoded_methods(self):
|
||||
def test_encoded_methods(self, httpbin):
|
||||
"""See: https://github.com/kennethreitz/requests/issues/2316"""
|
||||
r = requests.request(b'GET', httpbin('get'))
|
||||
assert r.ok
|
||||
@@ -1461,7 +1631,7 @@ class TestRedirects:
|
||||
'proxies': {},
|
||||
}
|
||||
|
||||
def test_requests_are_updated_each_time(self):
|
||||
def test_requests_are_updated_each_time(self, httpbin):
|
||||
session = RedirectSession([303, 307])
|
||||
prep = requests.Request('POST', httpbin('post')).prepare()
|
||||
r0 = session.send(prep)
|
||||
@@ -1539,12 +1709,11 @@ def test_prepare_unicode_url():
|
||||
p.prepare(
|
||||
method='GET',
|
||||
url=u('http://www.example.com/üniçø∂é'),
|
||||
hooks=[]
|
||||
)
|
||||
assert_copy(p, p.copy())
|
||||
|
||||
|
||||
def test_urllib3_retries():
|
||||
def test_urllib3_retries(httpbin):
|
||||
from requests.packages.urllib3.util import Retry
|
||||
s = requests.Session()
|
||||
s.mount('http://', HTTPAdapter(max_retries=Retry(
|
||||
@@ -1554,5 +1723,24 @@ def test_urllib3_retries():
|
||||
with pytest.raises(RetryError):
|
||||
s.get(httpbin('status/500'))
|
||||
|
||||
|
||||
def test_urllib3_pool_connection_closed(httpbin):
|
||||
s = requests.Session()
|
||||
s.mount('http://', HTTPAdapter(pool_connections=0, pool_maxsize=0))
|
||||
|
||||
try:
|
||||
s.get(httpbin('status/200'))
|
||||
except ConnectionError as e:
|
||||
assert u"Pool is closed." in str(e)
|
||||
|
||||
|
||||
def test_vendor_aliases():
|
||||
from requests.packages import urllib3
|
||||
from requests.packages import chardet
|
||||
|
||||
with pytest.raises(ImportError):
|
||||
from requests.packages import webbrowser
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
||||
|
||||
Reference in New Issue
Block a user