| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
|
| |
(John J Lee)
Minor code clarification and simplification.
|
|
|
|
| |
removing some duplication and gaining some flexibility in the process.
|
|
|
|
|
|
|
|
| |
Pass the full URL to find_user_password(), in particular so that hosts
with port numbers can be looked up.
Also specify the digest algorithm, even if it's MD5. Titus Brown
verified that this fixes a problem with LiveJournal.
|
|
|
|
|
| |
used to replace rfc822.formatdate for protocols like HTTP (where 'GMT' must
be the timezone string).
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The change to use the newer httplib interface admitted the possibility
that we'd get an HTTP/1.1 chunked response, but the code didn't handle
it correctly. The raw socket object can't be pass to addinfourl(),
because it would read the undecoded response. Instead, addinfourl()
must call HTTPResponse.read(), which will handle the decoding.
One extra wrinkle is that the HTTPReponse object can't be passed to
addinfourl() either, because it doesn't implement readline() or
readlines(). As a quick hack, use socket._fileobject(), which
implements those methods on top of a read buffer. (suggested by mwh)
Finally, add some tests based on test_urllibnet.
Thanks to Andrew Sawyers for originally reporting the chunked problem.
|
|
|
|
| |
Will backport to 2.3.
|
|
|
|
|
| |
Modified Files:
urllib2.py test/test_urllib2.py
|
| |
|
| |
|
|
|
|
|
|
| |
legal if a Range: header was supplied.
(Actually, should the first 'if' statement be modified to allow any 2xx status code?)
|
|
|
|
|
| |
Cleanup: use condition to be consistent with code above
CookieJar is in cookielib
|
| |
|
|
|
|
| |
implemented in patch [ 851736 ].
|
|
|
|
| |
Backported to 2.3.
|
| |
|
|
|
|
|
|
|
|
|
| |
Invoke the standard error handlers for non-200 responses.
Always supply a "Connection: close" header to prevent the server from
leaving the connection open. Downstream users of the socket may
attempt recv()/read() with no arguments, which would block if the
connection were kept open.
|
|
|
|
| |
Backported to 2.3.
|
|
|
|
| |
Backported to 2.3.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
The chief benefit of this change is that requests will now use
HTTP/1.1 instead of HTTP/1.0. Bump the module version number as part
of the change.
There are two possible incompatibilities that we'll need to watch out
for when we get to an alpha release. We may get a different class of
exceptions out of httplib, and the do_open() method changed its
signature. The latter is only important if anyone actually subclasses
AbstractHTTPHandler.
|
|
|
|
|
|
| |
Keep close() methods for backwards compatibility.
Does any call close() explicitly?
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
John J. Lee writes: "the patch makes it possible to implement
functionality like HTTP cookie handling, Refresh handling,
etc. etc. using handler objects. At the moment urllib2's handler
objects aren't quite up to the job, which results in a lot of
cut-n-paste and subclassing. I believe the changes are
backwards-compatible, with the exception of people who've
reimplemented build_opener()'s functionality -- those people would
need to call opener.add_handler(HTTPErrorProcessor).
The main change is allowing handlers to implement
methods like:
http_request(request)
http_response(request, response)
In addition to the usual
http_open(request)
http_error{_*}(...)
"
Note that the change isn't well documented at least in part because
handlers aren't well documented at all. Need to fix this.
Add a bunch of new tests. It appears that none of these tests
actually use the network, so they don't need to be guarded by a
resource flag.
|
|
|
|
|
|
|
|
|
|
|
|
| |
The patch was tweaked slightly. It's get a different mechanism for
generating the cnonce which uses /dev/urandom when possible to
generate less-easily-guessed random input.
Also rearrange the imports so that they are alphabetical and
duplicates are eliminated.
Add a few XXX comments about things left undone and things that could
be improved.
|
|
|
|
| |
(From SF patch #810751)
|
|
|
|
| |
Not sure if this fix is great, but it's probably a small improvement.
|
| |
|
| |
|
|
|
|
|
|
| |
capitalize in AbstractHTTPHandler before inserting headers into HTTP instance.
Closes bug #649742, again.
|
| |
|
| |
|
|
|
|
|
|
| |
have to insert it in front of other classes, nor do dirty tricks like
inserting a "dummy" HTTPHandler after a ProxyHandler when building an
opener with proxy support.
|
|
|
|
| |
the loop.
|
|
|
|
| |
back to item calls.
|
| |
|
|
|
|
| |
Closes patch #639139.
|
|
|
|
|
|
| |
headers and not have any dependency on case. Closes patch #649742.
Also changed all instances of dict.items to dict.iteritems where appropriate.
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The latest changes to the redirect handler couldn't possibly have been
tested, because they did not compute a newurl and failed with a
NameError. The __name__ == "__main__": block has a test for
redirects.
Also, fix SF bug 723831. A urlopen() that failed because the host was
not found raised a socket.gaierror unlike earlier versions of
urllib2. The problem is that httplib actually establishes the
connection at a different point starting with Python 2.2. Move the
try/except to endheaders(), which is where the connection gets
established.
|
|
|
|
| |
(contributed by John J Lee)
|
| |
|