When using urllib2 to download through a HTTP proxy,
which requires
authorisation, a broken HTTP request is sent. The initial
request might
work but subsequent requests send using the same socket
definitely
fail.
Problem occurs on Fedora Core 3 with python 2.3.4. Buggy
code still
exists in Python Library in 2.4.1.
Found the problem using yum to download files via my
companies Microsoft
ISA web proxy. The proxy requires authorisation. I set the
HTTP_PROXY
environment variable to define the proxy like this:
export HTTP_PROXY=http://username:password@proxy.
example.com:8080/
Analysis from my yum bugzilla report
http://devel.linux.duke.edu/bugzilla/show_bug.cgi?id=441 ,
follows:
Location is:
File: urllib2.py
Class: ProxyHandler
Function: proxy_open()
The basic proxy authorisation string is created using
base64.encodestring() and passed to add_header() method
of a Request
object. However base64.encodestring() specifically adds a
trailing
'\n' but when the headers are sent over the socket each is
followed by
'\r\n'. The server sees this double new line as the end of the
HTTP
request and the rest of the HTTP headers as a second
invalid request.
The broken request looks like this:
GET ...
Host: ...
Accept-Encoding: identity
Proxy-authorization: Basic xxxxxxxxxxxxxxxx
<-- Blank line which shouldn't be there
User-agent: urlgrabber/2.9.2
<-- Blank line ending HTTP request
The fix is just to remove the '\n' which base64.encodestring()
added
before calling add_header(). Just use string method strip()
as is done
in the only other location base64.encodestring() is used in
file
urllib2.py.
|