[Yum-devel] RHN Support

Panu Matilainen pmatilai at welho.com
Thu Mar 24 08:41:56 UTC 2005


On Tue, 22 Mar 2005, Michael Stenner wrote:

> On Tue, Mar 22, 2005 at 10:40:11AM +0200, Panu Matilainen wrote:
>> For some reason urlgrabber (urllib actually) doesn't seem to like our
>> proxies with https however, I just get "urlgrabber.grabber.URLGrabError:
>> [Errno 4] IOError: HTTP Error 500: Server Error" response from the proxy
>> where the error text from proxy is "'Error occurred:<P>\n',
>> '[code=PARENT_NEEDED] Unable to service this URL without parent cache."
>>
>> Some added "magic" is apparently needed since rhnlib is able to go through
>> it but urllib isn't. Actually the same problem happens with wget vs curl:
>> curl is able to go through the proxy with https, wget isn't, with the same
>> exact proxy config. Any ideas?
>
> Gah... proxies are really getting on my nerves.  I'd recommend
> packet-sniffing with all four (rhnlib, urlgrabber/urllib2, wget, and
> curl) and see if a pattern drops out.  I guess if it only happens with
> https, packet-sniffing is a pain.  For the record, new (>=2)
> urlgrabber uses urllib2, and only urllib for a few functions
> (splitting and recombining urls, for example).

It indeed only happens with https :-/

Looking at curl output in verbose mode and rhnlib sources... I don't 
know but maybe the relevant hint is here:

...
* Establish HTTP proxy tunnel to bugzilla.redhat.com:443
< HTTP/1.0 200 Connection established
< Proxy-Agent: NetCache NetApp/5.6.2
<
* Proxy replied OK to CONNECT request
...

Rhnlib has a specific HTTPSProxyConnection class which does something 
special apparently related to this:
     def connect(self):
         # Set the connection with the proxy
         HTTPProxyConnection.connect(self)
         # Use the stock HTTPConnection putrequest
         host = "%s:%s" % (self._host, self._port)
         HTTPConnection.putrequest(self, "CONNECT", host)
         # Add proxy-specific stuff
         self._add_proxy_headers()
         # And send the request
         HTTPConnection.endheaders(self)
         # Save the response class
         response_class = self.response_class
         # And replace the response class with our own one, which does not
         # close the connection after
         self.response_class = HTTPSProxyResponse
         response = HTTPConnection.getresponse(self)
         # Restore the response class
         self.response_class = response_class
 	...

..and I don't see urllib[2] nor urlgrabber doing anything like that. 
Again, this is just a braindump of what I've gathered so far, haven't had 
time to actually try this in separate piece of code.

 	- Panu -



More information about the Yum-devel mailing list