[Yum-devel] [PATCH 11/14] Implement parallel urlgrab()s

James Antill james at fedoraproject.org
Fri Oct 21 16:10:24 UTC 2011


On Fri, 2011-10-21 at 16:28 +0200, Zdeněk Pavlas wrote:
>          
>  #####################################################################
> +#  Downloader
> +#####################################################################
> +
> +class _AsyncCurlFile(PyCurlFileObject):
> +    def _do_open(self):
> +        self.curl_obj = pycurl.Curl() # don't reuse _curl_cache
> +        self._set_opts()
> +        self._do_open_fo() # open the file but don't grab

 Why can't we reuse _curl_cache ... what problems does that cause? IIRC
keepalive doesn't work if you do that.

> +    running = {}
> +    multi = pycurl.CurlMulti()

 Again, _if_ we want to use this then it should be an optimisation. I've
had _so many problems_ with using non-multi curl talking to multiple
hosts that I have no expectation that using multi. curl will make
anything better/easier/whatever.



More information about the Yum-devel mailing list