[Yum-devel] curlmulti based parallel downloads

Zdeněk Pavlas zpavlas at redhat.com
Mon Sep 26 16:06:16 UTC 2011


First, fix some issues with MultiFileMeter.

[PATCH 1/9] Prevent float division by zero
[PATCH 2/9] MultiFileMeter: show correct finished size
[PATCH 3/9] TextMultiFileMeter: use 'text' instead of 'basename'.
[PATCH 4/9] Use re.total instead of total_size.

Then, move parts of PyCurlFileObject code to separate
functions, so it can be shared by the new code.

[PATCH 5/9] move pycurl.error handling to _do_perform_exc()
[PATCH 6/9] move opening of target file to _do_open_fo().
[PATCH 7/9] move closing of target file to _do_close_fo().

Implement parallel downloads, and functions parallel_begin()
and parallel_end().

[PATCH 8/9] Implement parallel urlgrab()s

Implement a useful callback.

[PATCH 9/9] Implement 'failfunc' callback.

Open issues:

1) Connection limit: This probably should change from simple 'global'
to 'per-host'.  We should honour limits from metalink files.

2) Might be useful to increment the global mirror lists after
every *started* download, to spread the load over more hosts
and increase the number of active connections.

3) Downloading from a separate process: Not done yet.
Nothing really needs it ATM, and there seem to be no implementation
issues I know of, so I'd like to ack the general concept first.


More information about the Yum-devel mailing list