[yum-commits] Changes to 'external-multi-downloader'
zpavlas at osuosl.org
zpavlas at osuosl.org
Tue Nov 22 15:28:15 UTC 2011
New branch 'external-multi-downloader' available with the following commits:
commit 44af1ab85e3da4188fbeb31667f82a68930296da
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Tue Nov 22 14:37:43 2011 +0100
AVOID_CURL_MULTI flag
When using the 'external' download process it's not strictly necessary
to use CurlMulti(), as we can use ordinary blocking code and just throw
more processes on that.
AVOID_CURL_MULTI = True: Each download runs is a separate process.
Processes are reused when downloading files from the same host.
AVOID_CURL_MULTI = False: Fork a single process that handles all the
downloading. Should be somewhat more efficient.
commit 3432549d4b4a72df776bae62d2c7ba05988c49d1
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Mon Nov 7 11:02:33 2011 +0100
Optional/throttled progress reporting in download_process
CurlFileObject updates progress meter on every write. _ProxyProgress
pipes this to the parent, but it's often ignored there.
- make updates conditional
- throttle update rate at 0.31s
commit d128d40a5fef43a4538b46dbed5c6146f89c782d
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Fri Nov 4 09:25:58 2011 +0100
Improve ctrl-c handling
We don't detach the downloader process from TTY, so it receives
SIGINT as well, and may even exit sooner than Python raises
KeyboardInterrupt in the parent process.
- downloader: don't print ctrl-c traceback
- parent: handle EINTR and EOF as ctrl-c
commit 9ba1ffe2612a66f27a60b55ff554bd32917d73ba
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Fri Oct 21 10:42:59 2011 +0200
External downloading
Add 'external = True' flag to parallel_wait()
to relay download requests to external process.
commit ad0e8d8331c7ab2fc3f06a3d83775bf52f7e5a8d
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Fri Nov 4 08:35:15 2011 +0100
Downloader process
When executed with a single argument 'DOWNLOADER', grabber.py
parses download requests on stdin, and reports the results to stdout.
Conflicts:
urlgrabber/grabber.py
commit d4fc935db758bfbd87400417af4c0693d4de0601
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Fri Nov 4 08:17:02 2011 +0100
_dumps + _loads: custom serializer/parser
commit 14dfbb856dd5f4002a5a77939b554a450dd7ef0d
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Tue Oct 25 13:54:43 2011 +0200
Reuse curl objects (per host)
commit ad3f3bdedb18cbef05a8212fdb12be9b9d2599c4
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Tue Oct 4 17:24:20 2011 +0200
Implement parallel urlgrab()s
opts.async = (key, limit):
async urlgrab() with conn limiting.
parallel_wait():
wait untill all grabs have finished.
commit e38ec824eefdba0d39e216482e5ad886b097b47e
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Fri Oct 7 12:37:00 2011 +0200
Obsolete the _make_callback() method
Use _run_callback() instead.
commit 56c6ffec8db54a76d9af7389583bf8e1154c784d
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Tue Sep 6 14:41:27 2011 +0200
Implement 'failfunc' callback.
This callback is called when urlgrab request fails.
If grab is wrapped in a mirror group, only the mirror
group issues the callback.
commit c2aef937b331b3b92a17ef16aebabeb236e32d65
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Tue Oct 25 12:52:23 2011 +0200
TextMultiFileMeter: minor tweaks
remove _do_end(), because individual finished files were already
handled in end_meter, and _do_update_meter(None) fails.
remove _do_update_meter() at end of _do_end_meter().
we already have bumped finished_files counter, and
_do_update_meter() would report N+1 -th download.
More information about the Yum-commits
mailing list