[yum-commits] Changes to 'external-multi-downloader'
zpavlas at osuosl.org
zpavlas at osuosl.org
Mon Nov 28 15:10:51 UTC 2011
New branch 'external-multi-downloader' available with the following commits:
commit 4e3d40f4aa3e21dfe1afff526abb0ba4af4c055a
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Fri Nov 25 08:17:33 2011 +0100
Python2.7 bug: epoll.__del__() not closing the fd.
commit 37566ca40cfc9fd7a17d3f0a3873fc50760f9b18
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Thu Nov 24 15:30:06 2011 +0100
Initial epoll() support in ExternalDownloaderPool.
Handy if there's 1000+ downloaders to handle :)
commit 1a4959931ab32bc08da9f622b613c50e337c864e
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Tue Nov 22 14:37:43 2011 +0100
AVOID_CURL_MULTI flag
When using the 'external' download process it's not strictly necessary
to use CurlMulti(), as we can use ordinary blocking code and just throw
more processes on that.
AVOID_CURL_MULTI = True: Each download runs is a separate process.
Processes are reused when downloading files from the same host.
AVOID_CURL_MULTI = False: Fork a single process that handles all the
downloading. Should be somewhat more efficient.
commit 46faf5025719492124d1a07a46bb9fd069c6ed11
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Mon Nov 7 11:02:33 2011 +0100
Optional/throttled progress reporting in download_process
CurlFileObject updates progress meter on every write. _ProxyProgress
pipes this to the parent, but it's often ignored there.
- make updates conditional
- throttle update rate at 0.31s
commit c75190fd3f37316a008d859f18d21a8f7eb75af3
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Fri Nov 4 09:25:58 2011 +0100
Improve ctrl-c handling
We don't detach the downloader process from TTY, so it receives
SIGINT as well, and may even exit sooner than Python raises
KeyboardInterrupt in the parent process.
- downloader: don't print ctrl-c traceback
- parent: handle EINTR and EOF as ctrl-c
commit 60ec6f8ff1f1407f4459944ea046693a5b0e4403
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Fri Oct 21 10:42:59 2011 +0200
External downloading
Add 'external = True' flag to parallel_wait()
to relay download requests to external process.
commit 4223c6ec77b16144e79329ae4bf3d3f462016e37
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Fri Nov 4 08:35:15 2011 +0100
Downloader process
When executed with a single argument 'DOWNLOADER', grabber.py
parses download requests on stdin, and reports the results to stdout.
Conflicts:
urlgrabber/grabber.py
commit db657a6379d1d4f9b96062965e20a53bc23de313
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Fri Nov 4 08:17:02 2011 +0100
_dumps + _loads: custom serializer/parser
commit 06624ba76f58b132ab6449000fcfd5165b1d45cc
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Tue Oct 25 13:54:43 2011 +0200
Reuse curl objects (per host)
commit 811ab7ccac581538e294f8c0f2c1790e2f0fef5c
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Tue Oct 4 17:24:20 2011 +0200
Implement parallel urlgrab()s
opts.async = (key, limit):
async urlgrab() with conn limiting.
parallel_wait():
wait untill all grabs have finished.
commit b01de1eabef26c92aa499a5498f703d764c67422
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Fri Oct 7 12:37:00 2011 +0200
Obsolete the _make_callback() method
Use _run_callback() instead.
commit ae3265686ea13aac5b0e93af75ba806dd146da34
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Tue Sep 6 14:41:27 2011 +0200
Implement 'failfunc' callback.
This callback is called when urlgrab request fails.
If grab is wrapped in a mirror group, only the mirror
group issues the callback.
commit c5026d7eb19bd6f1631c45e43058338729f01b19
Author: ZdenÄk Pavlas <zpavlas at redhat.com>
Date: Tue Oct 25 12:52:23 2011 +0200
TextMultiFileMeter: minor tweaks
remove _do_end(), because individual finished files were already
handled in end_meter, and _do_update_meter(None) fails.
remove _do_update_meter() at end of _do_end_meter().
we already have bumped finished_files counter, and
_do_update_meter() would report N+1 -th download.
More information about the Yum-commits
mailing list