[Yum-devel] [UG] openers and keepalive
Michael Stenner
mstenner at linux.duke.edu
Wed Mar 31 05:40:19 UTC 2004
On Tue, Mar 30, 2004 at 10:23:23PM -0500, Ryan Tomayko wrote:
> On Mon, 2004-03-29 at 13:13, Michael Stenner wrote:
> > 2) opener-caching broken
> >
> > The opener-caching behavior seems to not quite be working right
> > yet, so I disabled it for now.
> <snip>
> > This only showed up in the test suites where LOTS
> > of stuff was happening. All my simple tests worked.
>
> I cannot seem to recreate this (tried under python 2.2 and python 2.3).
> Do you remember the test cases that failed? I'm building the opener with
> the CacheOpenerDirectory and with urllib2 then dumping the handler list
> and the handle_error dict of both but I'm not seeing any real
> difference. I'm running the entire suite though so I might be missing
> something.
>
> Also, were any tests failing? I'm getting all ok's using the cache but I
> get the feeling that tests weren't actually failing for you but that the
> caching was causing problems with other expected things to happen.
I ran the whole suite as well using python 2.2 on an FC1 box. Here's
the only change from HEAD (including your opener option patch):
self._opener = CachedOpenerDirector(*handlers)
#self._opener = urllib2.build_opener(*handlers)
I'm attaching the test output. I also just tried to run this at home
and DO NOT see the errors. The only thing that I can think of that's
different is that I'm running an ftp server at work where I saw this
and so the ftp tests run as well. Maybe there's some conflict there.
You might want to run a local server as well. I'm running vsftpd.
I'm also attaching the notes on how I set it up. Following these
notes, it should take you 3 minutes to get it up and running.
-Michael
--
Michael D. Stenner mstenner at ece.arizona.edu
ECE Department, the University of Arizona 520-626-1619
1230 E. Speedway Blvd., Tucson, AZ 85721-0104 ECE 524G
-------------- next part --------------
export PYTHONPATH=.; python test/runtests.py
urlgrabber tests
grabber.py tests
FTPRegetTests
exception raised for illegal reget mode ... ok
simple (forced) reget ... ok
NO reget when server version is newer than local ... skip
reget when server version is older than local ... skip
FileObjectTests
URLGrabberFileObject .read() method ... ok
URLGrabberFileObject .readline() method ... ok
URLGrabberFileObject .readlines() method ... ok
URLGrabberFileObject .read(N) with small N ... ok
FileRegetTests
exception raised for illegal reget mode ... ok
simple (forced) reget ... ok
NO reget when server version is newer than local ... ok
reget when server version is older than local ... ok
HTTPRegetTests
exception raised for illegal reget mode ... ok
simple (forced) reget ... ERROR
NO reget when server version is newer than local ... ERROR
reget when server version is older than local ... ERROR
HTTPTests
download refernce file via HTTP ... ok
Test module level functions defined in grabber.py
module-level urlgrab() function ... ok
module-level urlopen() function ... ok
module-level urlread() function ... ok
Test grabber.URLGrabber class
grabber.URLGrabber.__init__() **kwargs handling. ... ok
grabber.URLGrabber._parse_url() ... ok
grabber.URLGrabber._parse_url('/local/file/path') ... ok
grabber.URLGrabber._parse_url() with .prefix ... ok
byterange.py tests
Test module level functions defined in range.py
byterange.range_header_to_tuple() ... ok
byterange.range_tuple_normalize() ... ok
byterange.range_tuple_to_header() ... ok
Test range.RangeableFileObject class
RangeableFileObject.seek() poor mans version.. ... ok
RangeableFileObject.read() ... ok
RangeableFileObject.read(): to end of file. ... ok
RangeableFileObject.readline() ... ok
RangeableFileObject.seek() ... ok
RangeableFileObject.tell() ... ok
mirror.py tests
ActionTests
test the effects of a callback-returned action ... ok
test default action policy ... ok
test the effects of passed-in default_action ... ok
test the effects of method-level default_action ... ok
BadMirrorTests
test that a bad mirror raises URLGrabError ... ok
BasicTests
MirrorGroup.urlgrab ... ok
MirrorGroup.urlopen ... ok
MirrorGroup.urlread ... ok
CallbackTests
test that the callback can correctly re-raise the exception ... ok
test that MG executes the failure callback correctly ... ok
FailoverTests
test that a the MG fails over past a bad mirror ... ok
SubclassTests
MGRandomOrder.urlgrab ... ok
MGRandomStart.urlgrab ... ok
keepalive.py tests
CorruptionTests
download a file with mixed readline() and read(23) calls ... ok
download a file with a single call to read() ... ok
download a file with multiple calls to readline() ... ok
download a file with a single call to readlines() ... ok
download a file with multiple calls to read(23) ... ok
DroppedConnectionTests
testing connection restarting (20-second delay, ctrl-c to skip) ... ok
HTTPErrorTests
test that 200 works without fancy handler ... ok
test that 200 works with fancy handler ... ok
test that 403 works without fancy handler ... ok
test that 403 works with fancy handler ... ok
test that 404 works without fancy handler ... ok
test that 404 works with fancy handler ... ok
ThreadingTests
use 3 threads, each getting a file 4 times ... ok
===============================================================================
ERROR: simple (forced) reget
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "test/test_grabber.py", line 235, in test_basic_reget
self.grabber.urlgrab(self.url, self.filename, reget='simple')
File "test/../urlgrabber/grabber.py", line 536, in urlgrab
return self._retry(opts, retryfunc, url, filename)
File "test/../urlgrabber/grabber.py", line 478, in _retry
return apply(func, (opts,) + args, {})
File "test/../urlgrabber/grabber.py", line 526, in retryfunc
fo = URLGrabberFileObject(url, filename, opts)
File "test/../urlgrabber/grabber.py", line 640, in __init__
self._do_open()
File "test/../urlgrabber/grabber.py", line 685, in _do_open
fo, hdr = self._make_request(req, opener)
File "test/../urlgrabber/grabber.py", line 759, in _make_request
raise URLGrabError(4, _('IOError: %s') % (e, ))
URLGrabError: [Errno 4] IOError: HTTP Error 206: Partial Content
===============================================================================
ERROR: NO reget when server version is newer than local
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "test/test_grabber.py", line 266, in test_newer_check_timestamp
self.grabber.urlgrab(self.url, self.filename, reget='check_timestamp')
File "test/../urlgrabber/grabber.py", line 536, in urlgrab
return self._retry(opts, retryfunc, url, filename)
File "test/../urlgrabber/grabber.py", line 478, in _retry
return apply(func, (opts,) + args, {})
File "test/../urlgrabber/grabber.py", line 526, in retryfunc
fo = URLGrabberFileObject(url, filename, opts)
File "test/../urlgrabber/grabber.py", line 640, in __init__
self._do_open()
File "test/../urlgrabber/grabber.py", line 685, in _do_open
fo, hdr = self._make_request(req, opener)
File "test/../urlgrabber/grabber.py", line 759, in _make_request
raise URLGrabError(4, _('IOError: %s') % (e, ))
URLGrabError: [Errno 4] IOError: HTTP Error 206: Partial Content
===============================================================================
ERROR: reget when server version is older than local
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "test/test_grabber.py", line 253, in test_older_check_timestamp
self.grabber.urlgrab(self.url, self.filename, reget='check_timestamp')
File "test/../urlgrabber/grabber.py", line 536, in urlgrab
return self._retry(opts, retryfunc, url, filename)
File "test/../urlgrabber/grabber.py", line 478, in _retry
return apply(func, (opts,) + args, {})
File "test/../urlgrabber/grabber.py", line 526, in retryfunc
fo = URLGrabberFileObject(url, filename, opts)
File "test/../urlgrabber/grabber.py", line 640, in __init__
self._do_open()
File "test/../urlgrabber/grabber.py", line 685, in _do_open
fo, hdr = self._make_request(req, opener)
File "test/../urlgrabber/grabber.py", line 759, in _make_request
raise URLGrabError(4, _('IOError: %s') % (e, ))
URLGrabError: [Errno 4] IOError: HTTP Error 206: Partial Content
-------------------------------------------------------------------------------
Ran 59 tests in 33.243s
FAILED (errors=3, skipped=2)
-------------- next part --------------
/usr/sbin/adduser -d /var/ftp/ -M -s /sbin/nologin ftptest
/var/ftp/test/ contains the standard "reference" and "short_reference"
files.
########### /etc/vsftpd.user_list
# vsftpd userlist
# If userlist_deny=NO, only allow users in this file
# If userlist_deny=YES (default), never allow users in this file, and
# do not even prompt for a password.
# Note that the default vsftpd pam config also checks /etc/vsftpd.ftpusers
# for users that are denied.
ftptest
anonymous
######### /etc/vsftpd/vsftpd.conf
anonymous_enable=YES
local_enable=YES
#xferlog_enable=YES
# Make sure PORT transfer connections originate from port 20 (ftp-data).
connect_from_port_20=YES
# You may change the default value for timing out an idle session.
#idle_session_timeout=600
# You may change the default value for timing out a data connection.
#data_connection_timeout=120
nopriv_user=ftptest
pam_service_name=vsftpd
userlist_enable=YES
#enable for standalone mode
listen=YES
tcp_wrappers=YES
userlist_enable=YES
userlist_deny=NO
userlist_file=/etc/vsftpd.user_list
## the userlist_file contains ONLY: ftptest, anonymous
guest_enable=YES
guest_username=ftptest
ftp_username=ftptest
More information about the Yum-devel
mailing list