[Yum-devel] [urlgrabber] keepalive broken
Florian La Roche
laroche at redhat.com
Thu Dec 14 13:49:29 UTC 2006
On Wed, Dec 13, 2006 at 01:14:39PM -0700, Michael Stenner wrote:
> On Tue, Dec 12, 2006 at 10:17:01PM +0100, Florian La Roche wrote:
> > On Thu, Dec 07, 2006 at 05:20:44PM -0700, Michael Stenner wrote:
> > > woo-hoo, I finally have an excuse to do "make daily"!
> > >
> > > http://bird.ece.arizona.edu/~mstenner/urlgrabber-20061207.tar.gz
> >
> > the following script will also leak filedescriptors, so it will
> > also fail once the usual limit of 1024 fd's is reached and not
> > just have keepalive not working:
>
> Florian, did you use the recent urlgrabber version from above? I
> haven't been able to recreate the problem you describe in HEAD, which
> includes a fix for the caching problem you described last week. Is
> there some other problem that leads to file descriptor leaks or is
> that it?
We are currently at 3.1.0 release plus the keepalive patch from
Dec. 5th. That level is leaking filedescriptors.
I've tried again with the above version from 1207 and that one kind of
ignores keepalive and creates new connections, but also does not leak fds.
So it opens new connections but then does not break apart. One way to
still see leaking fds in this version is to reference a non-existing file.
This will then still stop after running beyond the 1024 fd limit:
#!/usr/bin/python
import urlgrabber
for i in xrange(2000):
fd = urlgrabber.urlopen("http://127.0.0.1/mirror/K-does-not-exist",
timeout=10.0, retry=3, keepalive=1)
fd.read(10)
fd.close()
regards,
Florian La Roche
More information about the Yum-devel
mailing list