[yum-commits] 2 commits - urlgrabber/grabber.py
skvidal at osuosl.org
skvidal at osuosl.org
Fri Feb 19 20:05:13 UTC 2010
urlgrabber/grabber.py | 10 +++++++++-
1 file changed, 9 insertions(+), 1 deletion(-)
New commits:
commit 1103dbe5fef67a1890af46a7d69f6c6cdcefd38f
Merge: d27763d... c20d160...
Author: Seth Vidal <skvidal at fedoraproject.org>
Date: Fri Feb 19 15:04:45 2010 -0500
Merge branch 'master' of ssh://yum.baseurl.org/srv/projects/yum/git/urlgrabber
* 'master' of ssh://yum.baseurl.org/srv/projects/yum/git/urlgrabber:
Change default timeout to 300 (seconds), update documentation.
Move .cvsignore to .gitignore, add *.pyo.
commit d27763d1cb035f457b066195bdc86db4055ecef7
Author: Seth Vidal <skvidal at fedoraproject.org>
Date: Fri Feb 19 15:03:29 2010 -0500
add reset_curl_obj() to top layer of urlgrabber.grabber to close and reopen the curl object
in the event the network info changes a lot.
marginally improve the error output for a bogus cert
diff --git a/urlgrabber/grabber.py b/urlgrabber/grabber.py
index 0023fed..88754d3 100644
--- a/urlgrabber/grabber.py
+++ b/urlgrabber/grabber.py
@@ -1299,7 +1299,7 @@ class PyCurlFileObject():
raise err
elif errcode == 60:
- msg = _("client cert cannot be verified or client cert incorrect")
+ msg = _("Peer cert cannot be verified or peer cert invalid")
err = URLGrabError(14, msg)
err.url = self.url
raise err
@@ -1640,6 +1640,14 @@ class PyCurlFileObject():
_curl_cache = pycurl.Curl() # make one and reuse it over and over and over
+def reset_curl_obj():
+ """To make sure curl has reread the network/dns info we force a reload"""
+ global _curl_cache
+ _curl_cache.close()
+ _curl_cache = pycurl.Curl()
+
+
+
#####################################################################
# DEPRECATED FUNCTIONS
More information about the Yum-commits
mailing list