[Yum] Maintaining my own copy of UPDATES

Les Mikesell lesmikesell at gmail.com
Thu Mar 26 19:13:35 UTC 2009


Seth Vidal wrote:
> 
>> With equal sarcasm, how would you feel if someone told you that you 
>> had to edit some config file or make a special setup for every 
>> different http page you want to visit just to get behavior that isn't 
>> antisocial?
> 
> Antisocial? Seriously? Antisocial is the pejorative you want there?

What do you call using 10x the bandwidth and mirror resources necessary 
to do updates (or whatever number of mirrors in the lists - you'll end 
up cluttering your cache with a copy from each of them)?  And wasting 
the administrators time.

> If I need to configure a special case for myself, it seems reasonable to 
> me that I should have to configure something.

I don't want a special case.  I want repeatable, standard behavior that 
works with standard infrastructure.  This should work for everyone that 
is behind a caching proxy (which is probably most people with multiple 
machines anyway).

>> If you would permit caching to work the way it is intended, distros 
>> probably wouldn't need all those mirrors anyway and other people 
>> wouldn't have had to invent a dozen different ways to work around what 
>> yum does when updating multiple machines.
> 
> "Caching to work the way it is intended" - I don't even remotely know 
> what you mean. More to the point, I bet if I asked 100 people what that 
> means I'd get roughly 400 different answers.

I mean, if you ask for a file or http page that your cache has already 
retrieved, you get the locally cached copy.  This isn't a new concept. 
Every distribution that includes yum almost certainly includes squid and 
perhaps other components with this functionality - and it provides it 
for everything going through it.  It doesn't matter what your 100 people 
think.  What matters are that there are standards to define this 
behavior and just about everything http-aware follows them.

However, instead of providing repeatable behavior, yum deliberately goes 
out of its way to request a copy of that same file from a different 
location each time to force the cache to bother some other mirror and 
waste internet bandwidth on both ends - and your time if you are running 
interactively - to get another identical copy.  As far as I can tell, it 
tries to make it as unlikely as possible that you will re-use the file 
your local proxy cache already has.

-- 
   Les Mikesell
     lesmikesell at gmail.com





More information about the Yum mailing list