A fine thought. You could make a specific refresh_pattern for them to try to hold
them longer, and maybe use wget or the client program that comes with squid in a
simple script to load them up beforehand.
D
Jeremy M. Harmer wrote:
> Hi,
>
> I guess this is an FAQ - and I guess I should read the documentation ;-) but:
>
> Is it possible to tell Squid to go and grab specified pages ready for later
> use? We're looking for a solution where we can make sure files are cached
> ready for a teaching class to use the following day (or week or whatever).
> Can I then somehow lock these files in the cache so they won't be deleted
> if it runs out of disk space?
>
> Thanks,
>
> Jeremy M. Harmer
> University of Leeds
-- Note to evil sorcerers and mad scientists: don't ever, ever summon powerful demons or rip holes in the fabric of space and time. It's never a good idea. ICQ UIN: 3225440Received on Thu Dec 11 1997 - 07:39:13 MST
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:37:55 MST