We've come up with a scheme for improving cache delivery of very popular
pages on which I would like to get feedback, tips, and such. I'll try to
explain it succintly:
1. You have a squid running on machine X which is entirely devoted to a
popular site, say netscape.com . This machine runs a prefetch program
through squid each night to get fresh copies of pages into its memory,
and it is configured so as to exclusively and optimally serve this
purpose. This machine is basically mirroring netscape.com.
2. Other caches insert lines like the following into squid.conf in
order to take advantage of the "mirror squid":
cache_host mirrorsquid.X.utk.edu sibling 3128 3130
cache_host_domain mirrorsquid.X.utk.edu .netscape.com
The "mirror squid" is thus only queried regarding objects in the domain
to which it is dedicated.
3. The "mirror squid" attaches an accurate expires meta-tag to pages it
serves so that they can be reliably cached locally, thus minimizing
traffic to the "mirror". All objects would expire at the time when the
prefetch program runs.
Essentially what this is is an exclusive mirror for the squid network.
Does this sound feasible? Can anyone recommend a prefetch program,
preferably non-commercial, that can be programmed/configured to crawl
through a particular domain?
Perhaps the most exciting thing about this concept is that it is
exclusive to the cache network, thus it woull serve as an incentive for more
people to implement caching. Your thoughts?
Richard Hall
Network Services
University of Tennessee
Received on Tue Jan 14 1997 - 14:27:43 MST
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:34:04 MST