I've got a bunch of machines at various locations on various ISPs for
which I need to filter web sites. I've set up Squid w/ authentication
as a proxy for these machines and put ACLs in place that allow me to
whitelist certain sites.
The problem I am having is that the users of these machines are about
two steps ahead of sea turtles on the intelligence scale and get
confused when asked to enter a user/pass when opening Firefox. They
*do* have a user/pass for their email, so you can see how they would
get confused having to remember *TWO* _different_ passwords. *roll
eyes*
At any rate, I started investigating the transparent proxy option for
two main reasons 1) they don't have to enter a user/pass, and 2) they
can't get around the proxy settings by changing browsers, etc. I
found the FAQ that says you can't use transparent proxying w/
proxy_auth, so I started reading over the lists. These macines are at
various locations on various ISPs, so I cannot use any speical
messages in DHCP or anything like that as has been suggested
previously. I eventually came across the login parameter to the
cache_peer config option and that got me thinking...
Could I setup squid on each of the remote boxes as a transparent
cache, but configure the cache such that all requests are forwarded to
the parent cache that uses authentication? The parent cache would
then filter content based on the whitelist already created, or
whatever filters we decide to put in place in the future. This would
solve both problems b/c users wouldn't have to enter a u/p (the Squid
config would handle that), and they can't get around the proxy b/c its
transparent on each machine. Has anyone done this? Any thoughts on
this setup? Security concerns?
TIA
Received on Mon Feb 20 2006 - 13:44:48 MST
This archive was generated by hypermail pre-2.1.9 : Wed Mar 01 2006 - 12:00:03 MST