On Tuesday 06 December 2005 16:24, Pedro Bastos wrote:
> I am in charge to take care of Internet access on an University lab.
> Some addresses must be forbidden (most pornographic) then I use to
> manually look for suspicious addresses names and put them on my deny
> sites regexp. The issue is that the brilliant students find the IP
> address for most of websites and start to using them by IP. This is
> very common with anonymous proxies and access to instant messages
> sites as well.
I know that I repeat myself. But a moderately intelligent human-being
in front of a computer (the younger the more dangerous) will not
have trouble with blacklisted regexps in Squid. Think about:
- accessing through IP (as you describe)
- images.google.com
- web based proxies on the internet
So if you are serious about content filtering then:
(1) Throw money at a commercial URL filter.
(2) Use whitelisting (block everything except a few trusted URLs).
Everything else is IMHO a waste of resources.
Christoph
-- ~ ~ ".signature" [Modified] 2 lines --100%-- 2,41 AllReceived on Tue Dec 06 2005 - 09:09:50 MST
This archive was generated by hypermail pre-2.1.9 : Sat Dec 31 2005 - 12:00:02 MST