Squid 2.6.stable14
I've got a small dillema. I've written an external Perl helper to return
"OK" or "ERR" dependant upon some regular expressions and/or domains
stored in an external Postgresql database.
What I've noticed is that for each URL/FQDN that is requested, squid
passes 'every' URL embedded in the webpage, one by one, to the external
helper(s that are running, until I add the asynchronous piece that allows
just one instance of the helper to run, which will then fork children
processes as needed). My external helper, makes a call to a postgresql db
and checks each domain/url against a single table of 50K entries, this
KILLS the performance and end user experience.
Does anyone have any suggestions as to 'how' to make this work well? Or
Henrik, do you have any suggestions as to where I might start looking in
the Squid code, as to how I can modify 'how' URL's are passed to the
external helper?
Thanks dist!
-- Louis Gonzales louis.gonzales@linuxlouis.net http://www.linuxlouis.netReceived on Mon Jul 23 2007 - 08:53:56 MDT
This archive was generated by hypermail pre-2.1.9 : Wed Aug 01 2007 - 12:00:04 MDT