Sorry if this is a total newbee question but I'm wanting to store the
actual page content in a database is there anyone out there that has
done anything like this? Do you have any pointers of where I should start.
I'm not looking to run the caching service in a database as the overhead
would no doubt be to great. I just want to take a copy of the surfed web
pages for further analysis.
Cheers
I'm also trying to figure out why I get "
FATAL: neighborsUdpPing: There is no ICP socket!
Aborted
"
When I've started up squid and try and request a page. If its an easy
fix then just let me sweat out the answer. My squid3 needs to speek to a
2.4 parent to get web access so I'm guessing its something to do with
that config so I'll go search.
Cheers for any help that you guys can offer
-- Martin Ritchie the Kelvin Institute 50, George Street +44 (0) 141 548 5719Received on Tue Oct 21 2003 - 05:31:12 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:20:34 MST