>
> I'm looking for at script or program that can sort the sites (eg.
> www.google.com) after how much traffic counted in megabytes
> the clients has
> requested and I'm also looking for a script or program that
> can do the same
> for the clients.
>
> I have looked more or less all the loganalysers though that I
> could find at
> http://www.squid-cache.org/scripts/ but i could not find what
> I'm looking
> for.
> I doesn't matter what the output is and how it looks. It
> should just be
> possible to see what sites and clients uses in traffic
> through the proxy.
>
should be easy using perl. run through the log and build a hash with
key = sites and values = amount of traffic. do the same with the clients.
something like that: (it seems to work, but test for yourself:)
----------------------------------------------------------------------------
-----------
#!/usr/bin/perl -w
use strict;
my ( %SITES, %CLIENTS, $key );
while( <> )
{
if(
/^\d+\.\d+\s*\d+\s*([\d\.]+)\s+\w+\/\d+\s+(\d+)\s+GET\s+http:\/\/([^\s]+?)\/
.*$/ )
{
$SITES{$3}+=$2;
$CLIENTS{$1}+=$2;
}
}
foreach $key (keys %SITES) {
printf "site: $key amount: $SITES{$key} bytes\n";
}
foreach $key (keys %CLIENTS) {
printf "client: $key amount: $CLIENTS{$key} bytes\n";
}
----------------------------------------------------------------------------
-------------
ciao -ap
System Administration
VIRBUS AG
Fon +49(0)341-979-7424
Fax +49(0)341-979-7409
andreas.piesk@virbus.de
www.virbus.de
Received on Mon Jan 14 2002 - 02:09:28 MST
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:05:50 MST