Unix and Linux Systems

2008/03/18



I wanted to gather some stats on the activity of the
Premium
Consultants listings
. It's easy enough to get the raw
activity: each passes through my "conlinkp.pl" script (that
just records a little bit and does a "Location" to send it on
to the real link). So to find out clicks on "conlink.pl", I
can just "grep conlinkp.pl access_log".. well, except these
all aren't clicks. A lot of them are search engine and other 'bots
tracing their way through my pages.


I could easily stop the 'bots from accessing those links,
but I really don't want to as there may be value in their
finding the sites. However, those are not people, so I'd
like to filter those out for reporting.


Well, one way to do that would be to have a list of 'bot ip
addresses, but that's a big, big list and is constantly changing.
A better way is to look at something 'bots don't usually care
about: Javascript. Unfortunately that's not foolproof either.
However, 'bots should look at "robots.txt", so if we filter out
those also, we should have what we want: real users (maybe).


However, that's not really my point here. In the process
of playing with this, I constructed a fairly long command line
and realized that breaking it down could be helpful for those
of you just leaning your way around Unixish shells. To
make it easier to read, I broke it down into one command per
line first:



grep conlinkp.pl logs/access_log
| sed 's/- .*//'
| uniq
| xargs -n 1 -J foo grep foo logs/access_log
| grep ".js HTTP"
| sed 's/- .*//'
| uniq
| xargs -n 1 -J foo grep foo logs/access_log
| grep conlinkp.pl > ~/conlinks.list

Now let's look at that in detail. I'm going to show a few sample
lines from each step of the pipeiine so you can see what actually
happens each step of the way.



grep conlinkp.pl logs/access_log

89.37.222.137 - - [15/Mar/2008:11:38:48 +0000] "GET
/cgi-bin/conlinkp.pl?http://www.cleverminds.net HTTP/1.1" 302 210
"-" "Java/1.6.0_03"
202.111.175.186 - - [15/Mar/2008:12:49:40 +0000] "GET
/cgi-bin/conlinkp.pl?http%3A%2F%2Fwww.landi-sempach-emmen.ch%2Faktionen%2Fimage%2Fzafecez%2Fiji%2F
HTTP/1.0" 302 205 "-" "Mozilla/4.0 (compatible; MSIE 7.0; Windows
NT 5.1; .NET CLR 2.0.50727; .NET CLR 1.1.4322)"
202.111.175.186 - - [15/Mar/2008:12:49:41 +0000] "GET
/cgi-bin/conlinkp.pl?http%3A%2F%2Fwww.vlopezalvarez.com%2FPersonal%2FFotos%2FViajes%2Fxaj%2Fyit%2F
HTTP/1.0" 302 205 "-" "Mozilla/4.0 (compatible; MSIE 7.0; Windows
NT 5.1; .NET CLR 2.0.50727; .NET CLR 1.1.4322)"
202.111.175.186 - - [15/Mar/2008:12:49:41 +0000] "GET
/cgi-bin/conlinkp.pl?http%3A%2F%2Fwww.cjp.spb.ru%2Fen%2Faki%2Fucuyupi%2F
HTTP/1.0" 302 205 "-" "Mozilla/4.0 (compatible; MSIE 7.0; Windows
NT 5.1; .NET CLR 2.0.50727; .NET CLR 1.1.4322)"
87.217.117.201 - - [15/Mar/2008:14:06:23 +0000] "GET
/cgi-bin/conlinkp.pl?http://echo3.net HTTP/1.1" 302 205 "-"
"Java/1.6.0_03"
213.46.86.86 - - [15/Mar/2008:14:09:42 +0000] "GET
/cgi-bin/conlinkp.pl?http://echo3.net HTTP/1.1" 302 205 "-"
"Java/1.5.0_06"

That's simple enough, right? The "grep" just pulls matching lines
from our web access log. Nothing complicated there. I've shown just six lines from the output, though actually there would be hundreds.



| sed 's/- .*//'

89.37.222.137
202.111.175.186
202.111.175.186
202.111.175.186
87.217.117.201
213.46.86.86

That line calls "sed" to edit out everything after "- " in each line.
The result is still six lines, but we've lost everything but the ip
addresses.




| sort -u

202.111.175.186
213.46.86.86
87.217.117.201
89.37.222.137

By running that output through "uniq", we cut away the duplicate ip's.
Again, here we only show four, but the original hundreds of lines would
be down to about 150 in the full output. So what do we have now? Just
a list of ip addresses, each of which had looked at "conlink.pl" at
some point. The next line starts to get interesting.



| xargs -n 1 -J foo grep foo logs/access_log

89.37.222.137 - - [08/Mar/2008:06:22:23 +0000] "GET
/Unixart/new_address.html HTTP/1.1" 200 16529 "-" "Java/1.6.0_03"
89.37.222.137 - - [08/Mar/2008:06:22:23 +0000] "GET /Web/4qsurveys.html
HTTP/1.1" 200 18504 "-" "Java/1.6.0_03"
89.37.222.137 - - [08/Mar/2008:06:22:24 +0000] "GET
/Web/social_blogging.html HTTP/1.1" 200 30665 "-" "Java/1.6.0_03"
89.37.222.137 - - [08/Mar/2008:06:22:24 +0000] "GET /cgi-bin/comingsoon.pl
HTTP/1.1" 200 9374 "-" "Java/1.6.0_03"
89.37.222.137 - - [08/Mar/2008:06:22:25 +0000] "GET
/cgi-bin/indexget.pl?Basics HTTP/1.1" 200 37476 "-" "Java/1.6.0_03"
89.37.222.137 - - [08/Mar/2008:06:22:26 +0000] "GET /cgi-bin/randompage.pl
HTTP/1.1" 200 91 "-" "Java/1.6.0_03"

This line produces a lot of output and is a bit tricky to understand.
What it's doing is finding every match for every ip address the previous
commands have produced. The output is every entry in the log for
every ip that includes an access of "conlink.pl". How does it
manage that from one big list of ip's? Well, there are several
ways I could have done that, but here I used "xargs". Xargs is
normally used to make commands more efficient; for examples
see How can I recursively grep through sub-directories?" and Using xargs. Here, we're using it for a different
purpose.


The first problem is to limit xargs to invoking grep with only
one argument - normally it wants to use as many as possible. The
"-n 1" tells it to do that. Next, we need to rearrange the command
line a little: if we just used "args -n 1 -J grep logs/access_log"
we'd end up with grep beimng called like this:



grep logs/access_log 89.37.222.137
grep logs/access_log 202.111.175.186

and so on, and that won't work. The "-J foo" provides the
magic we need. We can see it at work if we momentarily change our
command to subsitute "echo" for grep; the result would look something
like this:



89.37.222.137 logs/access_log
202.111.175.186 logs/access_log
87.217.117.201 logs/access_log
213.46.86.86 logs/access_log

Another way to see what xargs would do is to use "-p" in the command
line - xargs will echo each invocation and wait for you to confirm
with "y" or "n" before proceeding.


The choice of "foo" is arbitrary; you can use any word at all to
act as a place holder. What happens is the "foo" shows "xargs" where
you want its input to appear in its output: you are controlling the
command line it builds. This gives us what we need.



| grep ".js HTTP"

The next three lines are going to filter this output back to a smaller
set of lines again. We're looking for only those lines that have ".js HTTP".
Let's review: we found the lines that referenced "conlink.pl", we used
the ip addresses from those to find all accesses, and now we're grepping
out only the ".js HTTP" lines. Couldn't we have saved a step here?


Well, yes, but the quoting gets difficult. We want something
like this:



| xargs -n 1 -J foo grep \"foo .*.js HTTP\" logs/access_log

However, if we quote "foo",
xargs loses its interpolation. We end up with



grep "foo .*.js HTTP" logs/access_log 89.37.222.137

I could solve that by writing a script that reads stdin and
constructs the command line I want, but this isn't about writing
scripts, so we'll live with the inefficiency - after all, if I were really
concerned about how long this takes, I wouldn't be using command line
tools at all. I'd write a Perl script to do the whole task.


The next two lines should be understandable as they do just what we
did before:



| sed 's/- .*//'
| uniq

We're back to a simple list of ip's again, but now it has been
filtered down to only those ip's who accessed "conlink.pl" but
also accessed one or more Javascript programs. Finally we go back
to the logs once more to extract the original lines:



| xargs -n 1 -J foo grep foo logs/access_log
| grep conlinkp.pl > ~/conlinks.list

This is just a repeat of what was done earlier so you should understand
it. If not, use "-p" with xargs to follow along. The end result is
a listing of the actual "conlink.pl" lines where the ip origination had
also accessed a Javascript file.


Now remember: this actually isn't a useful exercise. Some bot's can
and do access Javascript and this pipeline would be very slow and
clumsy to run. The purpose here is just to show how command lines
can be manipulated with xargs, sed and uniq. To actually do
this, I'll use a Perl script like this:



#!/usr/bin/perl
open(I,"logs/access_log") or die "access_log $!";
while (<I>) {
chomp;
$ip=$_;
$ip=~ s/- .*//;
$isconlink{$ip}=$_ if /conlinkp.pl/;
$isrobots{$ip}=1 if /robots.txt HTTP/;
$isjavascript{$ip}=1 if /.js HTTP/;
}
foreach (keys %isconlink) {
next if $isrobots{$_};
# Not in robots..
next if not $isjavascript{$_};
# and did get javascript..
print "$isconlink{$_}\n";
}





















- Coming Soon - Skills Tests - Surveys - Kerio Mail Server - Fortinet Routers - Consulting - Advertise Here