Using PERL to Monitor Website / Server Status

Website monitor in perl sample #3

Sample 3 is the most complete and will require the most work as well as being able to schedule a cron job. It splits the server monitor script into 2 seperate functions that can run seperately.

In this version of the program we will write a file for each server or website and use a meta refresh to display a status screen with color coded tables. The script is expanded to e-mail the administrator if there is a problem. The program will run, be dependable and even notify you if you are not in front of the display screen. It can be run from a remote server without the display feature and just depend on the e-mail notification when a problem is detected. Since most cell phones accept e-mail messages, this can alert you 24-7 very effectively.

To make the script even more complete, I have added a trace route to log the route and record where the interuption is. This will be importnat in determining if it was your server or just a failure on the way to it.

Download the 3 scripts needed:





Upload (ASC/TEXT Format) the mystatus3.cgi to your website or each website you want to monitor, chmod 0755 and that is all. To see if the script is working, access the script with your browser.

Configure the servermonitor3.cgi script, adding in all the websites you want to check and the path to the log file. Upload servermonitor3.cgi to a remote website or home server, chmod 0755.

make sure you add the script to your /etc/crontab
15 * * * * root run-parts /pathto/servermonitor3.cgi
this line will run the script every 15 minutes

Configure the viewstatus3.cgi with the path to the directory with the status logs. Upload, chmod 0755 and open in a browser window.

You can access the datafile and see the past history of server responses.

Limitations: This is a great script, but the log is going to fill up. You will likely want to clear the log or rotate the log weekly or daily depending on how many servers you plan to monitor. We monitor a considerable number of servers, so our solution is to create a new log each day using the /bin/date command.

for example: $logname = `date '+%m_%d_%y'`;
This gives us a logfile name of mm_dd_yy which is easy to understand and will start a new file each day. Even if you load all the files in one directory, you would only need to clean up the old files after maybe 3 years.

Addons: One handy addition to the display script would be an audio alarm. This can be done very easily using a standard wav or mpg file. Just look for a down server and print the embed tag to the display page pointing to an audio file. Then you will get an audio alarm when a server is down.

To do this, in the display script, change the line:
if ($base =~ /WARNING/i){$statuscolor = "FF0000";}

Change to:
if ($base =~ /WARNING/i){$statuscolor = "FF0000";
$soundalarm = 1;}

Then simply add a print command if soundalarm = 1 then play the audio file.

# define the alarm
$alarm = "<embed src="alarm.wav" autostart="true" hidden="true" loop="true"></embed><noembed><bgsound src="$wavfile" loop="100"></noembed>";
if ($soundalarm == 1){print "$alarm";}

If you need a handy wav file here is a sample for you.
Download Alarm WAV