Using PERL to Monitor Website / Server Status

Website monitor in perl sample #2

Building on the basic code in sample 1 we will add a log to record each time the server or website is checked and the response it returns. We will also add some server status checking so we can return server load average and the uptime from the response script.

First we will add the uptime command to the response script.

$uptime = open( UPTIME, "/usr/bin/uptime |" );
while (<UPTIME>){print;}
close( UPTIME );

This will tell us the number of users logged in, the length of time the server has been running and the current load average. This is a simple hack, but allows you to understand how to expand the response script to show server characteristics you want to monitor.

Since we don't want to share that information with the general public we will add in an auth from a single ip.

In the monitoring script, we will just add a simple logging feature to record the responses. In this sample, we will just dump all the data in one giat log. But you can break the logging into a daily log or even logs for each server. In our case we just want to log the data in one text file;.

# write status to file
open(FILE,">>$statusfile");
flock(FILE, 2);
print FILE $response;
flock(FILE, 8);
chmod 0777, $statusfile;

By logging the responses, you will be able to check back and verify if a server was down.

Download the 2 scripts needed:

mystatus2.cgi

servermonitor2.cgi

Installation

Upload (ASC/TEXT Format) the mystatus2.cgi to your website or each website you want to monitor, chmod 0755 and that is all. To see if the script is working, access the script with your browser.

Configure the servermonitor2.cgi script, adding in all the websites you want to check and the path to the log file. Make sure you either add a blank file chmod 0777 or point to a directory that is writeable.Upload servermonitor2.cgi to a remote website or home server, chmod 0755. Then just open it in a web browser and watch it go.

You can access the datafile and see the past history of server responses.

Limitations: This script still uses the meta refresh, so the limitations of sample 1 still apply. In sample 3 we will use a cron run file every 15 minutes to write to a datafile. We will use the meta refresh only to read the datafile. This would be the prefered configuration, but not everyone can get root access to schedule cron jobs.

If you are running your own server from your home or office, sample 3 will give you the best bang for your buck. That is as much bang as you can get out of something that is free and you have to install and configure yourself.

Still, there is a need for multiple samples based on what the end users resources are and what they want to accomplish. If you already knew how to do it, you wouldn't be downloading the script. So we want to give you ample ideas to build on.