Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles / Languages / C++

Google Spreadsheet Server Monitoring

4.82/5 (8 votes)
20 Apr 2010CPOL26 min read 1  
Monitor your websites using a Google Spreadsheet and some PHP

What Do You Mean the Website is Down?

So, your client calls you and tells you that the contact form on their website isn't working. One of their customers called them to tell them, and it looks like it's been down for a while. Your client wonders why you're the last one to know - why do they pay for maintenance anyway?

This is the situation we faced too many times, years ago, and why we started monitoring our servers. We quickly went from being the last one to know when a website stopped working properly, to being the first. We also began collecting a lot of valuable data about the quality of our web hosting services. Further more, we did a kind of testing that really meant something real to us. Instead of just checking to see if a server was up, we created "sensors", that we placed on client websites, and would do things like make a simple call to the website's actual database, emulating what the website did as closely as possible. This told us more about what the actual user experience was like, and about whether our servers were doing what they were supposed to, than just pinging a server to see if it was up.

A few weeks ago, I was thinking about server monitor software and I realized that most of the mechanics behind the software is actually pretty simple; the more difficult part is the reporting side of things. Fortunately Google Spreadsheet has the ability to read data from external sources and wonderful graphs and gadgets (like speedometers) for translating the server monitoring data; and to display our information meaningfully and handily. I admit I'm a big fan of the Google docs webapps and I decided, mostly for fun, to try my hand at writing a server monitor with a little PHP and one Google spreadsheet.

Google Docs to the Rescue?

I decided to make a project out of building a server monitor that used a Google Spreadsheet for its front end. I was right in that the core was quite simple, but I admit I added a few unforeseen yet indispensable "enhancements" along the way (like data "archiving"). It worked (and was fun to do) so I decided to write a blog about how it works and also show people how they may do something like it for themselves (and hopefully inspire people to create things like it - I may also do a series on other things that might be done like search engine ranking reports, etc.). I provide all my code here, and instructions for creating the spreadsheet.

I started by defining what I wanted it to do, partly inspired by the kinds of things I know I can do with Google Docs. This was my list:

  1. It would send out an email/SMS notification if something went down
  2. It would send out an email/SMS if it went back up again
  3. I could view the status of all the sensors
  4. I could view detailed history for any single sensor
  5. It would email me a daily report
  6. I could share the 1live data with other people, and publish it back out to the Internet (again, live data)

Well, It worked out rather well. The only real drawback is it's not as immediate as I'd have liked (I want data updated by the second if I can get it). However, Google docs isn't going to poll your datasource every second (and for good reason!) so immediate updates aren't going to happen. However you can force an update if you need to and it refreshes often enough, I think, to stay quite useful.

Below I describe how you can make one for yourself. I'm not sure if I need to say this, but: I did this project and wrote this blog to amuse myself; and I provide this information here for your own benefit/amusement. It's up to you whether or not it's as dependable as is you want a server monitor to be and I'm not responsible if it doesn't work as well as you think it should (nor is it Google's).


What You'll Need

You'll need a Google account (of course) and a server (I used a shared Linux server at LunarPages) that isn't on a webserver you are monitoring. You'll have to write some server side script (I used PHP) and you'll need a database (I used MySQL). The sensors you create will be in whatever you use on your website currently (I show a couple of examples further on). You will also have to add two jobs to the cron so you'll need to make sure you have permission to create cron jobs (most of our hosting providers provide an interface for creating cron jobs in their control panel).

How It Works

The basic design has a group of small "things" (scripts and worksheets) working together to make it all work.

A cron job calls a script that tests all the sensors, and sends out notifications if necessary. The results are put in a database. The spreadsheet populates itself with the data from the database by calling some scripts, which pass back the data using CSV, and the spreadsheet uses that data to create all of our fancy charts and graphs. The settings for the application (e.g. the list of sensors) are also stored in the spreadsheet and the PHP scripts use that information to determine which sensors to call, etc. Finally a daily script sends out a summary email report and "compresses" old data to save space.

Step 1: Creating a Sensor

A sensor, in our terms, is a fairly simple thing. In fact it could be an existing web page if all you want to do is see if the site's web server is serving up pages. The server monitor simply tests to see if a sensor (which is a web page) returns an error code in its header. With something like a database sensor, we will "artificially" return an error code in the response header if there is a failure to send a query to the database.

Really you can create a sensor to test anything your want, even things on an application level (e.g. test to see if a variable has an expected value). Ideally a sensor would be able to tell that a website is acting entirely the way it is supposed to, and short of regularly [24/7] parsing each page on the website for error codes, missing images, broken links and basically a rigorous testing régime, I think the sensor approach is about the best one can do (I would love to hear that I'm wrong - please comment below if you think I am).

I suggest, however, even if you are just going to test to see if a webserver is serving up pages that you make a special page, something simple, that has no linked resources, and is only called by your server monitor. Something like this:

HTML
<html>
<body>
Web server is functioning properly
</body>
</html>

and call it something like /sensors/websensor.html.

For a database sensor, I suggest making a very simple call to one of the database tables actually used by your website. Further more, if you use an include file for connecting to your database on your website, I suggest you use the same include file your site uses. More than once a sensor has told us of a problem when someone accidentally overwrote a connection string file with a file from a test/staging server (Human error is the biggest problem actually).

A typical database sensor, written in PHP, might look something like this:

PHP
<?php
require_once('../Connections/your_connection_string_include_file.php');

$sql = "SELECT id FROM your_table LIMIT 1";
$dbtest = mysql_query($sql, $database_connection) or die(mysql_error());
$dbtest_totalrows = mysql_num_rows($dbtest);

if($dbtest_totalrows > 0) {
 ?>Database is functioning properly<?php
} else {
 header("http_response_code: 500");
 ?>Database not functioning properly<?php
}
?>

Now we have a sensor that will tell you if the webserver is serving up pages AND is able to access your database. Please note that it doesn't matter what the server side code is here, I just used PHP in my example. We have sensors in multiple languages testing multiple aspects of our web sites; all that matters is if the sensor returns an error code in the response header or not.

Step 2: Creating our Spreadsheet

We're going to step away from our text editors/IDEs long enough to start our spreadsheet now. We'll start by creating the first 2 worksheet in the spreadsheet, which I call the "sensor list". I strongly suggest that you build your spreadsheet just the same way I did, in terms of labels and what rows and columns data is put in, and then play with it afterward when it's all working. It will be easiest to follow me if you start out more or less exactly as I describe.

This first worksheet is going to do two things: It's the place where we are going to list the sensors that the application will test (our Sensors Tester script is going to read this list to determine which sensors to call). Also, beside each sensor on the list (columns A & B), we're going to display the sensor's current status in terms of green, yellow and red "lights" (but we'll save that for Step 4). Set up your spreadsheet so that it looks like this:

(Note that I have left the A and B columns blank for now - that's where we're going to display the sensor's status in Step 4)

The first column in our data (column C, the "Sensor ID" column), is what the server logs are keyed to. I used this method for two reasons: Using an integer means less storage space in the database (each log record can be associated with the appropriate sensor with as little as one byte of data) and if I keyed it to an existing field (e.g. sensor name) I wouldn't be able to edit that field without orphaning the sensor's previous data.

The second column is the name that will be used by the application when referring to the sensor (for instance, the email notifications with this name in their alerts). This will also be the name used on Graphs so try not to make any of these labels long if you can avoid it.

The third column is the email address that is used when sending out notifications for this server (use commas to list more than one address). Personally I like to have the monitor text message my cell phone when a server goes down. This can usually be done quite easily as many cell phone providers provide an email address that you can use to text your cell phone.

This is all we have to do to create new sensors. Create the sensor itself and install it on the corresponding server, and add the sensor to this list. Automatically the sensor will begin being scanned, we will be notified when there are issues, and we can see the sensor's status are read reports on it (as soon as there is enough data to do so).

IMPORTANT: Please note that Google Docs doesn't update the published document immediately after making changes; rather there is a lag time between when you change the document and when it republishes it. You can usually make the spreadsheet update the published version immediately if you turn off publishing and then turn it back on again.

We also need to create another worksheet that will contain some settings. We'll call this worksheet "Sensor and Report Settings", and in it you should create the following fields:

The yellow cell, B2, should contain the following formula:

=VLookup(B1,'Sensor List'!C3:D7, 2, FALSE)

Rather than create a separate set of worksheets for each sensor report, we're going to make one set of worksheets that will display the data from any sensor. We will control which sensor is being reported on by changing the number (sensorID) in the green box on this worksheet. The above formula gives you a little positive feedback by displaying the name of the sensor that you've just selected.

Before our application can read these settings, we must publish the spreadsheet. At the top select Share > Publish as web page... and you will get a dialog box where you can publish the document. Click Publish Now and then click More publishing options on the bottom of the dialog box. This is where you can create a URL for specific ranges of data. We're going to generate two URLs, one for the sensor list on the first worksheet, and the second for the settings on the second worksheet. In the pop-up dialog that appears when you click More publishing options, set the File Format to CSV, under What sheets select Sheet "Sensor List" only, and under What cells enter C3:F50 (I picked 50 at random, the number only has to be higher than the last sensor on your list, but equal to or less than the number of rows currently on the spreadsheet). Make a copy of this URL for yourself and generate one for the settings on the Sensor and Report Settings worksheet (cell range B3:B6).

Step 3: Testing our Sensors

Okay, now we have:

  1. a list of sensors to test
  2. at least one sensor script installed on a web server that we will test

Now, on the server that will be doing the testing (again, the server that you are using for your testing should not be on a server that you plan to test), we will add some PHP script and a MySQL database that will do the actual testing and send out notifications (if needed) and store the test results in our database. Clearly this is the nexus of our application.

Let's start by creating our database (I named my database sensors). Here is the SQL for creating the tables that I am using:

SQL
DROP TABLE IF EXISTS `sensor_log`;

CREATE TABLE IF NOT EXISTS `sensor_log` (
 `ID` int(11) NOT NULL auto_increment,
 `sensorID` smallint(2) NOT NULL,
 `lag` smallint(2) NOT NULL,
 `status_code` smallint(2) NOT NULL,
 `created_date` timestamp NOT NULL default CURRENT_TIMESTAMP,
 PRIMARY KEY  (`ID`),
 KEY `sensorID` (`sensorID`),
 KEY `status_code` (`status_code`)
) ENGINE=MyISAM  DEFAULT CHARSET=latin1 AUTO_INCREMENT=1 ;

DROP TABLE IF EXISTS `sensor_log_archive`;

CREATE TABLE IF NOT EXISTS `sensor_log_archive` (
 `ID` int(11) NOT NULL auto_increment,
 `sensorID` int(2) NOT NULL,
 `average_lag` int(2) NOT NULL,
 `downtime` int(3) NOT NULL,
 `sensor_date` date NOT NULL,
 `created_date` timestamp NOT NULL default CURRENT_TIMESTAMP,
 PRIMARY KEY  (`ID`),
 KEY `sensorID` (`sensorID`)
) ENGINE=MyISAM  DEFAULT CHARSET=latin1 AUTO_INCREMENT=1 ;

This is a very straight forward set of tables. The sensor_log table is where we store the results of our tests, and the sensor_log_archive table is where we store our "compressed" data (our script archives data by taking the aggregate results for an entire day for each sensor and insert it into the archive table, thus we reduce the amount of space by a factor of nearly 100 to 1).

Now we'll start creating our testing script. I call this script testallsensors.php. and I keep the script in a folder called sensors. The first thing you're going to need is a connection string to your database. Again I keep this in a separate file, like always, and call the file database_connection.php. My database connection looks something like this:

database_connection.php

PHP
<?php

$database_hostname = "localhost";
$database_username = "dbusername";
$database_password = "dbpassword";
$database_name = "dbname";

$database_connection = mysql_pconnect($database_hostname,
$database_username, $database_password) or trigger_error(mysql_error(),E_USER_ERROR);

mysql_select_db($database_name, $database_connection);
?>

(As you can see, I connect to the server and select my database in my connection script. I rarely have an application that accesses two databases on the same server so I find this most useful.)

We're also going to read some data from our spreadsheet. I use a readCSV function that I found in the comments area of one of the PHP Manual pages, that I modified very slightly for this purpose (see http://www.php.net/fgetcsv). I keep the function in an include file as I use on multiple pages. It looks like this:

csv.php

PHP
<?php

define("CSV_Start", 0);
define("CSV_Quoted",   1);
define("CSV_Quoted2",  2);
define("CSV_Unquoted", 3);

function readCSV($fh, $len, $delimiter = ',', $enclosure = '"') {
 $data = Array();
 $fildNr = 0;
 $state = CSV_Start;

 $data[0] = "";
 do {
  if(($line = fgets($fh, $len)) == FALSE) return FALSE;
  for ($ix = 0; $ix < strlen($line); $ix++) {
   if ($line[$ix] == $delimiter) {
    if ($state != CSV_Quoted) {
     $fildNr++;
     $data[$fildNr] = "";
     $state = CSV_Start;
    } else {
     $data[$fildNr] .= $line[$ix];
    }
   } elseif ($line[$ix] == $enclosure) {
    if ($state == CSV_Start) {
     $state = CSV_Quoted;
    } elseif ($state == CSV_Quoted) {
     $state = CSV_Quoted2;
    } elseif ($state == CSV_Quoted2) {
     $data[$fildNr] .= $line[$ix];
     $state = CSV_Quoted;
    } else {
     $data[$fildNr] .= $line[$ix];
    }
   } else {
    $data[$fildNr] .= $line[$ix];
    if ($state == CSV_Quoted2) {
     break;
    } elseif ($state == CSV_Start) {
     $state = CSV_Unquoted;
    }
   }
  }
 } while ($state == CSV_Quoted);

 return $data;
}

?>

Okay, now we'll start testallsensors.php. The first thing we'll do is import the settings from our spreadsheet. We'll start by ECHOing the settings to the script output so we can verify that the settings are being imported correctly. (Make sure you replace the URL so that it's the URL for the settings that you determined in Step 2.)

testallsensors.php (stage one)

PHP
<?php

require_once('database_connection.php');
require_once('csv.php');

$handle = fopen("http://spreadsheets.google.com/pub?
key=XXXXXXXXXXXXXXXX&output=csv&gid=2&range=B3:B6", "r");

if(($results = readCSV($handle, 1000)) == FALSE) break;
$settings_recipients = str_replace("\n", "", $results[0]);

if(($results = readCSV($handle, 1000)) == FALSE) break;
$settings_failures = str_replace("\n", "", $results[0]);

if(($results = readCSV($handle, 1000)) == FALSE) break;
$settings_retry_minutes = str_replace("\n", "", $results[0]);

if(($results = readCSV($handle, 1000)) == FALSE) break;
$settings_archive_days = str_replace("\n", "", $results[0]);

fclose($handle);

echo("$settings_recipients|$settings_failures|
$settings_retry_minutes|$settings_archive_days");

?>

If everything went well, you should see the settings you entered into your spreadsheet when you run this script. If it didn't work, check the URL and make sure that your spreadsheet is Published.

Lets go ahead and check out the rest of this script. I'll discuss it in detail below:

testallsensors.php (stage two)

PHP
<?php

require_once('database_connection.php');
require_once('csv.php');

function timestamp() {
 $mytime = split(" ", microtime(true));
 $mytime = ($mytime[0] + $mytime[1]);
 return round($mytime*1000);
}

// Get the maintenance settings
$handle = fopen("http://spreadsheets.google.com/pub?
key=XXXXXXXXXXXX&output=csv&gid=2&range=B3:B6", "r");

if(($results = readCSV($handle, 1000)) == FALSE) break;
$settings_recipients = str_replace("\n", "", $results[0]);

if(($results = readCSV($handle, 1000)) == FALSE) break;
$settings_failures = str_replace("\n", "", $results[0]);

if(($results = readCSV($handle, 1000)) == FALSE) break;
$settings_retry_minutes = str_replace("\n", "", $results[0]);

if(($results = readCSV($handle, 1000)) == FALSE) break;
$settings_archive_days = str_replace("\n", "", $results[0]);

fclose($handle);

// The back end could serve multiple spreadsheets each
// containing their own list of sensors
// if you want to do this, add each additional sensor list to this array:

$handle_list = array (
 "http://spreadsheets.google.com/pub?key=XXXXXXXXXXXXXXX&output=csv&gid=0&range=C3:F99"
);

$retries = 0;
$retry = false;

// we keep looping until we are sure a sensor has not gone down
// (i.e. a sensor that was previously up has not failed $settings_failures times).
do {
 $any_failures = false;
 for($handle_index=0 ; $handle_index<count($handle_list) ; $handle_index++) {
 
  $handle = fopen($handle_list[$handle_index], "r");
  
  while(($results = readCSV($handle, 1000)) != FALSE) {
 
   // go through each sensor on the list, test it, record it,
   // and send out a notification if necessary
   $sensor_sensorID = intval($results[0]);
   $sensor_name = trim($results[1]);
   $sensor_url = trim($results[2]);
   $sensor_emails = trim($results[3]);
 
   if(!$sensor_sensorID || $sensor_name=="" || $sensor_url=="" ||
	$sensor_emails=="") continue;

   // if it's a retry, then find out if this sensor failed last time
   $sensor_failed = false;
   if($retry) {
   
    $sql = "SELECT status_code FROM sensor_log WHERE
		sensorID=$sensor_sensorID ORDER BY created_date DESC LIMIT 1;";
 
    $result = mysql_query($sql, $database_connection) or die(mysql_error());
    $row = mysql_fetch_assoc($result);
    $sensor_failed = (intval($row['status_code'])==200) ? false : true;
    mysql_free_result($result);
   
   }

   if(($retry && $sensor_failed) || !$retry) {
   
    $start = timestamp();
    $response = @ file_get_contents($sensor_url, "r");
    list($version,$status_code,$msg) = explode(' ',$http_response_header[0], 3);
    $end = timestamp();
    $difference = $end-$start;

    // check to see if this sensor has had previous trouble
    $sql = "SELECT status_code FROM sensor_log WHERE sensorID=
		$sensor_sensorID ORDER BY ID DESC LIMIT $settings_failures";
    $dbtest = mysql_query($sql, $database_connection) or die(mysql_error());
    $dbtest_totalrows = mysql_num_rows($dbtest);
    $error_count = $settings_failures-$dbtest_totalrows;
    for($i=0 ; $i<$dbtest_totalrows ; $i++) {
     $row = mysql_fetch_assoc($dbtest);
     if($row['status_code'] != 200) $error_count++;
    }

    // add this last test to the database
    $sql = "INSERT INTO sensor_log (sensorID, status_code, lag)
		VALUES ($sensor_sensorID, $status_code, $difference)";
    $dbtest = mysql_query($sql, $database_connection) or die(mysql_error());
    // if this is the first good reading in awhile, then the server just went back up.
    if($error_count == $settings_failures && $status_code == 200) {
     // sensor has come back up after being down
     $subject =  "$sensor_name is back up!";
     $msg = "The server went back up!\n\n";
     ini_set('sendmail_from', 'nopreply@yoururl.com');
     $to = $sensor_emails; // separate email addresses with commas
     $mailheaders = "From: Server Monitor <nopreply@yoururl.com> \n";
     mail($to, $subject, $msg, $mailheaders);
    }
    $any_failures = ($status_code != 200 && $error_count
		< $settings_failures) ? true : $any_failures;

    if($retries == $settings_failures && $status_code != 200) {
     // send the email notification
     $subject =  "$sensor_name has gone down!";
     $msg = "The server went down!\n\n";
     ini_set('sendmail_from', 'nopreply@yoururl.com');
     $to = $sensor_emails; // separate email addresses with commas
     $mailheaders = "From: Server Monitor <noreply@yoururl.com> \n";
     mail($to, $subject, $msg, $mailheaders);
    }
   }
  }
  fclose($handle);
 }
 if($any_failures) {
  $retries++;
  sleep(abs($settings_retry_minutes * 60));
  $retry = true;
 } else {
  $retry = false;
 }
} while ($retry);
?>

Okay, there are a few things that need explaining here. I'll describe in plain language what's going on:

First the script reads the settings, as we reviewed in testallsensors.php (stage one).

The very outside loop is the retry loop. If any sensor fails (i.e. returns a status code other than 200) that wasn't failing previously, this loop continues until the sensor is either good, or the script has exhausted its number of retries (the number of retries, and the length of time this script sleeps between each retry is determined in our settings). Each time a test is made, the lag time and the response code is INSERTed into the sensor_log table.

The loop nested inside the retry loop is a loop that goes through each sensor list (if there is more than one). I built the back end so it could serve more than one list of sensors (i.e. lists from multiple spreadsheets); I do this by looping through an array of sensor list URLs (the example has just one URL, but you can add others). As it reads each sensor from the list, it calls it, gets the result and times the response time (what I refer to as lag).

I built it so one can have multiple sensor lists for organizational purposes. I thought there was a pretty good chance that people would want to create separate spreadsheets for different customers, etc. There's a lot to be said for having only one list though (one place to go to view the status of all of your sensors) and even if you have one sensor list you can, of course, create as many reports as you want for any sensorID on any number of spreadsheets. Just remember to never reuse a sensorID (as least not with the same backend and database). If you use the same sensorID on any sensor list twice, with the same back end/database, their data will get mixed together.

Within the inner loop, we do our sensor test and if it comes back up after being previously down then an "Up" notification goes out. If a sensor returns $settings_failures failures after previously being up, then a "Down" notification also goes out. And whatever happens, the result and the lag are inserting into the database with the current timestamp and the sensorID.

At the end of the loop, we see if a sensor is failing (by which I mean, if it came back down but has not returned $settings_failures failures yet). If there is such a circumstance, the process sleeps for $settings_retry_minutes before it tries again.

Run a few tests with the script and make sure that it's properly reading the information from your spreadsheet, that it's conducting its tests and INSERTing the test results in the database. Finally, simulate one of your sensors going down (you can do this by simply temporarily renaming a sensor so the script can't find it / gets a status code of 404) and make sure you get the notifications as the script detects the error. It's possible that your tests will time out on your webserver because of the sleep commands. This won't be an issue later as the cron will be running the script and it won't be running it using the webserver, but the timeouts can make testing difficult. If you get webserver timeouts, try shortening the $settings_retry_minutes and/or $settings_failures values temporarily for your tests or extend your server's timeout.

Important Note: If you edit this code, make sure you don't cause the script to go into an endless loop. Eventually, when you begin using the cron will thread a new instance of this script every 15 minutes and if the script isn't terminating correctly, you could make your testing server very very unhappy.

I'm using LunarPages for hosting and they have a very easy to use option in their control panel for creating cron jobs, but every shared hosting service we use has a similar facility. Usually you have to call the script by passing it to the PHP interpreter (e.g. php /path/to/script/testallsensors.php). I find running the script every 15 minutes is perfect. More often than that isn't much more useful, and it may get your hosting services upset with you. If you have issues creating a cron job, call your hosting service and they'll surely help you out.

When you are satisfied the script is working correctly, we'll move on to displaying the current status for all the sensors in our spreadsheet.

Step 5: Current Server Status

To display the current server status, first we need to pull results from the database and populate a new worksheet with them. We'll do this by having the worksheet execute an importDATA function that calls a script that returns CSV values that the function will use to populate the worksheet. We'll start by creating the script that returns the CSV data:

sensorsummary.php

PHP
<?php
require_once('database_connection.php');
require_once('csv.php');
// We get the list of sensors from the spreadsheet
$spreadsheet_sensorID = array();
$spreadsheet_name = array();

$handle = fopen("http://spreadsheets.google.com/pub?
key=XXXXXXXXXXXXXXXX&output=csv&gid=0&range=C3:C99", "r");
while(($results = readCSV($handle, 1000)) != FALSE) {
 array_push($spreadsheet_sensorID, intval($results[0]));
}

$sql = "SELECT sensorID, lag, status_code FROM sensor_log
	WHERE ID IN (SELECT MAX(ID) AS lastID FROM sensor_log
	WHERE sensorID IN (" . join(",", $spreadsheet_sensorID) . ")
	GROUP BY sensorID);";

$result = mysql_query($sql, $database_connection) or die(mysql_error());
$result_totalrows = mysql_num_rows($result);
for($i=0 ; $i<$result_totalrows ; $i++) {
 $row = mysql_fetch_assoc($result);
 echo "" . $row['sensorID'] . "," . $row['lag'] . "," .
	(($row['status_code']==200) ? (200) : (0)) . "\n";
}
mysql_free_result($result);
?>

Simply put, this script looks at all of the sensors on a sensor list and returns their current status (again, as CSV data, which I used as it was easiest).

If you have more than one spreadsheet/sensor list using your back end, you will either have to create one of these scripts for each of your spreadsheets or pass something to this script to that which identifies the sensor list URL it should use.

Once you have tested the script and made sure it is indeed outputting the sensor status data correctly, you can go ahead and import the data into your spreadsheet. Go to your spreadsheet and create a worksheet called Sensor Status Data. The worksheet should look like this:

In the cell A2, insert the following function:

=importData(http://www.yourserver.com/sensors/sensorsummary.php?temp=
& INT(NOW()/TIME(0;10;0)))

The temp value appended onto the end of the function causes the filename to change every 10 minutes; this helps to keep the data fairly current. Please understand that Google can't poll your script every few seconds or anything like that; nor would it be a good idea anyway (that's a LOT of traffic). Normally the spreadsheet updates the pulled data at variable freqencies, presumably depending on how busy their servers are. There is a certain amount of lag time here, especially during heavy traffic periods, but you can force an update if you really need to make sure the spreadsheet is as current as possible, and you can call testallsensors.php if you need to re-pole the servers being tested (you can force the data to reload manually by editing the cell with the function and changing the cell contents - usually I just add a space to the end of the cell contents).

I also made a second version of this script that pulls in some more details and orders the data into columns rather than rows. It really is very close to the same data used here sensorsummary.php, but I'll include it here for convenience sake; some of the graphs and gadgets that are available in Google Spreadsheet require the data to be organized like this:

sensorsummary_detailed.php

PHP
<?php
require_once('database_connection.php');
require_once('csv.php');
$current_ID = array();
$current_lag = array();
$current_code = array();
$historical_ID = array();
$historical_lag = array();
// We get the list of sensors from the spreadsheet
$spreadsheet_sensorID = array();
$spreadsheet_name = array();

$handle = fopen("http://spreadsheets.google.com/pub?
key=XXXXXXXXXXXXXXXXXXXX&output=csv&gid=0&range=C3:D99", "r");

while(($results = readCSV($handle, 1000)) != FALSE) {
 array_push($spreadsheet_sensorID, intval($results[0]));
 array_push($spreadsheet_name, str_replace("\n", "", $results[1]));
}

// We get the historical average so we have something to compare against
$sql = "SELECT sensorID, FLOOR(AVG(lag)) AS average_lag, AVG(status_code)
AS average_code, HOUR(created_date) AS hourgroup FROM `sensor_log`
WHERE HOUR(NOW())=HOUR(created_date) GROUP BY sensorID, hourgroup;";
$result = mysql_query($sql, $database_connection) or die(mysql_error());
$result_totalrows = mysql_num_rows($result);

for($i=0 ; $i<$result_totalrows ; $i++) {
 $row = mysql_fetch_assoc($result);
 array_push($historical_lag, $row['average_lag']);
 array_push($historical_ID, $row['sensorID']);
}

// We get the average for the last hour
$sql = "SELECT sensorID, FLOOR(AVG(lag)) AS average_lag, AVG(status_code)
AS average_code FROM `sensor_log` WHERE (TIMEDIFF(NOW(), created_date) <
TIME('1:00:00')) GROUP BY sensorID;";
$result = mysql_query($sql, $database_connection) or die(mysql_error());
$result_totalrows = mysql_num_rows($result);

for($i=0 ; $i<$result_totalrows ; $i++) {
 $row = mysql_fetch_assoc($result);
 // build arrays of the results, so we can turn them on their side
 array_push($current_ID, $row['sensorID']);
 array_push($current_lag, $row['average_lag']);
 array_push($current_code, (($row['average_code']==200) ? (200) : (0)));
}

// write out a row of sensor names
for($j=0 ; $j<count($spreadsheet_sensorID) ; $j++) {
 for($i=0 ; $i<count($current_ID) ; $i++)
  if($spreadsheet_sensorID[$j] == $current_ID[$i])
   echo "\"{$spreadsheet_name[$j]} ({$current_ID[$i]})\"";
 if($j<count($spreadsheet_sensorID)-1) echo ",";
}

echo "\n";
// write out a row of current lag times
for($j=0 ; $j<count($spreadsheet_sensorID) ; $j++) {
 for($i=0 ; $i<count($current_ID) ; $i++)
  if($spreadsheet_sensorID[$j] == $current_ID[$i])
   echo "\"{$current_lag[$j]}\"";
 if($j<count($spreadsheet_sensorID)-1) echo ",";
}

echo "\n";
// write out a row of average lag times for the same hour
for($j=0 ; $j<count($spreadsheet_sensorID) ; $j++) {
 for($i=0 ; $i<count($historical_ID) ; $i++)
  if($spreadsheet_sensorID[$j] == $historical_ID[$i])
   echo "\"{$historical_lag[$j]}\"";
 if($j<count($spreadsheet_sensorID)-1) echo ",";
}

echo "\n";

// write out a row of status codes (200 = entire hour is good, 0 = error)
for($j=0 ; $j<count($spreadsheet_sensorID) ; $j++) {
 for($i=0 ; $i<count($current_ID) ; $i++)
  if($spreadsheet_sensorID[$j] == $current_ID[$i])
   echo "\"{$current_code[$j]}\"";
 if($j<count($spreadsheet_sensorID)-1) echo ",";
}

echo "\n";
mysql_free_result($result);
?>

Colouring Our Data

This is useful stuff; there's nothing like having problems stand out in red. Google Docs Spreadsheet has a very easy mechanism for colouring your cells based on rules. In my spreadsheet, I took all of the values, including the values in the worksheets that contain the imported data, and made them so that they changed colour based on their value. This makes it really easy to spot trouble.

Colouring Lag Values

Ideally I think I'd like to base the colours on tolerances within what would be considered normal for a specific sensor, but in my example I used a gross scale that I apply to all the values. The values I used are:

  1. < 500 = good (green),
  2. 500 - 1000 = medium (yellow),
  3. > 1000 = bad (red)

Please note that some kinds of sensors are going to naturally take longer to return a result than others. For instance, a "web sensor" doesn't have to have any server side code at all, whereas a "database sensor" needs to open a connection to the database server, run a query and inspect the results.

The diagram on the right shows the exact settings I used. Make sure you select the entire column (or at least from row 2 to the bottom of your worksheet) that you intend to create your rules for before you create your rules.

Colouring Errors

Errors are easier to colour because there are only two states: error (red) and no error (green). On the front worksheet (the sensor list), beside each sensor in the first column, I made a "light" by inserting the error value from the worksheet that I pull the sensor status into (I conveniently return the values in the same order that they appear on the list). I then colour the text so you can't see the value at all, just bright green or red by making the rule change the text colour so that it's the same as the background colour.

Okay! We've come a long way now. We have the sensors being tested, notifications being sent out, data being stored, and the results coloured with current status lights beside each of the sensors on our sensor list. Now we just need to create useful reports on individual sensors, daily sensor reports, and just to be thorough, we're going to archive/compress our old data.

Creating our Detailed History Report

History reports allow us to get a bigger picture of a sensor's status and allow us to see in finer detail what went wrong and when. When I first started this project I wasn't quite sure how I was going to create a history report (which requires multiple worksheets) for each sensor. It soon occurred to me that if I create an adaptable report, where I could change one setting and have the report populate itself with the data from any sensor, I would save the end user (in this case, me!) a lot of work (and if we ever need to send someone a copy of a sensor history report, we can always "hard wire" a copy of the report for that specific use).

The way that I chose to do this was by creating a cell (that I colour Green) on the Sensors and Report Settings worksheet where the user enters the sensor ID that they want to create a report for (if someone can think of a way to do this with some kind of select box or something, I'd like to hear from you). For convenvience sake, I chose this page to display a list of the sensors and their hourly averages (the list also shows the current hour compared to the same hour's recent historical average) so that the user has the sensor list handy.

When the sensor ID is changed, two other worksheets are populated by calling two scripts that return 24 hour and 10 data historical data for the sensor indicated in the green box. Usually the data is populated within a few seconds of changing the [sensor ID] number in this cell, because changing the cell value alters the URLs that the data is read from and that typically triggers a [nearly] immediate update.

Then, finally, I have a fourth worksheet (titled the "Sensor Report") that shows two graphs based on the historical data for the sensor ID entered. The 24 hour graph shows actual values, where as the 10 day graph shows hourly averages for that period.

I pulled in the data by entering the following formulas in the A2 cells on the 24 Hour Error Trend Data and the 10 Day Error Trend Data worksheets:

For the 24 Hour Error Trend Data worksheet:

For the 10 Day Error Trend Data worksheet:

Graphing the Data

I used the Interactive Time Series graphs for this report. I found they were a good way of allowing the end user to examine any part of the data easily. Please note that these graphs will not be able to display any data until enough data collected first. You can [patiently] wait until there's enough data or you can generate some test data in the database if you are feeling particularly impatient.

Use the following for your Range in the graph's settings:

'24 Hour Error Trend Data'!A2:C130

and

'10 Day Error Trend Data'!A2:C250

(Note that the end of these two ranges can't go beyond the end of the last row that you actually have in these two worksheets. I padded the worksheets with extra rows because, at least with the 24 hour data, you can't know exactly how many actual readings there will be (because bad sensor readings generate extra follow-up readings to verify the trouble wasn't just some temporary network fluctuation - plus you may have triggers several test readings).

And the following are the two PHP scripts that are called to pull in the data:

24hours.php

PHP
<?php
require_once('database_connection.php');
$sensorID = 0;
if(isset($_GET['sensorID'])) {
 $sensorID = intval($_GET['sensorID']);
}
if(isset($_POST['sensorID'])) {
 $sensorID = intval($_POST['sensorID']);
}

if($sensorID == 0) {
 echo "error";
 exit();
}

$sql = "SELECT lag, YEAR(created_date) AS yeargroup, MONTH(created_date)
AS monthgroup, DAY(created_date) AS daygroup, HOUR(created_date) AS hourgroup,
MINUTE(created_date) AS minutegroup, status_code, TIMEDIFF(NOW(), created_date)
AS boo FROM sensor_log WHERE (TIMEDIFF(NOW(), created_date) < TIME('24:00:00'))
AND sensorID=$sensorID ORDER BY created_date;";
$result = mysql_query($sql, $database_connection) or die(mysql_error());
$result_totalrows = mysql_num_rows($result);

for($i=0 ; $i<$result_totalrows ; $i++) {
 $row = mysql_fetch_assoc($result);
 $minutes = sprintf("%02d", $row['minutegroup']);
 echo "" . $row['monthgroup'] . "/" . $row['daygroup'] . "/" .
$row['yeargroup'] . " " . $row['hourgroup'] . ":" . $minutes . "," .
$row['lag'] . "," . (($row['status_code']==200) ? (200) : (0)) . "\n";
}

mysql_free_result($result);
?>

10days.php

PHP
<?php 
 
require_once('database_connection.php'); 
 
$websiteID = 0; 
if(isset($_GET['websiteID'])) { 
$websiteID = intval($_GET['websiteID']); 
} else if(isset($_POST['websiteID'])) { 
$websiteID = intval($_POST['websiteID']); 
} 
 
if($websiteID == 0) { 
echo "error"; 
exit(); 
} 
 
 
$sql = "SELECT FLOOR(AVG(lag)) AS average_lag, YEAR(created_date) AS yeargroup, MONTH(created_date) AS monthgroup, DAY(created_date) AS daygroup, HOUR(created_date) AS hourgroup, AVG(status_code) AS average_code, TIMEDIFF(NOW(), created_date) AS boo FROM sensor_log WHERE (TIMEDIFF(NOW(), created_date) < TIME('240:00:00')) AND websiteID=$websiteID GROUP BY yeargroup, monthgroup, daygroup, hourgroup ORDER BY created_date;"; 
$result = mysql_query($sql, $database_connection) or die(mysql_error()); 
$result_totalrows = mysql_num_rows($result); 
 
for($i=0 ; $i<$result_totalrows ; $i++) { 
$row = mysql_fetch_assoc($result); 
echo "" . $row['monthgroup'] . "/" . $row['daygroup'] . "/" . $row['yeargroup'] . " " . $row['hourgroup'] . ":00," . $row['average_lag'] . "," . (($row['average_code']==200) ? (200) : (0)) . "\n"; 
} 
 
 
mysql_free_result($result); 
 
?>

Creating the Daily Report and Maintaining Our Database

We're going to take care of both of these tasks with one script that we'll have the cron call every 24 hours.

The Daily Report

The daily report is a report that we'll have emailed to us first thing in our day so that we can see at a glance how our servers have been doing in the last 24 hours. The report I made is quite simple in that it shows the sensor list, as well as each sensor's up time and lag time, for the most recent 24 hours. For each value, I use a small function that calculates an appropriate RGB value (colour) for each of the values shown on the sensor list.

"Archiving" Old Data

Also, as it would be unnecessary (or even excessive) to keep every ping value in perpetuity, we take data that is old (I use >2 weeks) and then average the values for each day we are archiving and place the averaged lag/up time values in another table (sensor_log_archive); deleting the old sensor values as we go. This should make your database nearly 100 times smaller than it would be otherwise.

Here is the PHP script that I created for the report. Remember that you will have to create a cronjob that will call the script once a day.

daily.php

PHP
<?php
/*
 This script does two things.
 
 1) sends out the daily email report
 2) archives old data
 
 This script must be called by the cron daily
*/
require_once('database_connection.php');
require_once('csv.php');
// Get the maintenance settings
$handle = fopen("http://spreadsheets.google.com/pub?
key=XXXXXXXXXXXXXXXXXXX&output=csv&gid=3&range=B3:B6", "r");
if(($results = readCSV($handle, 1000)) == FALSE) break;
$settings_recipients = str_replace("\n", "", $results[0]);

if(($results = readCSV($handle, 1000)) == FALSE) break;
$settings_failures = str_replace("\n", "", $results[0]);
if(($results = readCSV($handle, 1000)) == FALSE) break;
$settings_retry_minutes = str_replace("\n", "", $results[0]);
if(($results = readCSV($handle, 1000)) == FALSE) break;
$settings_archive_days = str_replace("\n", "", $results[0]);

// create a set of 2 dimensional arrays holding lag/down time
// for all sensors on all archivable dates
// (grouped by sensorID and date)
$compressed_sensor_readings = array();
$compressed_cummulativelag = array(); // Culumative lag time
$compressed_downtime = array();

$first_bad = array();
$last_code = array();

$sql = "SELECT sensorID, lag, status_code, created_date, DATE(created_date)
AS group_date FROM sensor_log WHERE created_date < DATE_SUB(CURRENT_DATE(),
INTERVAL $settings_archive_days DAY) ORDER BY sensorID, created_date";
$result = mysql_query($sql, $database_connection) or die(mysql_error());
$result_totalrows = mysql_num_rows($result);

for($i=0 ; $i<$result_totalrows ; $i++) {
 $row = mysql_fetch_assoc($result);
 $compressed_sensor_readings[$row['sensorID']][$row['group_date']]++;
 $compressed_cummulativelag[$row['sensorID']][$row['group_date']]+=$row['lag'];

 // check to see if up or down; and calculate downtime
 if($row['status_code'] == 200) {
  $compressed_downtime[$row['sensorID']][$row['group_date']] +=
(isset($last_code[$row['sensorID']][$row['group_date']]) &&
$last_code[$row['sensorID']][$row['group_date']] != 200) ?
(strtotime($row['created_date']) - $first_bad[$row['sensorID']][$row['group_date']]) : 0;
  $last_code[$row['sensorID']][$row['group_date']] = 200;
  $first_bad[$row['sensorID']][$row['group_date']] = null;
 } else {
  $first_bad[$row['sensorID']][$row['group_date']] =
(!isset($last_code[$row['sensorID']][$row['group_date']]) ) ?
strtotime($row['created_date']) : $first_bad[$row['sensorID']][$row['group_date']];
  $first_bad[$row['sensorID']][$row['group_date']] =
($last_code[$row['sensorID']][$row['group_date']] == 200) ?
strtotime($row['created_date']) : $first_bad[$row['sensorID']][$row['group_date']];
  $last_code[$row['sensorID']][$row['group_date']] = 0;
 }
}

mysql_free_result($result);
// Insert archived data into archive table
while (list($current_sensorID, $value) = each($compressed_sensor_readings)) {
 while (list($current_sensor_date, $value2) = each($value)) {
  $sql = "INSERT INTO sensor_log_archive (sensorID, average_lag,
downtime, sensor_date) VALUES ($current_sensorID," . intval
($compressed_cummulativelag[$current_sensorID][$current_sensor_date]/
$compressed_sensor_readings[$current_sensorID][$current_sensor_date]) .
",{$compressed_downtime[$current_sensorID][$current_sensor_date]},
'$current_sensor_date');\n";
  $result = mysql_query($sql, $database_connection) or die(mysql_error());
  mysql_free_result($result);
 }
}

// Delete pre-archived data from log
$sql = "DELETE FROM sensor_log WHERE created_date < DATE_SUB(CURRENT_DATE(),
INTERVAL $settings_archive_days DAY);";
$result = mysql_query($sql, $database_connection) or die(mysql_error());
mysql_free_result($result);

function percentage_to_color($p){
 $red = $p<50 ? 255 : round(256 - ($p-50)*5.12);
 $green = $p>50 ? 255 : round(($p)*5.12);
 return sprintf("%02X%02X00", $red, $green);
}

function lag_to_color($l){
 $p = 100 - ($l/12.5);
 $p = ($p<0) ? 0 : $p;

 return percentage_to_color($p);
}

// We get the list of sensors from the spreadsheet
$spreadsheet_sensorID = array();
$spreadsheet_name = array();
$spreadsheet_url = array();

$handle = fopen("http://spreadsheets.google.com/pub?
key=XXXXXXXXXXXXXXXXXX&output=csv&gid=0&range=C3:E99", "r");

while(($results = readCSV($handle, 1000)) != FALSE) {

 array_push($spreadsheet_sensorID, intval($results[0]));
 array_push($spreadsheet_name, str_replace("\n", "", $results[1]));
 array_push($spreadsheet_url, str_replace("\n", "", $results[2]));
}

// Summarize the data from yesterday
$summary_sensor_readings = array();
$summary_lag = array();
$summary_good_codes = array();

$sql = "SELECT sensorID, lag, status_code FROM sensor_log
WHERE DAY(created_date) = DAY(DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY))
ORDER BY sensorID, created_date;";

$result = mysql_query($sql, $database_connection) or die(mysql_error());
$result_totalrows = mysql_num_rows($result);

for($i=0 ; $i<$result_totalrows ; $i++) {
 $row = mysql_fetch_assoc($result);
 if(intval($row['sensorID']) > 0) {
  $summary_sensor_readings[intval($row['sensorID'])]++;
  $summary_lag[intval($row['sensorID'])] += $row['lag'];
  if($row['status_code'] == 200) $summary_good_codes[intval($row['sensorID'])]++;
 }
}

mysql_free_result($result);
// create the daily email report.
$emailcontent = "";
$yesterday=date('l jS \of F, Y', time()-86400);
$emailcontent .= "<p>Monitoring Report for: $yesterday</p>";
$emailcontent .= "<table>";
$emailcontent .= "<tr><th>Sensor Tested</th><th>Average Lag</th><th>Uptime</th></tr>";
for($i=0 ; $i<count($spreadsheet_sensorID) ; $i++) {
  $average_lag = intval($summary_lag[$spreadsheet_sensorID[$i]] /
$summary_sensor_readings[$spreadsheet_sensorID[$i]]);
  $percentage_up = sprintf("%.2d", ($summary_good_codes[$spreadsheet_sensorID[$i]] /
$summary_sensor_readings[$spreadsheet_sensorID[$i]])*100 );
  

 $emailcontent .= "<tr><td><a href=\"{$spreadsheet_url[$i]}\">
{$spreadsheet_name[$i]} ({$spreadsheet_sensorID[$i]})</a></td><td style=\"background: #"
. lag_to_color(intval($average_lag)) . ";\">{$average_lag}</td><td style=\"background: #"
. percentage_to_color(intval($percentage_up)) . ";\">{$percentage_up}%</td></tr>\n";
}
$emailcontent .= "</table>";

$email = "<html><body>\n" . $emailcontent . "\n</body></html>";

ini_set('sendmail_from', 'nopreply@yourdomain.com');
   
$mailheaders = "From: Server Monitor <noreply@yourdomain.com> \n";
$mailheaders .= 'Content-type: text/html; charset=iso-8859-1\n';
   
mail($settings_recipients, "Server Monitor Report", $email, $mailheaders);

?>

Remember now, to add new sensors all you have to do is upload a sensor script to the server you are monitoring and add 1 line to the sensor list. The application will see the published list and start calling the sensor.

I have several little extra features I've added to my spreadsheet (e.g. a call that tells the user the next unused sensorID) that I'd be happy to share with people (I just don't want to turn this article into a book - grin).

Happy Days are Here Again!

Oh, Happy Customers! Now you are the first to know when a website goes down. What's more, you've got charts and graphs to show your customers the great service they are getting and demonstrate the diligence you show on their behalf. And, you have historical data that you can compare and will give you a better idea of how well your servers are performing, as well as provide you with data that you can use when working with your providers to help diagnose issues, identify bottlenecks and improve service [where needed].

If you like this post, and you'd like to see some more projects along these lines, drop me a line (especially if you have ideas you'd like to contribute). If there is enough interest, I will consider doing a series on using Google Spreadsheets as front ends to other types of reports and monitoring webapps.


  1. By "live data", I refer to data that is connected to an external data source, and contains the most current information available.
  2. A worksheet is like a page within your spreadsheet. You can select between, and create, worksheets using the tabs at the bottom of your spreadsheet.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)