Backing Up Files from A Webpage

With all the hosting solutions out there most being resellers of resellers, more and more have less functionality and few allow ssh or telnet access. This makes backing up directories difficult or impossible.

Recently I sold a website and offered to transfer the 6,000 page site to the new host. But without shell access I could not unzip the file in the new server. So I was forced to unzip the file on my server and ftp the files one by one to the new site. After 8 hours of being cut off every 5 minutes, the ftp timeout on the server the files were transfered. When I sold the site I expected a 15 minute deal to transfer it.

So how do you backup and save files when you do not have the ability to run the tar command from any command line interface?

You can use a simple perl program to execute system commands like tar and gzip directly from a webpage. This will allows complete website backups or compressing large data files.

You will need access to one directory not included in the backup and that directory must be chmod 0777 (web writeable). And you will need to know the complete paths to your files.

The targz.cgi can be in the directory that is being zipped, but the backup file you are creating must be outside the directory.

The program does not need any configuation, just upload it in text mode, chmod 0755 and access with any web browser.

get the targz.cgi script

Save the text file, change the extention to cgi or pl depending on your server.

Upload to a directory on your website.

chmod 0777

Then just access the program with your browser and zip up your entire website.

The program is simple. There is really just one line of code you need to understand to write your own.

open (TAR, "/bin/tar -cpzf $FORM{'backupfile'} $FORM{'backupdir'}/* |");

Using the open command and the | for output will bypass the normal system `tar file1 file2`. The system command will not work on most servers from a webpage. So a simple hack will solve the problem and make zipping and unzipping files very easy with perl.

by changing the program line to:

open (TAR, "/bin/tar -zxf $FORM{'backupfile'} |");

You can unzip the file just as easy as you zipped it up. Keep in mind that the file will unzip in the same directory as the perl program is running from.

Since you are only running the tar command, you can use all the switches and file paramaters in your code as you would on any command line tar application.

A simple unzip script would be something linke this:

#!/usr/bin/perl
open (TAR, "/bin/tar -zxf $file_to_unzip |");
print "Content-Type: text/html\n\n";
print "$file_to_unzip has been unzipped";

You can expand the script as I did with the targz.cgi using forms to input the $file_to_unzip or just replace it in the script when you need it.

When a program can be written in 5 lines it is usually easier to write the program when needed than to develop a more diverse application with hundreds of lines, form parsing and all the variables you can image you might need.

I have included a basic unzip program if you just need a quick and easy script to unzip a file you have backed up.

get the unzip script unziptar.cgi

The script is simple, just change the extension from .txt to .cgi or .pl, upload the script to the same directory you want to unzip the file in, chmod 0755 and access with any browser.

Type in the path to the file you want to unzip.

Since this site is more about teaching people how to do things I have seperated the zip and unzip into 2 different programs so you will understand what is happening. But you could combine them into one and have a utility that can manage your zipping and unzipping needs.