Archive for 'Perl scripts'

I've made a simple Perl backup script. Here are the features:

  • It will tar a directory you specify in a variable.
  • The filename/s of the archive/s created will have a prefix specified by you in a variable, and it will include the date when the archive was created. A sample filename is "Michael.Balcos-2013-06-23.tar.gz"
  • As for the destination directory for the archive/s, you can also set that.
  • Backup archives older than a specified number of days in the script will be deleted.

To download the script, please click this: backupScript.txt

Be very careful in setting the $backupDir variable (it sets the destination directory of your backup archive/s). Files which are older than the maximum age specified in the script will be deleted in the directory specified by the $backupDir variable.

If you'd like to use this script, please rename the backupScript.txt file to backupScript.pl, do a "chmod 710 backupScript.pl". I recommend that you use the script as root, thus please change the ownership by issuing a "chown root:root backupScript.pl". You can use this script in a cron job. Currently, I've tested this script in Slackware64 14.0. It should work in other distributions.

#!/usr/bin/perl
@timeStampTmp=localtime();
$timeStampFinal=sprintf("%04d-%02d-%02d", $timeStampTmp[5]+1900,$timeStampTmp[4]+1,$timeStampTmp[3]);
chomp $timeStampFinal;
$maxAgeInDays=2; # Backups older than this value (in days) will be deleted.
$backupDir='/backup'; # This is where your backup archives/tarballs will go.
$dirToBackup='/directory/To/Backup'; # This is the directory to be backup.
$backupFilePrefix='Michael.Balcos'; # Format of backup filename is "<$backupFilePrefix>-<$timeStampFinal>.tar.gz". Note that the whole absolute path ( except for the leading "/" ) will be stored in your archive/tarball.
$command='tar czpf '.$backupDir.'/'.$backupFilePrefix.'-'.$timeStampFinal.'.tar.gz '.$dirToBackup.' > /dev/null 2>&1';
system($command);
$command='find '.$backupDir.' -type f -maxdepth 1 -mtime +'.$maxAgeInDays.' -exec rm -f {} \;';
system($command);

Here's a Perl script that will loop to run fping, and then generate mtr and traceroute output files when an user specified packet loss threshold in the script is surpassed. Take note that this script requires fping . Here's a link to the script: icmp_check

#!/usr/bin/perl
# To stop this script, do a "touch /tmp/stopFping"
$hostToPing='www.google.com';
$packetLossThreshold=10; # Value is in percent.
$command='touch /tmp/fping.tmp';
system($command);
$command='chmod 666 /tmp/fping.tmp';
system($command);
while(!(-e '/tmp/stopFping')){
$command='fping -c 20 -q '.$hostToPing.' > /tmp/fping.tmp 2>&1';
system($command);
open(INPUTFILE, '/tmp/fping.tmp');
$packetLoss = ;
close(INPUTFILE);
chomp($packetLoss);
$packetLoss=~s/^.*\ \=\ [0-9]+\/[0-9]+\/([0-9]+)\%\,\ min.*$/$1/;
if($packetLoss ge $packetLossThreshold){
@timeStampTmp=localtime();
$timeStampFinal=sprintf("%04d-%02d-%02d-%02d%02d", $timeStampTmp[5]+1900,$timeStampTmp[4],$timeStampTmp[3],$timeStampTmp[2],$timeStampTmp[1]);
chomp $timeStampFinal;
print "We have excessive packet loss as of the moment. mtr and traceroute outputs are being generated\n";
$command='mtr -c 1 -r '.$hostToPing.' > /tmp/mtr-'.$timeStampFinal.'.txt';
system($command);
$command='traceroute '.$hostToPing.' > /tmp/traceroute-'.$timeStampFinal.'.txt';
system($command);
}
}
$command='rm /tmp/stopFping';
system($command);

I think it's a good idea to send MySQL backups to a remote machine because:
1] it is a good measure for catastrophic data loss when the MySQL server crashes, and
2] backups are ideally stored on machines designed for backup storage.

So here's my approach to this idea. 🙂
› Continue reading...

Back to top