Galera Cluster – Simpler way to view GRA file content

Galera Cluster (MySQL from Codership, Percona XtraDB Cluster, MariaDB Galera Cluster) generates a GRA log files if it fails to apply the writeset on the target node. This files exists in the MySQL data directory. You can get an overview of the file (if exist) by listing your MySQL data directory (in my case, the data directory is at /var/lib/mysql):

$ ls -1 /var/lib/mysql | grep GRA

MySQL Performance Blog has covered this topic in well-explained. I’m going to make this simple. Download the script here and copy it to your /usr/bin directory:

wget -P /usr/bin/
chmod 755 /usr/bin/grareader

Just run following command to simply convert the GRA log file to a human-readable output:

grareader [gra_log_file]

Here is the example output:

/*!40019 SET @@session.max_insert_delayed_threads=0*/;
/*!50003 SET @[email protected]@COMPLETION_TYPE,COMPLETION_TYPE=0*/;
# at 4
#140114 3:12:42 server id 3 end_log_pos 120 Start: binlog v 4, server v 5.6.15-log created 140114 3:12:42 at startup
# at 120
#140114 3:12:43 server id 3 end_log_pos 143 Stop
# at 143
#140507 14:55:42 server id 4 end_log_pos 126 Query thread_id=3173489 exec_time=0 error_code=0
use `test_shop`/*!*/;
SET TIMESTAMP=1399445742/*!*/;
SET @@session.pseudo_thread_id=3173489/*!*/;
SET @@session.foreign_key_checks=1, @@session.sql_auto_is_null=0, @@session.unique_checks=1, @@session.autocommit=1/*!*/;
SET @@session.sql_mode=0/*!*/;
SET @@session.auto_increment_increment=1, @@session.auto_increment_offset=1/*!*/;
/*!\C utf8 *//*!*/;
SET @@session.character_set_client=33,@@session.collation_connection=33,@@session.collation_server=8/*!*/;
SET @@session.lc_time_names=0/*!*/;
SET @@session.collation_database=DEFAULT/*!*/;
ALTER TABLE `tblreshipment_header` DROP `ShipmentStatus`
# End of log file
ROLLBACK /* added by mysqlbinlog */;
/*!50003 SET [email protected]_COMPLETION_TYPE*/;

You can download the script here or copy and paste the code:

# Convert Galera GRA_* files to human readable output
# Usage: grareader 
# Example: grareader /var/lib/mysql/GRA_1_1.log
## GRA header file path
[ ! -e $input ] && echo 'Error: File does not exist' && exit 1
        wget_bin=`command -v wget`
        [ -z "$wget_bin" ] && echo 'Error: Unable to locate wget. Please install it first' && exit 1
        echo "Downloadling GRA-Header file into $path"
        $wget_bin --quiet $download_url -P $path
        [ $? -ne 0 ] && echo 'Error: Download failed' && exit 1
        mysqlbinlog_bin=`command -v mysqlbinlog`
        [ -z "$mysqlbinlog_bin" ] && echo 'Error: Unable to locate mysqlbinlog binary. Please install it first' && exit 1
        [ ! -e $gra_header_path ] && echo 'Error: Unable to locate GRA header file' && get_gra_header
cat $gra_header_path >> $tmp_path
cat $input >> $tmp_path
echo ''
$mysqlbinlog_bin -v -v -v $tmp_path
echo ''
rm -rf $tmp_path


Hope this could help make your Galera administrative task simpler!

Convert CSV to JSON using BASH

I have been assigned a task to generate random data in JSON format. I do have a big data set ready in CSV (comma separated values) and would love to convert it to JSON just using BASH. You can copy following codes and save it as a executable script file.

# CSV to JSON converter using BASH
# Usage ./csv2json input.csv > output.json
[ -z $1 ] && echo "No CSV input file specified" && exit 1
[ ! -e $input ] && echo "Unable to locate $1" && exit 1
read first_line < $input
headings=`echo $first_line | awk -F, {'print NF'}`
lines=`cat $input | wc -l`
while [ $a -lt $headings ]
        head_array[$a]=$(echo $first_line | awk -v x=$(($a + 1)) -F"," '{print $x}')
echo "{"
while [ $c -lt $lines ]
        read each_line
        if [ $c -ne 0 ]; then
                echo -n "{"
                while [ $d -lt $headings ]
                        each_element=$(echo $each_line | awk -v y=$(($d + 1)) -F"," '{print $y}')
                        if [ $d -ne $(($headings-1)) ]; then
                                echo -n ${head_array[$d]}":"$each_element","
                                echo -n ${head_array[$d]}":"$each_element
                if [ $c -eq $(($lines-1)) ]; then
                        echo "}"
                        echo "},"
done < $input
echo "}"

To perform the conversion, run the script with first argument is the CSV file that you want to convert to and redirect the output to an output file. Make sure the CSV file contains field names as the header, similar to example below:

"Farrah Walters","208-72-8449","1386670785"
"Shay Warner","539-53-2690","1386644172"
"Maxine Norton","231-61-5065","1386658663"

Hope this could help others out there! You can download the script here.

CentOS: Install Nagios – The Simple Way

Nagios is the most popular open-source infrastructure monitoring tools. Nagios offers monitoring and alerting for servers, switches, applications, and services. It alerts users when things go wrong and alerts them again when the problem has been resolved.

I have created a script to install Nagios and Nagious plugin in RHEL/CentOS:

# Install nagios and nagios plugin in RHEL/CentOS/Fedora
# Disable SElinux
sed -i.bak 's#SELINUX=enforcing#SELINUX=disabled#g' /etc/selinux/config
setenforce 0
# Nagios requirement
yum install gd gd-devel httpd php gcc glibc glibc-common make openssl openssl-devel -y
# Installation directory
[ ! -d $installdir ] && mkdir -p $installdir
rm -Rf $installdir/*
cd $installdir
wget $nagios_latest_url
wget $nagios_plugin_latest_url
# Nagios
nagios_package=`ls -1 | grep nagios | grep -v plugin`
tar -xzf $nagios_package
cd nagios
echo "Installing Nagios.."
useradd nagios
make all
make install
make install-init
make install-commandmode
make install-config
make install-webconf
echo "Create .htpasswd for nagios"
htpasswd -c /usr/local/nagios/etc/htpasswd.users nagiosadmin
cd $installdir
# Nagios Plugin
nagios_plugin_package=`ls -1 | grep nagios-plugin`
tar -xzf $nagios_plugin_package
cd nagios-plugin*
echo "Installing Nagios Plugin.."
make install
echo "Starting Nagios.."
chkconfig nagios on
service nagios start
echo "Staring Apache.."
service httpd restart
chkconfig httpd on
# Configure IPtables
iptables -I INPUT -m tcp -p tcp --dport 80 -j ACCEPT
service iptables save
ip_add=`hostname -I | tr -d ' '`
echo "Installation done.."
echo "Connect using browser http://$ip_add/nagios/"
echo "username: nagiosadmin"
echo "password: (nagios password)"

You can download the script directly here:

$ wget

Changethe  script permission and run script:

$ chmod +x
$ ./

Once completed, you can directly open Nagios page using http://<your_ip_address>/nagios and login with username nagiosadmin with same password you enter during the installation. You should see the Nagios page similar to screenshot below:




Installation done!


MailMe: Simple Bash to Notify Your Command Status via Email

I usually having problem whereby I always forgot to check what happen to my copying or downloading progress in the server. This has gives me idea to create a script to notify me via email once the command executed and completed.

For example, I usually download a big installer file which usually make me constantly check the download progress. I just need an alert to be send to me once the respective command completed whether it is failed or succeed. Another case is when I am running a big migration. I need to copy the whole /home directory to external hard disk in this will takes days to complete. Using MailMe will definitely increase my works efficiency. I just need to run the respective command and wait for the notification email. Thats all.

1. Install sendmail and mailx using yum. Mailx is required. You can use Postfix or any other SMTP server to send the email instead of sendmail:

$ yum install sendmail mailx -y

2. Start the sendmail service:

$ chkconfig sendmail on
$ service sendmail start

3. Download and integrate the script into environment. We will need to place the script under /usr/local/bin directory.

$ wget -P /usr/local/bin
$ chmod 755 /usr/local/bin/mailme

4. We need to change the MAILTO value so the script will send the notification automatically to your email. Open the script using text editor:

$ vim /usr/local/bin/mailme

And change following line:

Done. Now you can integrate mailme into your command. Example as below:

– Download the CentOS 6.3 64bit ISO:

$ mailme 'wget'

– Rsync the whole backup directory to another server:

$ mailme 'rsync -avzP /backup/*.tar.gz [email protected]:/backup'

Once the command executed successfully, you will get simple email notification like below:

Subject: MailMe Command Notification:
Command: wget
Date/Time: Mon Oct 1 11:14:54 MYT 2012

BASH: Some of My Looping Command Collections

Here are several of my BASH commands collection related to looping which I frequently used. This list will be always updated for reference and knowledge base.

1. Copy .htaccess file under /home/website1/public_html to all directories and sub-directories under /home/website2/public_html excluding .svn directories:

cd /home/website2/public_html
for i in $(find -type d | egrep -v .svn); do cp /home/website1/.htaccess $i; done

2. Rename all files and directories in current path to .bak:

for i in *; do mv $i $i.bak; done

3. Remove .bak extension in all files and directories in current path (undo for command #2):

for i in *; do mv $i $(basename $i .bak); done

4. Return number of files in each directory and sub-directory:

find -type f -execdir pwd \; | sort | uniq -c

5. Generate 24 files with 10 MB in size under current directory:

for i in $(seq 1 1 24); do dd bs=1024 count=10000 if=/dev/zero of=file.$i; done

6. Generate some random data for database foo and table bar in 3 fields (val1,val2,val3):

mysql -e "INSERT INTO (val1, val2, val3) VALUES ((SELECT floor(rand() * 10) as randNum), (SELECT floor(rand() * 10) as randNum),(SELECT floor(rand() * 10) as randNum));"


Your share and opinion is welcome!

cPanel: Auto Backup and Remote Transfer using API + PHP

The good thing about cPanel is you can generate your own backup automatically using cPanel API. In this case, we will use PHP to run on schedule to automatically generate backup and transfer via FTP or SCP to another server. This implementation can be done on user level without need to login into cPanel login page.

To integrate with cPanel API, we need a file xmlapi.php which we can get from cPanel GitHub repository at This post is about creating full backup and transfer the backup to a remote location using FTP automatically with cPanel under user privileges, not root privileges. Variable as below:

cPanel: WHM 11.30.6 (build 6)
cPanel user: mycp123
cPanel password: Pas$r12cP
Home directory: /home/mycp123

1. Download the PHP API at here For me I will download the zip format into my local desktop. Unzip it. Inside the folder got several files and folders. We just need to upload xmlapi.php into public_html folder using FTP client.

2. I am login into cPanel to create PHP script by using File Manager. Go to cPanel > File Manager > Web Root > Go > New File > File Name: cpbackup.php > Create New File.

3. Open back the file in text editor mode by right click on the file (cpbackup.php) and select Code Edit > Edit. It should open cPanel code editor. Copy following lines and save:

// Must include cPanel API
include "xmlapi.php";
// Credentials for cPanel account
$source_server_ip = ""; // Server IP or domain name eg: or cpanel.domain.tld
$cpanel_account = ""; // cPanel username
$cpanel_password = ""; // cPanel password
// Credentials for FTP remote site
$ftphost = ""; // FTP host IP or domain name
$ftpacct = ""; // FTP account
$ftppass = ""; // FTP password
$email_notify = ''; // Email address for backup notification
$xmlapi = new xmlapi($source_server_ip);
// Delete any other backup before create new backup
$conn_id = ftp_connect($ftphost);
$login_result = ftp_login($conn_id, $ftpacct, $ftppass);
$logs_dir = "/";
ftp_chdir($conn_id, $logs_dir);
$files = ftp_nlist($conn_id, ".");
foreach ($files as $file){
    ftp_delete($conn_id, $file);
$api_args = array(
print $xmlapi->api1_query($cpanel_account,'Fileman','fullbackup',$api_args);

4. Update the credentials details between line 5 to 15 in the script and save it.

The script will check whether any backup exists in destination server and will delete all of them (line 21 to 31). Then, using cPanel API we can create an argument as refer to here to use passive FTP as transfer mode at line 34 when the backup is ready.

5. We can execute this task manually by accessing the PHP file via browser at You should see JSON output will be return as below if successful:


6. To automate this task, we can simply create cron job to run it weekly. Go to cPanel > Advanced > Cron jobs and use following command:

php -q /home/mycp123/public_html/cpbackup.php

Screen shot as below:

Done! You can create many scripts using cPanel API to automate your repeated task.

cPanel: Create Backup and Transfer to Another Server

If you familiar in administrating cPanel, you should know a script/tool called pkgacct. This is the backup tools being used by cPanel in order to create and manage cPanel user’s account backup. By using this tool, we can take advantage by create a centralized backup server where cPanel account backup will be sent over to another server via FTP on weekly basis.

Following picture shows the architecture of the centralized backup I made:


FTP Server

1. I will store the cPanel backup in a Windows 2008 R2 server and I will be using FileZilla as the FTP server. Download the installer from here and follow the installation wizard. Just accept all defaults value during the installation process.

2. Add FTP required port into Windows Firewall:

Start > Administrative Tools > Windows Firewall with Advanced Security > Inbound Rules > New Rule > Port > Next > under Specific local ports enter this value: 20, 21 > Next > Allow the connection > Next > tick all for Domain, Private, Public > Next > put a name like FileZilla FTP > Finish.

3. Create FTP user and assign a directory called centralized_backup under C:\ partition:

FileZilla > Users > Add > enter username and password for respective user. Then go to Shared folders > Add Shared folders > C:\centralized_backup and tick all permissions on files and directories.

4. Make sure you can telnet port 21 from the cPanel servers:

$ telnet 21
Connected to (
Escape character is '^]'.
220-FileZilla FTP server
220 version 0.9.40 beta

5. Setup a schedule task so it will delete files longer than 30 days in the centralized backup directory:

Start > All Programs > Accessories > System Tools > Task Scheduler > Create Task. Enter following information:

General > Name: Delete_old_backup
General > Security options: SYSTEM
Triggers: Weekly  every Sunday of every week
Actions: Start a program
Actions > Program/scripts: forfiles
Actions > Add arguments (optional):

/p "c:\central_backup" /s /d -30 /M *.tar.gz /c "cmd /c del @file : date >= 30 days >NUL"

Click OK to complete the setup. Screenshot of the Actions setup is as below:

cPanel Servers

1. I will use BASH script to automate the process. In each cPanel server, create a file under /root/scripts directory:

$ mkdir -p /root/scripts
$ touch /root/scripts/centralbackup

Copy and paste following contents:

# Generate backups using cpbackup and then transfer them to a central server
# Author: SecaGuy @
# Local server configuration
# FTP server configuration
# Dont change line below
FTP=`which ftp`
if [ ! -d /var/cpanel/users ]
        echo "cPanel users not found. Aborted!"
        exit 1
        eof=`ls /var/cpanel/users | egrep -v '^\..$' | egrep -v '^\...$' | wc -l`
        echo "$eof cPanel user(s) found in this server"
	[ ! -d $TEMPPATH ] && mkdir -p $TEMPPATH || :
        for (( i=1; i<=$eof; i++ ))
                CPUSER=`ls /var/cpanel/users | egrep -v '^\..$' | egrep -v '^\...$' | head -$i | tail -1`
                echo "Creating backup for user $CPUSER.."
                /usr/local/cpanel/scripts/pkgacct $CPUSER $TEMPPATH userbackup
                echo "Backup done. Transferring backup to FTP server.."
                FILENAME=`ls $TEMPPATH | grep tar.gz`
                $FTP -n $CHOST <<END_SCRIPT
                quote USER $CUSERNAME
                quote PASS $CPASSWORD
                mkdir $LHOSTNAME
                lcd $TEMPPATH
                put $FILENAME
        echo "Removing temporary files.."
        rm -Rf $TEMPPATH/backup-*
        USERDIR=`cat /var/cpanel/users/$CPUSER | grep HOMEDIRPATHS | sed 's/HOMEDIRPATHS=//g'`
        rm -Rf $USERDIR/backup-*
        echo "Backup for $CPUSER complete!"
	echo "Process complete!"
        exit 0

2. Change the permission to executable:

$ chmod 755 /root/scripts/centralbackup

Schedule the Backup

Once all cPanel servers have been setup, I can schedule them using cron job on weekly basis (every Sunday at 12:00 AM). So I will add following line into cron job list:

$ crontab -e

Add following line:

0 0 * * 0 /root/scripts/centralbackup

Save the files and restart cron daemon:

$ service crond restart

Notes: You can use cpbackup-exclude.conf as refer to cPanel documentation page, to exclude certain files or directories from being included in this backup.

Create MySQL Database Backup Every Half an Hour

Our company has launched an online contest for our dedicated clients and we are collecting some really important information from them in order join the contest. My boss wants me to create a database backup every half an hour to make sure we reduce the data loss chance to the minimum possible.

I have create a simple BASH script to accomplish this. The script will create a MySQL database backup and specified in a folder based on date. We only stored 3 latest days backup and will remove other backup folder if exist.

1. Create the script in desired directory. In this case, I will use /home/scripts :

$ mkdir /home/scripts
$ touch /home/scripts/mysqlbackup_30min

2. Using your favourite text editor, paste following scripts into the script /home/scripts/mysqlbackup_30min :

# Scripts to create mysql backup every half and hour
# Confiration value
expired=3			#how many days before the backup directory will be removed
today=`date +%Y-%m-%d`
if [ ! -d $backup_path/$today ]
        mkdir -p $backup_path/$today
        /usr/bin/mysqldump -h $mysql_host -u $mysql_username -p$mysql_password $mysql_database > $backup_path/$today/$mysql_database-`date +%H%M`.sql
# Remove folder which more than 3 days
find $backup_path -type d -mtime +$expired | xargs rm -Rf

3. Change configuration value to suit your environment. In this case I will information as below:


4. Create a cron job to execute this task every half and hour. Open /var/spool/cron/root and add following line:

*/30 * * * * /bin/sh /home/scripts/mysqlbackup_30min

5. Restart cron daemon:

$ service crond restart

Done. You should see something like below after one day:

$ tree /backup/mysql
|-- 2011-11-14
|   |-- contest_db-1930.sql
|   |-- contest_db-2000.sql
|   |-- contest_db-2030.sql
|   |-- contest_db-2100.sql
|   |-- contest_db-2130.sql
|   |-- contest_db-2200.sql
|   |-- contest_db-2230.sql
|   |-- contest_db-2300.sql
|   `-- contest_db-2330.sql
`-- 2011-11-15
    |-- contest_db-0030.sql
    |-- contest_db-0100.sql
    |-- contest_db-0130.sql
    |-- contest_db-0200.sql
    |-- contest_db-0230.sql
    |-- contest_db-0300.sql
    |-- contest_db-0330.sql
    |-- contest_db-0400.sql
    |-- contest_db-0430.sql
    |-- contest_db-0500.sql
    |-- contest_db-0530.sql
    |-- contest_db-0600.sql
    |-- contest_db-0630.sql
    |-- contest_db-0700.sql
    |-- contest_db-0730.sql
    |-- contest_db-0800.sql
    |-- contest_db-0830.sql
    |-- contest_db-0900.sql
    |-- contest_db-0930.sql
    |-- contest_db-1000.sql
    |-- contest_db-1030.sql
    |-- contest_db-1100.sql
    |-- contest_db-1130.sql
    `-- contest_db-1200.sql

Bash Script – Delete Comments from a C program

I wrote a bash script to delete comments from a C program. C language will required /* and */ between the contents of the comment. Example as below:

MQLONG  Reason;      /* Qualifying reason      */
MQOD    ObjDesc = {MQOD_DEFAULT}; /* Object descriptor      */
MQLONG  OpenOptions; /* Options control MQOPEN *//*----------------------------------------- */
   /* Initialize the Object Descriptor (MQOD)  */
   /* control block.  (The remaining fields    */
   /* are already initialized.)                */
   strncpy( ObjDesc.ObjectName,
            MQ_Q_NAME_LENGTH );

This bash script will help to clear out whatever character contains between these 2 comment characters. I am using sed, which is a stream editor. A stream editor is used to perform basic text transformations on an input stream (a file or input from a pipeline) and also with some help from Regex. Script as below:

# Bash scripts to delete comments from a C program
Usage="Usage: {script name} {target file}"
if [ $# -eq 0 ]; then                   # if no argument specified
        echo $Usage                     # print Usage string value
        exit 1
until [ $# -eq 0 ]
        case $1 in
                -h) echo $Usage         # print Usage string value if argument is -h
                        exit 0;;
                *) FILE=$1              # declare the next argument as `FILE`
if [ ! -f $FILE ]; then                         # if the file is not exist
        echo "'$FILE' is not a exist"           # print the file name is not a exist
        exit 1                                  # terminate the program
        sed -i '/\*/s/\/\*.*\*\/$//' $FILE              # remove the comment using sed
        echo "Comments for $FILE has been removed"      # print the status

If the script name saved as comment_remover under root directory and the target file is /home/user1/program.c , you can execute the script as follow (make sure the script is executable):

$ /root/comment_remover /home/user1/program.c
Comments for /home/user1/program.c has been removed

It will turn the example I use above to:

MQLONG  Reason;
MQLONG  OpenOptions;
   strncpy( ObjDesc.ObjectName,
            MQ_Q_NAME_LENGTH );

I hope this will help some other people out there. Happy scripting!

Linux: Kill Process based on Start Time (STIME)

One of the server that I am working with has some infinitely running PHP process. Due to incorrect way of cron setup by the development team, it has caused the process hanging and not ended properly. According to them, these processes can be killed if still hang after 12 hours.

Any process which run in server will have start time (STIME). You can check this by using ps command. In this case, following result will appear:

$ ps aux | grep php
root      1399  0.0  0.0  61188   740 pts/2    S+   10:10   0:00 grep php
user1  2697  0.0  0.0 100664  8340 ?        Ss   Jul04   0:00 /usr/local/bin/php /home/user1/cron/sync2server.php
user1  5551  0.0  0.4 171052 78832 ?        Ss   Jun25   0:00 /usr/local/bin/php /home/user1/cron/sync2server.php
user1  9913  0.0  0.5 174636 82392 ?        Ss   Jun22   0:00 /usr/local/bin/php /home/user1/cron/sync2server.php
user1 11961  0.0  0.7 223276 131060 ?       Ss   May25   0:00 /usr/local/bin/php /home/user1/cron/sync2server.php
user1 16455  0.0  0.4 171564 79420 ?        Ss   Jun24   0:01 /usr/local/bin/php /home/user1/cron/sync2server.php
user1 17474  0.0  0.5 182060 90016 ?        Ss   Jun18   0:00 /usr/local/bin/php /home/user1/cron/sync2server.php
user1 20094  0.0  0.6 206636 114588 ?       Ss   Jun03   0:00 /usr/local/bin/php /home/user1/cron/sync2server.php
user1 22555  0.0  0.7 213548 121476 ?       Ss   May30   0:00 /usr/local/bin/php /home/user1/cron/sync2server.php
user1 24670  0.0  0.7 214572 122320 ?       Ss   May30   0:00 /usr/local/bin/php /home/user1/cron/sync2server.php
user1 28200  0.0  0.7 220204 127988 ?       Ss   May26   0:00 /usr/local/bin/php /home/user1/cron/sync2server.php
user1 30832  0.0  0.4 170284 78168 ?        Ss   Jun25   0:00 /usr/local/bin/php /home/user1/cron/sync2server.php
user1 30837  0.0  0.4 170114 88508 ?        Ss   23:20   0:00 /usr/local/bin/php /home/user1/cron/sync2server.php
user1 30848  0.0  0.4 120439 80770 ?        Ss   12:20   0:00 /usr/local/bin/php /home/user1/cron/sync2server.php

Continue reading “Linux: Kill Process based on Start Time (STIME)” »

cPanel – Remove FrontPage for All Accounts

FrontPage Extension in cPanel is consider deprecated and there are many security holes reported in this. Microsoft has discontinued FrontPage extension support for the Unix platform since end of 2006. It is good thing to remove this extension which sometimes being installed without your acknowledgement.

You can use many way to remove FrontPage extension, since cPanel already have built-in scripts to remove FrontPage extension which is /scripts/unsetupfp4. Following BASH script has been tested in cPanel 11.28.93 running on CentOS 5.5. It will detect users from /var/cpanel/users directory and try to search for vti directory, if found, the cPanel’s FrontPage uninstaller will execute the domain name found in .htaccess.

Lets do this. Firstly, create a new file by using text editor which, I will use nano:

[[email protected] ~]# nano /root/removefp

2. Copy and paste following scripts:

# Remove Frontpage Extension for all accounts in cPanel server
read -p "Are you sure you remove FP extension for all domains? <y/N> " prompt
if [[ $prompt == "y" || $prompt == "Y" || $prompt == "yes" || $prompt == "Yes" ]]
u=`ls -l $USERDIR | egrep '^-' | wc -l`
for (( i=1; i<=$u; i++ ))
        user=`ls -l $USERDIR | egrep '^-' | awk {'print $9'} | head -$i | tail -1`
        echo "Checking user $user FrontPage status.."
        homedir=`cat $USERDIR/$user | grep HOMEDIRPATHS | sed 's/HOMEDIRPATHS=//'`
        if [ -d $homedir/public_html/_vti_pvt ]; then
                domain=`cat $homedir/public_html/.htaccess | grep AuthName | awk {'print $2'}`
                echo "FrontPage found. Removing FrontPage for $domain.."
                /scripts/unsetupfp4 $domain
                echo "FrontPage not found for $user"
echo "Process completed"
  exit 0

(Press ‘Ctrl-X’ then ‘Y’ then ‘Enter’ to save and exit from editor)

Continue reading “cPanel – Remove FrontPage for All Accounts” »

MySQL Daily Backup and Transfer to other Server

When you have 2 MySQL server which is not running in replication or cluster, is recommended to have MySQL backup running daily. This will help you on fast restoration, reliable data backup and disaster recovery.

I have created a bash script to run daily and make sure the data is save into local disk and another copy being transferred to another server via rsync. You can use following script and change the value to suit your environment. I am using following variables:

OS: CentOS 5.6 64bit
Backup user: mysql_backup
Backup user password: l3tsb4ckup
Backup path: /backup

Before we use the script, is good to have a specific user to run the backup scripts. Now lets start configuring in Server1:

1. Create specific user to run the backup:

[root@centos ~] useradd mysql_backup
[root@centos ~] passwd mysql_backup

(notes: enter the password 2 times as above)

2. Assign that user to a specific location inside your server:

[root@centos ~] usermod -d /backup mysql_backup

3. Test the user environment by using su and create ssh-key so this user will have password-less connection to another server:

[root@centos ~] su - mysql_backup
-bash-3.2$ mkdir ~/.ssh/
-bash-3.2$ ssh-keygen -t dsa

(notes: just press enter for all prompts)

4. Now login to Server2 via root and repeat step 1 to step 3. Once done, continue below step in both servers to transfer the SSH key for mysql_backup user. This will allow password-less connection between each of them:

-bash-3.2$ cat ~/.ssh/ | ssh mysql_backup@ "cat -- >> ~/.ssh/authorized_keys"

(notes: enter mysql_backup’s password which is l3tsb4ckup)

Continue reading “MySQL Daily Backup and Transfer to other Server” »