This time is about using rsync to copy a Linux/Unix system to another host or backing it up. By using rsync, its more effective than using tool like dd since we can efficiently select the directory, attribute that we we want to backup or move.

Sometime ignorance is a blessing

The one who not so wise

rsync also can be use when the system is running. But you may need to do it with caution and make sure you understand the data state. When rsync with the system running, the data that have not commit the changes in files level may not transfer.

This method is work for migrating the the system to another hosts. Please be aware, that the target is the fresh installed operating system with the same version of the source.

Rsync full backup

# rsync -aAXHv --numeric-ids --info=progress2 --exclude={"/dev/*","/proc/*","/sys/*","/tmp/*","/run/*","/mnt/*","/media/*","/lost+found"} / /path/to/backup

Rsync Clone

rsync -qaHAXS [SOURCE_DIR] [DESTINATION_DIR]
  • –numeric-ids, will disable mapping of user and group names; instead, numeric group and user IDs will be transfered. This is useful when backing up over SSH or when using a live system to backup different system disk.
  • –info=progress2,  will show the overall progress info and transfer speed instead of the list of files being transferred.
  • avoid crossing a filesystem boundary when recursing, add the option -x/--one-file-system. This will prevent backing up any mount point in the hierarchy.
  • -n or known as the option for the dry-run. To simulate the file transfers

Previous ArticleNext Article

Leave a Reply

Your email address will not be published. Required fields are marked *

A.

Automated Cloud SQL export to Cloud Storage (Updated)

One of my costumers have a production environment in GCP. We helped them to migrate from on premises to GCP. We have setup the automated backup for their Cloud SQL for mysql 5.7 instances. Since the cloud sql backup using snapshot instance and only store/rotate 7 days of recovery point, they need another method to store the mysql backup with more retention.

We then use component below to create an automated export sql file from the Cloud SQL instances and store the on the Cloud Storage.

  • Cloud Scheduler
    As the schduler to run the task
  • Cloud Pub/Sub
    Payload source to trigger the automated export process
  • Cloud Functions
    The Function that run the export process using sql admin api.
  • Cloud Storage
    As the store target for the exported file from the automated process
  • Cloud IAM
    Permission and service account management for the related process/tools
  • Cloud SQL Admin API
    Make sure this enable, in order to run the allow the cloud function to run.

How to

  • Enable Cloud SQL Admin API
  • Create Cloud Storage bucket. In this case, we use Nearline storage type since the purpose is to store a backup. You can also set the lifecycle of the object in the storage. Ex. All the object older than 3 month automatically archive or delete.
  • Get the Cloud SQL instance service account and assign the related service account to the bucket with Storage/Bucket writer.
  • Create New IAM Service account with role to the CloudSQL Admin API
  • Create New Pub/Sub topic, Ex. named as “Backup-Payload”
  • Create a Cloud Function
    • Name: The function name
    • Region : which geographic location you want the CloudFunction run.
    • Memory : The size of memory you want to allocate for the Cloud Function. I chose the smallest one
    • Trigger : Chose method to trigger the Cloudfunction, in this case the newly created Pub/Sub.
    • Runtime: Chose the runtime, in this example we got the function from Nodejs 10
    • Source Code : Use inline editor and input the function from the codes below
    • Function to Execute : input the function name from the codes.
    • Service account for the cloud function that created before.
  • Create Cloud Scheduler
    • Name : The schedule name
    • Frequency : The time, you can use the cron format. In this case, we set the backup to run at 01.00 at morning.
    • Target : We chose the Pub/Sub
    • Topic : Chose the pub/sub topic that we created before
    • Payload : The jeson content about the project databse instance to export and storage target. Find the payload at the end.

Payload for Scheduler

{"project": "PROJECT_ID", "database": "DB_INSTANCE_NAME", "bucket": "gs://bucket-names"}

The Function (Updated)

Converting the time-stampt to format DDMMYYYY

const { google } = require('googleapis')
const { auth } = require('google-auth-library')
const sqladmin = google.sqladmin('v1beta4')

/**
 * Triggered from a Pub/Sub topic.
 * 
 * The input must be as follows:
 * {
 *   "project": "PROJECT_ID",
 *   "database": "DATABASE_NAME",
 *   "bucket": "BUCKET_NAME_WITH_OPTIONAL_PATH_WITHOUT_TRAILING_SLASH"
 * }
 *
 * @param {!Object} event Event payload
 * @param {!Object} context Metadata for the event
 */

var today = new Date();
var dd = today.getDate();
var mm = today.getMonth()+1; 
var yyyy = today.getFullYear();

if(dd<10) 
{
    dd='0'+dd;
} 

if(mm<10) 
{
    mm='0'+mm;
} 

exports.initiateBackup = async (event, context) => {
        const pubsubMessage = JSON.parse(Buffer.from(event.data, 'base64').toString())
        const authRes = await auth.getApplicationDefault()
        const request = {
                auth: authRes.credential,
                project: pubsubMessage['project'],
                instance: pubsubMessage['database'],
                resource: {
                        exportContext: {
                                kind: 'sql#exportContext',
                                fileType: 'SQL',
                                uri: pubsubMessage['bucket'] + '/backup-' +dd+'-'+mm+'-'+yyyy + '.gz'
                        }
                }
        }
        sqladmin.instances.export(request, (err, res) => {
                if (err) console.error(err)
                if (res) console.info(res)
        })
}

package.json
{
        "name": "cloudsql-backups",
        "version": "1.0.0",
        "dependencies": {
                "googleapis": "^45.0.0",
                "google-auth-library": "3.1.2"
        }
}

Reference
https://revolgy.com/blog/how-to-automated-long-term-cloud-sql-backups-step-by-step-guide/

P.

Postifx Command Cheatsheet

List of helpfull command that used to help me when maintaining Postfix mailserver.

Hemat Pangkal Kaya

Insignificant one

List mail queue and MAIL_ID’s, list mail queue
postqueue -p
mailq

Check the postfix version
postconf mail_version

Get default value set on the postfix config
postconf -d

Get non default value set on the postfix config
postconf -n

Flush the mail queuw
postfix flush

Force to send the queue
postqueue -f

Process all emails stuck in the queue
postsuper -r ALL && postqueue -f

Read email from mail queue
postcat -q MAIL_ID

Remove certain mail from mail queue
postsuper -d MAIL_ID

Remove all mail queue
postsuper -d ALL

Fast method to remove mail from queue
find /var/spool/postfix/deferred/ -type f | xargs -n1 basename | xargs -n1 postsuper -d

Remove all the deferred mail in queue
postsuper -d ALL deferred

Sorting email with “from address” filter
postqueue -p | awk '/^[0-9,A-F]/ {print $7}' | sort | uniq -c | sort -n

Remove all email that sent to user@dom.ain
postqueue -p|grep '^[A-Z0-9]'|grep user@dom.ain|cut -f1 -d' '|tr -d \*|postsuper -d -

Remove all email sent from user@dom.ain
postqueue -p|awk '/^[0-9,A-F].*user@dom.ain / {print $1}'|cut -d '!' -f 1|postsuper -d -

Remove all email sent from the dom.ain
postqueue -p | grep '^[A-Z0-9]'|grep @dom.ain|cut -f1 -d' ' |tr -d \*|postsuper -d -

Mail queue stats short
postqueue -p | tail -n 1

Number of emails in Mail queue
postqueue -p | grep -c "^[A-Z0-9]"

Fast count of emails in mail queue
find /var/spool/postfix/deferred -type f | wc -l

Watch Log Live
tail -f /var/log/maillog

Count and sort success pop3/imap logins

grep "\-login"  /var/log/dovecot-info.log |grep "Login:"|awk {'print $7'}|sort|uniq -c|sort -n

Count and sort success SMTP postfix logins. This is usefull when you want to track a compromised account.

grep -i "sasl_username"  /var/log/maillog |awk {'print $9'}|sort|uniq -c|sort -n

Count and sort success SMTP postfix logins on exact date “May 18”

grep -i "sasl_username"  /var/log/maillog |grep "May 18"|awk {'print $9'}|sort|uniq -c|sort -n

Analyze Postfix Logs

pflogsumm /var/log/maillog | less

You may need to install the pflogsumm package first.