One of my costumers have a production environment in GCP. We helped them to migrate from on premises to GCP. We have setup the automated backup for their Cloud SQL for mysql 5.7 instances. Since the cloud sql backup using snapshot instance and only store/rotate 7 days of recovery point, they need another method to store the mysql backup with more retention.

We then use component below to create an automated export sql file from the Cloud SQL instances and store the on the Cloud Storage.

  • Cloud Scheduler
    As the schduler to run the task
  • Cloud Pub/Sub
    Payload source to trigger the automated export process
  • Cloud Functions
    The Function that run the export process using sql admin api.
  • Cloud Storage
    As the store target for the exported file from the automated process
  • Cloud IAM
    Permission and service account management for the related process/tools
  • Cloud SQL Admin API
    Make sure this enable, in order to run the allow the cloud function to run.

How to

  • Enable Cloud SQL Admin API
  • Create Cloud Storage bucket. In this case, we use Nearline storage type since the purpose is to store a backup. You can also set the lifecycle of the object in the storage. Ex. All the object older than 3 month automatically archive or delete.
  • Get the Cloud SQL instance service account and assign the related service account to the bucket with Storage/Bucket writer.
  • Create New IAM Service account with role to the CloudSQL Admin API
  • Create New Pub/Sub topic, Ex. named as “Backup-Payload”
  • Create a Cloud Function
    • Name: The function name
    • Region : which geographic location you want the CloudFunction run.
    • Memory : The size of memory you want to allocate for the Cloud Function. I chose the smallest one
    • Trigger : Chose method to trigger the Cloudfunction, in this case the newly created Pub/Sub.
    • Runtime: Chose the runtime, in this example we got the function from Nodejs 10
    • Source Code : Use inline editor and input the function from the codes below
    • Function to Execute : input the function name from the codes.
    • Service account for the cloud function that created before.
  • Create Cloud Scheduler
    • Name : The schedule name
    • Frequency : The time, you can use the cron format. In this case, we set the backup to run at 01.00 at morning.
    • Target : We chose the Pub/Sub
    • Topic : Chose the pub/sub topic that we created before
    • Payload : The jeson content about the project databse instance to export and storage target. Find the payload at the end.

Payload for Scheduler

{"project": "PROJECT_ID", "database": "DB_INSTANCE_NAME", "bucket": "gs://bucket-names"}

The Function (Updated)

Converting the time-stampt to format DDMMYYYY

const { google } = require('googleapis')
const { auth } = require('google-auth-library')
const sqladmin = google.sqladmin('v1beta4')

/**
 * Triggered from a Pub/Sub topic.
 * 
 * The input must be as follows:
 * {
 *   "project": "PROJECT_ID",
 *   "database": "DATABASE_NAME",
 *   "bucket": "BUCKET_NAME_WITH_OPTIONAL_PATH_WITHOUT_TRAILING_SLASH"
 * }
 *
 * @param {!Object} event Event payload
 * @param {!Object} context Metadata for the event
 */

var today = new Date();
var dd = today.getDate();
var mm = today.getMonth()+1; 
var yyyy = today.getFullYear();

if(dd<10) 
{
    dd='0'+dd;
} 

if(mm<10) 
{
    mm='0'+mm;
} 

exports.initiateBackup = async (event, context) => {
        const pubsubMessage = JSON.parse(Buffer.from(event.data, 'base64').toString())
        const authRes = await auth.getApplicationDefault()
        const request = {
                auth: authRes.credential,
                project: pubsubMessage['project'],
                instance: pubsubMessage['database'],
                resource: {
                        exportContext: {
                                kind: 'sql#exportContext',
                                fileType: 'SQL',
                                uri: pubsubMessage['bucket'] + '/backup-' +dd+'-'+mm+'-'+yyyy + '.gz'
                        }
                }
        }
        sqladmin.instances.export(request, (err, res) => {
                if (err) console.error(err)
                if (res) console.info(res)
        })
}

package.json
{
        "name": "cloudsql-backups",
        "version": "1.0.0",
        "dependencies": {
                "googleapis": "^45.0.0",
                "google-auth-library": "3.1.2"
        }
}

Reference
https://revolgy.com/blog/how-to-automated-long-term-cloud-sql-backups-step-by-step-guide/

Previous ArticleNext Article

Leave a Reply

Your email address will not be published. Required fields are marked *