/
How To: Custom Backup and Restore using Oracle Flashback Database

How To: Custom Backup and Restore using Oracle Flashback Database


Step 1 Configure the Database

These are the Oracle Database prerequisites required for flashback database.

  • The REF database must be in archive log mode.
  • SYSDBA or SYSBACKUP privileges on the REF database are required to perform flashback operations.
    • The database must be shutdown and restarted during flashback which requires SYSDBA or SYSBACKUP.
    • Check with your Security team to determine an approved means of SYSDBA or SYSBACKUP database access.
    • Possible options:
      • Give DATICAL_USER the SYSDBA or SYSBACKUP privilege on the REF database.
      • Utilize a third-party database management tool that already has the privileges required to perform flashback.
      • Access the database server from the CI server via ssh to execute a script as the oracle user.
  • The database listener must have a statically defined entry for the REF database in the listener.ora file.
  • This is required so that the DATICAL_USER can connect to the database from the build server while the database is closed.


  1.  Enable archive logging on the REF database.
    setup db for flashback
    -- This is a one-time operation to enable flashback database.
    sqlplus / as sysdba
    shutdown immediate
    startup mount
    alter database archivelog;
    alter database open;
  2. Create a static listener registration for REF database

    https://www.funoracleapps.com/2019/04/static-and-dynamic-listener-concepts-in.html

    setup listener and reload oralistener
    -- This is a one-time setup that is required to allow a remote sql client to flashback the database.  Dynamic listener registration does not work because the SQL client must connect to the DB while it is shutdown in order to perform flashback.
    
    Login as oracle on the DB server.
    lsnrctl status
    -- Identify the location of the Listener Parameter File.  It is typically under the ORACLE_HOME or GRID_HOME and is named listener.ora.
    -- Edit the listener.ora file to add a static entry similar to the following:
    SID_LIST_LISTENER =
      (SID_LIST =
         (SID_DESC =
         (GLOBAL_DBNAME = mmb.datical.net)
         (SID_NAME = mmb)
         (ORACLE_HOME = /home/oracle/app/oracle/product/12.1.0/dbhome_2)
       )
    )
    -- Load the new information.
    lsnrctl reload
    
    -- Verify
    lsnrctl status
    -- There should be a listing for the DB instance with status UNKNOWN.
  3. Grant SYSDBA or SYSBACKUP to DATICAL_USER

    – SYSDBA or SYSBACKUP is required to shutdown/startup the database during the flashback operation.

    sqlplus / as sysdba
    grant sysdba to datical_user;



Step 2 Configure Custom Backup/Restore

Reference this documentation: Custom Backup and Restore Methods

  1. Create scripts to execute Oracle commands via sqlplus to create a restore point and flashback the database.
    -- The scripts will be called by the custom BackupRestoreOracleFlashback.groovy script.
    -- Place the scripts in DaticalDB/repl/scripts/extensions
    -- Permissions should be 765.  The unix user running the Jenkins job must be able to execute the scripts.

    Bash shell option

    datical_create_restore_point.sh
    #!/bin/bash
    CONNECT_STRING=$1
    echo "Running datical_create_restore_point.sh"
    sqlplus /nolog << EOF
    whenever sqlerror exit failure
    connect $CONNECT_STRING as sysdba
    whenever sqlerror continue
    drop restore point datical_backup;
    whenever sqlerror exit failure;
    create restore point datical_backup guarantee flashback database;
    exit;
    EOF
    
    

    -- Do not drop the restore point after flashback.  The restore may be performed more than once by the deployPackager if errors are encountered during the deploy to REF.

    datical_restore_point_flashback.sh
    #!/bin/bash
    CONNECT_STRING=$1
    echo "Running datical_restore_point_flashback.sh"
    sqlplus /nolog << EOF
    whenever sqlerror exit failure
    connect $CONNECT_STRING as sysdba
    shutdown immediate;
    startup mount;
    flashback database to restore point datical_backup;
    alter database open resetlogs;
    exit;
    EOF
    
    
  2. Create a custom BackupRestore script to use flashback database.
    The script should be located in DaticalDB/repl/scripts/extensions.  Use the script below as an example.

    BackupRestoreOracleFlashback.groovy
    package com.datical.hammer.scripts.backup
    
    import com.datical.db.project.DatabaseDef
    import com.datical.db.project.Project
    import com.datical.hammer.repl.command.DaticalDBHelper
    import com.datical.hammer.scripts.util.PackagerProperties
    import com.datical.hammer.scripts.util.ScriptLogging
    import com.datical.hammer.scripts.util.ScriptUtils
    import groovy.sql.Sql
    import org.apache.commons.lang3.StringUtils
    
    import java.sql.Connection
    import java.text.SimpleDateFormat
    import java.util.regex.Pattern
    
    /*********************************************************************************************************************
     * Has methods to perform Oracle database backup and restore using Guaranteed Restore points
     * and flashback database.
     * SQLplus must be available on the OS Path. The database must be in archivelog mode, and the database account used to
     * connect to the database must have SYSDBA privileges.  A static listener must be configured for the DB.
     * SQL operations are performed by shell scripts. The connect string is passed as a parameter to the shell scripts.
     *
     ********************************************************************************************************************/
    class BackupRestoreCustomStub implements BackupRestore {
      private ScriptLogging logger = new ScriptLogging('BackupRestoreCustomStub')
      private ScriptUtils scriptUtils = new ScriptUtils()
      private DaticalDBHelper daticalDB
      String defaultExportFilename
      String defaultExportLogFilename
      String jobNameBase
    
      BackupRestoreCustomStub(DaticalDBHelper daticalDB) {
        this.daticalDB = daticalDB
        def startTime = new Date()
        def timestamp            = new SimpleDateFormat("yyyyMMdd_HHmmss_S").format(startTime)
        defaultExportFilename    = ""
        defaultExportLogFilename = ""
        def jobnameTimestamp     = new SimpleDateFormat("yyyyMMdd_HHmmss").format(startTime)
        // The jobNameBase cannot be longer than 30 characters on Oracle
        jobNameBase              = "datical_pkg_${jobnameTimestamp}"
        logger.debug "  using BackupRestoreCustomStub"
      }
    
    
      /***************************************************************************************************************
       * Create a Guaranteed Restore Point in dbDef. The restore point is named DATICAL_BACKUP.
       * Returns a list of [Integer, StringBuffer, StringBuffer], where the Integer is a return code (0 is success, non-zero is an error),
       * the first StringBuffer contains the stderr from any external process, and the second StringBuffer contains the stdout from any external process.
       *
       * @param dbDef               A DatabaseDef object that can be used to create a connection to a database server.
       *                            The dbDef will be automatically constructed based on the name supplied by the
       *                            user on the command line.
       * @param schemaList          NOT USED.  A List of Strings, where each String is the name of a schema to backup. This list
       *                            is created by examining the datical.project file, or can be supplied by the user
       *                            on the command line. It may also be calculated by deploy packager from entries in
       *                            various metadata.properties files.
       * @param filename            NOT USED.  This is a String used to name the backup file. This value is calculated in the
       *                            method getFilename(),
       * @return                    A List of [Integer, StringBuffer, StringBuffer]
       * *************************************************************************************************************/
      @Override
      List runBackup(DatabaseDef dbDef, List<String> schemaList, String filename) {
        def rc
        def stdout = new StringBuffer()
        def stderr = new StringBuffer()
    
        try {
          ProcessBuilder processBuilder = new ProcessBuilder("")
    
           String connectString
           String printableConnectString
           if (StringUtils.isNotBlank(dbDef.sid)) {
             connectString          = "${dbDef.username}/\"${dbDef.password}\":@${dbDef.hostname}:${dbDef.port}:${dbDef.sid}"
             printableConnectString = "${dbDef.username}/\"***********************\":@${dbDef.hostname}:${dbDef.port}:${dbDef.sid}"
           }
           else if (StringUtils.isNotBlank(dbDef.serviceName)) {
             connectString          = "${dbDef.username}/\"${dbDef.password}\"@${dbDef.hostname}:${dbDef.port}/${dbDef.serviceName}"
             printableConnectString = "${dbDef.username}/\"***********************\"@${dbDef.hostname}:${dbDef.port}/${dbDef.serviceName}"
           }
           else {
             logger.printError "No SID or Service was specified in the connection (${dbDef.name})."
             return [1,null,null]
           }
    
          String[] restorepointOptions = ["", ""];
    /****************
    * Put the appropriate restorepointCommand here!
    *****************/
          String restorepointCommand = "/opt/datical/DaticalDB/repl/scripts/extensions/datical_create_restore_point.sh"
        
          def commands = processBuilder.command()
          commands.clear()
          commands.add(restorepointCommand.toString())
          commands.add(connectString.toString())
          restorepointOptions.eachWithIndex { item, index -> commands.add(item.toString()) }
    
          logger.debug "executing ${restorepointCommand} command:\n " + "${restorepointCommand} ${restorepointOptions.join(" ")}"
    
          Process proc = processBuilder.start()
          proc.waitForProcessOutput(stdout, stderr)
          rc = proc.exitValue()
    
        }
        catch (Exception ex) {
          stderr = ex
          rc = 1
        }
    
        def message = """
           rc    : ${rc}
           stderr: ${stderr}
           stdout: ${stdout}
    ----------------------------- end of CREATE_RESTORE_POINT output ---------------------------------
    """
    
        if (rc != 0) {
          logger.printError message
        }
        else {
          logger.printInfo stdout.toString()
          logger.printInfo "  Successfully created restore point for '${dbDef.name}'"
        }
        return [rc,stderr,stdout]
      }
    
    
      /****************************************************************************************
       * Restore the database specified by the dbDef using flashback database.
       * Returns a list of [Integer, StringBuffer, StringBuffer], where the Integer is a return code
       * (0 is success, non-zero is an error), the first StringBuffer contains the stderr from any external process,
       * and the second StringBuffer contains the stdout from any external process.
       *
       * @param dbDef               A DatabaseDef object that can be used to create a connection to a database server.
       *                            The dbDef will be automatically constructed based on the name supplied by the
       *                            user on the command line.
       * @param schemaList          NOT USED.  A List of Strings, where each String is the name of a schema to backup. This list
       *                            is created by examining the datical.project file, or can be supplied by the user
       *                            on the command line. It may also be calculated by deploy packager from entries in
       *                            various metadata.properties files.
       * @param filename            NOT USED.  This is a string that should be used to find the file that will be restored. This value
       *                            is specified by the user on the command line with the option 'restore=filename'
       *
       * @return                    A List of [Integer, StringBuffer, StringBuffer]
       *
       * *************************************************************************************************************/
      @Override
      List runRestore(DatabaseDef dbDef, List<String> schemaList, String filename) {
        def rc
        def stdout = new StringBuffer()
        def stderr = new StringBuffer()
        def schemas = ""
    
        try {
          ProcessBuilder processBuilder = new ProcessBuilder("")
    
           String connectString
           String printableConnectString
           if (StringUtils.isNotBlank(dbDef.sid)) {
             connectString          = "${dbDef.username}/\"${dbDef.password}\":@${dbDef.hostname}:${dbDef.port}:${dbDef.sid}"
             printableConnectString = "${dbDef.username}/\"***********************\":@${dbDef.hostname}:${dbDef.port}:${dbDef.sid}"
           }
           else if (StringUtils.isNotBlank(dbDef.serviceName)) {
             connectString          = "${dbDef.username}/\"${dbDef.password}\"@${dbDef.hostname}:${dbDef.port}/${dbDef.serviceName}"
             printableConnectString = "${dbDef.username}/\"***********************\"@${dbDef.hostname}:${dbDef.port}/${dbDef.serviceName}"
           }
           else {
             logger.printError "No SID or Service was specified in the connection (${dbDef.name})."
             return [1,null,null]
           }
    
          String[] flashbackOptions = ["", ""];
    /****************
    * Put the appropriate flashbackCommand here!
    *****************/
          String flashbackCommand = "/opt/datical/DaticalDB/repl/scripts/extensions/datical_restore_point_flashback.sh"
    
          def commands = processBuilder.command()
          commands.clear()
          commands.add(flashbackCommand.toString())
          commands.add(connectString.toString())
          flashbackOptions.eachWithIndex { item, index -> commands.add(item.toString()) }
    
          logger.debug "executing ${flashbackCommand} command:\n " + "${flashbackCommand} ${flashbackOptions.join(" ")}"
    
          Process proc = processBuilder.start()
          proc.waitForProcessOutput(stdout, stderr)
          rc = proc.exitValue()
    
    
        }
        catch (Exception ex) {
          stderr = ex
          rc = 1
        }
    
        def message = """
          rc    : ${rc}
          stderr: ${stderr}
          stdout: ${stdout}
    ----------------------------- end of FLASHBACK output ---------------------------------
    """
    
        if (rc != 0) {
          logger.printError message
        }
        else {
          logger.printInfo stdout.toString()
          logger.printInfo "  Successful flashback for '${dbDef.name}'"
        }
    
        return [rc,stderr,stdout]
            // return [0,"stderr=Stubbed","stdout=Stubbed"]
      }
    
    
      /***************************************************************************************************************
       * This method is called just after the runRestore method. Does nothing in this implementation.
       *
       * @param project             This will be populated with the Project object that deployPackager/backup/restore
       *                            is currently working with.
       * @param dbDef               A DatabaseDef object that can be used to create a connection to a database server.
       *                            The dbDef will be automatically constructed based on the name supplied by the
       *                            user on the command line.
       *
       * returns an integer. Zero indicates success, non-zero is failure.
       * *************************************************************************************************************/
      @Override
      int postRestore(Project project, DatabaseDef databaseDef) {
        return 0
      }
    
      /***************************************************************************************************************
       * The DatabaseBackupRestore class calls this method to allow an engine to return the name of the file to be
       * used for backup, restore, and verify operations. This implementation does nothing.  An enhancement could be to look at
       * deployPackager.properties for the property databaseBackupRestoreLocation and use that value for the name
       * of the Guaranteed Restore point.
       *
       * @param project             This will be populated with the Project object that deployPackager/backup/restore
       *                            is currently working with.
       * @param dbDef               A DatabaseDef object that can be used to create a connection to a database server.
       *                            The dbDef will be automatically constructed based on the name supplied by the
       *                            user on the command line.
       * @param forBackup           Not used in this implementation.
       *
       * Returns an empty String.
       * *************************************************************************************************************/
      @Override
      String getFilename(Project project, DatabaseDef dbDef, boolean forBackup) {
            return ""
      }
    
      /***************************************************************************************************************
       * Verify that the backup restore class is able to perform the operations. Returns a list of [Integer, StringBuffer, StringBuffer], where
       * the Integer is a return code (0 is success, non-zero is an error), the first StringBuffer contains the
       * stderr from any external process, and the second StringBuffer contains the stdout from any external process.
       *
       * @param dbDef               A DatabaseDef object that can be used to create a connection to a database server.
       *                            The dbDef will be automatically constructed based on the name supplied by the
       *                            user on the command line.
       * @param forBackup           This will be true if the system is preparing to do a backup. It will be false if
       *                            the system is preparing to do a restore.
       * @param filename            A String that refers to the backup file. The value is specified by the user on the
       *                            command line with the option 'backup=filename' or 'restore=filename'. This name
       *
       * This implementation does nothing.  Could be enhanced to verify if the database restore point exists.
       * *************************************************************************************************************/
      @Override
      List verify(DatabaseDef dbDef, boolean forBackup, String filename) {
      int rc = 0
        StringBuffer messages = new StringBuffer()
    
        return [rc, messages.toString()]
       }
    
    
      /***************************************************************************************************************
       * This method is called just prior to the runRestore method. This implementation does nothing.  There is no need to
       * drop the schemas since we will flashback the entire database.
       *
       * @param project             This will be populated with the Project object that deployPackager/backup/restore
       *                            is currently working with.
       * @param dbDef               A DatabaseDef object that can be used to create a connection to a database server.
       *                            The dbDef will be automatically constructed based on the name supplied by the
       *                            user on the command line.
       * @param schemaList          A List of Strings, where each String is the name of a schema to backup. This list
       *                            is created by examining the datical.project file, or can be supplied by the user
       *                            on the command line. It may also be calculated by deploy packager from entries in
       *                            various metadata.properties files.
       * *************************************************************************************************************/
      @Override
      int preRestore(Project project, DatabaseDef dbDef, List<String> schemaList) {
          logger.printInfo "Skipping schema drop, will flashback the database."
          return 0
      }
    
    
    }
    
    
  3. Update deployPackager.properties to use the custom databaseBackupRestoreMethod (per this documentation: Custom Backup and Restore Methods)

    deployPackager.properties
    #databaseBackupRestoreMethod=<name of custom backup script without the .groovy extension>
    databaseBackupRestoreMethod=BackupRestoreOracleFlashback
    databaseBackupMode=always
    packageOrderStrategy = default
    
    ddbScmSystem=git
    ddbScmBranch=master
    ddbScmPath=.
    
    # Default values
    sqlScmSystem=git
    sqlScmPath=../FLASH_SQL
    sqlScmSQLBaseDir=.
    
    # Pipeline specific values
    CURRENT.sqlScmBranch=master
    CURRENT.sqlScmLastImportID=a25a98e
    validateRollback = none
    disableTraceFile = false
  4. Create a Jenkins test harness to test flashback without packaging.
    -- This is optional.  The Jenkins job executes hammer commands to backup and restore the database allowing testing without running the full packaging process.

    Jenkins job
    #!/usr/bin/env groovy
    // Packager declarative pipeline
    //
    pipeline {
      agent {
        node {
          label 'datical'
          customWorkspace "/var/lib/jenkins/workspace/FLASH/TestFlashback-${BUILD_NUMBER}/"
        }
      }
     
      environment {
    	GITURL="git@github.com:datical-customersuccess"
    	GIT_DATICAL_REPO="FLASH_DDB"
    	PROJ_DDB="FLASH_DDB"
    	REPOSITORY_BASE="FLASH"
    	ORACLE_HOME="/opt/oracle/product/12.1/client"
        PATH="$PATH:/opt/datical/DaticalDB/repl:$ORACLE_HOME/bin"
     
      }
      stages {
    
        stage ('Precheck') {
    		steps {
    			sh '''
    				echo ORACLE_HOME=${ORACLE_HOME}
    				echo PATH=${PATH}
    				whoami
    				which git
    				which sqlplus
    				git --version
    				git config --global user.email "jenkins@datical.com"
    				git config --global user.name "jenkins"
    			'''
    		} // steps
    	} // stage 'precheck'
    
        stage ('Checkout') {
          steps {
            deleteDir()
     
            // checkout Datical project from DDB repo
            checkout([
                $class: 'GitSCM',
                branches: [[name: '*/master']],
                doGenerateSubmoduleConfigurations: false,
                extensions: [
                  [$class: 'RelativeTargetDirectory', relativeTargetDir: "${PROJ_DDB}"],
                  [$class: 'LocalBranch', localBranch: 'master']],
                submoduleCfg: [],
                userRemoteConfigs: [[url: "${GITURL}/${GIT_DATICAL_REPO}.git"]]
            ])
      
            
          } // steps for checkout stages
        } // stage 'checkout'
     
     
       stage ('Branches'){
          steps {
            sh '''
              #{ set +x; } 2>/dev/null
     
              cd ${PROJ_DDB}
              echo "Current Directory:" `pwd`
              git branch --set-upstream-to=origin/master master
              git status
     
             
            '''
          } // steps
        }   // Branches stage
      
        stage('Backup') {
          steps {
     
            // get BitBucket username and password
      //       withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: 'BitbucketJenkins',
       //                                  usernameVariable: 'SQL_SCM_USER', passwordVariable: 'SQL_SCM_PASS']]) {
    	//		withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: 'DDB_CREDENTIAL',
    	//									usernameVariable: 'DDB_USER', passwordVariable: 'DDB_PASS']]) {
    	//			withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: 'DDB_AUDIT_CREDENTIAL',
    	//										usernameVariable: 'DDB_AUDIT_USER', passwordVariable: 'DDB_AUDIT_PASS']]) {
    
    		 
    					sh '''
    					  { set +x; } 2>/dev/null
    											  
    					  cd ${PROJ_DDB}
    					  echo
    					  echo "==== Running - hammer version ===="
    					  hammer show version
    			 
    					  echo "==== Creating Database Restore Point ===="
    
                          hammer groovy database_backup_restore.groovy <dbdef> backup
                          
                          
    					  '''
    	//			} // with Credentials (AuditDB)
    	//		} // with Credentials (OracleDB)
        //    } // with Credentials (SCM)
          }   // steps
        }  // Backup stage
     
    stage('Restore') {
          steps {
     
            // get BitBucket username and password
      //       withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: 'BitbucketJenkins',
       //                                  usernameVariable: 'SQL_SCM_USER', passwordVariable: 'SQL_SCM_PASS']]) {
    	//		withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: 'DDB_CREDENTIAL',
    	//									usernameVariable: 'DDB_USER', passwordVariable: 'DDB_PASS']]) {
    	//			withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: 'DDB_AUDIT_CREDENTIAL',
    	//										usernameVariable: 'DDB_AUDIT_USER', passwordVariable: 'DDB_AUDIT_PASS']]) {
    
    		 
    					sh '''
    					  { set +x; } 2>/dev/null
    											  
    					  cd ${PROJ_DDB}
    					  echo
    					  echo "==== Running - hammer version ===="
    					  hammer show version
    			 
    					 
                          echo "==== Flashback Database ===="
                          hammer groovy database_backup_restore.groovy <dbdef> restore
                          
    
    					  '''
    	//			} // with Credentials (AuditDB)
    	//		} // with Credentials (OracleDB)
        //    } // with Credentials (SCM)
          }   // steps
        }  // Restore stage
     
    	  
      }   // stages
      
      post {
        always {
          // Jenkins Artifacts
          archiveArtifacts '**/daticaldb.log, **/Reports/**, **/Logs/**, **/Snapshots/** ,**/*.zip'
        }
      }
    }     // pipeline




Related content

Copyright © Datical 2012-2020 - Proprietary and Confidential