Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

This document explains how to setup an Azure pipeline with an agent to execute Liquibase Enterprise operations. An Azure DevOps agent can be shared among multiple projects, so one agent can serve several teams. You can learn more about Azure agents here: https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agents?view=azure-devops&tabs=browser.

Additionally we’ll show how Liquibase Enterprise forecast and deploy commands can run via an Azure Release using a Classic release pipeline: https://learn.microsoft.com/en-us/azure/devops/pipelines/release/?view=azure-devops

Instructions

Table of Contents

Prerequisites

...

Sample Repo

A sample public GitHub repo can be found here:

https://github.com/liquibase/cs_azure_cicd_pipelines

Prerequisites

Before following the steps in this document, setup your databases and create the Liquibase project.

...

  • A centralized Rules repository.

  • A centralized CI/CD repository.

...

    • When using a centralized Rules repository be sure to include the following in the .gitignore of the Liquibase Project repository.

      • Code Block
        Rules/
  • A centralized CI/CD repository.

Push the Liquibase project configuration files to the project repository in Azure Repos.

Code Block
git init --initial-branch=main
git remote add origin git@ssh.dev.azure.com:v3/asmith0101/Liquibase/ABC_DDB
git push -u origin --all
git add .
git commit -m "Initial commit"
git push -u origin main

...

Make sure that git is installed on the agent as it will be needed by the Liquibase Enterprise Deploy Packager. SCM Requirements and Features

...

Use an Azure Library to group credentials from your Key Vault into a Variable Group. This grouping makes it easy to associate credentials to Azure Pipelines and Releases.

...

Step 7:

...

Deploy Packager jobs will be triggered automatically by commits to the packaging branch of the SQL repo. Forecast and Deploy jobs will be triggered manually from the GitLab web interface. The Logging job will always run to generate a .zip of files to be used for debugging errors.

...

Create a Pipeline to run Packager

Create an Azure Pipeline to run the Deploy Packager (Build) job. Azure Pipelines are created from an azure_pipelines.yml file.

azure_pipelines.yml file

Code Block
trigger: none

name: $(Application.Name)-$(Build.BuildId)

variables:
- template: ./variables.yml 
- group: Liquibase_Variables

pool:
   name: $(Agent.PoolName)

workspace:
      clean: all

resources:         
  repositories:

  - repository: DDB_REPOSITORY
    name: ABC_DDB
    type: git
    connection: AzureRepos
    source: ABC_DDB
    ref: 'refs/heads/main'

  - repository: SQL_REPOSITORY
    name: ABC_SQL
    type: git
    connection: AzureRepos
    source: ABC_SQL
    ref: 'refs/heads/current'

  - repository: CICD_REPOSITORY
    name: Liquibase/DB_CICD_PIPELINES
    type: git
    connection: AzureRepos
    source: Liquibase/DB_CICD_PIPELINES
    ref: 'refs/heads/main'
 
  - repository: CENTRALIZED_RULES_REPOSITORY
    name: Liquibase/CENTRALIZED_RULES
    type: git
    connection: AzureRepos
    source: Liquibase/CENTRALIZED_RULES
    ref: 'refs/heads/main'

steps:
- checkout: DDB_REPOSITORY
  persistCredentials: true
  clean: true
- checkout: SQL_REPOSITORY
  clean: true
  fetchDepth: 100
  persistCredentials: true
- checkout: CENTRALIZED_RULES_REPOSITORY
  clean: true
  persistCredentials: true
  
- script: |
    whoami
    hammer show version
    echo $(System.DefaultWorkingDirectory)
  
 export PATH="$PATH:/opt/mssql-tools/bin"
  displayName: 'Run Prechecks'
 
# -Ensure script:the |PATH includes the necessary  cd $(Git.Repository)
    cp -R ../CENTRALIZED_RULES/Rules .
    printenv
    hammer groovy deployPackager.groovy pipeline=current commitPrefix="[skip ci]" scm=true labels=$(Build.BuildId),current
    if [ $? -ne 0 ]; then exit 1; fi
  displayName: 'Run Liquibase packager'
  env:
    DDB_USER: $(Liquibase-abc-ref1-User)
    DDB_PASS: $(Liquibase-abc-ref1-Pass)
    DDB_DMCDB_USER: $(Liquibase-abc-dmc-User)
    DDB_DMCDB_PASS: $(Liquibase-abc-dmc-Pass)
 executables, eg. hammer and sqlplus/sqlcmd/clpplus/psql
# Also copy the Rules from the Centralized rules folder to the DDB folder
- powershell: |
    $Env:Path += ";C:\Users\Administrator\DaticalDB\repl"
    $Env:Path += ";C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\170\Tools\Binn\"
    cd $(Git.Repository)
    xcopy /s/e/y ..\CENTRALIZED_RULES\Rules\. .\Rules\*
  condition: eq( variables['Agent.OS'], 'Windows_NT' )
  displayName: 'Update PATH (Windows) and set Rules'
  
- script: |
   pwd export   cd $(Git.Repository)PATH="$PATH:/opt/datical/DaticalDB/repl"
     echo $nameexport PATH="$PATH:/opt/mssql-tools/bin"
    zip -q -rcd $(Application.Name)-$(Build.BuildId).zip *Git.Repository)
    mkdircp -pR $(System.DefaultWorkingDirectory)/artifacts
../CENTRALIZED_RULES/Rules .
  ls -l $(Application.Name)-$(Build.BuildId).zip
   cp $(Application.Name)-$(Build.BuildId).zip $(System.DefaultWorkingDirectory)/artifactscondition: eq( variables['Agent.OS'], 'Linux' )
  displayName: 'Copy zip to Azure ArtifactsUpdate PATH (Linux) and set Rules'
 - upload:
$(System.DefaultWorkingDirectory)/artifacts/$(Application.Name)-$(Build.BuildId).zip
  artifact: drop

- task: ArtifactoryGenericUpload@2
  inputs:
    artifactoryService: 'ArtifactoryCS'
    specSource: 'taskConfiguration'
    fileSpec: |
      {
        "files": [
          {
            "pattern": "$(Git.Repository)/$(Application.Name)-$(Build.BuildId).zip",
            "target": "$(Project.Name)/$(Application.Name)/"
          }
        ]
      }
    replaceSpecVars: true
    specVars: '$(Application.Name),$(Build.BuildId)'
    collectBuildInfo: true
    buildName: '$(Build.DefinitionName)'
    buildNumber: '$(Build.BuildNumber)'
    module: '$(Project.Name)'
    failNoOp: true
  displayName: 'Upload zip to artifactory'

- task: ArtifactoryPublishBuildInfo@1
  inputs:
    artifactoryService: 'ArtifactoryCS'
    buildName: '$(Build.DefinitionName)'
    buildNumber: '$(Build.BuildNumber)'
  displayName: 'Publish Build Info to artifactory'

- task: ArtifactoryBuildPromotion@1
  inputs:
    artifactoryService: 'ArtifactoryCS'
    buildName: '$(Build.DefinitionName)'
    buildNumber: '$(Build.BuildNumber)'
    targetRepo: '$(Project.Name)'
    status: 'Released'
    sourceRepo: '$(Project.Name)'
    includeDependencies: false
    copy: false
    dryRun: false
  displayName: 'Promote Build to artifactory'

- task: AzureKeyVault@2
  inputs:
    azureSubscription: 'CustomerSuccessPayAsGo'
    KeyVaultName: 'cs-key-vault1'
    SecretsFilter: '*'
    RunAsPreJob: true

Note: You may need to manually initialize the git SSH key for both of the Repos.

Code Block
git config --global user.name "Martha Bohls"
git config --global user.email "mbohls@liquibase.com"
git config --global init.defaultBranch main
git init
git remote add origin git@gitlab.com:mbohls/$ddb_repo.git

Step 6: Configure variables for the pipeline.

Variables are a good way to store database credentials. They also allow the same script to be configured to work for different projects.

  1. Login to GitLab.

  2. Navigate to the SQL repo for the project.

  3. Go to Settings > CI/CD

  4. Expand the Variables section.

...

The sample script requires the variables shown below.

appname: Name of your Liquibase project

pipeline: Name of the pipeline in the Liquibase project

branch: Branch in the SQL repo that is used for packaging

ddb_repo: Name of the Liquibase project repository

sql_repo: Name of the SQL repository

DDB_USER: Database user name

DDB_PASS: Database password

DDB_DMCDB_USER: Database user for DMCDB

DDB_DMCDB_PASS: Database password for DMCDB

Step 7: Test your pipeline.

...

 script: |
    cd $(Git.Repository) 
    hammer groovy deployPackager.groovy pipeline=current commitPrefix="[skip ci]" scm=true labels=$(Build.BuildId),current
    if [ $? -ne 0 ]; then exit 1; fi
  displayName: 'Run Liquibase packager'
  env:
    DDB_USER: $(Liquibase-abc-ref1-User)
    DDB_PASS: $(Liquibase-abc-ref1-Pass)
    DDB_DMCDB_USER: $(Liquibase-abc-dmc-User)
    DDB_DMCDB_PASS: $(Liquibase-abc-dmc-Pass)

# Compress files into .zip
- task: ArchiveFiles@2
  inputs:
    rootFolderOrFile: '$(System.DefaultWorkingDirectory)/$(Git.Repository)'
    includeRootFolder: true
    archiveType: 'zip'
    archiveFile: '$(System.DefaultWorkingDirectory)/artifacts/$(Application.Name)-$(Build.BuildId).zip'

- upload: $(System.DefaultWorkingDirectory)/artifacts/$(Application.Name)-$(Build.BuildId).zip
  artifact: drop

- task: ArtifactoryGenericUpload@2
  inputs:
    artifactoryService: 'ArtifactoryCS'
    specSource: 'taskConfiguration'
    fileSpec: |
      {
        "files": [
          {
            "pattern": "artifacts/$(Application.Name)-$(Build.BuildId).zip",
            "target": "$(Project.Name)/$(Application.Name)/"
          }
        ]
      }
    replaceSpecVars: true
    specVars: '$(Application.Name),$(Build.BuildId)'
    collectBuildInfo: true
    buildName: '$(Build.DefinitionName)'
    buildNumber: '$(Build.BuildNumber)'
    module: '$(Project.Name)'
    failNoOp: true
  displayName: 'Upload zip to artifactory'

- task: ArtifactoryPublishBuildInfo@1
  inputs:
    artifactoryService: 'ArtifactoryCS'
    buildName: '$(Build.DefinitionName)'
    buildNumber: '$(Build.BuildNumber)'
  displayName: 'Publish Build Info to artifactory'

- task: ArtifactoryBuildPromotion@1
  inputs:
    artifactoryService: 'ArtifactoryCS'
    buildName: '$(Build.DefinitionName)'
    buildNumber: '$(Build.BuildNumber)'
    targetRepo: '$(Project.Name)'
    status: 'Released'
    sourceRepo: '$(Project.Name)'
    includeDependencies: false
    copy: false
    dryRun: false
  displayName: 'Promote Build to artifactory'

- task: AzureKeyVault@2
  inputs:
    azureSubscription: 'CustomerSuccessPayAsGo'
    KeyVaultName: 'cs-key-vault1'
    SecretsFilter: '*'
    RunAsPreJob: true

variables.yml file

Code Block
variables:
   Project.Name: ABC
   Application.Name: ABC-App
   Git.Repository: ABC_DDB
   Git.Repository2: ABC_SQL
   Agent.PoolName: 'CS'
   JFrog.EndPoint: 'JFrog-DevOps'
   JFrog.Repository: 'abc'
   App.Version: $(Build.BuildId)
   Source.Branch: 'current''

Step 8: Create a Release to run Forecast and Deploy Jobs

Instructions: https://learn.microsoft.com/en-us/azure/devops/pipelines/release/?view=azure-devops

Azure Releases can be used to run the Liquibase Enterprise Forecast and Deploy jobs. These jobs will pull the repository from an Artifact.

Under Releases select + NEW to create a New Release Pipeline. Add an artifact. This example uses a JFrog Artifactory artifact. For JFrog Artifactory you will need to have setup an Artifactory Service, see https://www.jfrog.com/confluence/display/JFROG/Artifactory+Azure+DevOps+Extension.

...

For the Build Name, select the build pipeline you created in Step 7.

...

Step 9: Configure your Release Pipeline

In this example we have setup the following workflow:

Deploy to DEV → Forecast to Test → Deploy to Test → Forecast to Prod → Deploy to Prod

...

Under the Variables section, make sure to Link the Variable Group created in Step 6.

...

Each step represents a Stage in the Pipeline.

Configure a Forecast Stage

For each Forecast Stage, select the Agent that will be used to run the job. Be sure to select the Agent Pool configured in Step 1.

...

Each Azure Stage is comprised of Tasks. Use a Command line task to run the Liquibase Enterprise hammer commands. (Please note, use of Windows agents may require slightly different task types or script code. Eg. Powershell may be required.)

  1. Be sure to unzip the artifact prior to running the hammer commands.

  2. You’ll need to specify the Working Directory under the Advanced section.

  3. You’ll also want to set DDB_USER, DDB_PASS, DDB_DMCDB_USER, and DDB_DMCDB_PASS under the Environment Variables section. These values will be set to the name of the vault values, eg. $(Liquibase-abc-dev-User), $(Liquibase-abc-dev-Pass), $(Liquibase-abc-dmc-User), $(Liquibase-abc-dmc-Pass)

Script Code:

Code Block
whoami
hammer show version
unzip -o *.zip
# Depending on how you have zipped the artifact you may need to cd into the DDB repo.

hammer forecast --labels=current --pipeline=current TEST

...

Configure a Deploy Stage

A Deploy Stage will be identical to a Forecast stage with the exception of the Script Code.

For Deploys use the following:

Code Block
whoami
hammer show version
unzip -o *.zip
# Depending on how you have zipped the artifact you may need to cd into the DDB repo.

hammer deploy --labels=current --pipeline=current TEST

Step 10: Test your Pipeline

Commit a change to the packaging branch in your SQL repo. Go to your Pipeline in Azure and select “Run pipeline”.

...

After all your pipeline steps have completed successfully, your artifact will be ready to run a Release.

...

  1. Go to Releases and select “Create release”.

  2. Select all the stages to run.

  3. Specify the Artifact Version to run. Select the most recent.

  4. You’ll need to click on the Stages to trigger them. You’ll need to click “Deploy”. (Please note this “Deploy” terminology is from Azure. Even the Liquibase Enterprise Forecast will be triggered via a “Deploy” button.)

...

Troubleshooting

ERROR: Error getting list of commits; Invalid revision range; unknown revision or path not in the working tree

Problem: One of the following messages appears when executing the pipeline:

ERROR: Error getting list of commits
ERROR: fatal: Invalid revision range

ERROR: fatal: ambiguous argument '158902d..HEAD': unknown revision or path not in the working tree.

Resolution

Azure Devops has a default GitHub fetch depth of 1 commit. This causes a problem of not being able to find the sqlScmLastImportID specified in deployPackager.properties. Updating the sqlScmLastImportID in the latest commit results in a temporary one-off success.

A large fetchDepth can affect performance of the pipeline over time, especially if large commits are made to the SQL repo. For example, a fetchDepth of '0' can be used to get all commits in the branch, but doing this every time could be wasteful of resources as time goes on, or if there are large commits: (https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops&tabs=yaml#shallow-fetch ).

This parameter might require tuning or intervention over time if the error appears again. We recommend starting with a relatively low value, say 100, for maintaining a consistent rolling window of 100 commits. While this setting can be made using the UI for basic pipelines with no YAML definition, it should be made directly in the pipeline YAML where checkout steps are defined.

Below is an example of the usage of fetchDepth:

Code Block
- checkout: SQL_REPOSITORY
  clean: true
  persistCredentials: true
  fetchDepth: 100