...
This document explains how to setup an Azure pipeline with an agent to execute Liquibase Enterprise operations. An Azure DevOps agent can be shared among multiple projects, so one agent can serve several teams. This example uses a Linux server for the agent. You can learn more about Azure agents here: https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agents?view=azure-devops&tabs=browser.
Prerequisites
Before following the steps in this document, setup your databases and create the Liquibase project.
Liquibase Enterprise requires two repositories which are referred to as Repos in Azure DevOps. Azure Pipelines can be built from code stored in Azure Repos, Bitbucket, GitHub, other other Git and Subversion sources. For this example we will be using Azure Repos.
In Azure Repos create the following two projects:
The Liquibase Project repository.
The SQL code repository.
Push the Liquibase project configuration files to the project repository in Azure Repos.
Code Block |
---|
git init --initial-branch=main
git remote add origin git@ssh.dev.azure.com:v3/asmith0101/Liquibase/ABC_DDB
git push -u origin --all
git add .
git commit -m "Initial commit"
git push -u origin main |
Step 1: Install Liquibase Enterprise on an Agent
The agent can be a VM on Azure or other cloud platform. Instructions to setup an Azure VM can be found at https://learn.microsoft.com/en-us/azure/devops/pipelines/process/environments-virtual-machines?view=azure-devops. Make sure to use a supported version of Windows or Linux. Hardware, Software, and Database Requirements
Follow these instructions to install Liquibase Enterprise:
Windows: Installing Liquibase Enterprise Clients on Windows Systems
Linux: Theme: Using the CLI and the Composite Repository to Install Liquibase Enterprise on Linux Systems
Make sure that git is installed on the agent as it will be needed by the Liquibase Enterprise Deploy Packager. SCM Requirements and Features
Step 2: Install GitLab Runner on the Linux Server
https://docs.gitlab.com/runner/install/linux-repository.html
Login to the Linux VM and run the following commands.
Code Block |
---|
curl -L "https://packages.gitlab.com/install/repositories/runner/gitlab-runner/script.rpm.sh" | sudo bash
sudo yum install gitlab-runner |
This example uses SSH keys for git authentication. You can find information on using SSH keys with GitLab here: https://docs.gitlab.com/ee/user/ssh.html . Copy the SSH private key to /home/gitlab-runner/.ssh.
Step 3: Register the Runner
https://docs.gitlab.com/runner/register/index.html
Login to GitLab.
Navigate to the SQL repo.
Go to Settings > CI/CD
Expand the Runners section.
Copy the registration token.Start a terminal session on the GitLab runner. Run the “gitlab-runner register” command.
You will be prompted for the required information. We recommend creating a tag for the runner so that GitLab CI/CD jobs for Liquibase can be associated with this runner. The executor for the runner should be “shell”.Code Block sudo gitlab-runner register Runtime platform arch=amd64 os=linux pid=11051 revision=5316d4ac version=14.6.0 Running in system-mode. Enter the GitLab instance URL (for example, https://gitlab.com/): https://gitlab.com Enter the registration token: <the token that was copied above> Enter a description for the runner: [ip-172-30-3-12.ec2.internal]: myrunner.mydomain.com Enter tags for the runner (comma-separated): liquibase Registering runner... succeeded runner=q8McuSyX Enter an executor: docker, parallels, virtualbox, docker+machine, kubernetes, custom, shell, ssh, docker-ssh+machine, docker-ssh: shell Runner registered successfully. Feel free to start it, but if it's running already the config should be automatically reloaded!
Step 4: Create an Access Token
The Access Token (PAT) is needed to download artifacts.
To create a personal access token:
Login to GitLab.
On the top-right corner, select your avatar.
Select Edit Profile
On the left sidebar, select Access Tokens.
Enter a name.
Leave expiry date blank (never expires).
Select access token scope: api
Select create personal access token.
Save token. It cannot be retrieved later.
If you would prefer to use a Project Access Token, the instructions are here: https://docs.gitlab.com/ee/user/project/settings/project_access_tokens.html .
Step 5: Configure the .gitlab-ci.yml file
Deploy Packager jobs will be triggered automatically by commits to the packaging branch of the SQL repo. Forecast and Deploy jobs will be triggered manually from the GitLab web interface. The Logging job will always run to generate a .zip of files to be used for debugging errors.
The example .gitlab-ci.yml file should be placed in the top level of the SQL repo. Update the GitLab URLs for your Repos.
Code Block |
---|
# Liquibase Sample gitlab-ci.yml
stages:
- package
- forecast
- deploy
- logging
packager-job:
stage: package
tags:
- liquibase
only:
refs:
- develop
variables:
- $jobtype == "PACKAGER"
except:
variables:
# Don't execute when Liquibase commits back to the Repo.
- $CI_COMMIT_AUTHOR == "Liquibase <support@liquibase.com>"
script:
- |
echo "Packaging Stage"
echo "Checking Environment"
echo Project directory: $CI_PROJECT_DIR
echo Project ID: $CI_PROJECT_ID
echo Commit Author: $CI_COMMIT_AUTHOR
whoami
# Variables needed for Oracle projects.
export PATH="$PATH:/opt/datical/DaticalDB/repl:/opt/datical/DaticalDB/instantclient"
export LD_LIBRARY_PATH=/opt/datical/DaticalDB/instantclient
hammer show version
- |
echo "Getting SQL repo"
cd ..
rm src -r -f
mkdir -p src
cd src
git config --global user.name "Liquibase"
git config --global user.email "support@liquibase.com"
git config --global init.defaultBranch main
git init
git remote add origin git@gitlab.com:mbohls/$sql_repo.git
git fetch origin
git checkout -b $branch --track origin/$branch
git branch --set-upstream-to=origin/$branch $branch
- |
echo "Getting DDB repo"
cd ..
rm ddb -r -f
mkdir -p ddb
cd ddb
git config --global user.name "Liquibase"
git config --global user.email "support@liquibase.com"
git config --global init.defaultBranch main
git init
git remote add origin git@gitlab.com:mbohls/$ddb_repo.git
git fetch origin
git checkout -b main --track origin/main
git branch --set-upstream-to=origin/main main
- |
echo "Packaging"
hammer groovy deployPackager.groovy pipeline=$pipeline scm=true labels=$CI_JOB_ID
zip -r $appname-$CI_JOB_ID.zip * -x "Reports/*" -x "Logs/*" -x "Snapshots/*" -x "Profiles/*" -x "daticaldb.log" -x "deployPackager.properties";
mv $appname-$CI_JOB_ID.zip $CI_PROJECT_DIR
artifacts:
paths:
- $appname-$CI_JOB_ID.zip
forecast-job:
stage: forecast
tags:
- liquibase
only:
refs:
- web
variables:
- $jobtype == "FORECAST"
script:
- |
echo "Forecast Stage"
echo Project directory: $CI_PROJECT_DIR
echo Artifact ID: $packagenumber
export PATH="$PATH:/opt/datical/DaticalDB/repl"
export LD_LIBRARY_PATH=/opt/datical/DaticalDB/instantclient
cd ../ddb
mkdir -p forecast
cd forecast
# Get GitLab Artifact
curl --output $appname-$packagenumber.zip --header "PRIVATE-TOKEN: <your PAT>" "https://gitlab.com/api/v4/projects/$CI_PROJECT_ID/jobs/$packagenumber/artifacts/$appname-$packagenumber.zip"
# Unzip and forecast
unzip $appname-$packagenumber.zip
if [ -n "${labels}" ]
then
hammer forecast $environment --labels="${labels}"
else
hammer forecast $environment
fi
deploy-job:
stage: deploy
tags:
- liquibase
only:
refs:
- web
variables:
- $jobtype == "DEPLOY"
script:
- |
echo "Deploy Stage"
echo Project directory: $CI_PROJECT_DIR
echo Artifact ID: $packagenumber
export PATH="$PATH:/opt/datical/DaticalDB/repl"
export LD_LIBRARY_PATH=/opt/datical/DaticalDB/instantclient
cd ../ddb
mkdir -p deploy
cd deploy
# Get GitLab Artifact
curl --output $appname-$packagenumber.zip --header "PRIVATE-TOKEN: yourtoken" "https://gitlab.com/api/v4/projects/$CI_PROJECT_ID/jobs/$packagenumber/artifacts/$appname-$packagenumber.zip"
# Unzip and deploy
unzip $appname-$packagenumber.zip
if [ -n "${labels}" ]
then
hammer deploy $environment --labels="${labels}"
else
hammer deploy $environment
fi
logging-job:
stage: logging
tags:
- daticaldb
except:
variables:
# Don't execute when Liquibase commits to the repo.
- $CI_COMMIT_AUTHOR == "Liquibase <support@liquibase.com>" && $jobtype == "PACKAGER"
script:
- |
# Generate log files for debugging
echo "Post Stage"
echo Project directory: $CI_PROJECT_DIR
export PATH="$PATH:/opt/datical/DaticalDB/repl:/opt/datical/DaticalDB/instantclient"
export LD_LIBRARY_PATH=/opt/datical/DaticalDB/instantclient
# Switch to the DDB directory
cd ../ddb
hammer debug export --include="datical.project,changelog.xml,daticaldb*.log,*.html,deployPackager.properties,packager.log" --report=scrubbed_debug_files-$CI_JOB_ID.zip
mv scrubbed_debug_files-$CI_JOB_ID.zip $CI_PROJECT_DIR
# Run even if there was a job failure earlier so that we capture the log files.
when: always
artifacts:
paths:
- scrubbed_debug_files-$CI_JOB_ID.zip
variables:
#These are runtime variables. Global variables are set at the project level.
environment:
value: "ref"
description: "DBDEF name. Valid values are: ref, test, prod"
jobtype:
value: "PACKAGER"
description: "valid values are: PACKAGER, FORECAST, DEPLOY"
packagenumber:
value: "changeme"
description: "FORECAST and DEPLOY JOBS ONLY. Unique number of artifact"
labels:
description: "FORECAST and DEPLOY JOBS ONLY. Label expression" |
Note: You may need to manually initialize the git SSH key for both of the Repos.
Code Block |
---|
git config --global user.name "Martha Bohls"
git config --global user.email "mbohls@liquibase.com"
git config --global init.defaultBranch main
git init
git remote add origin git@gitlab.com:mbohls/$ddb_repo.git |
Step 6: Configure variables for the pipeline.
Variables are a good way to store database credentials. They also allow the same script to be configured to work for different projects.
Login to GitLab.
Navigate to the SQL repo for the project.
Go to Settings > CI/CD
Expand the Variables section.
...
The sample script requires the variables shown below.
appname: Name of your Liquibase project
pipeline: Name of the pipeline in the Liquibase project
branch: Branch in the SQL repo that is used for packaging
ddb_repo: Name of the Liquibase project repository
sql_repo: Name of the SQL repository
DDB_USER: Database user name
DDB_PASS: Database password
DDB_DMCDB_USER: Database user for DMCDB
DDB_DMCDB_PASS: Database password for DMCDB
Step 7: Test your pipeline.
Commit a change to the packaging branch in your SQL repo. GitLab should run the pipeline Packaging job using your runner.Additionally we’ll show how Liquibase Enterprise forecast and deploy commands can run via an Azure Release using a Classic release pipeline: https://learn.microsoft.com/en-us/azure/devops/pipelines/release/?view=azure-devops
Instructions
Table of Contents |
---|
Sample Repo
A sample public GitHub repo can be found here:
https://github.com/liquibase/cs_azure_cicd_pipelines
Prerequisites
Before following the steps in this document, setup your databases and create the Liquibase project.
Liquibase Enterprise requires two repositories which are referred to as Repos in Azure DevOps. Azure Pipelines can be built from code stored in Azure Repos, Bitbucket, GitHub, and other Git or Subversion sources. For this example we will be using Azure Repos.
In Azure Repos create the following two projects:
The Liquibase Project repository.
The SQL code repository.
This example also uses 2 optional repos:
A centralized Rules repository.
When using a centralized Rules repository be sure to include the following in the .gitignore of the Liquibase Project repository.
Code Block Rules/
A centralized CI/CD repository.
Push the Liquibase project configuration files to the project repository in Azure Repos.
Code Block |
---|
git init --initial-branch=main
git remote add origin git@ssh.dev.azure.com:v3/asmith0101/Liquibase/ABC_DDB
git push -u origin --all
git add .
git commit -m "Initial commit"
git push -u origin main |
Step 1: Setup an Azure Agent Pool
Instructions: https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agents?view=azure-devops
An Azure Agent Pool can be created under Project Settings. Agent pools are shared across an organization.
...
Step 2: Create an Azure Agent
Instructions: https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agents?view=azure-devops
After setting up an Agent Pool you can create a New agent. Instructions will be provided for the type of agent (Windows or Linux) that you wish to use. Note there are also Detailed Instructions for each of the types. Make sure to use a supported version of Windows or Linux. Hardware, Software, and Database Requirements
...
Step 3: Setup authentication from the Agent to the Repositories
This example uses SSH keys for git authentication. You can find information on using SSH keys with Azure at https://learn.microsoft.com/en-us/azure/devops/repos/git/use-ssh-keys-to-authenticate?view=azure-devops.
If you wish to use HTTPS for git authentication you will need to setup Git Credential Manager. Instructions can be found at https://learn.microsoft.com/en-us/azure/devops/repos/git/set-up-credential-managers?view=azure-devops.
Step 4: Install Liquibase Enterprise on the Agent
Follow these instructions to install Liquibase Enterprise:
Windows: Installing Liquibase Enterprise Clients on Windows Systems
Linux: Theme: Using the CLI and the Composite Repository to Install Liquibase Enterprise on Linux Systems
Make sure that git is installed on the agent as it will be needed by the Liquibase Enterprise Deploy Packager. SCM Requirements and Features
Step 5: Setup an Azure Key Vault with the database credentials
Instructions: https://learn.microsoft.com/en-us/azure/devops/pipelines/release/key-vault-in-own-project?view=azure-devops&tabs=portal
The Azure Key Vault is a secure method for storing database credentials. It is recommended that you add Key Vault secrets for all databases in your pipeline including the DMC database.
In this example, Secrets added in the Key Vault will be referenced by the pipeline using Variable Groups.
...
Step 6: Setup an Azure Library Variable Group
Instructions: https://learn.microsoft.com/en-us/azure/devops/pipelines/library/?view=azure-devops and https://learn.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups?view=azure-devops&tabs=yaml
Use an Azure Library to group credentials from your Key Vault into a Variable Group. This grouping makes it easy to associate credentials to Azure Pipelines and Releases.
...
Step 7: Create a Pipeline to run Packager
Create an Azure Pipeline to run the Deploy Packager (Build) job. Azure Pipelines are created from an azure_pipelines.yml file.
azure_pipelines.yml file
Code Block |
---|
trigger: none
name: $(Application.Name)-$(Build.BuildId)
variables:
- template: ./variables.yml
- group: Liquibase_Variables
pool:
name: $(Agent.PoolName)
workspace:
clean: all
resources:
repositories:
- repository: DDB_REPOSITORY
name: ABC_DDB
type: git
connection: AzureRepos
source: ABC_DDB
ref: 'refs/heads/main'
- repository: SQL_REPOSITORY
name: ABC_SQL
type: git
connection: AzureRepos
source: ABC_SQL
ref: 'refs/heads/current'
- repository: CICD_REPOSITORY
name: Liquibase/DB_CICD_PIPELINES
type: git
connection: AzureRepos
source: Liquibase/DB_CICD_PIPELINES
ref: 'refs/heads/main'
- repository: CENTRALIZED_RULES_REPOSITORY
name: Liquibase/CENTRALIZED_RULES
type: git
connection: AzureRepos
source: Liquibase/CENTRALIZED_RULES
ref: 'refs/heads/main'
steps:
- checkout: DDB_REPOSITORY
persistCredentials: true
clean: true
- checkout: SQL_REPOSITORY
clean: true
fetchDepth: 100
persistCredentials: true
- checkout: CENTRALIZED_RULES_REPOSITORY
clean: true
persistCredentials: true
- script: |
whoami
hammer show version
echo $(System.DefaultWorkingDirectory)
displayName: 'Run Prechecks'
# Ensure the PATH includes the necessary executables, eg. hammer and sqlplus/sqlcmd/clpplus/psql
# Also copy the Rules from the Centralized rules folder to the DDB folder
- powershell: |
$Env:Path += ";C:\Users\Administrator\DaticalDB\repl"
$Env:Path += ";C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\170\Tools\Binn\"
cd $(Git.Repository)
xcopy /s/e/y ..\CENTRALIZED_RULES\Rules\. .\Rules\*
condition: eq( variables['Agent.OS'], 'Windows_NT' )
displayName: 'Update PATH (Windows) and set Rules'
- script: |
export PATH="$PATH:/opt/datical/DaticalDB/repl"
export PATH="$PATH:/opt/mssql-tools/bin"
cd $(Git.Repository)
cp -R ../CENTRALIZED_RULES/Rules .
condition: eq( variables['Agent.OS'], 'Linux' )
displayName: 'Update PATH (Linux) and set Rules'
- script: |
cd $(Git.Repository)
hammer groovy deployPackager.groovy pipeline=current commitPrefix="[skip ci]" scm=true labels=$(Build.BuildId),current
if [ $? -ne 0 ]; then exit 1; fi
displayName: 'Run Liquibase packager'
env:
DDB_USER: $(Liquibase-abc-ref1-User)
DDB_PASS: $(Liquibase-abc-ref1-Pass)
DDB_DMCDB_USER: $(Liquibase-abc-dmc-User)
DDB_DMCDB_PASS: $(Liquibase-abc-dmc-Pass)
# Compress files into .zip
- task: ArchiveFiles@2
inputs:
rootFolderOrFile: '$(System.DefaultWorkingDirectory)/$(Git.Repository)'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(System.DefaultWorkingDirectory)/artifacts/$(Application.Name)-$(Build.BuildId).zip'
- upload: $(System.DefaultWorkingDirectory)/artifacts/$(Application.Name)-$(Build.BuildId).zip
artifact: drop
- task: ArtifactoryGenericUpload@2
inputs:
artifactoryService: 'ArtifactoryCS'
specSource: 'taskConfiguration'
fileSpec: |
{
"files": [
{
"pattern": "artifacts/$(Application.Name)-$(Build.BuildId).zip",
"target": "$(Project.Name)/$(Application.Name)/"
}
]
}
replaceSpecVars: true
specVars: '$(Application.Name),$(Build.BuildId)'
collectBuildInfo: true
buildName: '$(Build.DefinitionName)'
buildNumber: '$(Build.BuildNumber)'
module: '$(Project.Name)'
failNoOp: true
displayName: 'Upload zip to artifactory'
- task: ArtifactoryPublishBuildInfo@1
inputs:
artifactoryService: 'ArtifactoryCS'
buildName: '$(Build.DefinitionName)'
buildNumber: '$(Build.BuildNumber)'
displayName: 'Publish Build Info to artifactory'
- task: ArtifactoryBuildPromotion@1
inputs:
artifactoryService: 'ArtifactoryCS'
buildName: '$(Build.DefinitionName)'
buildNumber: '$(Build.BuildNumber)'
targetRepo: '$(Project.Name)'
status: 'Released'
sourceRepo: '$(Project.Name)'
includeDependencies: false
copy: false
dryRun: false
displayName: 'Promote Build to artifactory'
- task: AzureKeyVault@2
inputs:
azureSubscription: 'CustomerSuccessPayAsGo'
KeyVaultName: 'cs-key-vault1'
SecretsFilter: '*'
RunAsPreJob: true
|
variables.yml file
Code Block |
---|
variables:
Project.Name: ABC
Application.Name: ABC-App
Git.Repository: ABC_DDB
Git.Repository2: ABC_SQL
Agent.PoolName: 'CS'
JFrog.EndPoint: 'JFrog-DevOps'
JFrog.Repository: 'abc'
App.Version: $(Build.BuildId)
Source.Branch: 'current'' |
Step 8: Create a Release to run Forecast and Deploy Jobs
Instructions: https://learn.microsoft.com/en-us/azure/devops/pipelines/release/?view=azure-devops
Azure Releases can be used to run the Liquibase Enterprise Forecast and Deploy jobs. These jobs will pull the repository from an Artifact.
Under Releases select + NEW to create a New Release Pipeline. Add an artifact. This example uses a JFrog Artifactory artifact. For JFrog Artifactory you will need to have setup an Artifactory Service, see https://www.jfrog.com/confluence/display/JFROG/Artifactory+Azure+DevOps+Extension.
...
For the Build Name, select the build pipeline you created in Step 7.
...
Step 9: Configure your Release Pipeline
In this example we have setup the following workflow:
Deploy to DEV → Forecast to Test → Deploy to Test → Forecast to Prod → Deploy to Prod
...
Under the Variables section, make sure to Link the Variable Group created in Step 6.
...
Each step represents a Stage in the Pipeline.
Configure a Forecast Stage
For each Forecast Stage, select the Agent that will be used to run the job. Be sure to select the Agent Pool configured in Step 1.
...
Each Azure Stage is comprised of Tasks. Use a Command line task to run the Liquibase Enterprise hammer commands. (Please note, use of Windows agents may require slightly different task types or script code. Eg. Powershell may be required.)
Be sure to unzip the artifact prior to running the hammer commands.
You’ll need to specify the Working Directory under the Advanced section.
You’ll also want to set
DDB_USER
,DDB_PASS
,DDB_DMCDB_USER
, andDDB_DMCDB_PASS
under the Environment Variables section. These values will be set to the name of the vault values, eg.$(Liquibase-abc-dev-User)
,$(Liquibase-abc-dev-Pass)
,$(Liquibase-abc-dmc-User)
,$(Liquibase-abc-dmc-Pass)
Script Code:
Code Block |
---|
whoami
hammer show version
unzip -o *.zip
# Depending on how you have zipped the artifact you may need to cd into the DDB repo.
hammer forecast --labels=current --pipeline=current TEST |
...
Configure a Deploy Stage
A Deploy Stage will be identical to a Forecast stage with the exception of the Script Code.
For Deploys use the following:
Code Block |
---|
whoami
hammer show version
unzip -o *.zip
# Depending on how you have zipped the artifact you may need to cd into the DDB repo.
hammer deploy --labels=current --pipeline=current TEST |
Step 10: Test your Pipeline
Commit a change to the packaging branch in your SQL repo. Go to your Pipeline in Azure and select “Run pipeline”.
...
After all your pipeline steps have completed successfully, your artifact will be ready to run a Release.
...
Go to Releases and select “Create release”.
Select all the stages to run.
Specify the Artifact Version to run. Select the most recent.
You’ll need to click on the Stages to trigger them. You’ll need to click “Deploy”. (Please note this “Deploy” terminology is from Azure. Even the Liquibase Enterprise Forecast will be triggered via a “Deploy” button.)
...
Troubleshooting
ERROR: Error getting list of commits; Invalid revision range; unknown revision or path not in the working tree
Problem: One of the following messages appears when executing the pipeline:
ERROR: Error getting list of commits
ERROR: fatal: Invalid revision range
ERROR: fatal: ambiguous argument '158902d..HEAD': unknown revision or path not in the working tree.
Resolution
Azure Devops has a default GitHub fetch depth of 1 commit. This causes a problem of not being able to find the sqlScmLastImportID specified in deployPackager.properties. Updating the sqlScmLastImportID in the latest commit results in a temporary one-off success.
A large fetchDepth can affect performance of the pipeline over time, especially if large commits are made to the SQL repo. For example, a fetchDepth of '0' can be used to get all commits in the branch, but doing this every time could be wasteful of resources as time goes on, or if there are large commits: (https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops&tabs=yaml#shallow-fetch ).
This parameter might require tuning or intervention over time if the error appears again. We recommend starting with a relatively low value, say 100, for maintaining a consistent rolling window of 100 commits. While this setting can be made using the UI for basic pipelines with no YAML definition, it should be made directly in the pipeline YAML where checkout steps are defined.
Below is an example of the usage of fetchDepth:
Code Block |
---|
- checkout: SQL_REPOSITORY
clean: true
persistCredentials: true
fetchDepth: 100 |