...
This document explains how to setup an Azure pipeline with an agent to execute Liquibase Enterprise operations. An Azure DevOps agent can be shared among multiple projects, so one agent can serve several teams. You can learn more about Azure agents here: https://learn.microsoft.com/en-us/azure/devops/pipelines/agents/agents?view=azure-devops&tabs=browser.
Additionally we’ll show how Liquibase Enterprise forecast and deploy commands can run via an Azure Release using a Classic release pipeline: https://learn.microsoft.com/en-us/azure/devops/pipelines/release/?view=azure-devops
Instructions
Table of Contents |
---|
Prerequisites
...
Sample Repo
A sample public GitHub repo can be found here:
https://github.com/liquibase/cs_azure_cicd_pipelines
Prerequisites
Before following the steps in this document, setup your databases and create the Liquibase project.
...
A centralized Rules repository.
A centralized CI/CD repository.
Push the Liquibase project configuration files to the project repository in Azure Repos.
...
When using a centralized Rules repository be sure to include the following in the .gitignore of the Liquibase Project repository.
Code Block Rules/
A centralized CI/CD repository.
Push the Liquibase project configuration files to the project repository in Azure Repos.
Code Block |
---|
git init --initial-branch=main git remote add origin git@ssh.dev.azure.com:v3/asmith0101/Liquibase/ABC_DDB git push -u origin --all git add . git commit -m "Initial commit" git push -u origin main |
...
Windows: Installing Liquibase Enterprise Clients on Windows Systems
Linux: Theme: Using the CLI and the Composite Repository to Install Liquibase Enterprise on Linux Systems
Make sure that git is installed on the agent as it will be needed by the Liquibase Enterprise Deploy Packager. SCM Requirements and Features
...
Step 7: Create a Pipeline to run Packager
Use Create an Azure Pipeline to run the Deploy Packager (Build) job. Azure Pipelines are created from an azure_pipelines.yml file.
...
Code Block |
---|
trigger: none name: $(Application.Name)-$(Build.BuildId) variables: - template: ./variables.yml - group: Liquibase_Variables pool: name: $(Agent.PoolName) workspace: clean: all resources: repositories: - repository: DDB_REPOSITORY name: ABC_DDB type: git connection: AzureRepos source: ABC_DDB ref: 'refs/heads/main' - repository: SQL_REPOSITORY name: ABC_SQL type: git connection: AzureRepos source: ABC_SQL ref: 'refs/heads/current' - repository: CICD_REPOSITORY name: Liquibase/DB_CICD_PIPELINES type: git connection: AzureRepos source: Liquibase/DB_CICD_PIPELINES ref: 'refs/heads/main' - repository: CENTRALIZED_RULES_REPOSITORY name: Liquibase/CENTRALIZED_RULES type: git connection: AzureRepos source: Liquibase/CENTRALIZED_RULES ref: 'refs/heads/main' steps: - checkout: DDB_REPOSITORY persistCredentials: true clean: true - checkout: SQL_REPOSITORY clean: true fetchDepth: 100 persistCredentials: true - checkout: CENTRALIZED_RULES_REPOSITORY clean: true persistCredentials: true - script: | whoami hammer show version echo $(System.DefaultWorkingDirectory) export PATH="$PATH:/opt/mssql-tools/bin" displayName: 'Run Prechecks' # -Ensure script:the |PATH includes the necessary cd $(Git.Repository) cp -R ../CENTRALIZED_RULES/Rules . printenv hammer groovy deployPackager.groovy pipeline=current commitPrefix="[skip ci]" scm=true labels=$(Build.BuildId),current if [ $? -ne 0 ]; then exit 1; fi displayName: 'Run Liquibase packager' env: DDB_USER: $(Liquibase-abc-ref1-User) DDB_PASS: $(Liquibase-abc-ref1-Pass) DDB_DMCDB_USER: $(Liquibase-abc-dmc-User) DDB_DMCDB_PASS: $(Liquibase-abc-dmc-Pass) executables, eg. hammer and sqlplus/sqlcmd/clpplus/psql # Also copy the Rules from the Centralized rules folder to the DDB folder - powershell: | $Env:Path += ";C:\Users\Administrator\DaticalDB\repl" $Env:Path += ";C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\170\Tools\Binn\" cd $(Git.Repository) xcopy /s/e/y ..\CENTRALIZED_RULES\Rules\. .\Rules\* condition: eq( variables['Agent.OS'], 'Windows_NT' ) displayName: 'Update PATH (Windows) and set Rules' - script: | pwd cd $(Git.Repository)export PATH="$PATH:/opt/datical/DaticalDB/repl" echo $name export PATH="$PATH:/opt/mssql-tools/bin" zip -q -rcd $(Application.Name)-$(Build.BuildId).zip *Git.Repository) mkdircp -pR $(System.DefaultWorkingDirectory)/artifacts ../CENTRALIZED_RULES/Rules . ls -l $(Application.Name)-$(Build.BuildId).zip cp $(Application.Name)-$(Build.BuildId).zip $(System.DefaultWorkingDirectory)/artifacts displayName: 'Copy zip to Azure Artifacts' - upload: $(System.DefaultWorkingDirectory)/artifacts/$(Application.Name)-condition: eq( variables['Agent.OS'], 'Linux' ) displayName: 'Update PATH (Linux) and set Rules' - script: | cd $(Git.Repository) hammer groovy deployPackager.groovy pipeline=current commitPrefix="[skip ci]" scm=true labels=$(Build.BuildId).zip,current artifact:if drop[ $? -ne task:0 ArtifactoryGenericUpload@2]; then exit inputs:1; fi artifactoryServicedisplayName: 'ArtifactoryCS'Run Liquibase packager' specSourceenv: 'taskConfiguration' fileSpec: |DDB_USER: $(Liquibase-abc-ref1-User) DDB_PASS: {$(Liquibase-abc-ref1-Pass) "files": [ { DDB_DMCDB_USER: $(Liquibase-abc-dmc-User) DDB_DMCDB_PASS: $(Liquibase-abc-dmc-Pass) # Compress files into .zip - task: ArchiveFiles@2 inputs: "pattern"rootFolderOrFile: "'$(GitSystem.RepositoryDefaultWorkingDirectory)/$(ApplicationGit.Name)-$(Build.BuildId).zip", Repository)' includeRootFolder: true archiveType: 'zip' "target"archiveFile: "'$(ProjectSystem.NameDefaultWorkingDirectory)/artifacts/$(Application.Name)/")-$(Build.BuildId).zip' - } upload: $(System.DefaultWorkingDirectory)/artifacts/$(Application.Name)-$(Build.BuildId).zip artifact: drop - task: ArtifactoryGenericUpload@2 inputs: ]artifactoryService: 'ArtifactoryCS' specSource: }'taskConfiguration' replaceSpecVarsfileSpec: | true specVars: '$(Application.Name),$(Build.BuildId)'{ collectBuildInfo: true "files": [ buildName: '$(Build.DefinitionName)' buildNumber: '$(Build.BuildNumber)' { module: '$(Project.Name)' failNoOp: true displayName: 'Upload zip to artifactory' - task: ArtifactoryPublishBuildInfo@1 inputs:"pattern": "artifacts/$(Application.Name)-$(Build.BuildId).zip", artifactoryService: 'ArtifactoryCS'"target": "$(Project.Name)/$(Application.Name)/" buildName: '$(Build.DefinitionName)' } buildNumber: '$(Build.BuildNumber)' displayName: 'Publish Build Info to] artifactory' - task: ArtifactoryBuildPromotion@1 inputs:} artifactoryServicereplaceSpecVars: 'ArtifactoryCS'true buildNamespecVars: '$(Application.Name),$(Build.DefinitionNameBuildId)' buildNumbercollectBuildInfo: '$(Build.BuildNumber)'true targetRepobuildName: '$(ProjectBuild.NameDefinitionName)' statusbuildNumber: 'Released$(Build.BuildNumber)' sourceRepomodule: '$(Project.Name)' includeDependenciesfailNoOp: false copy: false dryRun: false true displayName: 'PromoteUpload Buildzip to artifactory' - task: AzureKeyVault@2ArtifactoryPublishBuildInfo@1 inputs: azureSubscriptionartifactoryService: 'CustomerSuccessPayAsGoArtifactoryCS' KeyVaultNamebuildName: 'cs-key-vault1$(Build.DefinitionName)' SecretsFilterbuildNumber: '*' RunAsPreJob: true |
Step 8: Create a Release to run Forecast and Deploy Jobs
Use an Azure Release to run the Deploy Packager (Build) job. Azure Pipelines are created from an azure_pipelines.yml file.
Step 9: Test your pipeline.
...
'$(Build.BuildNumber)'
displayName: 'Publish Build Info to artifactory'
- task: ArtifactoryBuildPromotion@1
inputs:
artifactoryService: 'ArtifactoryCS'
buildName: '$(Build.DefinitionName)'
buildNumber: '$(Build.BuildNumber)'
targetRepo: '$(Project.Name)'
status: 'Released'
sourceRepo: '$(Project.Name)'
includeDependencies: false
copy: false
dryRun: false
displayName: 'Promote Build to artifactory'
- task: AzureKeyVault@2
inputs:
azureSubscription: 'CustomerSuccessPayAsGo'
KeyVaultName: 'cs-key-vault1'
SecretsFilter: '*'
RunAsPreJob: true
|
variables.yml file
Code Block |
---|
variables:
Project.Name: ABC
Application.Name: ABC-App
Git.Repository: ABC_DDB
Git.Repository2: ABC_SQL
Agent.PoolName: 'CS'
JFrog.EndPoint: 'JFrog-DevOps'
JFrog.Repository: 'abc'
App.Version: $(Build.BuildId)
Source.Branch: 'current'' |
Step 8: Create a Release to run Forecast and Deploy Jobs
Instructions: https://learn.microsoft.com/en-us/azure/devops/pipelines/release/?view=azure-devops
Azure Releases can be used to run the Liquibase Enterprise Forecast and Deploy jobs. These jobs will pull the repository from an Artifact.
Under Releases select + NEW to create a New Release Pipeline. Add an artifact. This example uses a JFrog Artifactory artifact. For JFrog Artifactory you will need to have setup an Artifactory Service, see https://www.jfrog.com/confluence/display/JFROG/Artifactory+Azure+DevOps+Extension.
...
For the Build Name, select the build pipeline you created in Step 7.
...
Step 9: Configure your Release Pipeline
In this example we have setup the following workflow:
Deploy to DEV → Forecast to Test → Deploy to Test → Forecast to Prod → Deploy to Prod
...
Under the Variables section, make sure to Link the Variable Group created in Step 6.
...
Each step represents a Stage in the Pipeline.
Configure a Forecast Stage
For each Forecast Stage, select the Agent that will be used to run the job. Be sure to select the Agent Pool configured in Step 1.
...
Each Azure Stage is comprised of Tasks. Use a Command line task to run the Liquibase Enterprise hammer commands. (Please note, use of Windows agents may require slightly different task types or script code. Eg. Powershell may be required.)
Be sure to unzip the artifact prior to running the hammer commands.
You’ll need to specify the Working Directory under the Advanced section.
You’ll also want to set
DDB_USER
,DDB_PASS
,DDB_DMCDB_USER
, andDDB_DMCDB_PASS
under the Environment Variables section. These values will be set to the name of the vault values, eg.$(Liquibase-abc-dev-User)
,$(Liquibase-abc-dev-Pass)
,$(Liquibase-abc-dmc-User)
,$(Liquibase-abc-dmc-Pass)
Script Code:
Code Block |
---|
whoami
hammer show version
unzip -o *.zip
# Depending on how you have zipped the artifact you may need to cd into the DDB repo.
hammer forecast --labels=current --pipeline=current TEST |
...
Configure a Deploy Stage
A Deploy Stage will be identical to a Forecast stage with the exception of the Script Code.
For Deploys use the following:
Code Block |
---|
whoami
hammer show version
unzip -o *.zip
# Depending on how you have zipped the artifact you may need to cd into the DDB repo.
hammer deploy --labels=current --pipeline=current TEST |
Step 10: Test your Pipeline
Commit a change to the packaging branch in your SQL repo. Go to your Pipeline in Azure and select “Run pipeline”.
...
After all your pipeline steps have completed successfully, your artifact will be ready to run a Release.
...
Go to Releases and select “Create release”.
Select all the stages to run.
Specify the Artifact Version to run. Select the most recent.
You’ll need to click on the Stages to trigger them. You’ll need to click “Deploy”. (Please note this “Deploy” terminology is from Azure. Even the Liquibase Enterprise Forecast will be triggered via a “Deploy” button.)
...
Troubleshooting
ERROR: Error getting list of commits; Invalid revision range; unknown revision or path not in the working tree
Problem: One of the following messages appears when executing the pipeline:
ERROR: Error getting list of commits
ERROR: fatal: Invalid revision range
ERROR: fatal: ambiguous argument '158902d..HEAD': unknown revision or path not in the working tree.
Resolution
Azure Devops has a default GitHub fetch depth of 1 commit. This causes a problem of not being able to find the sqlScmLastImportID specified in deployPackager.properties. Updating the sqlScmLastImportID in the latest commit results in a temporary one-off success.
A large fetchDepth can affect performance of the pipeline over time, especially if large commits are made to the SQL repo. For example, a fetchDepth of '0' can be used to get all commits in the branch, but doing this every time could be wasteful of resources as time goes on, or if there are large commits: (https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops&tabs=yaml#shallow-fetch ).
This parameter might require tuning or intervention over time if the error appears again. We recommend starting with a relatively low value, say 100, for maintaining a consistent rolling window of 100 commits. While this setting can be made using the UI for basic pipelines with no YAML definition, it should be made directly in the pipeline YAML where checkout steps are defined.
Below is an example of the usage of fetchDepth:
Code Block |
---|
- checkout: SQL_REPOSITORY
clean: true
persistCredentials: true
fetchDepth: 100 |