Liquibase Enterprise was formerly known as Datical DB.

Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 18 Next »

 

Overview of the Project Creation Script

You can use the GUI to create projects one at a time. However, if you have several projects to set up, that can take a long time. Using the project_creator.groovy script can be faster.   The project_creator reads information from a set of input files, and generates a project directory in your workspace containing the required configuration files.  You can import the new project into your SCM repository to begin using it with your CI/CD tooling.

Example Input files

Example input files are delivered with Datical DB.  They are located in the following directory. 

 <DaticalDB-Installation-Directory>/repl/scripts/examples


Select the example files for your database platform.

  • DB2 Example
    • ProjectDB2.xlsx
      • ProjectDB2.dbdefs.tsv.txt
      • ProjectDB2.pipelines.tsv.txt
      • ProjectDB2.project.tsv.txt
  • Oracle Example
    • ProjectNameOracle.xlsx
      • ProjectNameOracle.dbdefs.tsv.txt
      • ProjectNameOracle.pipelines.tsv.txt
      • ProjectNameOracle.project.tsv.txt
  • EDB Postgres Example
    • ProjectPostgres.xlsx
      • ProjectPostgres.dbdefs.tsv.txt
      • ProjectPostgres.pipelines.tsv.txt
      • ProjectPostgres.project.tsv.txt
  • SQL Server Example
    • ProjectNameMSSQL.xlsx
      • ProjectNameMSSQL.dbdefs.tsv.txt
      • ProjectNameMSSQL.pipelines.tsv.txt
      • ProjectNameMSSQL.project.tsv.txt
      • MultiCatalogMSSQL.dbdefs.tsv
      • MultiCatalogMSSQL.pipelines.tsv
      • MultiCatalogMSSQL.project.tsv
      • MultiCatalogMSSQL.xlsx

How To Use project_creator.groovy

The project_creator.groovy script is executed using the hammer command.

hammer groovy project_creator.groovy [<datical_install_dir>/plugins | ignore] <provisioning_files_dir> <datical_workspace_dir> <project_name> [replace] [testConnections] 


Required Parameters (in this order)

 

<datical_install_dir>/plugins
or
ignore

The JDBC drivers directory
or
Type ignore if the database drivers have already been installed for the CLI

<provisioning_files_dir>

The directory that contains the input files: <project_name>.project.tsv.txt, <project_name>.dbdefs.tsv.txt and <project_name>.pipelines.tsv.txt

<datical_workspace_dir>

The directory that contains the Datical projects

<project_name>

The name of the Datical project you want to create


Optional Parameters

 

replace

Flag to re-create and replace the existing project

testConnections

Flag to test connectivity for all database definitions specified in <project_name>.dbdefs.tsv.txt (connections are tested after they've been setup in the Datical project)

Prerequisites

  1. Ensure that the Datical DB CLI directory is in your path. Otherwise, you will need to specify the full path to the CLI tool – hammer.
  2. Configure and place the following input files (Tab Delimited Files) in the <provisioning_files_dir> directory

Tips:

  • The filenames must begin with your <project_name>.  This is how Datical determines which files it should process.
  • The first line of the file must contain the field headers.
  • Column order does not matter. You can place them in any order. 


<project_name>.project.tsv.txt

Field Headers (case-sensitive)

Description

autoGenSQL

A TRUE or FALSE value that determines if SQL is automatically generated from Datical functions

deployThreshold

Pick one of the 3 available options:

  • stopOnError - Deployment will not be performed when errors are present in Pre-Deployment validation
  • stopOnWarn - Deployment will not be performed when errors and/or Warnings are present in Pre-Deployment validation
  • deployAlways - Deploy will occur regardless of validation results. NOT RECOMMENDED

externalStoredLogic

A TRUE or FALSE value that determines if storedLogic is inline or in external files

invalidsCheck

Oracle, SQL Server, or DB2 only

Determines the type of validity check for compiled objects
disabled - No check

limited - Only check objects targeted by a deployment and their dependencies

local - Check objects in all schemas specified in the deployment plan
global - Check across the entire database

invalidsAction

Oracle, SQL Server, or DB2 only

Determines what to do when the Stored Logic Validity Check fails. 

Both existing objects already in the database and new objects being deployed are checked.  

  • warn - print WARN message and continue
    • New objects that fail - print WARN message
    • Existing objects that fail - print WARN message
  • fail - print WARN or ERROR message, depending on objects
    • New objects that fail - print ERROR message
    • Existing objects that fail that were not targeted by a changeset in the deploy operation - print WARN message
    • (7.7 & later) Existing objects that fail that were targeted by a changeset in the deploy operation - print ERROR message

requireOptions

A TRUE or FALSE value that determines if Forecast and Deploy Options are always required

deployMode

Datical DB can optionally perform a full set of rules validations and deployment simulation prior to deploying the changes to the database. This is called a "Full" deploy. If you typically do a Forecast and then a deploy, you can set the deployMode to "quick" to skip the forecast and rules validation steps that happen during a Full deploy.

Valid values for this setting are:

  • quick
  • full

dbSchemaDefinition

The value of this setting determines whether the schema managed by this Plan are defined once for all Deployment Steps in the Plan (value: 'project') or specified independently with each Deployment Step (value:'dbDef') .  For multi-schema projects, this value should always be 'project'.  

multiCatalog

SQL Server only

A TRUE or FALSE value to indicate whether this Deployment Plan manages more than one catalog/database (MSSQL Server only).

multiSchema

A TRUE or FALSE value to indicate whether this Deployment Plan manages more than one schema. 

trackingSchema

The schema that should host the Datical Tracking Tables for the Deployment Steps in this Plan.

managedSchema

A comma-separated list of the schema to be managed by the Deployment Plan. They must be valid and available for the database specified by schemaSelectionStep.

schemaSelectionStep

DbDef to use for the list of managed schema in managedSchema. 

limitForecast

A TRUE or FALSE value that optimizes the forecast by limiting the objects under consideration to the ones that are getting deployed (instead of the entire database)

enableRowCount

Pick one of the 3 available options:

  • exact - Use a table scan to collect the row count. 
  • approximate - Use statistics to estimate the row count.
  • disabled - Do not collect row counts.

Before v5.7, the semantics were true|false.  When read in v5.7 or later, true is mapped to exact and false is mapped to disabled. 

forecastDML
Oracle only

A TRUE or FALSE value that enables DML forecasting.

enableDdlLockTimeout

Oracle only

A TRUE or FALSE value that enables the Oracle DDL_LOCK_TIMEOUT. The actual value 

ddlLockTimeoutValue

Oracle only

DDL commands require exclusive locks on internal structures. If these locks are not available some commands return with an "ORA-00054: resource busy" error message, which can be frustrating when trying to modify objects that are accessed frequently. To get round this Oracle 11g includes the DDL_LOCK_TIMEOUT parameter, which can be set at instance or session level using the ALTER SYSTEM and ALTER SESSION commands respectively.

The DDL_LOCK_TIMEOUT parameter indicates the number of seconds a DDL command should wait for the locks to become available before throwing the resource busy error message. The default value is zero.

runtimeCredentials

Do not use this field, it does not have any effect.
Set the DbDefClass with the prefix DelayedCredential instead

enableSqlParser

Oracle only

Enable use of the SQL Parser for scripts in the ddl_direct, sql, and sql_direct folders. For eligible databases and restrictions on usage, see Using SQL Parser.


<project_name>.dbdefs.tsv.txt

Field Headers (case-sensitive)

Description

name

The name for the dbDef. Used in the as an alias for this connection.

DbDefClass

  • Oracle
    • Inline Credentials:  OracleDbDef
    • Run-Time Supplied Credentials:  DelayedCredentialOracleDbDef
  • SQL Server
    • Inline Credentials:  SqlServerDbDef
    • Run-Time Supplied Credentials:  DelayedCredentialSqlServerDbDef
  • EDB Postgres
    • Inline Credentials:  PostgresqlDbDef
    • Run-Time Supplied Credentials:  DelayedCredentialPostgresDbDef
  • DB2
    • Inline Credentials:  Db2DbDef
    • Run-Time Supplied Credentials:  DelayedCredentialDb2DbDef
  • (deprecated) MySQL Inline Credentials:  MysqlDbDef

hostname

The hostname/ip of the target database server.

port

The port number for JDBC connections to the target database server.

username

The database user name to use for the connection.

password

The password for the database user specified.  We strongly recommend to use runtime credentials because it is more secure than stored credentials (due to the possibility of the stored password being decoded).

integratedSecurity
SQL Server only

Enable Active Directory for SQL Server. When this is set, the username and password fields should be empty.

database
DB2 or Postgres only

The type of the database you wish to manage.

databaseName
SQL Server only

The name of the database you wish to manage.

instanceName
SQL Server only

The name of the instance you wish to manage.

applicationName
SQL Server only

The name of the database application you wish to manage.

azureClientId

Azure SQL Managed Instance only

Reference DB only

The Microsoft Azure Client ID associated with the account with which you wish to manage the Azure SQL Managed Instance.

azureTenantId

Azure SQL Managed Instance only

Reference DB only

The Microsoft Azure Tenant ID associated with the account for the Azure SQL Managed Instance you wish to manage.

azureClientSecret

Azure SQL Managed Instance only


Reference DB only

The Microsoft Azure Client Secret used to authenticate with the Azure REST.

The REST API Client Secret is a sensitive credential that should be treated with the same caution as database login credentials. It is strongly recommended that this credential be provided at runtime in an environment variable and not stored in the GUI. For more information, see product documentation about Runtime Credentials.

azureResourceGroup

Azure SQL Managed Instance only

Reference DB only

The Microsoft Azure Resource Group associated with the Azure SQL Managed Instance you wish to manage.

azureSubscriptionId

Azure SQL Managed Instance only

Reference DB only

The Microsoft Azure Subscription ID associated with the Azure SQL Managed Instance you wish to manage.

platformType
DB2 only

Platform type, one of LUW or ZOS. LUW is the default. 

sid
Oracle only

The name of the Oracle SID to which you wish connect.

serviceName
Oracle only

The name of the Oracle Service to which you wish connect.

useWallet
Oracle only
Use Oracle Wallet to connect to the Oracle database through an SSL connection. See Using Oracle Wallet for SSL Connections to Oracle Databases
tnsName
Oracle only
Directory that contains configuration files for Oracle Wallet.

defaultSchemaName

The name of the schema/catalog you wish to manage. Only used with single schema project configurations.

contexts

A comma separated list of contexts to associate with the new dbDef.

labels

A comma separated list of labels to associate with the new dbDef.

storageOptions

A TRUE or FALSE value that determines if the Datical collects Storage Options.

dbDefType

Set to standard for managed database. Set to dmcdb for a Deployment Monitoring Console database.
NOTE: If specifying a dmcdb in your project, project_creator.groovy will attempt to register the project with the dmcdb.

scriptExecutionTimeoutLimit the time to wait for a script to execute, in seconds.  Default value is 0, which means there is no limit. 


<project_name>.pipelines.tsv.txt 

Field Headers (case-sensitive)

Description

pipeline

Name of the pipeline

description

Decription for the pipeline

dbDef

Name of a dbDef from <project_name>.dbdefs.tsv.txt

visible

TRUE/FALSE value - is this step visible in the pipeline

Example: Running the Script on Windows Systems

% hammer groovy project_creator.groovy "F:\drivers\jdbc_drivers" "F:\provisioning_files" "C:\Users\TestUser1\datical" "new_project" testConnections

Example: Running the Script on Linux Systems

% hammer groovy project_creator.groovy ignore /opt/datical/provisioning_files /opt/datical/workspace "new_project" testConnections

Results

A <project_name> directory is placed in your datical workspace containing a datical.project file.


  • No labels