Overview of the Project Creation Script
You can use the GUI to create projects one at a time. However, if you have several projects to set up, that can take a long time. Using the project_creator.groovy
script can be faster. The project_creator reads information from a set of input files, and generates a project directory in your workspace containing the required configuration files. You can import the new project into your SCM repository to begin using it with your CI/CD tooling.
Example Input files
Example input files are delivered with Datical DB. They are located in the following directory.
<DaticalDB-Installation-Directory>/repl/scripts/examples
Select the example files for your database platform.
- DB2 Example
- ProjectDB2.xlsx
- ProjectDB2.dbdefs.tsv.txt
- ProjectDB2.pipelines.tsv.txt
- ProjectDB2.project.tsv.txt
- ProjectDB2.xlsx
- Oracle Example
- ProjectNameOracle.xlsx
- ProjectNameOracle.dbdefs.tsv.txt
- ProjectNameOracle.pipelines.tsv.txt
- ProjectNameOracle.project.tsv.txt
- ProjectNameOracle.xlsx
- EDB Postgres Example
- ProjectPostgres.xlsx
- ProjectPostgres.dbdefs.tsv.txt
- ProjectPostgres.pipelines.tsv.txt
- ProjectPostgres.project.tsv.txt
- ProjectPostgres.xlsx
- SQL Server Example
- ProjectNameMSSQL.xlsx
- ProjectNameMSSQL.dbdefs.tsv.txt
- ProjectNameMSSQL.pipelines.tsv.txt
- ProjectNameMSSQL.project.tsv.txt
- MultiCatalogMSSQL.dbdefs.tsv
- MultiCatalogMSSQL.pipelines.tsv
- MultiCatalogMSSQL.project.tsv
- MultiCatalogMSSQL.xlsx
- ProjectNameMSSQL.xlsx
How To Use project_creator.groovy
The project_creator.groovy script is executed using the hammer command.
hammer groovy project_creator.groovy [<datical_install_dir>/plugins | ignore] <provisioning_files_dir> <datical_workspace_dir> <project_name> [replace] [testConnections]
Required Parameters (in this order) |
|
---|---|
<datical_install_dir>/plugins | The JDBC drivers directory |
<provisioning_files_dir> | The directory that contains the input files: |
<datical_workspace_dir> | The directory that contains the Datical projects |
<project_name> | The name of the Datical project you want to create |
Optional Parameters |
|
---|---|
replace | Flag to re-create and replace the existing project |
testConnections | Flag to test connectivity for all database definitions specified in <project_name>.dbdefs.tsv.txt (connections are tested after they've been setup in the Datical project) |
Prerequisites
- Ensure that the Datical DB CLI directory is in your path. Otherwise, you will need to specify the full path to the CLI tool – hammer.
- Configure and place the following input files (Tab Delimited Files) in the <provisioning_files_dir> directory
Tips:
- The filenames must begin with your <project_name>. This is how Datical determines which files it should process.
- The first line of the file must contain the field headers.
- Column order does not matter. You can place them in any order.
<project_name>.project.tsv.txt | |
Field Headers (case-sensitive) | Description |
autoGenSQL | A TRUE or FALSE value that determines if SQL is automatically generated from Datical functions |
deployThreshold | Pick one of the 3 available options:
|
externalStoredLogic | A TRUE or FALSE value that determines if storedLogic is inline or in external files |
invalidsCheck | Determines the type of validity check for compiled objects limited - Only check objects targeted by a deployment and their dependencies local - Check objects in all schemas specified in the deployment plan |
invalidsAction | Determines what to do when the Stored Logic Validity Check fails. Both existing objects already in the database and new objects being deployed are checked.
|
requireOptions | A TRUE or FALSE value that determines if Forecast and Deploy Options are always required |
deployMode | Datical DB can optionally perform a full set of rules validations and deployment simulation prior to deploying the changes to the database. This is called a "Full" deploy. If you typically do a Forecast and then a deploy, you can set the deployMode to "quick" to skip the forecast and rules validation steps that happen during a Full deploy. Valid values for this setting are:
|
dbSchemaDefinition | The value of this setting determines whether the schema managed by this Plan are defined once for all Deployment Steps in the Plan (value: 'project') or specified independently with each Deployment Step (value:'dbDef') . For multi-schema projects, this value should always be 'project'. |
multiCatalog | A TRUE or FALSE value to indicate whether this Deployment Plan manages more than one catalog (MSSQL Server only). |
multiSchema | A TRUE or FALSE value to indicate whether this Deployment Plan manages more than one schema. |
trackingSchema | The schema that should host the Datical Tracking Tables for the Deployment Steps in this Plan. |
managedSchema | A comma separated list of the schema to be managed by the Deployment Plan. They must be valid and available for the database specified by schemaSelectionStep. |
schemaSelectionStep | DbDef to use for the list of managed schema in managedSchema. |
limitForecast | A TRUE or FALSE value that optimizes the forecast by limiting the objects under consideration to the ones that are getting deployed (instead of the entire database) |
enableRowCount | Pick one of the 3 available options:
Before v5.7, the semantics were true|false. When read in v5.7 or later, true is mapped to exact and false is mapped to disabled. |
forecastDML | A TRUE or FALSE value that enables DML forecasting. |
enableDdlLockTimeout Oracle Only | A TRUE or FALSE value that enables the Oracle DDL_LOCK_TIMEOUT. The actual value |
ddlLockTimeoutValue Oracle Only | DDL commands require exclusive locks on internal structures. If these locks are not available some commands return with an "ORA-00054: resource busy" error message, which can be frustrating when trying to modify objects that are accessed frequently. To get round this Oracle 11g includes the The |
runtimeCredentials | Do not use this field, it does not have any effect. |
enableSqlParser | Enable use of the SQL Parser for sql_direct scripts. For eligible databases and restrictions on usage, see Using SQL Parser. |
<project_name>.dbdefs.tsv.txt | |
Field Headers (case-sensitive) | Description |
name | The name for the dbDef. Used in the as an alias for this connection. |
DbDefClass |
|
hostname | The hostname/ip of the target database server. |
port | The port number for JDBC connections to the target database server. |
username | The database user name to use for the connection. |
password | The password for the database user specified. We strongly recommend to use runtime credentials because it is more secure than stored credentials (due to the possibility of the stored password being decoded). |
integratedSecurity | Enable Active Directory for SQL Server. When this is set, the username and password fields should be empty. |
database | The type of the database you wish to manage. |
databaseName | The name of the database you wish to manage. |
instanceName | The name of the instance you wish to manage. |
applicationName | The name of the database application you wish to manage. |
platformType | Platform type, one of LUW or ZOS. LUW is the default. |
sid | The name of the Oracle SID to which you wish connect. |
serviceName | The name of the Oracle Service to which you wish connect. |
useWallet Oracle only | Use Oracle Wallet to connect to the Oracle database through an SSL connection. See Using Oracle Wallet for SSL Connections to Oracle Databases. |
tnsName Oracle only | Directory that contains configuration files for Oracle Wallet. |
defaultSchemaName | The name of the schema/catalog you wish to manage. |
contexts | A comma separated list of contexts to associate with the new dbDef. |
labels | A comma separated list of labels to associate with the new dbDef. |
storageOptions | A TRUE or FALSE value that determines if the Datical collects Storage Options. |
dbDefType | Set to |
scriptExecutionTimeout | Limit the time to wait for a script to execute, in seconds. Default value is 0, which means there is no limit. |
<project_name>.pipelines.tsv.txt | |
Field Headers (case-sensitive) | Description |
pipeline | Name of the pipeline |
description | Decription for the pipeline |
dbDef | Name of a dbDef from <project_name>.dbdefs.tsv.txt |
visible | TRUE/FALSE value - is this step visible in the pipeline |
Example: Running the Script on Windows Systems
% hammer groovy project_creator.groovy "F:\drivers\jdbc_drivers" "F:\provisioning_files" "C:\Users\TestUser1\datical" "new_project" testConnections
Example: Running the Script on Linux Systems
% hammer groovy project_creator.groovy ignore /opt/datical/provisioning_files /opt/datical/workspace "new_project" testConnections
Results
A <project_name>
directory is placed in your datical
workspace containing a datical.project file.