Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  1. Checkout the SQL code from the appropriate branch 
  2. Checkout the DaticalDB project from the main branch
  3. Run the deployer through the Command Line Interface (CLI) "hammer groovy deployPackager.groovy", this will includes:
    Image RemovedImage Added
  4. Create an artifact ( comprised of the DaticalDB project directory)
  5. After the artifact has been created, the process will be the same in each environment:
    1. download the artifact
    2. run the CLI to do a forecast
    3. run the CLI to execute the deployment itself

...

The file structure is typically as follow:

Image RemovedImage Added

where:

  • portal_sql is the name of your repository, 
  • sql_code is the name of the directory where the packager will look for new/modified .sql files to package, this directory is reference in the deployPackager.properties in your DaticalDB project
  • the sub-directories represent the different kinds of SQL scripts to package. The
  • the process order for a commit when using fixed folder names is the following: ddl, sqlplusddl_direct, sqlview, function, procedure, package, packagebody, package, procedure , trigger, sql, sql_direct and finally data_dml folder
  • the process order for a commit when using flexible folders is: convert, ddl_direct, storedlogic, sqlfile, direct, and finally data_dml packageMethod
  • more details cane be found at Placing Files in the SCM Repository

metadata.properties

In each directory under the top level one (sql_code in this example), you can create configuration file named metadata, where you can pass some additional directives to the packager (like additional label or context). 

...

You will find below more specific instructions on how to setup your packager job in each major build system, if yours is not explicitely explained below, feel free to use those instructions to create your own process or contact us to get some help from a team of seasoned professionals.

...

That step should look like:

Image RemovedImage Added

DaticalDB project repo

...

  1. check out to a sub-directory: portal_ddb
  2. Don't trigger a build on commit notification (if you have a CI, you do not want to start a new build when the DaticalDB project is modified)
  3. Check out to specific local branch: master (as the chanes changes are cumulative, there is no need for a branch)

That step should look like:

Image RemovedImage Added

Build Triggers

...

Check "Delete workspace before build starts", repo is usually small so the cost of cloning it the whole project is negligible compare t to the benefit of a clean environment

Image RemovedImage Added

Build

This is where the real work is done for the deployer

...

Code Block
titlePackager
linenumberstrue
# specify path to "hammer" - Datical DB's CLI tool
# specify path to Oracle client
export PATH=$PATH:/opt/datical/DaticalDB/repl:/home/datical/app/datical/product/11.2.0/client_1/bin
export ORACLE_HOME=/home/datical/app/datical/product/11.2.0/client_1/
cd fpcms2portal_ddb

# invoke Datical DB's Deployment Packager
echo
echo === Invoke Deployment Packager
hammer groovy deployPackager.groovy pipeline=Pipeline1 labels="2017-july" scm=true

#
# Pipeline Status to refresh AuditDB
echo
echo ==== Refresh AuditDB ====
call hammer --pipeline=Pipeline1 status 

echo ==== Creating ddb-packager-${BUILD_NUMBER}.zip ====
zip -q -r ddb-packager-${BUILD_NUMBER}.zip *
mv *.zip ..

echo
echo =====FINISHED====



Post-build Actions

...

Log Access

In jenkins to be able to access the logs and reports, use the "archive the artifacts" action:

Image Added


Artifact Creation

Like for a CI build, you may want to create an artifact C in Jenkins, or an external Artifact Repository like ElectricCloud Artifact Repository, IBM CodeStation, ... that you will use later on for deployment.

IBM uBuild

Atlassian Bamboo

...

IBM CodeStation

JFrog Artifactory

Jenkins 

  1. First install the Artifactory plugins
    1. Setup the connection in the "Manage Jenkins/Configure System" section
  2. Create a generic repository on the Artifactory server
  3. Select the "Generic-Artifactory integration" in your Jenkins build
    1. select your artifactory server
    2. Insert a spec that looks like

      Code Block
      {
        "files": [
          {
            "pattern": "*.zip",
            "target": "${PROJECT}/${Pipeline}/",
            "flat" : "false"
          }
        ]
      }


ElectricCloud Artifact Repository

...