Table of Contents |
---|
Overview
Below are suggestions that can improve packager performance. Which suggestions may be most relevant can depend on which stages of packager are showing longer times spent in the packagerReport.html and/or which packaging folders and script types are showing longer times spent. The first step is to look in your packager reports to see which stage(s) of packaging are taking the most time.
...
First run deployPackager and obtain timing data on the complete workflow in the packagerReport.html. You may want to run packager multiple times, with different types of scripts committed in different folders. The timing will likely be different for some types of scripts compared to others because packager uses different processes. Such as timing for ddl folder (when using the default ddl packageMethod=convert) versus timing for sql_direct folder or ddl_direct folder (or any folder with any name that uses either packageMethod=direct or packageMethod=ddl_direct) versus timing for function/procedure/etc stored logic folders (uses packageMethod=storedlogic).
...
- In the desktop client/Eclipse GUI, set it to approximate. See Collect Row Counts During Forecast & Deploy here → Configuring Project Settings
- Set it using the hammer CLI command (example: "hammer set enableRowCount approximate"). See enableRowCount here → CLI Commands#set
- If you use the optional project creator script, see enableRowCount here → Creating a Datical Project Using the Project Creation Script (project_creator.groovy)
6. Use the optional Limited Forecast which is typically faster than Full Forecast, please see this page → Limited Forecast
- Caveat: This could have an impact on Rules enforcement if using Rules.
7. Run packager as a different user than the schema owner (so packager drops the schema and re-creates it, instead of clearing out each object).
8. For Oracle/SQL Server/DB2, change the Stored Logic Validity Check from "local" (the default) to the new "limited" value. Using "limited" will be faster than using "local" or "global".
- For Oracle, we recommend running Datical DB 6.12 or higher due to a performance improvement with the "limited" option for Oracle only.
- If you are using the "limited", "local", or "global" setting and you package multiple scripts in the same packaging run in the ddl folder (or with the convert packaging method), we recommend running Datical DB 6.12 or higher because the stored logic validity check will no longer be repeated redundantly for each script.
- The stored logic validity check runs during the deploy section of packager. See enableRowCount here → CLI Commands#set
- If you use the optional project creator script, see enableRowCount here → Creating a Datical Project Using the Project Creation Script (project_creator.groovy)
6. Use the optional Limited Forecast which is typically faster than Full Forecast, please see this page → Limited Forecast
- Caveat: This could have an impact on Rules enforcement if using Rules.
7. Run packager as a different user than the schema owner (so packager drops the schema and re-creates it, instead of clearing out each object).
8. For Oracle/SQL Server/DB2, change the Stored Logic Validity Check to "local" (the default) or "limited" or "disabled", depending on which features you actively use.
- Please see the notes in these pages:
- Performance recommendations:
- If you do not review or use the information in the Stored Logic Validity Check section of your deploy reports, then set storedLogicValidityCheck="disabled" to avoid possible performance slow down for a feature you aren't actively using.
- If you review and use the Stored Logic Validity Check information in your deploy reports but you do not use the storedLogicValidityAction=FAIL option, then we recommend setting storedLogicValidityCheck="limited".
- If you review and use the Stored Logic Validity Check information in your deploy reports and you also have enabled the storedLogicValidityAction=FAIL option, then we recommend setting storedLogicValidityCheck="local".
- Although storedLogicValidityCheck="global" is an available setting and is the most comprehensive, if performance timing is an important consideration then it may be better to use a smaller scope such as "local" or "limited".
- There are different ways you can set the stored logic validity check level, use the method you prefer to set the value:
- In the desktop client/Eclipse GUI, set it to limited. See Stored Logic Validity Check here → Configuring Project Settings
- Set it using the hammer CLI command (example: "hammer set invalidsCheck limited"). See invalidsCheck here → CLI Commands#set
- If you use the optional project creator script, see invalidsCheck here → Creating a Datical Project Using the Project Creation Script (project_creator.groovy)
- In the datical.project file, this is represented as storedLogicValidityCheck="disabled" or storedLogicValidityCheck="limited" or storedLogicValidityCheck="local".
9. Packaging ddl scripts from the sql_direct folder (packageMethod=direct) or from the ddl_direct folder (packageMethod=ddl_direct) is typically faster than from the ddl folder (packageMethod=convert). You could also opt to set packageMethod=ddl_direct for your ddl folder folder for better performance.
- Caveat: If you are not using SQL Parser for Oracle, then only sqlrules would apply in sql_direct and ddl_direct folder, or any folder with any name that uses packageMethod=ddl_direct or packageMethod=direct. (Other types of rules and forecast modeling do NOT apply in the for ddl_direct/direct/sql_direct folder changes if you are not using SQL Parser for Oracle.)
- If you are using Oracle with a recent version of Liquibase Enterprise/Datical DB 7.6 or higher x, you could consider using SQL Parser for Oracle to add forecast modeling and forecast rules.
- When you enable the SQL Parser for Oracle project setting, SQL Parser is applicable by default to the dataddl_dml direct folder (packageMethod=dataddl_dmldirect) and , the sql_direct folder (packageMethod=direct), and sql folder (packageMethod=sqlfile).
- You could also opt to set packageMethod=ddl_direct for your ddl folder using flexible folder configuration so that folder would also use SQL Parser. Using SQL Parser with packageMethod=ddl_direct/packageMethod=direct for ddl would be faster than using the packageMethod=convert for ddl (convert is the default for ddl). You can change the packageMethod for the ddl folder in the metadata.properties file for the ddl folder. Using packageMethod=ddl_direct would make that folder be processed near the beginning of the folder order. Or using packageMethod=direct would make that folder be processed near the end of the folder order.
- If your DML scripts are quite large, for performance reasons you could disable SQL Parser for the DATA_DML folder for performance reasonsany folder where large DML scripts may be packaged. You can disable SQL Parser at the folder level by setting disableSqlParser=true in the metadata.properties file for the DATA_DML that folder. Note that you only need to set disableSqlParser=true is already set by default for the DATA_DML folder in older Datical newer versions 7.5 6 and below. Parser is already disabled for DML by default in newer Datical versions 7.6 and higherhigher.
- There were improvements to SQL Parser for Oracle in versions 7.8 and 7.12. We recommend upgrading to a recent 7.x version if you are using SQL Parser for Oracle.
- Please see these pages:
10. Having the build agent and the database in close proximity can help performance.
...
14. Consider increasing the amount of RAM used by Datical using the Xmx setting. See the instructions here: Increase the amount of RAM used by Datical DBLiquibase Enterprise
15. Upgrade to a current version of Liquibase Enterprise/Datical:
- There were performance improvements during the forecast stage for those who run deployPackager on Windows clients or Windows agents in Datical DB version 6.8 (and higher).There were performance improvements for those who use the Stored Logic Validity Check project setting in Datical DB version 6.12 (and higher).
- There were performance improvements for several operations Status, statusDetails, and Pipeline Status in Datical DB version 6.14 (and higher). Areas where you may notice performance improvements:
- Status/Pipeline Status operations in the Datical DB GUI
- 'status' & 'statusDetails' commands in the CLI Complex operations This also improved operations which run status implicitly (CLI & GUI) - All types of 'deploy' operations, All types of 'rollback' operations, Deploy Packager, Convert SQL, and Change Log Syncsuch as deploy, rollback, deployPackager, convert SQL and changeLogSync.
- There were performance improvements specifically for multi-database/multi-catalog configurations of SQL Server projects in Datical DB version 6.16 (and higher).
- There is a fix for the DATAPUMP API Oracle backup and restore in Datical DB version 6.16 (and higher) to better handle running multiple packager jobs concurrently.
- There is a new cleanup command for packager in Datical DB versions 7.3 (and higher). The cleanup command can be run after any time that you might need to manually interrupt a packager build job midway. The cleanup command is to unblock subsequent packager jobs after a manual interruption by clearing the locks on DATABASECHANGELOGLOCK and DATICAL_SPERRORLOG tables and also restoring REF. Please see this page for more details: How To: Use ReleaseLocks Command and Packager with Cleanup Option
- With Datical DB version 7.6 (and higher), there is a new feature that prevents continuing to run packager jobs after a backup error or restore error. Please see the "Recovering from a Backup or Restore Failure" section here for more details: https://datical-cs.atlassian.net/wiki/spaces/DDOC/pages/896570174/Managing+Database+Backup+and+Restore#ManagingDatabaseBackupandRestore-RecoveringfromaBackuporRestoreFailure
- There were improvements for SQL Parser for Oracle in versions 6.15, 7.6, and 7.8. If you are using SQL Parser for Oracle, we recommend running a recent 7.x versionconfigurations of SQL Server projects in Datical DB version 6.16 (and higher).
- There is a fix for the DATAPUMP API Oracle backup and restore in Datical DB version 6.16 (and higher) to better handle running multiple packager jobs concurrently.
- There is a new cleanup command for packager in Datical DB versions 7.3 (and higher). The cleanup command can be run after any time that you might need to manually interrupt a packager build job midway. The cleanup command is to unblock subsequent packager jobs after a manual interruption by clearing the locks on DATABASECHANGELOGLOCK and DATICAL_SPERRORLOG tables and also restoring REF. Please see this page for more details: How To: Use ReleaseLocks Command and Packager with Cleanup Option
- With Datical DB version 7.6 (and higher), there is a new feature that prevents continuing to run packager jobs after a backup error or restore error. Please see the "Recovering from a Backup or Restore Failure" section here for more details: Recovering from a Backup or Restore Failure
- There were improvements for memory utilization of SQL scripts that produce a high-volume output in 7.11 (and higher).
- There were improvements for SQL Parser for Oracle in version 7.12 (and higher).
- There were improvements for Limited Forecast in version 7.13 (and higher):
- Limited Forecast will only profile tables impacted by the changesets to be forecasted or deployed
- Limited Forecast will only profile the schema impacted by the changesets to be forecasted or deployed in multi-schema projects
- Significant performance improvements for Forecast profiling in version 7.14 (and higher):
- Faster forecasting of Views and Materialized Views
- Faster profiling for tables, columns, and views in multi-schema projects
- Use multiple connections (maximum of 10 connections) to profile schemas simultaneously in Oracle multi-schema projects. Note that with 7.14 (and higher) in Oracle projects with multiple schemas, you may notice higher CPU utilization due to multiple connections being used for Oracle forecast profiling.
16. Although not specifically about packager, it may also be useful to check the items on these pages that may improve Deploy performance:
...