Skip to main content Skip to complementary content
Close announcements banner

What's new?

This section describes the new and enhanced features in Compose November 2023 and Compose November 2023 SR1.

Information noteIn addition to these release notes, customers who are not upgrading from the latest GA version are advised to review the release notes for all versions released since their current version.

Customers should also review the Replicate release notes in Qlik Community for information about the following:

  • Migration and upgrade
  • End of life/support features
  • Newly supported versions and third-party software
  • Resolved issues
  • Known issues

Features and enhancements introduced in Compose November 2023 Service Release 1

Support for bulk generation of data warehouse and workflow tasks

In previous versions, there was no way to generate multiple data warehouse tasks or workflow tasks in a single operation. Now, you can select the tasks you want to generate in the new Bulk Generate dialog, as well as the validation level (basic or all) for all selected tasks.

Generating data warehouse tasks

Generating workflow tasks

Support for basic validations when generating data mart tasks

In the past, data mart tasks were generated with All Validations by default and there was no option to choose Basic Validations. As All Validations access the database to verify the existence of columns used in expressions and lookups, they could take a long time to complete are often not required. From this version, the default task generation has now been changed to Basic Validations with an option to choose All Validations.

Adding data marts and star schemas

Support for editing data warehouse tasks

From this version it is now possible to edit the task type (Full Load or Change Processing) as well as other task properties.

Adding, editing, and duplicating tasks

Other enhancements

Compose CLI data mart processing enhancement

Support for the --timeout -1 parameter was added to the Compose CLI mark_reload_datamart_on_next_run command. This parameter overrides the server call's default timeout in seconds and can be used to prevent timeouts when processing very large data marts.

UI enhancement

You can now sort columns in the Manage Data Storage Tasks and Monitor Details windows.

Snowflake enhancements

  • To align with the updated behavior of Snowflake on AWS auto-increment columns, newly added auto-increment columns will use the new ORDERED modifier, as needed.
  • It is now possible to limit the number of data warehouse task runs checked by the data mart task. To do this, set the following Compose environment variable:

    qlk__MissingSatIDsLatestRuns

    This might improve performance in certain scenarios.

Using inner joins with Transactional data marts

When working with Transactional data marts, it is now possible to use inner joins for dimensions instead of sub-queries.

To turn on this feature, set the following environment variables to "true":

qlk__PersistDenormForFctT

qlk__PersistPreselForFctT

Features and enhancements introduced in Compose November 2023 Initial Release

Support for choosing the task mode for new data warehouse tasks

In previous versions, users needed to duplicate the Full Load task in order to configure a Change Processing task. From this version, you can now choose whether to run the data warehouse task in Full Load mode or in Change Processing mode.

See: Adding, editing, and duplicating tasks

Support for using a separate schema for data mart tables in Amazon Redshift

The option to configure a separate schema for data mart tables has been extended to support Amazon Redshift.

Azure Synapse Analytics enhancements

Warning noteSome of the enhancements described below require setting a Windows environment variable. If you set or unset an environment variable, the change will only take effect after you restart the Qlik Compose service.
  • HEAP staging tables support: Two environmental variables have been added: "qlk__FullLoadStagingTablesAsHeap" and "qlk__CDC_StagingTablesAsHeap". Set these variables to 'true' or '1', to create the staging tables as HEAP tables for Full Load or CDC tasks respectively.
  • Added the ability to set the statistics threshold for data mart ETL: Now there are two statistics thresholds for Synapse that can be set by the user using the following system environment variables:

    1. For the data warehouse ETL, use "qlk__UpdateStatisticsPercentageDwh"

      This is used for updating the statistics of the Hub and satellite tables.

    2. For the data mart ETL, use "qlk__UpdateStatisticsPercentageDma"

      This is used for updating the statistics for the fact and dimension tables.

    Notes:

    • Values should be between 0 and 100. A value less than 0 will be converted to 0; in this case, the command to update the statistics will be skipped.
    • A value exceeding 100 will be converted to 100.
    • If a value cannot be interpreted as an integer, the default value (20) will be used.
    • If this variable is not present, then the default value (20) will be used.
    • The "UpdateStatisticsPercentage" system environment variable is no longer supported.
  • The JDBC and ODBC additional properties will no longer be overridden: On the first deployment, Compose copies all the connection parameters including JDBC and ODBC additional properties. On subsequent deployments, the parameters will not be overridden in the target environment.
  • Improved performance: Revised ELT statements to reduce number of statements and improve performance running against Synapse including:

    • Skipping statements when not needed (based on run-time metadata)
    • Combining multiple statements into a single one
    • Managing Staging table (create/insert/index) based on runtime metadata

Snowflake enhancements

Warning noteSome of the enhancements described below require setting a Windows environment variable. If you set or unset an environment variable, the change will only take effect after you restart the Qlik Compose service.
  • Data mart performance improvement: Each SELECT is replaced by SELECT DISTINCT to improve Snowflake's performance with data mart tasks.

    Information noteIn some environments, using the 'DISTINCT' keyword for Snowflake might cause performance degradation. If this is the case, you can suppress the 'DISTINCT' keyword by setting the environmental variable "qlk__DisableCteDistinct" to either '1' or 'true'.
  • Reduced Snowflake storage costs by adding support for Transient Tables: In previous versions, Compose would create TSTG and TTMP objects in Snowflake during ELT processes, which would increase customers' data storage costs. From this version, Compose will create Snowflake Transient Tables for temporary data storage during ETL processes, thereby significantly reducing costs.

  • Key pair authentication: Snowflake key pair authentication is now supported.

    Information noteKey pair authentication is supported in both standard and advanced mode, and with both JDBC and ODBC.

    See: Defining the connection parameters

Other enhancements to Data Warehouse projects

Warning noteSome of the enhancements described below require setting a Windows environment variable. If you set or unset an environment variable, the change will only take effect after you restart the Qlik Compose service.
  • Data mart obsolete indication: Optimized implementation of the data mart obsolete indication.
  • Transactional data mart performance: Performance improvements were made to transactional data marts.
  • Optimized the method for updating Type 2 dimensions: Before generating the ETL for this, you first need to set the environmental system variable 'qlk__NewPreselectDim' to either '1' or 'true'.

  • Expressions: Added the option to evaluate NULL when testing an expression.

  • Migration performance: Improved performance with Qlik Compose migration operations.

  • Data mart export/import: Exporting and importing data marts now includes the "Table Creation Modifiers" column. This will enable you to customize the fact or dimension table creation modifiers.

    Notes:

    • If the column value is empty, the project default will be used.
    • The project default value is not included in the export/import.
  • Optimization of dropping and creating tables in an empty schema: From this version, when a schema does not exist, Compose will try to create it (and return an error if it fails). Additionally, if the new schema is empty, Compose will not try to drop tables from the previous schema.
  • Mappings for target columns not mapped to source: A new option has been added to the Task Settings: When a data warehouse column is unassigned. The new option enables you to set unassigned columns to NULL or to use a previous column value.

    Information noteThis setting will be ignored if "backdating" is used.

    See: Modifying task settings

  • CLI task generation: Added the ability to generate tasks using the Compose CLI at project, task, data warehouse, and data mart level.

    See: Generating tasks using the CLI

  • Logging: The logging (for DWH and Data marts) can now be controlled by the following environmental system variable:

    qlk__LoggingType

    The following options are possible:

    • None - No logging at all
    • Deferred - All logging info will be stored in runtime variables, which will be used to add the logging information in a single statement at the end of the task. When this variable is not available or when it has other values, the logging will be as usual.
  • Compose CLI in Data Warehouse projects: Added the ability to update custom ETLs in Data Warehouse projects using the Compose CLI. This functionality can be incorporated into a script to easily update Custom ETLs.

    See: Creating and managing custom ETLs

Enhancements to Data Lake projects

  • Apache Impala views in Data Lakes projects: The header__batch_modified column will now be cast as varchar(32) for the outbound Apache Impala views. To leverage this enhancement, you need to set an environment variable.

  • Databricks: Added support for Unity Catalog.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!