Modifying data mart settings
You can modify the data mart settings according to your needs.
To modify data mart settings:
-
In the Manage Data Marts window, select a data mart and click Settings.
The Setting - Data Mart Name window opens. In the General tab, the following settings are available:
- Log level: Select the log level granularity, which can be any of the following:
- INFO (default) - Logs informational messages that highlight the progress of the ETL process at a coarse-grained level.
- VERBOSE - Logs fine-grained informational events that are most useful to debug the ETL process.
TRACE - Logs finer-grained informational events than the VERBOSE level.
The log levels VERBOSE and TRACE impact performance. Therefore, you should only select them for troubleshooting if advised by Qlik Support.
-
Load Type: Select Full rebuild to build the data mart from scratch each time or Incremental loading (default) to only load changes.
Information noteIncremental loading is not available for Aggregated or State Oriented fact tables.
-
Create tables in database - By default, data mart tables are created in the database specified in the data warehouse connection settings. Optionally, click the browse button and select a different database.
Information noteThis option is only available for Microsoft SQL Server.
-
Create tables in schema - By default, data mart tables are created in the schema specified in the data warehouse connection settings. Optionally, specify a different schema, either by typing the schema name or by clicking the browse button and selecting one of the existing schemas. If you specify the name of a non-existing schema, Compose will create the schema automatically.
Information noteThis option is only available for Microsoft SQL Server, Amazon Redshift, Snowflake, and Microsoft Azure Synapse Analytics.
-
Create views in schema - By default, data mart views are created in the schema specified in the data warehouse connection settings. Optionally, specify a different schema, either by typing the schema name or by clicking the browse button and selecting one of the existing schemas. If you specify the name of a non-existing schema, Compose will create the schema automatically, unless the database is Oracle.
Information noteIf the view schema is different from the data mart schema, you need to grant the following permission:
Grant SELECT on DM_SCHEMA to DM_VIEW_SCHEMA WITH GRANT OPTION
See also Data mart views.
- Log level: Select the log level granularity, which can be any of the following:
-
In the Advanced tab, the following settings are available:
- Sequential Processing: Select this option if you want to run all the data mart tasks sequentially, even if they can be run in parallel. This can be good for debugging or profiling.
-
Maximum number of database connections: Enter the maximum number of connections allowed. The default size is 10.
For more information, see Determining the required number of database connections.
-
JVM memory settings: Edit the memory for the java virtual machine (JVM) if you experience performance issues. Xms is the minimum memory; Xmx is the maximum memory. The JVM starts running with the Xms value and can use up to the Xmx value.
Information noteOnly the following characters are supported (shown as a regular expression):
/^[-a-zA-Z0-9:]*$/
- Position in Default Workflow: Set the position you want the data mart to appear in the default workflow. For more information on workflows, see Workflows.
- Optimize for initial load: This option is not applicable to the State Oriented and Aggregated fact types. If the "Incremental Loading" option is enabled (the default), clear the "Optimize for initial load" option after the initial load task completes and regenerate the Data Mart task. If the "Full Rebuild" option is enabled, selecting "Optimize for initial load" may accelerate the loading process.
- Write task statement duration to the TLOG_PROCLOG table in the data warehouse: This option is useful for troubleshooting performance issues with task statements as it records the duration of each task statement in a special table (named TLOG_PROCLOG) in the data warehouse. You can then use this information to locate task statements with abnormal duration times and modify them accordingly.
- Do not drop temporary tables: Select this option if you want to keep the temporary tables created during the task. Only use for debugging.
- Enable table logging: This option is available for Oracle only. When enabled, the data mart tables will be created with the Oracle LOGGING option enabled. Leaving this option unselected (the default) should improve performance, but in some cases DML operations will not be recorded in the redo log file. For more information on this option, refer to the Oracle online documentation.
- Click OK.