Migrating tasks
You can migrate data warehouse tasks, data mart tasks, and custom ETLs (tasks) from one environment to another, while preserving custom objects in the target environment. This is especially useful for customers who wish to incrementally updated production environments with new versions from the test environment.
The following files will be exported:
For the data warehouse:
- <specified export folder>/customEtl.csv - Contains details of any (enabled or disabled) custom ETLs defined for the task.
- <specified export folder>/taskCustomEtl.csv - Lists any enabled custom ETLs defined for the data warehouse task.
- <specified export folder>/taskSettings.csv - Contains details of the task settings defined for each of the data warehouse tasks.
- <specified export folder>/tasks.csv - Lists the data warehouse tasks.
- <specified export folder>/taskDataWarehouseTables.csv - Lists the data warehouse tables and properties.
- <specified export folder>/taskMappings.csv - Lists the mappings used in the task.
- <specified export folder>/SQL/DW_Custom_ETL_<custom ETL name>.SQL - One SQL file for each custom ETL.
For each data mart:
- <specified export folder>/<data mart name>/customEtl.csv - Contains details of any custom pre-loading or post-loading ETLs defined for the data mart task
- <specified export folder>/<data mart name>/taskSettings.csv - Contains details of the task settings
- <specified export folder>/<data mart name>/SQL/DM_Custom_ETL_<custom ETL name>.SQL - One SQL file for each custom ETL
Considerations
Export considerations
- Parameters will be written to CSV files in alphabetical order (as they appear in the web console).
Import considerations
- Importing tasks or custom ETLs will override any existing objects with the same names.
- Importing logical entities or mappings that do not exist in the target project will result in failure.
Valid CSV file formats
The CSV files must be in a valid format. For more detailed information, see the notes in Valid CSV file formats.
Data warehouse CSV file formats
Tasks file
The tasks.csv file consists of one row per task.
Header name | Mandatory | If column is missing | If column exists but value is empty | Comments |
---|---|---|---|---|
Task name | Yes |
Reject |
Reject | Case insensitive |
Description | No |
Empty |
Empty | - |
Type | No | Full Load Only | Reject |
Full Load Only or Change Tables Only |
Task settings file
The taskSettings.csv file consists of one row per setting.
Header name | Mandatory | If column is missing | If column exists but value is empty | Comments |
---|---|---|---|---|
Task Name | Yes |
Reject |
Reject | Case insensitive |
Setting Name | Yes | Reject | Reject | Case insensitive |
Setting Value | Yes | Reject | Reject | - |
Task entities file
The taskDataWarehouseTables.csv file consists of one row per entity.
Header name | Mandatory | If column is missing | If column exists but value is empty | Comments |
---|---|---|---|---|
Task Name | Yes |
Reject |
Reject | Case insensitive |
Entity Name | Yes |
Reject |
Reject | Case insensitive |
Handle duplicates | Yes |
Reject |
Reject | Boolean |
Task mappings file
The taskMappings.csv consists of one row per task for each mapping that is used in the task.
Header name | Mandatory | If column is missing | If column exists but value is empty | Comments |
---|---|---|---|---|
Task Name | Yes |
Reject |
Reject | Case insensitive |
Mapping Name | Yes |
Reject |
Reject | Case insensitive |
Custom ETL file
The customEtl.csv consists of one row for each custom ETL defined for the task (Pre-Loading, Multi-Table, Single Table, or Post-Loading), regardless of whether or not the ETL is enabled.
Header name | Mandatory | If column is missing | If column exists but value is empty | Comments |
---|---|---|---|---|
Name | Yes |
Reject |
Reject | Case insensitive |
Description | No | Empty | Empty | - |
Type | Yes | Reject | Reject | Pre Loading ETL, Multi Table ETL, Single Table ETL or Post Loading ETL |
Entity | Yes | Reject |
Reject |
Relevant only for single table ETL |
Sequence Number | Yes | Reject | Reject | Valid values are positive integer numbers |
Execute as Stored Procedure | Yes | Reject | Reject | Boolean |
-
For each custom ETL, Compose will export/import an SQL file to:
<specified export folder>/SQL
- The file name will be DW_Custom_ETL_<custom ETL name>.SQL
-
If you wish to edit the file name, make sure that it only contains the following characters: A-Z, 0-9, underscore (_), or space. On import, any other character will be replaced with an underscore.
Task custom ETL file
The taskCustomEtl.csv file consists of one row for each enabled custom ETL used in the data warehouse task.
Header name | Mandatory | If column is missing | If column exists but value is empty | Comments |
---|---|---|---|---|
DWH Task Name |
Yes |
Reject |
Reject |
Case insensitive |
Custom ETL Name |
Yes |
Reject |
Reject |
Case insensitive |
Data mart CSV file formats
Task settings file
The taskSettings.csv file consists of one row per setting.
Header name | Mandatory | If column is missing | If column exists but value is empty | Comments |
---|---|---|---|---|
Setting Name | Yes | Reject | Reject | Case insensitive |
Setting Value | Yes | Reject | Reject | - |
Custom ETL file
The customEtl.csv file consists of one row per custom ETL (Pre Loading ETL or Post Loading ETL).
Header name | Mandatory | If column is missing | If column exists but value is empty | Comments |
---|---|---|---|---|
Name | Yes |
Reject |
Reject | Case insensitive |
Description | No | Empty | Empty | - |
Type | Yes | Reject | Reject | Pre Loading ETL or Post Loading ETL |
Active | Yes | False | False | Boolean |
Sequence Number | Yes | Reject | Reject | Valid values are positive integer numbers |
Execute as Stored Procedure | Yes | Reject | Reject | Boolean |
-
For each custom ETL, Compose will export/import an SQL file to:
<specified export folder>/<data mart name>/SQL
- The file name will be DM_Custom_ETL_<custom ETL name>.SQL
-
If you wish to edit the file name, make sure that it only contains the following characters: A-Z, 0-9, underscore (_), or space. On import, any other character will be replaced with an underscore.