Skip to main content Skip to complementary content

Exporting and importing data pipelines

You can export a data pipeline project to a JSON file that contains everything required to reconstruct the data project. The exported JSON file can be imported to the same tenant, or to another tenant. You can use this, for example, to move data projects from one tenant to another, or to make backup copies of data projects. You can also update a data project from a JSON export file.

Exporting a data project

  • In Data Integration home, click on the data project to export, and select Export.

The data project is exported to a JSON file with a file name that consists of the data project name, the data platform, and a timestamp.

Warning noteDo not edit the exported JSON file. Doing so can result in a data project file that cannot be imported.

Importing a cloud data warehouse data project

Information noteThis section covers importing a data project with a cloud data warehouse as data platform. For information about importing a data project with Qlik Cloud (via Amazon S3) as data platform, see Importing a data project with Qlik Cloud as data platform.

You can import an exported cloud data warehouse data project to the same tenant it was exported from, or to another tenant. When the data project is imported to another tenant than the original data project, you need to define new data connections for the data project, the staging area and for all data sources.

You can change which data platform to use, but it is not possible to change data platform from a cloud data warehouse to Qlik Cloud.

  1. Click Add new in Data Integration home, and select Import data project.

  2. Add the data project JSON file. You can either drop it on the dialog, or browse to select the file.

  3. Name

    Change the name of the data project. The default name is the original data project name prefixed with Imported_.

  4. Space
    Select which space to add the data project to.

  5. Description
    Add or edit the description of the data project.

  6. Data platform

    You can change the data platform of the data project.

  7. Data connection

    You can change the data connection to the data platform.

    This is required if you imported a data project from another tenant, or if you changed the data platform in the previous step.

  8. Connection to staging area

    You can change the connection to the staging area.

    This is required if you imported a data project from another tenant, or in some cases if you changed the data platform in the previous step.

    Information noteThis is not required if the data platform is Snowflake.
  9. Replace imported source connections
    You can replace the imported source connections.

    This is required if you imported a data project from another tenant.

  10. Data schemas prefix

    You can add a prefix to the data schemas that are created in the data project. This is useful when the imported data project is in the same cloud data warehouse as the exported data project.

  11. Replace imported source databases and schemas

    You can replace the source schema for landing tasks, and the source database and schema for registered data.

    Select a task and replace the values in New schema and New database.

  12. Default database names

    If the data platform is Snowflake or Microsoft Azure Synapse Analytics, you can change default database names.

  13. Default warehouse names

    If the data platform is Snowflake, you can change default warehouse names.

  14. When you are ready, click Upload.

The data project is added to Data Integration home.

Importing a data project with Qlik Cloud as data platform

You can import an exported Qlik Cloud (via Amazon S3) data project to the same tenant it was exported from, or to another tenant. When the tenant is imported on another tenant than the original data project, you need to define new data connections for the data project, the staging area and for all data sources.

It is not possible to change data platform from Qlik Cloud to a cloud data warehouse, such as Snowflake.

  1. Click Add new in Data Integration home, and select Import data project.

  2. Add the data project JSON file. You can either drop it on the dialog, or browse to select the file.

  3. Name

    Change the name of the data project. The default name is the original data project name prefixed with Imported_.

  4. Space
    Select which space to add the data project to.

  5. Description
    Add or edit the description of the data project.

  6. Store QVD files in:

    Select where to generate QVD files.

    • Qlik managed storage

    • Customer managed storage

      Amazon S3 storage managed by you.

  7. Data connection

    If you selected Customer managed storage, you can change the data connection to the Amazon S3 storage area.

    This is required if you imported a data project from another tenant.

  8. Connection to staging area

    You can change the connection to the Amazon S3 staging area.

    This is required if you imported a data project from another tenant, or in some cases if you changed the data platform in the previous step.

  9. Replace imported source connections
    You can replace the imported source connections.

    This is required if you imported a data project from another tenant.

  10. When you are ready, click Upload.

The data project is added to Data Integration home.

Updating a data project

You can update a project from a JSON export file. This will replace all tasks in the data pipeline, but connections and settings will not be replaced. Data tasks that are not included in the imported data project will be removed.

For example, you can import a data project exported from the development data space into a data project in the production data space to update the production data project.

Before you start updating the data project:

  • If you want a backup of the data project before you update, export it by clicking , and then Export.

  • You must stop all tasks that will be removed from the data pipeline before you update the data project.

  • If the data project uses SaaS application connections that do not exist yet, you must create the connections and generate metadata before you start importing.

  • Make sure that the imported data project uses the same cloud data platform, for example, Snowflake.

To update a data project:

  1. Open the data project that you want to update.

  2. Click , and then click Import.

  3. Select or drop the JSON file that you want to import.

  4. Make any required changes for mapping connections that are different between the data project and the imported data project.

    For example, the imported data project could be using a source connection named SQL1, while this data project uses a similar connection named SQL2. In this case, map the imported connection to SQL2 in Replace imported source connections.

    Information noteWhen selecting a connection to map, you can create a new database connection, but not a SaaS application connection.

    Click Import when you are ready.

The data project is now updated according to the imported JSON file. You may need to validate and sync data tasks that have been updated through the import.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!