Exporting and importing data pipelines
You can export a data pipeline project to a file that contains everything required to reconstruct the data project. The export file can be imported to the same tenant, or to another tenant. You can use this, for example, to move data projects from one tenant to another, or to make backup copies of data projects. You can also update a data project from an export file.
The export file format has been changed from a single JSON file to a ZIP file containing several JSON files. The old JSON format will be supported for import until January 30, 2026. To convert your JSON files to the new format you can import them, and then export them to the new ZIP format. Projects in the old JSON format cannot be imported via the API.
Users are advised to export their existing projects to enjoy the new format by January 30, 2026.
Exporting a data project
To export a project, do one of the following:
-
In Data Integration > Projects, click
on the project to export, and select Export.
-
Open the project, click
in the top right, and select Export.
The project is exported to a ZIP file with a file name that consists of the project name, the data platform, and a timestamp.
Importing projects
This section covers importing projects. You can import a cloud data warehouse project or a Qlik Cloud (via Amazon S3) project.
You can change which data platform to use with the following limitations:
-
It is not possible to change data platform from a cloud data warehouse to Qlik Cloud, or the other way around.
-
It is not possible to change a Snowflake project using landing to cloud file storage to another data platform, or the other way around.
Before importing a project
Before you start importing a project consider:
-
Create all new connections that you will need iif you are importing to a new tenant or space.
-
If the project uses SaaS application connections that do not exist yet, you must create the connections and generate metadata before you start importing.
-
If you are importing a cross-project pipeline, best practice is to import the upstream projects first.
Importing a cloud data warehouse project
You can import an exported cloud data warehouse project to the same tenant it was exported from, or to another tenant. When the project is imported into a tenant other than the original data project's tenant, you need to define new connections for the project, the staging area and for all data sources.
If the project consumes tasks from other projects, you must map projects and tasks unless names for spaces and projects are identical.
-
In Data Integration > Projects, click Create new and select Import project.
-
Add the export file. You can either drop it on the dialog, or browse to select the file.
Click Next.
-
Set Project properties for the new project.
You must select which space to add the project to in Space.
In Data platform you can change the data platform of the project, and the connection to the data platform.
Changing Connection is required if you imported a project from another tenant, or if you changed the data platform in the previous step.
You can change the connection to the staging area. This is required if you imported a project from another tenant, or in some cases if you changed the data platform in the previous step.
Click Next.
-
Set Default project settings for the new project.
You can add a prefix to the data schemas that are created in the project in Prefix for all schemas. This is useful when the imported project is in the same cloud data warehouse as the exported project.
You can also set a default name in Database name. For Snowflake projects you can set a default Data warehouse name, and for Databricks projects you can set a default Catalog name. You can use the project default name for all task types, or set the name to default or a custom name for each task type.
Click Next.
-
Set Connections and task settings.
You can replace the imported source connections or cross-project sources. This is required if you imported a project from another tenant.
In Optional task settings you can also change task settings that were overridden in the original project.
-
When you are ready, click Import.
The project is added to Data Integration home.
Importing a project with Qlik Cloud as data platform
You can import an exported Qlik Cloud (via Amazon S3) project to the same tenant it was exported from, or to another tenant. When the project is imported into a tenant other than the original data project's tenant, you need to define new connections for the project, the staging area and for all data sources.
It is not possible to change data platform from Qlik Cloud to a cloud data warehouse, such as Snowflake.
-
In Data Integration > Projects, click Create new and select Import project.
-
Add the export file. You can either drop it on the dialog, or browse to select the file.
Click Next.
-
Name
Change the name of the project. The default name is the original project name prefixed with Imported_.
-
Space
Select which space to add the project to. -
Description
Add or edit the description of the project. -
Store QVD files in:
Select where to generate QVD files.
-
Qlik managed storage
-
Customer managed storage
Amazon S3 storage managed by you.
-
-
Data connection
If you selected Customer managed storage, you can change the connection to the Amazon S3 storage area.
This is required if you imported a project from another tenant.
-
Connection to staging area
You can change the connection to the Amazon S3 staging area.
This is required if you imported a project from another tenant, or in some cases if you changed the data platform in the previous step.
-
Click Next.
-
Set connections for the tasks added in the original version
You can replace the imported source connections.This is required if you imported a project from another tenant.
-
In Optional task settings you can change task settings that were overridden in the original project.
-
When you are ready, click Import.
The project is added to Data Integration home.
Updating a project
You can update a project from an export file. This will replace all tasks in the data pipeline, but connections and settings will not be replaced. Data tasks that are not included in the imported project will be removed.
For example, you can import a project exported from the development data space into a project in the production data space to update the production project.
Before you start updating the project:
-
If you want a backup of the project before you update, export it by clicking
, and then Export.
-
You must stop all tasks that will be removed from the data pipeline before you update the project.
-
If the project uses SaaS application connections that do not exist yet, you must create the connections and generate metadata before you start importing.
-
Make sure that the imported project uses the same cloud data platform, for example, Snowflake.
To update a project:
-
Open the project that you want to update.
-
Click
, and then click Import.
-
Select or drop the file that you want to import.
-
Make any required changes for mapping connections that are different between the project and the imported project.
For example, the imported project could be using a source connection named SQL1, while this project uses a similar connection named SQL2. In this case, map the imported connection to SQL2 in Set connections for the tasks added in the original version.
Information noteWhen selecting a connection to map, you can create a new database connection, but not a SaaS application connection.Click Import when you are ready.
The project is now updated according to the imported file. You may need to validate and sync data tasks that have been updated through the import.
Limitations
-
Notifications are not included when you export a pipeline. You need to set up new notifications in the imported pipeline.