Skip to main content

Controlling data warehouse ETL Tasks

Once the data warehouse tables have been created and the ETL Set has been generated, you can then proceed to run the data warehouse ETL task. The data warehouse ETL task extracts data from the staging tables and load it into the data warehouse tables.


Ingesting a historical record deletes any history that is later than the ingested record. For example, if a data warehouse contains the following historical records:

2012 - Boston

2014 - Chicago

2015 - New Jersey

Ingesting the record 2013 - New York will delete the 2014 and 2015 records.

Data warehouse ETL tasks can be run manually, scheduled to run periodically or run as part of a workflow. The section below describes how to run a data warehouse task manually. For information on scheduling data warehouse tasks or including them in a workflow, see Controlling and monitoring tasks and workflows .


Data warehouse ETL tasks cannot run in parallel with data mart ETL tasks. Data warehouse ETL tasks that update the same tables cannot run in parallel.

  1. Click the Manage button in the bottom right of the Data Warehouse panel. The Manage ETL Set dialog box opens.

  2. If you have more than one ETL Set, in the left pane, select the ETL Set that you want to generate.
  3. Click the Run toolbar button. The dialog box switches to Monitor view and a progress bar shows the current progress in terms of percentage.

    You can stop the task at any time by clicking the Abort toolbar button. This may be necessary if you need to urgently edit the task settings due to some unforeseen development. After editing the task settings, simply click the Run button again to restart the task.


    Aborting a task may leave the data warehouse tables in an inconsistent state. Consistency will be restored the next time the task is run.

  4. When the progress reaches 100% completed, close the Manage ETL Set dialog box.

Other monitoring information such as the task details (i.e. the number of rows inserted/updated) and the task log files can be accessed by clicking the Run Details and Log buttons respectively.

Once the data warehouse has been successfully loaded into the data warehouse tables, you can then proceed to the final part of the Compose for Data Warehouses workflow - defining and populating data marts. For more information, see Creating and managing data marts .