Skip to main content Skip to complementary content

Configuring a HDFS connection to run on Spark

Using the tHDFSConfiguration component, you can connect your HDFS filesystem to Spark.

Before you begin

Procedure

  1. In the Repository, expand Metadata > Hadoop Cluster, then expand the Hadoop cluster metadata of your choice.
    1. Expand the HDFS folder of your Hadoop cluster metadata.
    2. Drag-and-drop the HDFS metadata onto the Designer.
    3. Select a tHDFSConfiguration component.
      The Hadoop Configuration Update Confirmation window opens.
  2. Click OK.

Results

Talend Studio updates the Spark configuration so that it corresponds to your cluster metadata.

What to do next

In the Run view, click Spark Configuration. The execution is configured with the HDFS connection metadata.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – please let us know!