Skip to main content

Prerequisites

Before you begin to work with a Hadoop cluster as a target in Qlik Replicate, make sure that the following prerequisites have been met:

  • General:

    • The Hadoop WebHDFS must be accessible from the Qlik Replicate machine.
    • The Hadoop Data Nodes must be accessible from the Qlik Replicate machine.
    • The Hadoop WebHDFS service must be running.
  • ODBC Access:

    When accessing Hive using ODBC, the following ODBC drivers are supported:

    • Hortonworks: Cloudera ODBC Driver for Apache Hive 2.06.09.1009 only.

    • Cloudera: Cloudera ODBC Driver for Apache Hive 2.06.09.1009 only.

      Information note

      Cloudera ODBC drivers 2.5.20 or later do not support the Snappy compression method.

  • SSL: Before you can use SSL, you first need to perform the following tasks:

    • Configure each NameNode and each DataNode with an SSL certificate (issued by the same CA).
    • Place the CA certificate on the Replicate Server machine. The certificate should be a base64-encoded PEM (OpenSSL) file.
  • Permissions: The user specified in the Hadoop target settings must have write permission for the specified HDFS target directory.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!