Skip to main content Skip to complementary content

Create a Databricks connection

You need to know the Databricks server and database name to create a connection. You must also have access credentials. Once you have created a connection to your Databricks database, you can select data from the available tables and load that data into your app.

In Qlik Sense and Qlik Cloud Analytics Services, you connect to Databricks through the Data manager or the Data load editor.

In QlikView you connect to a Databricks database through the Edit Script dialog.

Setting up the connection properties

Qlik Sense and Qlik Cloud Analytics Services database properties

Connection properties that can be configured
Database property Description Required
Host name The IP address or host name of the Databricks server. Yes
Port Server port for the Databricks connection. Yes
Catalog

If your Databricks host supports Unity Catalog, specify the catalog name. Otherwise, leave this field empty.

Information noteThe catalog name is case sensitive.
Information noteIn some cases, the default catalog value may be set to hive_metastore, even in workspaces where unity catalog is not enabled. In this scenario, the catalog name must be specified as hive_metastore, otherwise the connection will fail. .

See Set up and manage Unity Catalog to check if the workspace is enabled for unity catalogs.
See What are catalogs in Azure Databricks to identify the catalog name that must be specified in the connection dialog.
Only if your Databricks host supports Unity Catalog.
Database The name of the Databricks database. If you specified a catalog, you must specify a database in that catalog. Yes
HTTP Path Databricks compute resources URL. Yes

QlikView database properties

Database properties that can be configured
Database property Description Required
Spark Server Type The type of Databricks server can be Shark Server, Shark Server 2, or Spark Thrift Server. Yes
Host name The IP address or host name of the Databricks server. Yes
Port Server port for the Databricks database. Yes
Schema The name of the Databricks database schema. Yes
Thrift Transport Can be set to Binary, SASL, or HTTP. Default = HTTP Yes

Authenticating the driver

Databricks connectors have the following authentication mechanisms:

  • User name

  • User name and password

  • No authentication

  • Azure OAuth

    Information noteWhen using Direct Access gateway, Azure OAuth authentication require Direct Access gateway 1.6.6 or later.

Qlik Sense and Qlik Cloud Analytics Services: Databricks authentication properties

These properties apply to authentication with:

  • User name

  • User name and password

  • No authentication

Authentication properties that can be configured
Property Description
Mechanism If the Databricks Server Type is Shark Server, you must select No Authentication. If the Databricks Server Type is Databricks Thrift Server, most configurations require User Name authentication. Selecting User Name or User Name And Password gives you the option to set up Account properties.
User Name User name for the Databricks connection.
Password Password for the Databricks connection.
Name

Name of the Databricks connection. The default name will be used if you do not enter a name.

Databricks configuration for OAuth

Your Databricks workspace must be configured to use OAuth.

Do the following:

  1. Have a Databricks service hosted in Azure.

  2. Register OAuth app in Azure. For more information, see Configure an OpenID Connect OAuth application from Azure AD app gallery.

  3. Assign the following API to the OAuth app you created: https://azuredatabricks.net//user_impersonation.

QlikView: Databricks authentication properties

Authentication properties that can be configured
Property Description
Mechanism Authentication with user name only, with user name and password, or with no authentication. If the Databricks Server Type is Shark Server, you must select No Authentication. If the Databricks Server Type is DatabricksDatabricks Thrift Server, most configurations require User Name authentication.
User name Username for the Databricks connection.
Password Password for the Databricks connection.

Account properties

Credentials

Credentials are used to prove that a user is allowed to access the data in a connection.

There are two types of credentials that can be used when making a connection in Qlik Sense SaaS. If you leave the User defined credentials check box deselected, then only one set of credentials will be used for the connection. These credentials belong to the connection and will be used by anyone who can access it. For example, if the connection is in a shared space, every user in the space will be able to use these credentials. This one-to-one mapping is the default setting.

If you select User defined credentials, then every user who wants to access this connection will need to input their own credentials before selecting tables or loading data. These credentials belong to a user, not a connection. User defined credentials can be saved and used in multiple connections of the same connector type.

In the Data load editor, you can click the Primary key underneath the connection to edit your credentials. In spaces or Data manager, you can edit credentials by right-clicking on the connection and selecting Edit Credentials.

See which authentication type applies on each connector's page.

Account properties that can be configured
Account property Description
User defined credentials Select this check box if you want users that access this connection to have to input their own credentials. Deselect this check box if credentials can be shared with anyone who has access to this connection.
New credentials Drop-down menu item that appears if User defined credentials is selected.
Existing credentials Drop-down menu item that appears if User defined credentials is selected.
User User name for the connection.
Password Password for the connection.
Credentials name Name given to a set of user defined credentials.

Setting SSL options

SSL options that can be configured
Property Description Required
Trusted Certificate The full path to the SSL certificate if it is not stored in the standard system location. Yes, if certificate is not stored in the standard system location.
Allow Self-signed Server Certificate Accept an SSL certificate from the server that is self-signed and not verified by a trusted authority. No
Allow Common Name Host Name Mismatch Allow a mismatch between the SSL certificate's common name and the name provided in Host name field. No
Information noteSSL is enabled by default.

Miscellaneous properties

Miscellaneous properties and options that can be configured
Property Description
Query timeout Amount of time before a data load query times out. Can be set from 30 seconds to 65535 seconds. Default is 30 seconds.

Load optimization settings

Load properties that can be configured
Property Description Required
Max String Length

Maximum length of string fields. This can be set from 256 to 16384 characters. The default value is 4096. Setting this value close to the maximum length may improve load times, as it limits the need to allocate unnecessary resources. If a string is longer than the set value, it will be truncated, and the exceeding characters will not be loaded.

No

Advanced options

Information noteThis section is for advanced users who want to add additional connection parameters that are not displayed above.
Advanced options that can be configured
Option Description Required

Name

Name of the property. You can add additional properties by clicking the plus sign.

No
Value

Value of the property.

No
Information noteWhen you connect to a Databricks database in the Data load editor or the Edit Script dialog, you can click Test Connection before you create it.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – please let us know!