Azure Storage Metadata
The Qlik Azure Storage Metadata connector uses the Azure Blob API to access your Azure Storage metadata, like the names of files and subfolders in your Azure Storage account.
- Qlik Sense Business
- Qlik Sense Enterprise SaaS
- Qlik Sense Enterprise on Windows
- Qlik Sense Desktop
If you are using this connector on Qlik Sense Enterprise on Windows or Qlik Sense Desktop, you will need to set an encryption key.
For more information, see Setting an encryption key.
Ways to access your data
To access your Azure Storage metadata, you need to authenticate the connector with your Azure Storage account credentials. After you create a connection, select Azure Storage Metadata as your data source, enter the blob container path, and select Authenticate. You are redirected to a Azure Storage login page. Log in with your credentials to retrieve an authentication token, and enter it into the connector.
Loading data from tables
After you authenticate the connector with your account credentials, use the tables to fetch your data. Below are use cases for some of the available tables:
|Use this to list the files and subfolders of a blob repository located in your Azure Storage account.
|Use this to list the metadata and properties for a blob repository.
To select and load data from a table, enter the required table parameters and click Preview data. Required parameters are marked with an asterisk (*) in the dialog. The table fields are displayed under the Data preview tab. You can select fields individually by selecting the box beside each field name. Select Insert script after you have made your selection.
Reference - Azure Storage documentation
You can refer to the Microsoft Azure Blob service REST API documentation to learn more about the requirements and restrictions imposed by the API.
You receive an error message that you have reached the API rate limit
You have exceeded the API limits that are imposed by the Azure Blob API.
To reduce the impact of reaching API rate limits, develop your app with the following in mind:
- Extract only the data you need.
- Reload one Azure Storage-based application at a time.
- Ensure that loops in your script that make API calls do not result in infinite loops.