Google Cloud Storage Metadata
The Qlik Google Cloud Storage Metadata Connector uses the Google Cloud API to access your Google Cloud Storage metadata, such as the names of files and subfolders in your Google Cloud Storage bucket.
- Qlik Sense Business
- Qlik Sense Enterprise SaaS
Connection to your data
You need to authenticate the connector with your Google Cloud Storage account credentials:
- Service account key file: You need one or more service accounts with appropriate roles and permissions in the IAM settings in the Google Cloud console. The simplest option is a service account with a storage admin role and no conditions. You will need to create a JSON key for each service account you wish to use. This key file should be downloaded and stored securely. Note that P12 keys are not supported, as they are considered legacy. In order to authenticate, you will need the key file and the bucket name you wish to connect to. These details are stored securely in the connection.
- Bucket name: You can find bucket names in the Google Cloud Storage console storage page.
After you authenticate the connector with your account credentials, use the tables to fetch your data. Some tables are preconfigured to access a specific set of data while others let you create custom queries.
Do the following:
- Select a table from the Tables column.
Enter table parameters, if required.
Required parameters are marked with a *.
Click Data preview to see a sample of your data and to select the table fields that you want to load.
The table below outlines some of the use cases for some of the tables that are available.
|ListObjects||Lists the objects in a bucket. You must have bucket READ permission.|
|Metadata||Returns metadata for the object.|
Loading data into your Qlik Sense app
Once you have finished selecting data, you can load your data into your Qlik Sense app. How you proceed will depend on whether you load data with Add data or the Data load editor.
To load data using Add data, click Add data. This will open the data in the Associations view of the data manager. In the associations view, you can continue to add data sources, transform the data, and associate the tables in Data manager.
Data profiling is enabled by default when you click Add data. Data profiling does the following:
- Recommends data associations.
- Auto-qualifies common fields between tables. This adds a unique prefix based on table name.
- Maps date and time fields to autoCalendar.
Tables are not associated on common field names automatically. You can associate tables in the Associations view.
If you want to load the data directly into your app, click and then disable data profiling.
When you add data with data profiling disabled, all existing data from data sources will be reloaded when you add the data. Tables will be associated on common field names automatically. Date and time fields will not be created.
Data load editor
To load data with the Data load editor, click Insert script once you are finished selecting the data. A load script is inserted into the script editor of the Data load editor. You can continue to edit the script in the script editor or you can click Load data to run the data load script.
Reference - Google Cloud documentation
You can refer to the Google Cloud API documentation to learn more about the requirements and restrictions imposed by the Google Cloud API.
You receive generic access error messages
There are multiple generic access error messages: "cannot open file", "no fields to load", etc. that can occur.
You have incorrectly entered Google Cloud Storage credentials or file/bucket information.
Ensure that your Google Cloud Storage account credentials have been correctly entered:
- Check that your Google Cloud Storage key and bucket name are accurately entered in your connector configuration. Check that your load script correctly references your connection name, file name, file path, and file formatting information.
- Ensure that the file is available in your Google Cloud Storage instance at the specified file path.
You have exceeded the API limits that are imposed by the Google Cloud API.
To reduce the impact of reaching the API rate limits, develop your app with the following in mind:
- Extract only the data you need.
- Reload one Google Cloud Storage-based application at a time.
- Ensure that loops in your script that make API calls will not result in infinite loops.