Skip to main content Skip to complementary content

Building access to Cloud Storage

Procedure

  1. Double-click tBigQueryOutput to open its Component view.
  2. Click Sync columns to retrieve the schema from its preceding component.
  3. In the Local filename field, enter the directory where you need to create the file to be transferred to BigQuery.
  4. Navigate to the Google APIs Console in your web browser to access the Google project hosting the BigQuery and the Cloud Storage services you need to use.
  5. Click Google Cloud Storage > Interoperable Access to open its view.
  6. In Google storage configuration area of the Component view, paste Access key, Access secret from the Interoperable Access tab view to the corresponding fields, respectively.
  7. In the Bucket field, enter the path to the bucket you want to store the transferred data in. In this example, it is talend/documentation
    This bucket must exist in the directory in Cloud Storage
  8. In the File field, enter the directory where in Google Clould Storage you receive and create the file to be transferred to BigQuery. In this example, it is gs://talend/documentation/biquery_UScustomer.csv. The file name must be the same as the one you defined in the Local filename field.

    Troubleshooting: if you encounter issues such as Unable to read source URI of the file stored in Google Cloud Storage, check whether you put the same file name in these two fields.

  9. Enter 0 in the Header field to ignore no rows in the transferred data.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – please let us know!