Skip to main content Skip to complementary content

Publishing a message to an Azure Event Hub instance

This scenario aims at helping you set up and use connectors in a pipeline. You are advised to adapt it to your environment and use case.

Example of a pipeline created from the instructions below.

Before you begin

Procedure

  1. Click Datasets > Drop a file or browse.
  2. Browse to the local_file-to-azure_event_hubs.csv file and select it.
  3. Rename it, for example Baltimore restaurants.
  4. Click Connections > Add connection.
  5. In the panel that opens, select the type of connection you want to create.

    Example

    Azure EventHubs
  6. Select your engine in the Engine list.
    Information noteNote:
    • It is recommended to use the Remote Engine Gen2 rather than the Cloud Engine for Design for advanced processing of data.
    • If no Remote Engine Gen2 has been created from Talend Management Console or if it exists but appears as unavailable which means it is not up and running, you will not be able to select a Connection type in the list nor to save the new connection.
    • The list of available connection types depends on the engine you have selected.
  7. Select the type of connection you want to create.
    Here, select Azure Event Hubs.
  8. Fill in the connection properties to safely access your Azure service resources (endpoint, shared access signature keys) as described in Azure Event Hubs properties, check the connection and click Add dataset.
  9. In the Add a new dataset panel, name your dataset. In this example, the baltimore-restaurants Event Hub that is currently empty will be used to publish the data about Baltimore restaurants.

    Example

    Configuration of a new Azure Event Hubs dataset.
  10. Name your dataset, Restaurant Event Hub for example.
  11. Click Validate to save your dataset.
  12. Click Add pipeline on the Pipelines page. Your new pipeline opens.
  13. Give the pipeline a meaningful name.

    Example

    From local file to Azure Event Hubs - publish an event
  14. Click ADD SOURCE and select your source dataset, Baltimore restaurants in the panel that opens.
  15. Click add processor and add a Strings processor to the pipeline in order to change the case of some records. The configuration panel opens.
  16. Give a meaningful name to the processor.

    Example

    change case of police district records
  17. Configure the processor:
    1. Select Change to title case in the Function name list as you want to change the case of the records from upper case to title case.
    2. Select .police_disctrict in the Fields to process list as you want to apply this change to the values of these specific records.
  18. Click Save to save your configuration.
  19. (Optional) Look at the preview of the processor to see the data after the case change.
    In the Output data preview, the police district data changed from full capital letters to capital on the first letter.
  20. Click the ADD DESTINATION item on the pipeline to open the panel allowing to select the Azure Event Hub in which your output data will be loaded, Restaurant Event Hub.
  21. In the Configuration tab of the destination, the Round-Robin model is the default Partition Type used when publishing an event but feel free to specify a partition key or a partition ID according to your case.
  22. On the top toolbar of Talend Cloud Pipeline Designer, click the Run button to open the panel allowing you to select your run profile.
  23. Select your run profile in the list (for more information, see Run profiles), then click Run to run your pipeline.

Results

Your pipeline is being executed, the Baltimore restaurant data from your local file has been processed and the output flow is sent to the baltimore-restaurants Azure Event Hub you have defined:

In the Azure event hub, 300 messages were received in the last hour which represent 25.78 incoming kilobytes.

What to do next

Once the event is published, you can consume the messages in another pipeline and use them as an Azure Event Hub source:

A new pipeline where the source is the restaurant message data from the previous destination pipeline.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – please let us know!