tKafkaOutput Standard properties
These properties are used to configure tKafkaOutput running in the Standard Job framework.
The Standard tKafkaOutput component belongs to the Internet family.
The component in this framework is available in all Talend products with Big Data and in Talend Data Fabric.
Basic settings
Input type |
Select the type of messages to be read by Kafka from the drop-down list:
This property is available since Kafka 2.2.1. |
Schema and Edit schema |
A schema is a row description. It defines the number of fields (columns) to be processed and passed on to the next component. When you create a Spark Job, avoid the reserved word line when naming the fields. Note that the schema of this component is read-only. It stores the messages to be published. |
Use an existing connection |
Select this check box and in the Component List drop-down list, select the desired connection component to reuse the connection details you already defined. |
Version |
Select the version of the Kafka cluster to be used. If you have installed the 8.0.1-R2024-02 Talend Studio Monthly update or a later one delivered by Talend, the Kafka 2.4.x version and previous versions are deprecated. |
Broker list |
Enter the addresses of the broker nodes of the Kafka cluster to be used. The form of this address should be hostname:port. This information is the name and the port of the hosting node in this Kafka cluster. If you need to specify several addresses, separate them using a comma (,). |
Topic name |
Enter the name of the topic you want to publish messages to. This topic must already exist. This property is only available when you select Byte[] from the Input type drop-down list. |
Compress the data |
Select the Compress the data check box to compress the output data. |
Use SSL/TLS |
Select this check box to enable the SSL or TLS encrypted connection. This check box is available since Kafka 0.9.0.1. |
Set keystore |
Select this check box to enable the SSL or TLS encrypted connection via a tSetKeystore component. Then you need to use the tSetKeystore component in the same Job to specify the encryption information. This check box is available when you select the Use SSL/TLS check box. Information noteNote: This option is available when you have installed the 8.0.1-R2022-05 Talend Studio Monthly update or a later one delivered by Talend. For more information, check with your administrator.
|
Use Kerberos authentication |
If the Kafka cluster to be used is secured with Kerberos, select this check box to display the related parameters to be defined:
For further information about how a Kafka cluster is secured with Kerberos, see Authenticating using SASL. This check box is available since Kafka 0.9.0.1. |
Advanced settings
Kafka properties |
Add the Kafka new producer properties you need to customize to this table. For more information about the new producer properties you can define in this table, see the section describing the new producer configuration from the official Kafka documentation. |
Set Headers |
Select this check box to add headers to messages to be sent. This feature is available to Kafka 1.1.0 onwards. |
Use schema registry |
Select this check box to use Confluent Schema Registry and to display the
related parameters to be defined:
For more information about Schema Registry, see the Confluent documentation. This option is only available when you select ProducerRecord from the Input type drop-down list in the Basic settings view. Information noteNote: This option is available when you have installed the 8.0.1-R2022-01 Talend Studio Monthly update or a later one delivered by Talend. For more information, check with your administrator.
|
tStatCatcher Statistics |
Select this check box to gather the processing metadata at the Job level as well as at each component level. |
Global Variables
ERROR_MESSAGE |
The error message generated by the component when an error occurs. This is an After variable and it returns a string. This variable functions only if the Die on error check box is selected. |
NB_LINE |
The number of rows processed. This is an After variable and it returns an integer. |
NB_ERRORS |
The number of rows processed with errors. This is an After variable and it returns an integer. |
NB_SUCCESS |
The number of rows successfully processed. This is an After variable and it returns an integer. |
Usage
Usage rule |
This component is an end component. It requires a tJavaRow or tJava component to transform the incoming data into serialized byte arrays. The following sample shows how to construct a statement to perform this transformation:
In this code, the output_row variable represents the schema of the data to be output to tKafkaOutput and output_row.serializedValue the single read-only column of that schema; the input_row variable represents the schema of the incoming data and input_row.users the input column called users to be transformed to byte arrays by the getBytes() method. |