Reading and writing Avro data from Kafka using Standard Jobs
This scenario explains how to use schema registry with deserializer and how to handle Avro data using ConsumerRecord and ProducerRecord from Kafka components in your Standard Jobs.


This scenario applies only to Talend products with Big Data and Talend Data Fabric.