Analyzing people's activities using a Storm topology (deprecated)
For more technologies supported by Talend, see Talend components.
This scenario applies only to Talend Real Time Big Data Platform and Talend Data Fabric.
In this scenario, a four-component Storm Job (a topology) is created to transmit messages about the activities of some given people to the topology you are designing in order to analyze the popularities of those activities.
This Job subscribes to the related topic created by the Kafka topic producer, which means you need to install the Kafka cluster in your messagingg system to maintain the feeds of messages. For further information about the Kafka messaging service, see Apache's documentation about Kafka.
On the other hand, since this Job runs on top of Storm, you need to ensure that your Storm system is ready for use. For further information about Storm, see Apache's documentation about Storm.
Note that when you use the Storm system installed in the Hortonwork Data Platform 2.1 (HDP2.1), ensure that the Storm DRPC (distributed remote procedure call) servers' names have been properly defined in the Custom storm.yaml section of the Config tab of Storm in Ambari's web console. For example, you need to use two Storm DRPC servers which are Server1 and Server2, then you must define them in the Custom storm.yaml secton as follows: [Server1,Server2].
To replicate this scenario, proceed as follows.