In the Integration perspective
of the Studio, create an empty Map/Reduce Job from the
Job Designs node in the Repository tree view.
For further information about how to create a Map/Reduce Job, see
Talend Big Data Getting Started Guide
.
Drop a tHDFSInput component, a tJavaMR component, and a tHDFSOutput component in the workspace.
The tHDFSInput component reads data from
the Hadoop distribution to be used and the tHDFSOutput component writes processed data into a that
distribution.
Connect these components using the Row >
Main link.
Did this page help you?
If you find any issues with this page or its content – a typo, a missing step, or a technical error – please let us know!