Logging and checkpointing the activities of your Apache Spark Job
It is recommended to activate the Spark logging and checkpointing system in the Spark configuration tab of the Run view of your Spark Job, in order to help debug and resume your Spark Job when issues arise.
The information in this section is only for users who have subscribed to Talend Data Fabric or to any Talend product with Big Data.