Switching between modes, distributions, or environments with Spark Universal
Spark Universal mechanism allows you to easily and quickly switch between the different Spark modes, distributions, or environments by changing the Hadoop configuration JAR file while keeping the same Job configuration. The switch operation can be performed on:
- Spark mode: you can switch between Local and Yarn cluster mode to first test your Job on your local machine before sending it to a cluster.
- Distribution: you can switch between the different big data distributions available for a given Spark version.
- Environment: you can switch between your development, integration, or production environment.