Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Added on September 16, 2016 3:59PM
Likes: 0
Replies: 1
DSS only supports one Spark installation (i.e. directory with the Spark configuration and libraries for a given version of spark) see here for the details.
However, using different Spark configurations you can have your Spark job run either locally (spark master "local[*]") or remotely (e.g. spark master "yarn-client" to run on the Hadoop cluster, or spawn a spark "standalone cluster" on your cluster and provide its URL as the spark master).