Re: Spark on local machine (where DSS is intalled) + Spark on another cluster
DSS only supports one Spark installation (i.e. directory with the Spark configuration and libraries for a given version of spark) see here for the details.
However, using different Spark configurations you can have your Spark job run either locally (spark master "local[*]") or remotely (e.g. spark master "yarn-client" to run on the Hadoop cluster, or spawn a spark "standalone cluster" on your cluster and provide its URL as the spark master).