Spark and HDFS Integration

Bader
Bader Registered Posts: 46 ✭✭✭✭✭

Hi All,

Thanks for great support.

I would like to install DSS in different server where Spark and HDFS are not exist. is it possible to integrate DSS with remote server where spark and hdfs are exist ? and is it possible to submit spark job remotely ?

Thanks

Kind regards

Bader

Answers

  • Clément_Stenac
    Clément_Stenac Dataiker, Dataiku DSS Core Designer, Registered Posts: 753 Dataiker

    Hi,

    I'm assuming you're talking about "regular Hadoop" here (i.e. Cloudera / Hortonworks / MapR / EMR / Dataproc).

    Even if your machine itself is not a part of the cluster, it will still need to have client libraries, binaries and configurations installed locally in order to talk to the cluster. It is not possible to submit jobs to a "completely-separated" cluster without anything installed locally

  • Bader
    Bader Registered Posts: 46 ✭✭✭✭✭

    Could you please list the required configuration and lib in order to separate the dss server and the cluster

Setup Info
    Tags
      Help me…