Write to artifacts to a folder from a pyspark job - Access issue

Options
skandagn
skandagn Registered Posts: 8 ✭✭✭

Hi, I am trying to write a pySpark.ml model (Pipeline model) from a pyspark recipe to a folder located in Server's filesystem storage and I am getting access denied error. How can I write to a folder from a pyspark recipe.

pipeline = Pipeline(stages=[<listofencoders>])

# Fit and transform the data using the pipeline
pipeline_model = pipeline.fit(data)

pipeline_model.save(dataiku.Folder("model").get_path() + "/model")


Operating system used: Linux

Answers

Setup Info
    Tags
      Help me…