Write to artifacts to a folder from a pyspark job - Access issue
Skanda Gurunathan
Registered Posts: 8 ✭✭✭
Hi, I am trying to write a pySpark.ml model (Pipeline model) from a pyspark recipe to a folder located in Server's filesystem storage and I am getting access denied error. How can I write to a folder from a pyspark recipe.
pipeline = Pipeline(stages=[<listofencoders>])
# Fit and transform the data using the pipeline
pipeline_model = pipeline.fit(data)
pipeline_model.save(dataiku.Folder("model").get_path() + "/model")
Operating system used: Linux
Answers
-
Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,166 Neuron
Hi, you can't write to Dataiku folders directly like that. You need to use the upload_stream() method:
https://knowledge.dataiku.com/latest/code/managed-folders/concept-managed-folders.html