Write to artifacts to a folder from a pyspark job - Access issue
Hi, I am trying to write a pySpark.ml model (Pipeline model) from a pyspark recipe to a folder located in Server's filesystem storage and I am getting access denied error. How can I write to a folder from a pyspark recipe.
pipeline = Pipeline(stages=[<listofencoders>])
# Fit and transform the data using the pipeline pipeline_model = pipeline.fit(data)