Want to Stop Rebuilding "Expensive" Parts of your Flow? Explicit Builds are the Answer!READ MORE

Deleting File from Amazon S3 bucket in R and Python

tamvap
Level 2
Deleting File from Amazon S3 bucket in R and Python

Hello,

In my current project i am trying to delete files that are located in Amazon S3 bucket from a dataiku recipe in a flow. This S3 files have taken to managed folders that pointing Amazon S3 folders in order to be accessed in the flow.
I have gone through the documentation but i wasn't able to find something similar.
I am using R and Python in that project and i will need to do that through both languages.


Let me know how I can delete Amazon S3(inside managed folders in flow) files with R and Python from a recipe.

Thanks

 

0 Kudos
1 Reply
tomas
Neuron
Neuron

You can just use https://doc.dataiku.com/dss/latest/python-api/managed_folders.html#dataiku.Folder.delete_path method on Folder.

 

Screenshot 2022-03-15 111503.jpg

 

 

 

import dataiku
fold = dataiku.Folder('test')
fold.delete_path ('/dumm4.csv')
fold.list_paths_in_partition('')
# Returns ['/dummy.txt', '/dummy2.txt', '/dummy3.txt']

 

 

Setup info

?
Tags (1)
A banner prompting to get Dataiku