Deleting File from Amazon S3 bucket in R and Python
tamvap
Dataiku DSS Core Designer, Registered Posts: 10 ✭✭✭✭
Hello,
In my current project i am trying to delete files that are located in Amazon S3 bucket from a dataiku recipe in a flow. This S3 files have taken to managed folders that pointing Amazon S3 folders in order to be accessed in the flow.
I have gone through the documentation but i wasn't able to find something similar.
I am using R and Python in that project and i will need to do that through both languages.
Let me know how I can delete Amazon S3(inside managed folders in flow) files with R and Python from a recipe.
Thanks
Tagged:
Answers
-
You can just use https://doc.dataiku.com/dss/latest/python-api/managed_folders.html#dataiku.Folder.delete_path method on Folder.
import dataiku fold = dataiku.Folder('test') fold.delete_path ('/dumm4.csv') fold.list_paths_in_partition('') # Returns ['/dummy.txt', '/dummy2.txt', '/dummy3.txt']