Can you help me in reading all the files present in a HDFS managed folder based on certain criteria/Pattern and writing the files into a different HDFS managed folder.
Attaching the problem statment. Please help.
Thanks in advance.
One option would be to use a Python recipe to read in the inputs of this HDFS managed folder, filter on the file names (using regex), and then copying over the files accordingly to the appropriate output managed folders by using the read/write APIs.
import dataiku import re # Read inputs and managed folders. Make sure to use the appropriate managed folder IDs. input_folder = dataiku.Folder("INPUT_MANAGED_FOLDER_ID") paths = input_folder.list_paths_in_partition() output_folder1 = dataiku.Folder("OUTPUT_MANAGED_FOLDER1_ID") output_folder2 = dataiku.Folder("OUTPUT_MANAGED_FOLDER2_ID") # Iterate through files, check if they fit certain regex condition, and write them to output managed folders accordingly. x=0 for paths[x] in paths: # Check if file starts with "/File_" and, if so, copy the file to the first output managed folder. Replace with appropriate regex as needed. if re.match(r"/[F|f]ile_\d+", paths[x]): with input_folder.get_download_stream(paths[x]) as f: data = f.read() with output_folder1.get_writer(paths[x]) as w: w.write(data) # Check if file starts with "/Input_file_" and, if so, copy the file to the second output managed folder. Replace with appropriate regex as needed. if re.match(r"/[i|I]nput_file_\d+", paths[x]): with input_folder.get_download_stream(paths[x]) as f: data = f.read() with output_folder2.get_writer(paths[x]) as w: w.write(data) x +=1
I hope that this helps!
Hi @ATsao ,
I have a similar problem but I want to zip a file to another folder instead of just copying. Do you know how can i do this?
I was trying this to find the zip file from a specific month from a defined variable:
for paths[x] in paths:
if fnmatch.fnmatch(paths[x], '*' + (dataiku.get_custom_variables()["v_mth"]) + '.zip'):
with input_folder.get_download_stream(paths[x]) as f:
with zipfile.ZipFile(f.read(), "r") as zip_ref:
but I get the following error:
Thank you very much!
I have created a python API endpoint and within that, I am trying to access the folder created within the local file system.
I am getting following error :
folder = dataiku.Folder('xyz')
path = folder.list_paths_in_partition()
Unable to get the list of files presents within the folder created.
Please note that endpoints are meant to be deployed to API nodes so you should generally not be using internal dataiku APIs from your endpoint as these endpoints are independent from your design node. If you wish to call a local managed folder, please follow the steps defined here to "package" your managed folder so that it can be referenced from your python API endpoint:
As an alternative, you can utilize internal "files from folder" dataset to filter your files with regex
Such approach would not use managed folder as output though. Generally speaking, it is not as flexible as Python code but could be useful if you prefer visual recipes over code recipes.