Invoking an external utility, that needs HDFS Folders for input & output, from within a Recipe
Hello, I am very new to Dataiku. My use-case might not be typical. I want to build an NLP pipeline, where each stage reads one file (say PDF) & produces another, say an XML file. I want to invoke a 3rd party program that requires 2 parameters, an HDFS input folder & an HDFS output folder. I see that a Recipe also requires an input & an output folder, for which I can define Managed folders. In my case, the actual Reading & Writing into HDFS folders will be performed by the 3rd party program, without using any DSS APIs. In that case, will I be hard-coding the folder names with-in my Recipe?