Running punctual jobs on DSS

fmonari Registered Posts: 18 ✭✭✭✭
edited July 16 in Using Dataiku


I would like to address the following problem. I have prepare a tar or zip archive with a script and some data. I would like to upload the archive to DSS and run the script with the it through a shell recipe.

I am doing the following, having the data in the managed folder $DSS_FORLDER in the instance filesystem.


tar -xvf archive.tar.gz

cd archive


I get errors at the first line because "archive.tar.gz" is not found. Is there a way to use DSS in this non-canonical way?



Best Answer

  • Alexandru
    Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,209 Dataiker
    Answer ✓

    Hi Filipo,

    If I understand correctly what you are trying to do you should be able to use the variables which can be found under Variables. For more information see :

    So you could use cd $DKU_INPUT_0_FOLDER_PATH instead but this will would work only for local managed folders.

    Screenshot 2021-06-16 at 17.36.23.png

    Let me know if this helps.

Setup Info
      Help me…