Running punctual jobs on DSS

fmonari
fmonari Registered Posts: 18 ✭✭✭✭
edited July 16 in Using Dataiku

Hi,

I would like to address the following problem. I have prepare a tar or zip archive with a script and some data. I would like to upload the archive to DSS and run the script with the it through a shell recipe.

I am doing the following, having the data in the managed folder $DSS_FORLDER in the instance filesystem.

cd $DSS_FOLDER

tar -xvf archive.tar.gz

cd archive

python3 script.py

I get errors at the first line because "archive.tar.gz" is not found. Is there a way to use DSS in this non-canonical way?

Tanks,

Filippo

Best Answer

  • Alexandru
    Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,225 Dataiker
    Answer ✓

    Hi Filipo,

    If I understand correctly what you are trying to do you should be able to use the variables which can be found under Variables. For more information see :

    https://doc.dataiku.com/dss/latest/code_recipes/shell.html

    So you could use cd $DKU_INPUT_0_FOLDER_PATH instead but this will would work only for local managed folders.

    Screenshot 2021-06-16 at 17.36.23.png

    Let me know if this helps.

Setup Info
    Tags
      Help me…