Running punctual jobs on DSS

Solved!
fmonari
Level 2
Running punctual jobs on DSS

Hi,

I would like to address the following problem. I have prepare a tar or zip archive with a script and some data. I would like to upload the archive to DSS and run the script with the it through a shell recipe.

I am doing the following, having the data in the managed folder $DSS_FORLDER in the instance filesystem.

cd $DSS_FOLDER

tar -xvf archive.tar.gz

cd archive

python3 script.py

I get errors at the first line because "archive.tar.gz" is not found. Is there a way to use DSS in this non-canonical way?

Tanks,

Filippo

0 Kudos
1 Solution
AlexT
Dataiker

Hi Filipo,

If I understand correctly what you are trying to do you should be able to use the variables which can be found under Variables. For more information see : 

https://doc.dataiku.com/dss/latest/code_recipes/shell.html

So you could use cd $DKU_INPUT_0_FOLDER_PATH instead but this will would work only for local managed folders.

Screenshot 2021-06-16 at 17.36.23.png

Let me know if this helps.

 

 

 

 

 

 

 

View solution in original post

0 Kudos
1 Reply
AlexT
Dataiker

Hi Filipo,

If I understand correctly what you are trying to do you should be able to use the variables which can be found under Variables. For more information see : 

https://doc.dataiku.com/dss/latest/code_recipes/shell.html

So you could use cd $DKU_INPUT_0_FOLDER_PATH instead but this will would work only for local managed folders.

Screenshot 2021-06-16 at 17.36.23.png

Let me know if this helps.

 

 

 

 

 

 

 

0 Kudos