Check out Building a Data-Centric Culture at the ALMA Observatory on November 5th Read More

Kickstart python script execution from Dataiku

Level 1
Kickstart python script execution from Dataiku

Can we use Dataiku to execute and kickstart python scripts which are placed in some other server [i.e. python scripts aren't there on the same server where Dataiku is installed].

I tried creating some python recipes in one of the dataiku projects but looks like those recipes are intended mainly for reading some datasets , doing some actions and creating a output dataset.

We have a data pipeline code built of few python scripts running one after the other and currently we are looking what goodies we can provide to GUI savvy folks from dataiku in the pipeline execution.

0 Kudos
2 Replies
Dataiker
Dataiker
Hi Nitesh, 
 
You have an option to remotely execute a shell command as part of DSS job by utilizing Shell recipe. For example, you can log in to a remote server and run Python code like this:
ssh -t -i ~/.ssh/my_key centos@ec2-address.compute.amazonaws.com "python test_print.py”
 
On the other hand, to benefit from executing your Python code locally as part of a DSS project you can import it using Git and Project Libraries. Then import and use it in Python recipe or Jupyter notebook
0 Kudos

Hi @NiteshK. This is an interesting topic. We had a similar use case here, where there was a need to trigger a script execution in a different machine, and then wait for the results and import them into dataiku.

Our take in the end was not to support that kind of use case, because it looked more like an automation of data acquisition (which can be solved with other more focused tools) than a data science project that could fit into the Dataiku model.

I'm interested in knowing if with the solution @dima_naboka suggests you will be going ahead on this use case, and if succeed what was your final implementation.

0 Kudos