Can we use Dataiku to execute and kickstart python scripts which are placed in some other server [i.e. python scripts aren't there on the same server where Dataiku is installed].
I tried creating some python recipes in one of the dataiku projects but looks like those recipes are intended mainly for reading some datasets , doing some actions and creating a output dataset.
We have a data pipeline code built of few python scripts running one after the other and currently we are looking what goodies we can provide to GUI savvy folks from dataiku in the pipeline execution.
ssh -t -i ~/.ssh/my_key email@example.com "python test_print.py”
Hi @NiteshK. This is an interesting topic. We had a similar use case here, where there was a need to trigger a script execution in a different machine, and then wait for the results and import them into dataiku.
Our take in the end was not to support that kind of use case, because it looked more like an automation of data acquisition (which can be solved with other more focused tools) than a data science project that could fit into the Dataiku model.
I'm interested in knowing if with the solution @dima_naboka suggests you will be going ahead on this use case, and if succeed what was your final implementation.