Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Added on October 17, 2022 4:58PM
Likes: 2
Replies: 3
I recently bumped into an issue where my python code was not executed the same way wether it was ran from a notebook or its corresponding recipe*.
I eventually used the following function but I was wondering if there were a native function in dataiku to detect the running environment?
def in_ipynb(): try: get_ipython() return True except NameError: return False
*for those interested in this issue, it was related to writing to partitions for which it "is not allowed in Python recipes, where write is controlled by the Flow" --> in practice, you can not use the dataiku.Dataset.set_write_partition() in python recipes.
Operating system used: windows 10
Hi @tanguy
,
Indeed there are some things that only make sense in the context of the flow, and some that only make sense out of the context of the flow.
The dataiku module has the dictionary dku_flow_variables available when a Python recipe is run from the flow itself. If the same code is run outside of the flow, dku_flow_variables won't be available. So you could do a check like so:
if hasattr(dataiku, 'dku_flow_variables'): print('in recipe') else: print('in notebook')
Let me know if you have any questions about this.
Thanks,
Sarina
This worked well for me. Thanks.
You can use a built-in function to check if you are running the code in the notebook or in the recipe:
dataiku.in_ipython()