PySpark exit recipe with Warning status

jacksonisaac
Level 1
PySpark exit recipe with Warning status

I have a PySpark recipe which reads a dataset, and extracts a column based on first index (first row).

In a scenario when the input dataset partition is empty, it throws a normal error: 'index out of range'.

To handle this I created a try except block, and want to end the recipe at that except block. I tried sys.exit(1) which returns error, and sys.exit(0) which returns successful.

I wanted to know if there is any way I can signal the PySpark recipe to flag this partition job as warning (so I see 'Yellow' on that partition instead of Green or Red button).

Here's a code snippet I have now:

try:
    var_x = input_df.select('var_x').drop_duplicates().take(1)[0]['var_x']
except:
    logger.warn(f"Empty Input partition for {var_a}|{var_b}")
    sys.exit(0)

 

0 Kudos
2 Replies
Clรฉment_Stenac

Hi,

It is not currently possible for user code to flag a recipe as warning. We'll make sure to take your request into account to inform future developments.

emher
Level 3

Is there any updates? I would also like this feature ๐Ÿ™‚

0 Kudos