"schema aware" custom recipes? - make propagate schema changes work

Options
UserBird
UserBird Dataiker, Alpha Tester Posts: 535 Dataiker
Is there a way to make the "propagate schema changes" work with a custom recipe?

This would be very handy especially if there are large chains of custom recipes and something at the very beginning changes..

Best Answer

  • UserBird
    UserBird Dataiker, Alpha Tester Posts: 535 Dataiker
    Answer ✓
    Options

    Not possible: https://doc.dataiku.com/dss/latest/schemas/recipes.html#python-r-pyspark-sparkr

    It would be marvelous for productivity throughout flow development to be able to pass a schema method to the input/output dataframes which constructs the schema when doing a schema propagation.


    def get_ouput_schemas():<BR /> result = {}<BR /> schema1 = ds_input1.get_schema()<BR /> schema2 = ds_input2.get_schema()<BR /> schema_1, schema_2 = do_something(schema1, schema2)<BR /> result['output_name_1'] = schema_1<BR /> result['output_name_2'] = schema_2<BR /> <BR />dataiku.get_schemas(get_output_schemas)<BR />

Setup Info
    Tags
      Help me…