"schema aware" custom recipes? - make propagate schema changes work

UserBird
UserBird Dataiker, Alpha Tester Posts: 535 Dataiker
Is there a way to make the "propagate schema changes" work with a custom recipe?

This would be very handy especially if there are large chains of custom recipes and something at the very beginning changes..

Best Answer

  • UserBird
    UserBird Dataiker, Alpha Tester Posts: 535 Dataiker
    edited July 17 Answer ✓

    Not possible: https://doc.dataiku.com/dss/latest/schemas/recipes.html#python-r-pyspark-sparkr

    It would be marvelous for productivity throughout flow development to be able to pass a schema method to the input/output dataframes which constructs the schema when doing a schema propagation.


    def get_ouput_schemas():
    result = {}
    schema1 = ds_input1.get_schema()
    schema2 = ds_input2.get_schema()
    schema_1, schema_2 = do_something(schema1, schema2)
    result['output_name_1'] = schema_1
    result['output_name_2'] = schema_2

    dataiku.get_schemas(get_output_schemas)

Setup Info
    Tags
      Help me…