Create a SparkSQL Recipe With API

selchheb
selchheb Registered Posts: 1

Hello,

i try to create a sparsql recipe with the coderecipe createor. i saw in the library that only

"type in ["python", "r", "sql_script", "pyspark", "sparkr", "spark_scala", "shell"]:" are accepted,
i tried sql_builder = project.new_recipe("spark_sql_script", recipe_name)
i have the error that sql_builder is a none type.
Does anyone know if it is possible, and if yes what is the keyword? thanks

Best Answer

  • Sarina
    Sarina Dataiker, Dataiku DSS Core Designer, Dataiku DSS Adv Designer, Registered Posts: 317 Dataiker
    edited July 17 Answer ✓

    Hi @selchheb
    ,

    You can use the type spark_sql_query for spark sql recipes!

    For example:

    from dataikuapi.dss.recipe import DSSRecipeCreator
    recipe_creator = DSSRecipeCreator("spark_sql_query", "test_recipe_creation", client.get_default_project())
    
    recipe_creator.with_input("Orders")
    recipe_creator.with_output("output_dataset_that_exists")


    As a hint if you run into this again, you can always use project.get_recipe() on an existing recipe (i.e. an existing Spark SQL recipe) and then get the recipe settings, which shows the type for the recipe.

    Thanks,
    Sarina

Setup Info
    Tags
      Help me…