Best Practices For Updating and Renaming Spark and Container Configurations
Hello Dataiku Community,
Hope all is well!
Our team is looking to implement new Spark and container configuration settings on our instances. We are curious to understand what the best practices are for updating the existing configurations. For context we have existing Spark configurations already being used by end users, however we would like to replace these existing settings with net new settings and naming conventions.
As a test we tried creating a net new Spark configuration on one of our dedicated "Dev" instances (a design node) and tested what would happen if we renamed the configuration. We saw that if we rename the configuration, all previous settings that were explicitly set to this new setting will be converted to "Nothing Selected". Please see example files for before and after pictures. Our before config was named "Large_9GBMem_11Exec_new" and we updated the name of the config to "Large_9GBMem_11Exec". However, in the after, we saw that the selection now is set to "Nothing Selected". Is there a way to have the selection default to the new name "Large_9GBMem_11Exec" for example or is this behavior expected?
I found the following below documentation and discussions through the community regarding systematically checking/updating the Spark settings via python script and wanted to confirm if it is the best practice to leverage the API to systematically update the Spark configs or if there is another way to update the configs/config names through the UI automatically.