Spark Configuration for Managed K8s cluster
Hi all,
I am working through setting up Spak on managed K8s. And it is not clear to me what should be my configurations under Admin > Settings > Spark.
My cluster has been attached to the DSS, I have created the stand alone spark image and pushed that into my repo. But now have no idea where to do.
I attempt to execute a job and an getting a "Job failed: Spark Config not found: default" error.
Can anyone help me with a boiler plate config for managed K8s set up? Or point me in the right direction to know where to see these config parameters, because I have not been able to find it in the documentation.
Operating system used: Linux
Operating system used: Linux
Best Answer
-
Grixis PartnerApplicant, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 82 ✭✭✭✭✭
Hello @jonhli
When you are in the configuration of the instance as you found it under configurations under Admin > Settings > Spark you have below computing and scaling you have a section for Spark. However your error message does not seem to be directly linked to K8S in fact the message simply indicates that your job was unable to take a config for Spark because it seems that in the settings inside your recipe which seems to be called "default" doesn’t exist.
as in the attached photo that there is a 'default' name configuration with the keys set configurations and that it is Managed Spark-on-K8S. Alternatively, perhaps you have pre-existing configurations to take that are other than default.