Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Hi,
I have a JAR file that i use as a spark datasource to read data from a url.
This is how i do it on my local
conf = SparkConf().setAll([("spark.jars", "path-to-jar/<jar_name>.jar"),])
Hi @DataikuNewUser1,
You can include additional Spark jar files under Administration > Settings > Spark > Advanced Settings > Additional jars:
Can you see if this works for you?
Thank you,
Sarina
Currently I am using a Demo community account. Is tis feature available there. If yes can u plz help me with the steps i can access this page
Hi,
If you don't have access to the administration you can't access this feature.
I've added 2 more screenshots: