Using Jar based spark data source in Dataiku Recipe
DataikuNewUser1
Registered Posts: 2 ✭
Hi,
I have a JAR file that i use as a spark datasource to read data from a url.
This is how i do it on my local
conf = SparkConf().setAll([("spark.jars", "path-to-jar/<jar_name>.jar"),])
df = spark.read.format("cloud.alpha.spark.providers.appobject.AppObjectTableProvider") \
.option("appName", "<name>") \
.option("appGroupName", "<name>") \
.load()
But i am unable to replicate this on Dataiku recipe. I have kept the jar in managed folder, and when i am setting the path for the jar in spark config in the recipe the error is it is unable to find the specified path to the class: cloud.alpha.spark.providers.appobject.AppObjectTableProvider.
Can someone provide some help with this.
Answers
-
Sarina Dataiker, Dataiku DSS Core Designer, Dataiku DSS Adv Designer, Registered Posts: 317 Dataiker
Hi @DataikuNewUser1
,
You can include additional Spark jar files under Administration > Settings > Spark > Advanced Settings > Additional jars:Can you see if this works for you?
Thank you,
Sarina -
Currently I am using a Demo community account. Is tis feature available there. If yes can u plz help me with the steps i can access this page
-
Hi,
If you don't have access to the administration you can't access this feature.I've added 2 more screenshots: