Pyspark and python error
Rushil09
Partner, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered, Frontrunner 2022 Participant Posts: 17 Partner
I was trying to execute a Pyspark script and encountered a py4j error.
Can someone help me with this? I have checked all the version compatibilities as well.
I am attaching a screenshot of the error.
Operating system used: Ubuntu
Operating system used: Ubuntu
Best Answers
-
Hi
it looks like Spark integration is not done, or not done properly. Either you're using a spark in version >3.1.2, or you haven't run the "<datadir>/bin/dssadmin install-spark-integration ..." command
-
Rushil09 Partner, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered, Frontrunner 2022 Participant Posts: 17 Partner
Yes it did solve the error.
thank you so much