Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Added on June 28, 2022 8:09AM
Likes: 0
Replies: 2
I was trying to execute a Pyspark script and encountered a py4j error.
Can someone help me with this? I have checked all the version compatibilities as well.
I am attaching a screenshot of the error.
Operating system used: Ubuntu
Operating system used: Ubuntu
Hi
it looks like Spark integration is not done, or not done properly. Either you're using a spark in version >3.1.2, or you haven't run the "<datadir>/bin/dssadmin install-spark-integration ..." command
Yes it did solve the error.
thank you so much