Pyspark and python error

Solved!
Rushil09
Level 3
Pyspark and python error

I was trying to execute a Pyspark script and encountered a py4j error.

Can someone help me with this? I have checked all the version compatibilities as well.

I am attaching a screenshot of the error.

 
 

image (1).png


Operating system used: Ubuntu


Operating system used: Ubuntu

โ€ƒ

0 Kudos
2 Solutions
fchataigner2
Dataiker

Hi

it looks like Spark integration is not done, or not done properly. Either you're using a spark in version >3.1.2, or you haven't run the "<datadir>/bin/dssadmin install-spark-integration ..." command

View solution in original post

Rushil09
Level 3
Author

Yes it did solve the error. 
thank you so much

View solution in original post

0 Kudos
2 Replies
fchataigner2
Dataiker

Hi

it looks like Spark integration is not done, or not done properly. Either you're using a spark in version >3.1.2, or you haven't run the "<datadir>/bin/dssadmin install-spark-integration ..." command

Rushil09
Level 3
Author

Yes it did solve the error. 
thank you so much

0 Kudos

Labels

?
Labels (1)

Setup info

?
A banner prompting to get Dataiku