Pyspark and python error

Options
Rushil09
Rushil09 Partner, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered, Frontrunner 2022 Participant Posts: 17 Partner

I was trying to execute a Pyspark script and encountered a py4j error.

Can someone help me with this? I have checked all the version compatibilities as well.

I am attaching a screenshot of the error.

image (1).png


Operating system used: Ubuntu


Operating system used: Ubuntu

Tagged:

Best Answers

  • fchataigner2
    fchataigner2 Dataiker Posts: 355 Dataiker
    Answer ✓
    Options

    Hi

    it looks like Spark integration is not done, or not done properly. Either you're using a spark in version >3.1.2, or you haven't run the "<datadir>/bin/dssadmin install-spark-integration ..." command

  • Rushil09
    Rushil09 Partner, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered, Frontrunner 2022 Participant Posts: 17 Partner
    Answer ✓
    Options

    Yes it did solve the error.
    thank you so much

Setup Info
    Tags
      Help me…