Pyspark and python error

Tags
Partner, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered, Frontrunner 2022 Participant Posts: 17 Partner

I was trying to execute a Pyspark script and encountered a py4j error.

Can someone help me with this? I have checked all the version compatibilities as well.

I am attaching a screenshot of the error.

image (1).png


Operating system used: Ubuntu


Operating system used: Ubuntu

Best Answers

  • Dataiker Posts: 355 Dataiker
    Answer ✓

    Hi

    it looks like Spark integration is not done, or not done properly. Either you're using a spark in version >3.1.2, or you haven't run the "<datadir>/bin/dssadmin install-spark-integration ..." command

  • Partner, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered, Frontrunner 2022 Participant Posts: 17 Partner
    Answer ✓

    Yes it did solve the error.
    thank you so much

Welcome!

It looks like you're new here. Sign in or register to get started.

Welcome!

It looks like you're new here. Sign in or register to get started.