Cannot use SparkContext.getOrCreate()

Options
pillsy
pillsy Partner, Registered Posts: 5 Partner
edited July 16 in Using Dataiku

I have been using the default PySpark notebook on a Dataiku instance which does have Spark set up, and cannot get past the very first step after the imports (which are successful modulo some apparently harmless DeprecationWarnings about docstrings).

Evaluating this cell in a Jupyter notebook gets stuck with an asterisk for an indefinite period of time:

# Load PySpark
sc = pyspark.SparkContext.getOrCreate()

Any ideas how to better diagnose or even resolve this issue?

Thanks!

Best Answer

  • Alexandru
    Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,209 Dataiker
    Answer ✓
    Options

    Hi,

    You could try to run the same code or the sample PySpark as a recipe instead of a Notebook and look at the job log for any errors.

    If you are unable to find anything that could explain the issue I would suggest you open a support ticket with the job diagnostics.

    Kind Regards,

Setup Info
    Tags
      Help me…