Cannot use SparkContext.getOrCreate()

Solved!
pillsy
Level 2
Cannot use SparkContext.getOrCreate()

I have been using the default PySpark notebook on a Dataiku instance which does have Spark set up, and cannot get past the very first step after the imports (which are successful modulo some apparently harmless DeprecationWarnings about docstrings).

 

Evaluating this cell in a Jupyter notebook gets stuck with an asterisk for an indefinite period of time:

# Load PySpark
sc = pyspark.SparkContext.getOrCreate()

Any ideas how to better diagnose or even resolve this issue?

Thanks!

0 Kudos
1 Solution
AlexT
Dataiker

Hi,

You could try to run the same code or the sample PySpark as a recipe instead of a Notebook and look at the job log for any errors.

If you are unable to find anything that could explain the issue I would suggest you open a support ticket with the job diagnostics. 

Kind Regards,

 

View solution in original post

1 Reply
AlexT
Dataiker

Hi,

You could try to run the same code or the sample PySpark as a recipe instead of a Notebook and look at the job log for any errors.

If you are unable to find anything that could explain the issue I would suggest you open a support ticket with the job diagnostics. 

Kind Regards,