Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
I have been using the default PySpark notebook on a Dataiku instance which does have Spark set up, and cannot get past the very first step after the imports (which are successful modulo some apparently harmless DeprecationWarnings about docstrings).
Evaluating this cell in a Jupyter notebook gets stuck with an asterisk for an indefinite period of time:
# Load PySpark
sc = pyspark.SparkContext.getOrCreate()
Any ideas how to better diagnose or even resolve this issue?
Thanks!
Hi,
You could try to run the same code or the sample PySpark as a recipe instead of a Notebook and look at the job log for any errors.
If you are unable to find anything that could explain the issue I would suggest you open a support ticket with the job diagnostics.
Kind Regards,
Hi,
You could try to run the same code or the sample PySpark as a recipe instead of a Notebook and look at the job log for any errors.
If you are unable to find anything that could explain the issue I would suggest you open a support ticket with the job diagnostics.
Kind Regards,