kernel error when starting a python notebook

Options
longhowlam
longhowlam Partner, Registered Posts: 24 Partner

Hi,



I installed DSS on a GCP VM when I start a Python jupyter notebook, I get the following kernel error. Any Ideas

Regards, Longhow







Traceback (most recent call last):

File "/home/longhowlam/dataiku-dss-6.0.2/python.packages/tornado/web.py", line 1592, in _execute

result = yield result

File "/home/longhowlam/dataiku-dss-6.0.2/python.packages/tornado/gen.py", line 1133, in run

value = future.result()

File "/home/longhowlam/dataiku-dss-6.0.2/python.packages/tornado/concurrent.py", line 261, in result

raise_exc_info(self._exc_info)

File "/home/longhowlam/dataiku-dss-6.0.2/python.packages/tornado/gen.py", line 1141, in run

yielded = self.gen.throw(*exc_info)

File "/home/longhowlam/dataiku-dss-6.0.2/dku-jupyter/packages/notebook/services/sessions/handlers.py", line 81, in post

user=self.get_current_user(),

File "/home/longhowlam/dataiku-dss-6.0.2/python.packages/tornado/gen.py", line 1133, in run

value = future.result()

File "/home/longhowlam/dataiku-dss-6.0.2/python.packages/tornado/concurrent.py", line 261, in result

raise_exc_info(self._exc_info)

File "/home/longhowlam/dataiku-dss-6.0.2/python.packages/tornado/gen.py", line 326, in wrapper

yielded = next(result)

File "/home/longhowlam/dataiku-dss-6.0.2/dku-jupyter/packages/notebook/dataiku/sessionmanager.py", line 55, in create_session

kernel_context = self.new_kernel_context(head, tail, user, session_id)

File "/home/longhowlam/dataiku-dss-6.0.2/dku-jupyter/packages/notebook/dataiku/sessionmanager.py", line 32, in new_kernel_context

"sessionId" : session_id

File "/home/longhowlam/dataiku-dss-6.0.2/dku-jupyter/packages/notebook/dataiku/__init__.py", line 27, in pintercom_json_post

raise Exception("Failed privileged call (%s): %s" % (path, _get_error_message(err_data).encode("utf8")))

Exception: Failed privileged call (jupyter/new-kernel-context): Spark Config not found: default

Answers

  • Clément_Stenac
    Clément_Stenac Dataiker, Dataiku DSS Core Designer, Registered Posts: 753 Dataiker
    Options
    Hi,

    You removed the Spark named config named "default", which is configured as the Spark config to use for your notebooks. We'd advise that you go in Admin > Settings > Spark and either select another config as the default for notebooks, or create a new config named "default" with no settings in it.
  • longhowlam
    longhowlam Partner, Registered Posts: 24 Partner
    Options
    Hi Clement,

    Thanks this worked!

    Note that the removal of default config 'silently' happend when I set up Dataiku to run notebooks on GKE. Other users may see the same behaviour?

    Regards, Longhow
Setup Info
    Tags
      Help me…