Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
I have set up containerized execution and executing spark through a kubernetes spark cluster. This works well for all kinds of recipes, and spark related jupyterhub notebooks. However when I try using python notebooks kernels aren't launched in kubernetes pods.
What I tried:
- Adding all jupyterhub related packages to both the dss image and the container image.
- Setting a new code env that forces containerized execution.
When I try to add notebook support to that code env, it tries to download jupyterhub packages through pip from the internet. Despite the packages being already installed on the dss image.
Is there any way I could achieve that?
you should get in touch with your customer success manager in order to devise a way to get this to work.
The image building for code env images runs pip commands inside the container, so you need to have a pip repository accessible on the machine, either a private repository or via a proxy.
Hi thank you for your reply.
I have already prepared container images that includes everything that pip would need to download. Isn't there any solution to make dss realize that all packages are there?
It's incredibly easier for me to go with this solution rather than set up an offline pip repo.