Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
When I try making a custom codeEnv for my python notebooks through dataiku. It tries to run the command docker build to do so. I'm running dataiku inside a kubernetes cluster managed by azure (AKS) so docker isn't available. What is the recommended way by dataiku to build such an image. And why doesn't dataiku use something like kaniko to build and push the image to a configured registry.
The recommended way is to avoid running Dataiku itself in a Docker container. It is possible to use it as such, but it comes with a lot of limitations, as detailed in our reference documentation: https://doc.dataiku.com/dss/latest/installation/other/docker.html
In particular, if you run Dataiku in a container, you completely forfeit the ability to leverage containerized execution.
Hope this helps !
Hello, that is odd. Because on this page about running DSS in AKS: https://doc.dataiku.com/dss/latest/containers/aks/managed.html#create-a-new-containerized-execution-....
The final step is clicking on "Push base images" which uses docker. However AKS doesn't have docker. Maybe I missed something, could you shed some light on this?
This page is about running DSS workloads in AKS, not DSS itself. The goal of containerized execution is to offload heavy computation from recipes or notebooks to Kubernetes pods where they will have access to a bigger pool of resources. In that configuration, the DSS instance still runs on a classical VM, not on Kubernetes.
The base image that you push is not the image of DSS itself but of the "framework" required to execute notebooks and recipes.
You can familiarise yourself with the basic concepts by reading this section of the documentation: https://doc.dataiku.com/dss/latest/containers/concepts.html#interaction-between-dss-and-containers
Hope this helps.