Mismatch in random ports for notebook kernels
When I try to use the notebook kernels in containerized environments I get different ports stored in DSS and in the kubernetes pod is that normal?
The file jupyter-run/jupyter/runtime/kernel-c234fbb1-84dc-44ad-a18e-7ad0aff81702.json shows different ports:
{
"shell_port": 51875,
"iopub_port": 38011,
"stdin_port": 35689,
"control_port": 41523,
"hb_port": 57653,
"ip": "127.0.0.1",
"key": "f9ffba4e-cd5d99a382680a9e8421cd8a",
"transport": "tcp",
"signature_scheme": "hmac-sha256",
"kernel_name": "py-dku-containerized-venv--k8s-container"
}
from those shown in the pod's logs:
Setup forwarding
Connect with timeout 30000
Use connection file {"shell_port": 35489, "iopub_port": 39419, "stdin_port": 43865, "control_port": 42353, "hb_port": 51247, "ip": "0.0.0.0", "key": "71d9b24a-10aaa34669dd53b810e7c50f", "transport": "tcp", "signature_scheme": "hmac-sha256", "kernel_name": "", "remoteHost": "dataiku-exec-remote-notebook-3wmu7vfe"}
Failed to setup forwarder
In forwarder.log stored in the notebook files inside DSS instance the process of picking random ports is logged and matches the json file above. There is no place in DSS where the ports shown on the kernel pod logs are stored.
How is that possible? Are random ports chosen twice for notebooks?
It works well for visual recipes for example where random chosen ports match correctly
Operating system used: ubuntu 24
Answers
-
Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,352 DataikerHi,
This would be expected for notebooks. Ports used by DSS and the ports used by the kernel inside the container are separate. A "forwarder" keeps these mappings in memory.
Thanks