Train Model Error - com.dataiku.dip.io.SocketBlockLink$SecretKernelTimeoutException
Metal_Horse
Registered Posts: 2 ✭✭✭✭
While going through the tutorials, I am getting following error for training the model:
[2018/06/17-18:42:07.394] [MRT-164] [ERROR] [dku.analysis.ml.python] - Processing failed
com.dataiku.dip.io.SocketBlockLink$SecretKernelTimeoutException: Subprocess failed to connect, it probably crashed at startup. Check the logs.
at com.dataiku.dip.io.SocketBlockLink.waitForConnection(SocketBlockLink.java:82)
at com.dataiku.dip.io.SecretProtectedKernelLink.waitForProcess(SecretProtectedKernelLink.java:38)
at com.dataiku.dip.io.PythonSecretProtectedKernel.start(PythonSecretProtectedKernel.java:84)
at com.dataiku.dip.analysis.ml.shared.PRNSTrainThread.run(PRNSTrainThread.java:87)
Caused by: java.net.SocketException: Socket closed
at java.net.PlainSocketImpl.socketAccept(Native Method)
at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:409)
at java.net.ServerSocket.implAccept(ServerSocket.java:545)
at java.net.ServerSocket.accept(ServerSocket.java:513)
at com.dataiku.dip.io.SocketBlockLink.waitForConnection(SocketBlockLink.java:78)
... 3 more
Tagged:
Best Answer
-
A bit above in the logs, you have a "Missing required dependencies ['numpy']", which means that the DSS builtin python environment has an issue.
Possibly a package has been installed that caused a change in some core dependency. The recommended way of installing packages is to use code environment, see Installing Python packages for more details.
To fix your current issue, try rebuilding DSS' builtin python environment.
Answers
-
Thank you so much! So, the problem was missing packages. And whenever I face same problem, I need to go to Settings --> Administration --> Code Env --> Packages to Install, and add all the required packages.
Thank you once again. -
You're very welcome
-
Hello,
I am facing the same issue on Archlinux. I don't manage to solve this "numpy" dependency error... I tried to build the environment in settings > Administration > Code Env. numpy is there, in my system installation, in the DDS code env but for some reasons, the training of my algo are failing and the log shows: xxx/DATA_DIR/bin/python: Missing required dependencies ['numpy']
For information, my code_env page was empty before I created a python code_env manually, I don't know if it is a normal situation.
Any idea how I could solve this issue to make dds run on archlinux?
Thanks for your support,
Salvatore -
Have you tried the solution mentioned above about rebuilding DSS's **builtin** python environment?
Code Envs are specific environments that you can use in your Python/R recipes (in Advanced) and ML trainings (under "Python environment" in the Design part of the ML task), when you want to use specific packages.
By default and if you don't need to use additional packages, the builtin environment is the one that gets used. If it is lacking numpy, it means it's broken, and you should rebuild it.
If you need to use additional packages, then you want to create a Code Env and use that code env in your ML task or recipe. -
Hi,
thanks for the reply. Here is how I solved my issue:
1) in administration, create an environment as explained above
2) install the following additional packages in the custum env:
scipy
sklearn
statsmodels
xgboost
jinja2
3) in your model, select the created environment
thanks again for the help, the journey with dss can continue
Salvatore -
Yes, that also works. You will need to use this environment with all your python and ML recipes though.
I would still advise also fixing the builtin DSS python environment though -
thanks Adrien.
I didn't understand what di you mean and what can I do to "fix the built-in environment". It refuses to run because of missing numpy and I have no idea how to fix numpy without creating my environment. As a reminder, with a fresh install of dss, there is no environment in the admin tab so nothing to rebuilt.
I am missing something
Salvatore -
The Code Env tab only shows the additional environments created by the DSS administrator/user.
On a fresh DSS install, there is also the built-in python environment, which is used by default on all things running on Python.
To fix it, check the link that I referenced in my first answer above, i.e.
https://doc.dataiku.com/dss/latest/installation/python.html#rebuilding-the-builtin-python-environment -
ok, I start understanding the logic... I performed the steps of the documentation but the dss-local-packages.txt was empty. I added inside the list of package of my custum env to "install" them in the built-in environment. The the training is still crahing because numpy is not found. When I run "DATA_DIR/bin/pip list", I get a full list of packages that includes "numpy".
thanks for your help
Salvatore
ps: I am on archlinux -
The dss-local-packages.txt can be empty, it lists the _additional_ packages that you installed on top of the base packages. The DSS installer takes care of installing the base packages under the hood.
You should not manually specify the packages from your repository, as it will override the base packages with different versions that may not expected in the built-in python env.
Can you try with just the instructions from the documentation? -
Hi,
I tried following the steps of the documentation: learning step crashed and the logs show that numpy is missing.
Salvatore -
ArchLinux is not officially supported, but quick tests seem to show that we can't reproduce your issue. With a base ArchLinux from AWS, DSS starts normally and Pandas is functional after:
- installing the packages extra/jdk8-openjdk, extra/nginx, extra/freetype2, community/gcc54
- adding /usr/lib/gcc/x86_64-pc-linux-gnu/5.4.1 to DSS' LD_LIBRARY_PATH
If you can't solve this, you can continue with your code env as long as it's not blocking you, or try either installing the builtin code env using conda https://doc.dataiku.com/dss/latest/installation/python.html#using-anaconda-python or a fully custom python environment https://doc.dataiku.com/dss/latest/installation/python.html#advanced-using-a-fully-custom-python-environment
Also, providing full installation logs may help us have other ideas. -
thanks Adrien for your investigations. After installing gcc54 and adding /usr/lib/gcc/x86_64-pc-linux-gnu/5.4.1 to LD_LIBRARY_PATH, I don't have the issue anymore.
For the record for people reading this later, just run:
export LD_LIBRARY_PATH="/usr/lib/gcc/x86_64-pc-linux-gnu/5.4.1"
before launching the installation.
Thanks again.
Salvatore -
Glad to hear that. Thanks for the update
-
ibn-mohey Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 4 ✭
any suggestion for solution with windows ???