Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
According to the 'llama-cpp-python' documentation:
pip install llama-cpp-python
The above command will attempt to install the package and build llama.cpp from source. This is the recommended installation method as it ensures that llama.cpp is built with the available optimizations for your system.
so you should not need to compile llama.cpp as a separate step.
If you experience any issues installing 'llama-cpp-python' in DSS, we'll need a code env dialog in order to start our investigation. Please note that this package has no wheels , so among other things you'll need some OS packages installed in your server in order to install from source, for instance CMake
Thanks for your response,
Yeah this is roadblock: Compile llama.cpp on containers (Compile C/C++).
During a call with Dataiku Support, I was informed that:
So it is locked, There is no way to make it work, probably in the future Dataiku team can an additional way to precompile GCC libs before install python packages.