How to "send" a knowledge bank with a get_llm?
Hello all,
Here our context :
-An headless custom web app on Dataiku that receives through API a user question and must give an answer based on internal procedures
-as of today, the procedures are sent in markdown with the prompt but it's really heavy and the response time is more thant 20 seconds
-I have played with embedded database (chroma) and retrieval LLM. Not that easy to play with the embedding model when you have a lot of text but in the end it worked.
however, now, I would like to use my knowledge bank into my webapp with the get_llm. I'm not even sure it's possible, the documentation doesn't seem to adress this case.
Best regards,
Simon
Operating system used: Windows 11
Answers
-
Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,381 DataikerHi,
You should be able to use the Knowledge Bank from the LLM Mesh following the example here
https://developer.dataiku.com/latest/tutorials/genai/nlp/llm-mesh-rag/index.html#running-an-enriched-llm-queryTHanks