Run Dynamic Prompt via LLM AI recipe
Hi Everyone,
We are building a Dataiku webapp. We want to get dynamic input from the user through UI and when the user submits the prompt then the prompts gets processed via LLM through AI recipe.
We are facing issue in providing dynamic input to the recipe. Any suggestions for doing the same are highly appreciated.
Operating system used: Windows
Operating system used: Windows
Answers
-
tgb417 Dataiku DSS Core Designer, Dataiku DSS & SQL, Dataiku DSS ML Practitioner, Dataiku DSS Core Concepts, Neuron 2020, Neuron, Registered, Dataiku Frontrunner Awards 2021 Finalist, Neuron 2021, Neuron 2022, Frontrunner 2022 Finalist, Frontrunner 2022 Winner, Dataiku Frontrunner Awards 2021 Participant, Frontrunner 2022 Participant, Neuron 2023 Posts: 1,598 Neuron
Welcome to the Dataiku community we are so glad you have joined us.
When reading your post it lead me to think about dataiku webapps documentation.
https://doc.dataiku.com/dss/latest/webapps/index.html
and this set of tutorial documents
https://knowledge.dataiku.com/latest/data-viz/webapps/index.html
and this
https://knowledge.dataiku.com/latest/data-viz/webapps/concept-webapps.html
here is a tutorial that may help
https://developer.dataiku.com/latest/tutorials/webapps/standard/form-to-submit-values/index.html
This is somewhat of an advanced topic, requiring knowledge of Dataiku and coding. Depending on your use case there may be other simpler ways of providing input for a flow. For example I’ve used a Google sheet via the Google sheet plugin to connect to a flow to allow input. There are editable datasets that can help some times.That said others may of more help, however, they may want a bit more information about your specific use case in order to better provide more targeted suggestions.
Do you have a web app started, what type of web app, are your getting any specific error messages?
-
Thank you very much Tom for providing the information.
We have built the backend using flask. When the user enters prompt in UI and submits it the request is sent to backend via Ajax. Only issue is in the backend if we are running a scenario to process the prompt via AI recipe, then we are not getting how in the scenario the AI LLM recipe will receive dynamic input entered by user.
-
tgb417 Dataiku DSS Core Designer, Dataiku DSS & SQL, Dataiku DSS ML Practitioner, Dataiku DSS Core Concepts, Neuron 2020, Neuron, Registered, Dataiku Frontrunner Awards 2021 Finalist, Neuron 2021, Neuron 2022, Frontrunner 2022 Finalist, Frontrunner 2022 Winner, Dataiku Frontrunner Awards 2021 Participant, Frontrunner 2022 Participant, Neuron 2023 Posts: 1,598 Neuron
Two things come to mind.
1. Dataiku DSS will allow you to setup an API. Your flask application could consume the API provided by DSS. (I'm not an expert on this.)
2. Reach out to your Customer Success Manager at Dataiku. They should be able to find you the right resource inside Dataiku to get a good start on this process.
-
Thank you tom for the suggestions.
We were able to figure out one other way. Instead of using AI recipe in and then calling the scenario, we can directly call the LLM in the code as described in below link
https://developer.dataiku.com/latest/concepts-and-examples/llm-mesh.html
-
tgb417 Dataiku DSS Core Designer, Dataiku DSS & SQL, Dataiku DSS ML Practitioner, Dataiku DSS Core Concepts, Neuron 2020, Neuron, Registered, Dataiku Frontrunner Awards 2021 Finalist, Neuron 2021, Neuron 2022, Frontrunner 2022 Finalist, Frontrunner 2022 Winner, Dataiku Frontrunner Awards 2021 Participant, Frontrunner 2022 Participant, Neuron 2023 Posts: 1,598 Neuron
Cool, So you are using the Dataiku LLM Mesh. But using the Dataiku API to get access from your Flask bases App. I can imagine that others in the community might want to learn even more about how this is going for you. Is it performant enough, reliable enough, etc.