Survey banner
Switching to Dataiku - a new area to help users who are transitioning from other tools and diving into Dataiku! CHECK IT OUT

Run Dynamic Prompt via LLM AI recipe

karan_25
Level 2
Run Dynamic Prompt via LLM AI recipe

Hi Everyone,

We are building a Dataiku webapp. We want to get dynamic input from the user through UI and when the user submits the prompt then the prompts gets processed via LLM through AI recipe.

 

We are facing issue in providing dynamic input to the recipe. Any suggestions for doing the same are highly appreciated.


Operating system used: Windows


Operating system used: Windows

0 Kudos
5 Replies
tgb417

@karan_25 

Welcome to the Dataiku community we are so glad you have joined us.

When reading your post it lead me to think about dataiku webapps documentation.  

https://doc.dataiku.com/dss/latest/webapps/index.html

and this set of tutorial documents

https://knowledge.dataiku.com/latest/data-viz/webapps/index.html 

and this

https://knowledge.dataiku.com/latest/data-viz/webapps/concept-webapps.html 

here is a tutorial that may help

https://developer.dataiku.com/latest/tutorials/webapps/standard/form-to-submit-values/index.html 

This is somewhat of an advanced topic, requiring knowledge of Dataiku and coding.  Depending on your use case there may be other simpler ways of providing input for a flow.  For example I’ve used a Google sheet via the Google sheet plugin to connect to a flow to allow input.  There are editable datasets that can help some times.  

That said others may of more help, however, they may want a bit more information about your specific use case in order to better provide more targeted suggestions.  

Do you have a web app started, what type of web app, are your getting any specific error messages?

--Tom
0 Kudos
karan_25
Level 2
Author

Thank you very much Tom for providing the information.

We have built the backend using flask. When the user enters prompt in UI and submits it the request is sent to backend via Ajax. Only issue is in the backend if we are running a scenario to process the prompt via AI recipe, then we are not getting how in the scenario the AI LLM recipe will receive dynamic input entered by user.

0 Kudos

@karan_25 

Two things come to mind.

1. Dataiku DSS will allow you to setup an API.  Your flask application could consume the API provided by DSS. (I'm not an expert on this.)

2.  Reach out to your Customer Success Manager at Dataiku.  They should be able to find you the right resource inside Dataiku to get a good start on this process.

--Tom
0 Kudos
karan_25
Level 2
Author

Thank you tom for the suggestions.

We were able to figure out one other way. Instead of using AI recipe in and then calling the scenario, we can directly call the LLM in the code as described in below link

https://developer.dataiku.com/latest/concepts-and-examples/llm-mesh.html

Cool,  So you are using the Dataiku LLM Mesh.  But using the Dataiku API to get access from your Flask bases App.  I can imagine that others in the community might want to learn even more about how this is going for you.  Is it performant enough, reliable enough, etc.

--Tom
0 Kudos