Reusing few-shot classification prompts in prompt studio?

ron_dk
Level 2
Reusing few-shot classification prompts in prompt studio?

Hi Dataiku'ers,

Playing around with the few-shot classification prompts in prompt studio using Open AI GPT, it looks like the entire prompt is sent to the LLM for each row of data to be classified. With zero-shot labels that might make sense, but for few-shot classification prompts that include example texts, definitions and so on, this seem uneconomical, since all text being sent back and forth counts towards usage.

Is there any way to just upload the classification instructions once and have the LLM reuse it for each row in the dataset? Either on the Dataiku setup side or in the prompt itself? On the Dataiku side, is there a no-code batching function that can be switched on somewhere? Or in the prompt, is there a way of using prompt chaining to ensure that GPT only has to read the prompt once?

Any ideas much appreciated!


Operating system used: Windows

2 Replies
AdrienL
Dataiker

Hello,

AFAIK, OpenAI does not yet provide the possibility to directly do such batching. Which seems understandable, since under the hood, the model still needs to execute the full prompt for each of these examples.

There is an Assistants beta that seems like it could do such a thing, but the API for it is different. Maybe one could use its threads for batch scoring, but I don't how good they are in keeping overall instructions and context for each message. Or maybe one could create an Assistant per such task and leverage its base prompt for the few-shots example, but since it needs to evaluate this prompt internally, I'm not sure it makes a difference in pricing.

I guess one could also try to leverage their fine-tuning capabilities, to fine-tune a model for a specific task, but be mindful that the pricing is not the same (fine-tuned GPT 3.5 seems 3 times more expensive as base GPT 3.5), so the difference in tokens has to be worth it.

0 Kudos
ron_dk
Level 2
Author

Thanks for your answer, Adrien, very helpful.

Disappointing to hear that there is no batching on Open AI's end, but good to have the clarification. I'm reassured that at least I'm not wasting resources simply because I didn't tick some box in the setup.

Grateful to be able to use the free Dataiku version!


0 Kudos