LLM Batch requests

toppe
Registered Posts: 1 ✭✭
I am experiencing rate limit errors for my LLM requests.
Is there a simple method to batch my requests?
My thanks in advance.
Tom
Operating system used: Mac OS
Answers
-
Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,269 Dataiker
Hi,
Currently, there is no direct way to batch request; for example, in the prompt recipe,e each row will be sent a seperate request to the LLM.
You can adjust the connection settings to increase the retries, reduce parallelism, or try obtain higher LLM rate limits on the LLM side.
Thanks