background
Cutcosts,notperformance
Anopen-source,crowdsourcedbatchAIinferenceAPIoptimizedformassiveAIworkloads
Cheapest inference for..
Processing large amounts of data
Optimizing inference cost
Doing large-scale offline evaluations
Experimenting offline workflows
Batch Inference
Upload File
or drag and drop

.jsonl up to 30B

Completion Window
The time frame within which the batch should be processed.
Api Key
Request Code
curl https://r.dria.co/https://example.com \ 
-H "Authorization: Bearer dria_4e3f64d4586949e1945132f070732365lC82uQoa6TOMkshQVFE-fUah-bS0"
-F file="@batchinput.jsonl"
Coming soon
Supported Models
Claude 3.7 Sonnet
Claude 3.5 Sonnet
Gemini 2.5 Pro Experimental
Gemini 2.0 Flash
gemma3 4b
gemma3 12b
gemma3 27b
GPT-4o-mini
GPT-4o
Llama 3.3 70B Instruct
Llama 3.1 8B Instruct
Llama 3.2 1B Instruct
Mistral Nemo
Join the community