background
Cutcosts,notperformance
Anopen-source,crowdsourcedbatchAIinferenceAPIoptimizedformassiveAIworkloads
Cheapest inference for..
Processing large amounts of data
Optimizing inference cost
Doing large-scale offline evaluations
Experimenting offline workflows
Batch Inference
Upload File
or drag and drop

.jsonl up to 30MB

Api Key
Request Code
curl --location --request GET 'https://mainnet.dkn.dria.co/api/v0/batch/complete_upload' \
--header 'Content-Type: application/json' \
--header 'x-api-key: dria_4d6245da4f3e9bb07f8585c4a9882f94DkZgL8meQSzeaV-r6T3H6JKhcQVfgJyA' \
--data '{
  "id": "put-file-id-here"
}'
View Docs
Supported Models
Claude 3.7 Sonnet
Claude 3.5 Sonnet
Gemini 2.5 Pro Experimental
Gemini 2.0 Flash
gemma3 4b
gemma3 12b
gemma3 27b
GPT-4o-mini
GPT-4o
Llama 3.3 70B Instruct
Llama 3.1 8B Instruct
Llama 3.2 1B Instruct
Mistral Nemo
Join the community