Create batch
post https://api.openai.com/v1/batches
Creates and executes a batch from an uploaded file of requests
Request body
The ID of an uploaded file that contains requests for the new batch.
See upload file for how to upload a file.
Your input file must be formatted as a JSONL file, and must be uploaded with the purpose batch
. The file can contain up to 50,000 requests, and can be up to 200 MB in size.
The endpoint to be used for all requests in the batch. Currently /v1/chat/completions
, /v1/embeddings
, and /v1/completions
are supported. Note that /v1/embeddings
batches are also restricted to a maximum of 50,000 embedding inputs across all requests in the batch.
The time frame within which the batch should be processed. Currently only 24h
is supported.
Returns
The created Batch object.
Example request
1
2
3
4
5
6
7
8
curl https://api.openai.com/v1/batches \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"input_file_id": "file-abc123",
"endpoint": "/v1/chat/completions",
"completion_window": "24h"
}'
Response
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
{
"id": "batch_abc123",
"object": "batch",
"endpoint": "/v1/chat/completions",
"errors": null,
"input_file_id": "file-abc123",
"completion_window": "24h",
"status": "validating",
"output_file_id": null,
"error_file_id": null,
"created_at": 1711471533,
"in_progress_at": null,
"expires_at": null,
"finalizing_at": null,
"completed_at": null,
"failed_at": null,
"expired_at": null,
"cancelling_at": null,
"cancelled_at": null,
"request_counts": {
"total": 0,
"completed": 0,
"failed": 0
},
"metadata": {
"customer_id": "user_123456789",
"batch_description": "Nightly eval job",
}
}