Batch operations
Modules concerning the batch operations of an LM invoker.
AnthropicBatchOperations(invoker)
Bases: BatchOperations
Handles batch operations for the AnthropicLMInvoker class.
Examples:
Invoke the language model in batch mode:
results = await lm_invoker.batch.invoke(...)
Standalone batch operations: 1. Create a batch job:
batch_id = await lm_invoker.batch.create(...)
- Get the status of a batch job:
status = await lm_invoker.batch.status(batch_id)
- Retrieve the results of a batch job:
results = await lm_invoker.batch.retrieve(batch_id)
- List the batch jobs:
batch_jobs = await lm_invoker.batch.list()
- Cancel a batch job:
await lm_invoker.batch.cancel(batch_id)
Initializes the batch operations.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
invoker
|
AnthropicLMInvoker
|
The AnthropicLMInvoker to use for the batch operations. |
required |
cancel(batch_id)
async
Cancels a batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_id
|
str
|
The ID of the batch job to cancel. |
required |
create(requests, hyperparameters=None)
async
Creates a new batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
requests
|
dict[str, LMInput]
|
The dictionary of requests that maps request ID to the request. Each request must be a valid input for the language model. 1. If the request is a list of Message objects, it is used as is. 2. If the request is a list of MessageContent or a string, it is converted into a user message. |
required |
hyperparameters
|
dict[str, Any] | None
|
A dictionary of hyperparameters for the language model. Defaults to None, in which case the default hyperparameters are used. |
None
|
Returns:
| Name | Type | Description |
|---|---|---|
str |
str
|
The ID of the batch job. |
list()
async
Lists the batch jobs.
Returns:
| Type | Description |
|---|---|
list[dict[str, Any]]
|
list[dict[str, Any]]: The list of batch jobs. |
retrieve(batch_id, **kwargs)
async
Retrieves the results of a batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_id
|
str
|
The ID of the batch job to get the results of. |
required |
**kwargs
|
Any
|
Additional keyword arguments. |
{}
|
Returns:
| Type | Description |
|---|---|
dict[str, LMOutput]
|
dict[str, LMOutput]: The results of the batch job. |
status(batch_id)
async
Gets the status of a batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_id
|
str
|
The ID of the batch job to get the status of. |
required |
Returns:
| Name | Type | Description |
|---|---|---|
BatchStatus |
BatchStatus
|
The status of the batch job. |
BatchOperations(invoker)
Handles batch operations for an LM invoker.
Examples:
Invoke the language model in batch mode:
results = await lm_invoker.batch.invoke(...)
Standalone batch operations: 1. Create a batch job:
batch_id = await lm_invoker.batch.create(...)
- Get the status of a batch job:
status = await lm_invoker.batch.status(batch_id)
- Retrieve the results of a batch job:
results = await lm_invoker.batch.retrieve(batch_id)
- List the batch jobs:
batch_jobs = await lm_invoker.batch.list()
- Cancel a batch job:
await lm_invoker.batch.cancel(batch_id)
Initializes the batch operations.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
invoker
|
BaseLMInvoker
|
The LM invoker to use for the batch operations. |
required |
cancel(batch_id)
async
Cancels a batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_id
|
str
|
The ID of the batch job to cancel. |
required |
Raises:
| Type | Description |
|---|---|
NotImplementedError
|
The batch operation is not supported. |
create(requests, hyperparameters=None)
async
Creates a new batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
requests
|
dict[str, LMInput]
|
The dictionary of requests that maps request ID to the request. Each request must be a valid input for the language model. 1. If the request is a list of Message objects, it is used as is. 2. If the request is a list of MessageContent or a string, it is converted into a user message. |
required |
hyperparameters
|
dict[str, Any] | None
|
A dictionary of hyperparameters for the language model. Defaults to None, in which case the default hyperparameters are used. |
None
|
Returns:
| Name | Type | Description |
|---|---|---|
str |
str
|
The ID of the batch job. |
Raises:
| Type | Description |
|---|---|
NotImplementedError
|
The batch operation is not supported. |
invoke(requests, hyperparameters=None, status_check_interval=DEFAULT_STATUS_CHECK_INTERVAL, max_iterations=None)
async
Invokes the language model in batch mode.
This method orchestrates the entire batch invocation process, including; 1. Creating a batch job. 2. Iteratively checking the status of the batch job until it is finished. 3. Retrieving the results of the batch job. The method includes retry logic with exponential backoff for transient failures.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
requests
|
dict[str, LMInput]
|
The dictionary of requests that maps request ID to the request. Each request must be a valid input for the language model. 1. If the request is a list of Message objects, it is used as is. 2. If the request is a list of MessageContent or a string, it is converted into a user message. |
required |
hyperparameters
|
dict[str, Any] | None
|
A dictionary of hyperparameters for the language model. Defaults to None, in which case the default hyperparameters are used. |
None
|
status_check_interval
|
float
|
The interval in seconds to check the status of the batch job. Defaults to DEFAULT_STATUS_CHECK_INTERVAL. |
DEFAULT_STATUS_CHECK_INTERVAL
|
max_iterations
|
int | None
|
The maximum number of iterations to check the status of the batch job. Defaults to None, in which case the number of iterations is infinite. |
None
|
Returns:
| Type | Description |
|---|---|
dict[str, LMOutput]
|
dict[str, LMOutput]: The results of the batch job. |
Raises:
| Type | Description |
|---|---|
CancelledError
|
If the invocation is cancelled. |
TimeoutError
|
If the invocation times out. |
list()
async
Lists the batch jobs.
Returns:
| Type | Description |
|---|---|
list[dict[str, Any]]
|
list[dict[str, Any]]: The list of batch jobs. |
Raises:
| Type | Description |
|---|---|
NotImplementedError
|
The batch operation is not supported. |
retrieve(batch_id, **kwargs)
async
Retrieves the results of a batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_id
|
str
|
The ID of the batch job to get the results of. |
required |
**kwargs
|
Any
|
Additional keyword arguments. |
{}
|
Returns:
| Type | Description |
|---|---|
dict[str, LMOutput]
|
dict[str, LMOutput]: The results of the batch job. |
Raises:
| Type | Description |
|---|---|
NotImplementedError
|
The batch operation is not supported. |
status(batch_id)
async
Gets the status of a batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_id
|
str
|
The ID of the batch job to get the status of. |
required |
Returns:
| Name | Type | Description |
|---|---|---|
BatchStatus |
BatchStatus
|
The status of the batch job. |
Raises:
| Type | Description |
|---|---|
NotImplementedError
|
The batch operation is not supported. |
GoogleBatchOperations(invoker)
Bases: BatchOperations
Handles batch operations for the GoogleLMInvoker class.
Examples:
Invoke the language model in batch mode:
results = await lm_invoker.batch.invoke(...)
Standalone batch operations: 1. Create a batch job:
batch_id = await lm_invoker.batch.create(...)
- Get the status of a batch job:
status = await lm_invoker.batch.status(batch_id)
- Retrieve the results of a batch job:
results = await lm_invoker.batch.retrieve(batch_id)
- List the batch jobs:
batch_jobs = await lm_invoker.batch.list()
- Cancel a batch job:
await lm_invoker.batch.cancel(batch_id)
Initializes the batch operations.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
invoker
|
GoogleLMInvoker
|
The GoogleLMInvoker to use for the batch operations. |
required |
cancel(batch_id)
async
Cancels a batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_id
|
str
|
The ID of the batch job to cancel. |
required |
create(requests, hyperparameters=None)
async
Creates a new batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
requests
|
dict[str, LMInput]
|
The dictionary of requests that maps request ID to the request. Each request must be a valid input for the language model. 1. If the request is a list of Message objects, it is used as is. 2. If the request is a list of MessageContent or a string, it is converted into a user message. |
required |
hyperparameters
|
dict[str, Any] | None
|
A dictionary of hyperparameters for the language model. Defaults to None, in which case the default hyperparameters are used. |
None
|
Returns:
| Name | Type | Description |
|---|---|---|
str |
str
|
The ID of the batch job. |
list()
async
Lists the batch jobs.
Returns:
| Type | Description |
|---|---|
list[dict[str, Any]]
|
list[dict[str, Any]]: The list of batch jobs. |
retrieve(batch_id, **kwargs)
async
Retrieves the results of a batch job.
Note that due to Google SDK limitations with inline batch requests, the original request IDs are not
preserved in the results. The results are keyed by request index in the format '1', '2', etc,
in which order are preserved based on the original request order. If you want to use custom request IDs,
you can pass them as a list of strings to the custom_request_ids keyword argument.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_id
|
str
|
The ID of the batch job to get the results of. |
required |
**kwargs
|
Any
|
Additional keyword arguments to retrieve batch results. |
{}
|
Returns:
| Type | Description |
|---|---|
dict[str, LMOutput]
|
dict[str, LMOutput]: The results of the batch job, keyed by the original request IDs. If the custom_request_ids is not provided, the results will be keyed by numeric indices starting from "1", "2", etc. |
status(batch_id)
async
Gets the status of a batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_id
|
str
|
The ID of the batch job to get the status of. |
required |
Returns:
| Name | Type | Description |
|---|---|---|
BatchStatus |
BatchStatus
|
The status of the batch job. |
OpenAIBatchOperations(invoker)
Bases: BatchOperations
Handles batch operations for the OpenAILMInvoker class.
Examples:
Invoke the language model in batch mode:
results = await lm_invoker.batch.invoke(...)
Standalone batch operations: 1. Create a batch job:
batch_id = await lm_invoker.batch.create(...)
- Get the status of a batch job:
status = await lm_invoker.batch.status(batch_id)
- Retrieve the results of a batch job:
results = await lm_invoker.batch.retrieve(batch_id)
- List the batch jobs:
batch_jobs = await lm_invoker.batch.list()
- Cancel a batch job:
await lm_invoker.batch.cancel(batch_id)
Initializes the batch operations.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
invoker
|
OpenAILMInvoker
|
The OpenAILMInvoker to use for the batch operations. |
required |
cancel(batch_id)
async
Cancels a batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_id
|
str
|
The ID of the batch job to cancel. |
required |
create(requests, hyperparameters=None)
async
Creates a new batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
requests
|
dict[str, LMInput]
|
The dictionary of requests that maps request ID to the request. Each request must be a valid input for the language model. 1. If the request is a list of Message objects, it is used as is. 2. If the request is a list of MessageContent or a string, it is converted into a user message. |
required |
hyperparameters
|
dict[str, Any] | None
|
A dictionary of hyperparameters for the language model. Defaults to None, in which case the default hyperparameters are used. |
None
|
Returns:
| Name | Type | Description |
|---|---|---|
str |
str
|
The ID of the batch job. |
list()
async
Lists the batch jobs.
Returns:
| Type | Description |
|---|---|
list[dict[str, Any]]
|
list[dict[str, Any]]: The list of batch jobs. |
retrieve(batch_id, **kwargs)
async
Retrieves the results of a batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_id
|
str
|
The ID of the batch job to get the results of. |
required |
**kwargs
|
Any
|
Additional keyword arguments. |
{}
|
Returns:
| Type | Description |
|---|---|
dict[str, LMOutput]
|
dict[str, LMOutput]: The results of the batch job. |
status(batch_id)
async
Gets the status of a batch job.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_id
|
str
|
The ID of the batch job to get the status of. |
required |
Returns:
| Name | Type | Description |
|---|---|---|
BatchStatus |
BatchStatus
|
The status of the batch job. |