Skip to content

Request Processor

Modules concerning the request processor modules to perform model inferences in Gen AI applications.

LMRequestProcessor(prompt_builder, lm_invoker)

A request processor to perform language models inference.

The LMRequestProcessor class handles the process of building a prompt and invoking a language model. It combines a prompt builder and a language model invoker to manage the inference process in Gen AI applications.

Attributes:

Name Type Description
prompt_builder PromptBuilder

The prompt builder used to format the prompt.

lm_invoker BaseLMInvoker

The language model invoker that handles the model inference.

tool_dict dict[str, Tool]

A dictionary of tools provided to the language model to enable tool calling, if any. The dictionary maps the tool name to the tools themselves.

Initializes a new instance of the LMRequestProcessor class.

Parameters:

Name Type Description Default
prompt_builder PromptBuilder

The prompt builder used to format the prompt.

required
lm_invoker BaseLMInvoker

The language model invoker that handles the model inference.

required

clear_response_schema()

Clears the response schema for the LM invoker.

This method clears the response schema for the LM invoker.

clear_tools()

Clears the tools for the LM invoker.

This method clears the tools for the LM invoker.

process(history=None, extra_contents=None, hyperparameters=None, event_emitter=None, auto_execute_tools=True, max_lm_calls=5, **kwargs) async

Processes a language model inference request.

This method processes the language model inference request as follows: 1. Assembling the prompt using the provided keyword arguments. 2. Invoking the language model with the assembled prompt and optional hyperparameters. 3. If auto_execute_tools is True, the method will automatically execute tools if the LM output includes tool calls.

Parameters:

Name Type Description Default
history list[Message] | None

A list of conversation history to be included in the prompt. Defaults to None.

None
extra_contents list[MessageContent] | None

A list of extra contents to be included in the user message. Defaults to None.

None
hyperparameters dict[str, Any] | None

A dictionary of hyperparameters for the model invocation. Defaults to None.

None
event_emitter EventEmitter | None

An event emitter for streaming model outputs. Defaults to None.

None
auto_execute_tools bool

Whether to automatically execute tools if the LM invokes output tool calls. Defaults to True.

True
max_lm_calls int

The maximum number of times the language model can be invoked when auto_execute_tools is True. Defaults to 5.

5
**kwargs Any

Keyword arguments that will be passed to format the prompt builder. Values must be either a string or an object that can be serialized to a string. Reserved keyword arguments that cannot be passed to the prompt builder include: 1. history 2. extra_contents 3. hyperparameters 4. event_emitter 5. auto_execute_tools 6. max_lm_calls

{}

Returns:

Name Type Description
LMOutput LMOutput

The result of the language model invocation.

set_response_schema(response_schema)

Sets the response schema for the LM invoker.

This method sets the response schema for the LM invoker. Any existing response schema will be replaced.

Parameters:

Name Type Description Default
response_schema ResponseSchema | None

The response schema to be used.

required

set_tools(tools)

Sets the tools for the LM invoker.

This method sets the tools for the LM invoker. Any existing tools will be replaced.

Parameters:

Name Type Description Default
tools list[Tool]

The list of tools to be used.

required

UsesLM

A mixin to be extended by components that use LMRequestProcessor.

This mixin should be extended by components that use LMRequestProcessor. Components that extend this mixin must have a constructor that accepts the LMRequestProcessor instance as its first argument.

LM based components can be categorized into two types: 1. Components that do not utilize structured output. 2. Components that utilize structured output.

Building a component without structured output

As defined above, the component must accepts an LMRequestProcessor instance as its first argument, e.g.:

class LMBasedComponent(Component, UsesLM):
    def __init__(self, lm_request_processor: LMRequestProcessor, custom_kwarg: str):
        self.lm_request_processor = lm_request_processor
        self.custom_kwarg = custom_kwarg

Using the from_lm_components method provided by this mixin, the component can be instantiated as follows:

component = LMBasedComponent.from_lm_components(
    prompt_builder,
    lm_invoker,
    custom_kwarg="custom_value",
)
Building a component with structured output

When the component utilizes structured output, the _parse_structured_output method can be used to simplify the process of extracting the structured output in the component's runtime methods, e.g.:

class LMBasedComponent(Component, UsesLM):
    def __init__(self, lm_request_processor: LMRequestProcessor, custom_kwarg: str):
        self.lm_request_processor = lm_request_processor
        self.custom_kwarg = custom_kwarg

    def runtime_method(self, param1: str, param2: str) -> str:
        lm_output = self.lm_request_processor.process(param1=param1, param2=param2)
        return self._parse_structured_output(lm_output, "target_key", "fallback_output")

Notice that in the above example, the LMRequestProcessor is configured to take param1 and param2 as keyword arguments and output a structured output that contains the target_key key. Hence, these conditions must be fulfilled when instantiating the component.

This mixin also provides the with_structured_output method to simplify the process of instantiating the component with structured output. Let's take a look at an example that meets the above conditions:

class Schema(BaseModel):
    target_key: str

component = LMBasedComponent.with_structured_output(
    model_id="openai/gpt-4.1-mini",
    response_schema=Schema,
    system_template="system_template {param1}",
    user_template="user_template {param2}",
    custom_kwarg="custom_value",
)
Building a structured output preset

If desired, the component can also define a quick preset. This can be done by providing default prompts as response schema. Here's an example:

class Schema(BaseModel):
    target_key: str

@classmethod
def from_preset(cls, model_id: str, custom_kwarg: str) -> "LMBasedComponent":
    return cls.with_structured_output(
        model_id=model_id,
        response_schema=Schema,
        system_template=PRESET_SYSTEM_TEMPLATE,
        user_template=PRESET_USER_TEMPLATE,
        custom_kwarg=custom_kwarg,
    )
)

Then, the preset can be instantiated as follows:

component = LMBasedComponent.from_preset(
    model_id="openai/gpt-4.1-mini",
    custom_kwarg="custom_value",
)

from_lm_components(prompt_builder, lm_invoker, **kwargs) classmethod

Creates an instance from LMRequestProcessor components directly.

This method is a shortcut to initialize the class by providing the LMRequestProcessor components directly.

Parameters:

Name Type Description Default
prompt_builder PromptBuilder

The prompt builder used to format the prompt.

required
lm_invoker BaseLMInvoker

The language model invoker that handles the model inference.

required
**kwargs Any

Additional keyword arguments to be passed to the class constructor.

{}

Returns:

Name Type Description
UsesLM UsesLM

An instance of the class that mixes in this mixin.

with_structured_output(model_id, response_schema, system_template='', user_template='', **kwargs) classmethod

Creates an instance with structured output configuration.

This method is a shortcut to initialize the class with structured output configuration.

Parameters:

Name Type Description Default
model_id str

The model ID of the language model.

required
response_schema type[BaseModel]

The response schema of the language model.

required
system_template str

The system template of the language model. Defaults to an empty string.

''
user_template str

The user template of the language model. Defaults to an empty string.

''
**kwargs Any

Additional keyword arguments to be passed to the class constructor.

{}

Returns:

Name Type Description
UsesLM UsesLM

An instance of the class that mixes in this mixin with structured output configuration.