Terminator step
A step that connects previous steps to the END node.
References
NONE
TerminatorStep(name, retry_config=None, error_handler=None, cache_store=None, cache_config=None)
Bases: BasePipelineStep
A step that connects previous steps to the END node.
This step is useful when you want to explicitly terminate a branch or the entire pipeline. It has no processing logic and simply acts as a connection point to the END node.
Example:
pipeline = (
step_a
| ConditionalStep(
name="branch",
branches={
"terminate": TerminatorStep("early_end"),
"continue": step_b
},
condition=lambda x: "terminate" if x["should_stop"] else "continue"
)
| step_c
)
Attributes:
| Name | Type | Description |
|---|---|---|
name |
str
|
A unique identifier for this pipeline step. |
retry_policy |
RetryPolicy | None
|
Configuration for retry behavior using LangGraph's RetryPolicy. |
Initializes a new pipeline step.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
name |
str
|
A unique identifier for the pipeline step. |
required |
retry_config |
RetryConfig | None
|
Configuration for retry behavior using
GLLM Core's RetryConfig. Defaults to None, in which case no retry config is applied.
The RetryConfig is automatically converted to LangGraph's RetryPolicy when needed for internal use.
Note that |
None
|
error_handler |
BaseStepErrorHandler | None
|
Strategy to handle errors during execution. Defaults to None, in which case the RaiseStepErrorHandler is used. |
None
|
cache_store |
'BaseCache' | None
|
The cache store to use for caching step results. Defaults to None. If None, no caching will be used. |
None
|
cache_config |
dict[str, Any] | None
|
Configuration for the cache store. 1. key_func: A function to generate cache keys. If None, the cache instance will use its own key function. 2. name: The name of the cache. If None, the cache instance will use its own key function. 3. ttl: The time-to-live for the cache. If None, the cache will not have a TTL. 4. matching_strategy: The strategy for matching cache keys. If None, the cache instance will use "exact". 5. matching_config: Configuration for the matching strategy. If None, the cache instance will use its own default matching strategy configuration. |
None
|
Caching Mechanism
When a cache_store is provided, the step's execution method is automatically wrapped with a cache decorator. This means: 1. Before execution, the cache is checked for existing results based on input parameters 2. If a cached result exists and is valid, it's returned immediately 3. If no cached result exists, the step executes normally and the result is cached 4. Cache keys are generated from the step's input state and configuration 5. The cache name defaults to "step_{step_name}" if not specified
add_to_graph(graph, previous_endpoints, retry_policy=None)
Adds this step to the graph and connects it to the END node.
This method is used by Pipeline to manage the pipeline's execution flow.
It should not be called directly by users.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
graph |
StateGraph
|
The graph to add this step to. |
required |
previous_endpoints |
list[str]
|
The endpoints from previous steps to connect to. |
required |
retry_policy |
RetryPolicy | None
|
Configuration for retry behavior using LangGraph's RetryPolicy. If None, the retry policy of the step is used. If the step is not a retryable step, this parameter is ignored. |
None
|
Returns:
| Type | Description |
|---|---|
list[str]
|
list[str]: Empty list as this step has no endpoints (it terminates the flow). |
execute(state, runtime)
async
Executes this step, which does nothing but pass through the state.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
state |
PipelineState
|
The current pipeline state. |
required |
runtime |
Runtime[dict[str, Any] | BaseModel]
|
The runtime information. |
required |