Skip to content

Gllm Inference

This documentation covers the Gllm Inference library, which is part of the GLLM Monorepo.

API Reference

The following modules are available in this library:

  • Builder: Modules concerning the builder utilities of GLLM Inference modules.
  • Catalog: Modules concerning the catalog to manage and load prompt builders used in Gen AI applications.
  • Constants: Contains constants used throughout the gllm_inference package.
  • Em Invoker: Modules concerning the embedding model invokers used in Gen AI applications.
  • Exceptions: Provides custom exception classes, error handling and parsing utilities.
  • Lm Invoker: Modules concerning the language model invokers used in Gen AI applications.
  • Model: Defines the model names constants.
  • Output Parser: Modules concerning the output parsers used in Gen AI applications.
  • Prompt Builder: Modules concerning the prompt builders used in Gen AI applications.
  • Prompt Formatter: Modules concerning the prompt formatters used in Gen AI applications.
  • Realtime Chat: [BETA] Modules concerning the realtime chat modules to interact with realtime chat models.
  • Request Processor: Modules concerning the request processor modules to perform model inferences in Gen AI applications.
  • Schema: Modules concerning the schema of GLLM Inference modules.
  • Utils: Defines utilities for gllm_inference.