Model id
Defines the model id related schemas for the gllm_inference package.
References
NONE
ModelId
Bases: BaseModel
Defines a representation of a valid model id.
Attributes:
| Name | Type | Description |
|---|---|---|
provider |
ModelProvider
|
The provider of the model. |
name |
str | None
|
The name of the model. |
path |
str | None
|
The path of the model. |
Provider-specific examples
Using Anthropic
model_id = ModelId.from_string("anthropic/claude-sonnet-4-20250514")
Using Bedrock
model_id = ModelId.from_string("bedrock/us.anthropic.claude-sonnet-4-20250514-v1:0")
Using Cohere
model_id = ModelId.from_string("cohere/embed-english-v3.0")
Using Cohere with custom endpoint
model_id = ModelId.from_string("cohere/https://my-cohere-url:8000/v1:my-model-name")
Using Datasaur
model_id = ModelId.from_string("datasaur/https://deployment.datasaur.ai/api/deployment/teamId/deploymentId/")
Using Google
model_id = ModelId.from_string("google/gemini-2.5-flash-lite")
Using Jina
model_id = ModelId.from_string("jina/jina-embeddings-v2-large")
Using Jina with custom endpoint
model_id = ModelId.from_string("jina/https://my-jina-url:8000/v1:my-model-name")
Using OpenAI
model_id = ModelId.from_string("openai/gpt-5-nano")
Using OpenAI with Chat Completions API
model_id = ModelId.from_string("openai-chat-completions/gpt-5-nano")
Using OpenAI Responses API-compatible endpoints (e.g. SGLang)
model_id = ModelId.from_string("openai/https://my-sglang-url:8000/v1:my-model-name")
Using OpenAI Chat Completions API-compatible endpoints (e.g. Groq)
model_id = ModelId.from_string("openai-chat-completions/https://api.groq.com/openai/v1:llama3-8b-8192")
Using Azure OpenAI
model_id = ModelId.from_string("azure-openai/https://my-resource.openai.azure.com/openai/v1:my-deployment")
Using Voyage
model_id = ModelId.from_string("voyage/voyage-3.5-lite")
Using TwelveLabs
model_id = ModelId.from_string("twelvelabs/Marengo-retrieval-2.7")
Using LangChain
model_id = ModelId.from_string("langchain/langchain_openai.ChatOpenAI:gpt-4o-mini")
For the list of supported providers, please refer to the following table: https://python.langchain.com/docs/integrations/chat/#featured-providers
Using LiteLLM
model_id = ModelId.from_string("litellm/openai/gpt-4o-mini")
For the list of supported providers, please refer to the following page: https://docs.litellm.ai/docs/providers/
Using xAI
model_id = ModelId.from_string("xai/grok-4-0709")
For the list of supported models, please refer to the following page: https://docs.x.ai/docs/models
Custom model name validation example
validation_map = {
ModelProvider.ANTHROPIC: {"claude-sonnet-4-20250514"},
ModelProvider.GOOGLE: {"gemini-2.5-flash-lite"},
ModelProvider.OPENAI: {"gpt-4.1-nano", "gpt-5-nano"},
}
model_id = ModelId.from_string("...", validation_map)
from_string(model_id, validation_map=None)
classmethod
Parse a model id string into a ModelId object.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
model_id |
str
|
The model id to parse. Must be in the format defined in the following page: https://gdplabs.gitbook.io/sdk/resources/supported-models |
required |
validation_map |
dict[str, set[str]] | None
|
An optional dictionary that maps provider names to sets of valid model names. For the defined model providers, the model names will be validated against the set of valid model names. For the undefined model providers, the model name will not be validated. Defaults to None. |
None
|
Returns:
| Name | Type | Description |
|---|---|---|
ModelId |
ModelId
|
The parsed ModelId object. |
Raises:
| Type | Description |
|---|---|
ValueError
|
If the provided model id is invalid or if the model name is not valid for the provider. |
to_string()
Convert the ModelId object to a string.
Returns:
| Name | Type | Description |
|---|---|---|
str |
str
|
The string representation of the ModelId object. The format is defined in the following page: https://gdplabs.gitbook.io/sdk/resources/supported-models |
ModelProvider
Bases: StrEnum
Defines the supported model providers.