azLLM Package

azLLM Class

class azllm.core.azLLM(config_file='config.yaml', custom=False)

Bases: object

Main interface to interact with multiple LLM clients via unified configuration and execution methods.

config_file

Path to the configuration file.

Type:

str

custom

Whether to use custom configuration.

Type:

bool

config

Loaded configuration from file if custom is True.

Type:

dict

clients

Mapping of client names to their respective classes.

Type:

dict

get_model_config(client_name, model_name, version='default')

Retrieves the configuration of a specific model from custom configurations.

Parameters:
  • client_name (str) – Name of the client.

  • model_name (str) – Name of the model.

  • version (str) – Model version.

Returns:

Configuration of the model.

Return type:

dict

Raises:

ValueError – If client or model config is not found.

Example

>>> azllm = azLLM(custom=True)
>>> model_config = azllm.get_model_config('openai', 'gpt-3', 'v1')
>>> isinstance(model_config, dict)
True
generate_text(client_model_version, prompt, kwargs=None, parse=False)

Generates text using a specific client and model for a given prompt.

Parameters:
  • client_model_version (str) – Format ‘client:model::version’.

  • prompt (str) – Text prompt.

  • kwargs (dict, optional) – Additional generation parameters.

  • parse (bool) – Whether to parse the output.

Returns:

Generated text.

Return type:

str

Example

>>> azllm = azLLM()
>>> result = azllm.generate_text("openai:gpt-4o-mini::v1", "Hello, how are you?")
>>> isinstance(result, str)
True
batch_generate(client_model_version, prompts, kwargs=None, parse=None)

Generates text for multiple prompts using the specified client and model.

Parameters:
  • client_model_version (str) – Format ‘client:model::version’.

  • prompts (List[str]) – List of prompts.

  • kwargs (List[dict], optional) – Parameters per prompt.

  • parse (List[bool], optional) – Parse flag per prompt.

Returns:

List of generated texts.

Return type:

List[str]

Example

>>> azllm = azLLM()
>>> results = azllm.batch_generate("openai:gpt-4o-mini::v1", ["How are you?", "What's the weather?"])
>>> isinstance(results, list)
True
generate_parallel(prompt, clients_models_versions, kwargs=None, parse=None)

Generate text in parallel using different clients and models for the same prompt.

Parameters:
  • prompt (str) – Input prompt.

  • clients_models_versions (List[str]) – List of ‘client:model::version’ strings.

  • kwargs (List[dict], optional) – Additional parameters per client.

  • parse (List[bool], optional) – Parse flag per client.

Returns:

Mapping of ‘client:model::version:index’ to generated text or error message.

Return type:

dict

Example

>>> azllm = azLLM()
>>> results = azllm.generate_parallel("Hello!", ["openai:gpt-4o-mini::v1", "grok:default::default"])
>>> isinstance(results, dict)
True

azLLMConfigs Class

class azllm.configmanager.azLLMConfigs(config_file='config.yaml', custom=False)

Bases: object

A configuration manager for handling custom and default configurations of supported LLM clients.

config_file

Name of the configuration file.

Type:

str

custom

Flag to determine if custom configurations are used.

Type:

bool

custom_configs

Dictionary holding custom configuration data.

Type:

dict

Usage:
To use default configurations:
>>> cfg = azLLMConfigs()
>>> cfg.get_default_configs('openai')
Retrieve default config for all supported clients
>>> cfg.get_default_configs('all')

Working with Custom Configurations:

When custom=True, the class will:

  • Create a local configuration file if it doesn’t exist at: custom_configs/config.yaml

  • Load existing configurations from that file.

  • Allow updating or adding new model configurations per client.

Example of internal structure of a custom config (custom_configs/config.yaml):

deepseek:
    models:
        - model: deepseek-chat
        version: v2
        parameters:
            frequency_penalty: 0
            max_tokens: 1024
            presence_penalty: 0
            system_message: You are an advanced AI assistant.
            temperature: 0.7
raises ValueError:

If a client is unsupported or input format is incorrect.

get_default_configs(client='all')

Retrieves the default configuration(s) for one or all LLM clients.

Parameters:

client (str) – Client name (e.g., ‘openai’) or ‘all’ for all clients.

Returns:

Dictionary of default configurations.

Return type:

dict

Raises:

ValueError – If the specified client is unsupported.

update_custom_configs(client_type, models_to_update_or_add)

Updates or adds custom configurations for models under a specific client.

If the client or model doesn’t exist in the configuration, it is added. If the model exists, its configuration is updated.

Parameters:
  • client_type (str) – The LLM client identifier (e.g., ‘openai’).

  • models_to_update_or_add (dict) – A dictionary where keys are model names and values are dictionaries with ‘version’ and ‘parameters’.

Raises:

ValueError – If the client type is not supported.

Return type:

None