LLM

class pipecat.services.sambanova.llm.SambaNovaLLMService(*, api_key, model='Llama-4-Maverick-17B-128E-Instruct', base_url='https://api.sambanova.ai/v1', **kwargs)[source]

Bases: OpenAILLMService

A service for interacting with SambaNova using the OpenAI-compatible interface. This service extends OpenAILLMService to connect to SambaNova’s API endpoint while maintaining full compatibility with OpenAI’s interface and functionality. :param api_key: The API key for accessing SambaNova API. :type api_key: str :param model: The model identifier to use. Defaults to “Meta-Llama-3.3-70B-Instruct”. :type model: str, optional :param base_url: The base URL for SambaNova API. Defaults to “https://api.sambanova.ai/v1”. :type base_url: str, optional :param **kwargs: Additional keyword arguments passed to OpenAILLMService.

Parameters:
  • api_key (str)

  • model (str)

  • base_url (str)

  • kwargs (Dict[Any, Any])

create_client(api_key=None, base_url=None, **kwargs)[source]

Create OpenAI-compatible client for SambaNova API endpoint.

Parameters:
  • api_key (str | None)

  • base_url (str | None)

  • kwargs (Dict[Any, Any])

Return type:

Any

async get_chat_completions(context, messages)[source]

Get chat completions from SambaNova API endpoint.

Parameters:
  • context (OpenAILLMContext)

  • messages (List[ChatCompletionDeveloperMessageParam | ChatCompletionSystemMessageParam | ChatCompletionUserMessageParam | ChatCompletionAssistantMessageParam | ChatCompletionToolMessageParam | ChatCompletionFunctionMessageParam])

Return type:

Any