LlmVertex

class pipecat.services.google.llm_vertex.GoogleVertexLLMService(*, credentials=None, credentials_path=None, model='google/gemini-2.0-flash-001', params=None, **kwargs)[source]

Bases: OpenAILLMService

Implements inference with Google’s AI models via Vertex AI while maintaining OpenAI API compatibility.

Reference: https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/call-vertex-using-openai-library

Parameters:
  • credentials (str | None)

  • credentials_path (str | None)

  • model (str)

  • params (InputParams | None)

class InputParams(*, frequency_penalty=<factory>, presence_penalty=<factory>, seed=<factory>, temperature=<factory>, top_k=None, top_p=<factory>, max_tokens=<factory>, max_completion_tokens=<factory>, extra=<factory>, location='us-east4', project_id)[source]

Bases: InputParams

Input parameters specific to Vertex AI.

Parameters:
  • frequency_penalty (float | None)

  • presence_penalty (float | None)

  • seed (int | None)

  • temperature (float | None)

  • top_k (int | None)

  • top_p (float | None)

  • max_tokens (int | None)

  • max_completion_tokens (int | None)

  • extra (Dict[str, Any] | None)

  • location (str)

  • project_id (str)

location: str
project_id: str
model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].