LLM

class pipecat.services.ollama.llm.OLLamaLLMService(*, model='llama2', base_url='http://localhost:11434/v1')[source]

Bases: OpenAILLMService

Parameters:
  • model (str)

  • base_url (str)