-
Notifications
You must be signed in to change notification settings - Fork 7.2k
Closed
Labels
Description
Bug description
I am trying to use MetaGPT via a third party url wrapper. It works for me with openai and claude:
llm:
api_type: "openai" # or azure / ollama / groq etc.
model: "gpt-4-turbo" # or gpt-3.5-turbo
base_url: "http://localhost:8989/openai"
api_key: "xxxx"
But when i configure ollama, i have a problem:
llm:
api_type: "ollama" # or azure / ollama / groq etc.
model: "llama2" # or gpt-3.5-turbo
base_url: "http://localhost:8989/ollama"
But it fails with a 404 not found, because the base URL needs to be:
/api/chat and it just gets /chat
@property
def api_suffix(self) -> str:
return "/chat"
I also tried to setup a proxy:
llm:
api_type: "ollama" # or azure / ollama / groq etc.
model: "llama2" # or gpt-3.5-turbo
base_url: "http://localhost:11434/api" # or forward url / other llm url
proxy: "http://localhost:8989"
But it doesn't seem to be even picked. How can i configure ollama via this wrapper url?
Environment information
mac os m2
llm type -> ollama