Skip to content

Add Ollama to bicep #491

@justinyoo

Description

@justinyoo

Certain connectors using local LLMs like Ollama, Hugging Face and LG use the Ollama server when deploying to the cloud. However, the current bicep template doesn't cover this scenario.

Add another ACA instance to deploy Ollama server with the GPU plan.

Reference: https://techcommunity.microsoft.com/blog/appsonazureblog/open-ais-gpt-oss-models-on-azure-container-apps-serverless-gpus/4440836

Metadata

Metadata

Assignees

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions