Skip to content
Open
Show file tree
Hide file tree
Changes from 10 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,5 @@

- [GitHub Models](github-models.md)
- [Hugging Face](hugging-face.md)
- [Ollama](ollama.md)
- [OpenAI](openai.md)
179 changes: 179 additions & 0 deletions docs/ollama.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,179 @@
# OpenChat Playground with Ollama

This page describes how to run OpenChat Playground (OCP) with Ollama integration.

## Get the repository root

1. Get the repository root.

```bash
# bash/zsh
REPOSITORY_ROOT=$(git rev-parse --show-toplevel)
```

```powershell
# PowerShell
$REPOSITORY_ROOT = git rev-parse --show-toplevel
```

## Run on local machine

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Make sure Ollama is installed and running on your local machine. If not, install Ollama from [ollama.com](https://ollama.com/) and start the service.

```bash
# Start Ollama service
ollama serve
```

1. Pull the model you want to use. Replace `{{MODEL_NAME}}` with your desired model.

```bash
# Example: Pull llama3.2 model
ollama pull llama3.2
# Or pull other models
ollama pull mistral
ollama pull phi3
ollama pull qwen
```

1. Run the app.

```bash
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- --connector-type Ollama --model llama3.2
```

1. Open your web browser, navigate to `http://localhost:5280`, and enter prompts.

## Run in local container

This approach runs OpenChat Playground in a container while connecting to Ollama running on the host machine.

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Configure Ollama to accept connections from containers.

```powershell
# PowerShell (Windows)
$env:OLLAMA_HOST = "0.0.0.0:11434"
# Start Ollama service
ollama serve
```

```bash
# bash/zsh (Linux/macOS)
export OLLAMA_HOST=0.0.0.0:11434
ollama serve
```

1. Pull the model you want to use.

```bash
# Pull llama3.2 model (recommended)
ollama pull llama3.2
# Verify Ollama is accessible
curl http://localhost:11434/api/version
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

curl 명령어는 파워셸(윈도우OS)에서는 동작하지 않습니다. 따라서, 파워셸용 Invoke-RestMethod 커맨드로도 돌아갈 수 있게 해야 합니다.

다른 문서 보시면 이런 경우 # bash/zsh# PowerShell을 구분지어 놓은 것을 볼 수 있을 겁니다.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

이 부분 Invoke-RestMethod로 변경해야합니다.

Powershell에서 curl을 자동으로 Invoke-RestMethod로 바꿔 실행하긴 하는데,
CLI인자 넘기는 방식이 달라서 굳이 curl을 사용할 필요는 없더라구요.
돌아는 가는데ㅎㅎ 권장되는 방식이 아닙니다.

```

1. Build a container.

```bash
docker build -f Dockerfile -t openchat-playground:latest .
```

1. Run the app.

```bash
# Using command-line arguments
docker run -i --rm -p 8080:8080 \
openchat-playground:latest \
--connector-type Ollama \
--base-url http://host.docker.internal:11434 \
--model llama3.2
```

```bash
# Alternative: Using environment variables
docker run -i --rm -p 8080:8080 \
-e ConnectorType=Ollama \
-e Ollama__BaseUrl=http://host.docker.internal:11434 \
-e Ollama__Model=llama3.2 \
openchat-playground:latest
```

> **NOTE**: Use `host.docker.internal:11434` to connect to Ollama running on the host machine from inside the container. Make sure `OLLAMA_HOST=0.0.0.0:11434` is set on the host.

1. Open your web browser, navigate to `http://localhost:8080`, and enter prompts.

## Run on Azure

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Login to Azure.

```bash
# Login to Azure Dev CLI
azd auth login
```

1. Check login status.

```bash
# Azure Dev CLI
azd auth login --check-status
```

1. Initialize `azd` template.

```bash
azd init
```

> **NOTE**: You will be asked to provide environment name for provisioning.

1. Set Ollama configuration to azd environment variables.

**Azure-hosted Ollama**

```bash
# Set connector type to Ollama
azd env set CONNECTOR_TYPE "Ollama"
# Use placeholder URL that will be replaced when Ollama is deployed in Azure
azd env set OLLAMA_BASE_URL "https://{{OLLAMA_URL}}:11434"
# Set a specific model
azd env set OLLAMA_MODEL "llama3.2"
```

> **NOTE**: The `{{OLLAMA_URL}}` will be replaced with the actual Ollama service URL when you deploy Ollama in Azure. This allows you to prepare the configuration before the Ollama service is available.

1. Run the following commands in order to provision and deploy the app.

```bash
azd up
```

> **NOTE**: You will be asked to provide Azure subscription and location for deployment.

1. Clean up all the resources.

```bash
azd down --force --purge
```
4 changes: 4 additions & 0 deletions infra/main.bicep
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ param githubModelsToken string = ''
// Hugging Face
param huggingFaceModel string = ''
// Ollama
param ollamaModel string = ''
param ollamaBaseUrl string = ''
// Anthropic
// LG
// Naver
Expand Down Expand Up @@ -69,6 +71,8 @@ module resources 'resources.bicep' = {
githubModelsModel: githubModelsModel
githubModelsToken: githubModelsToken
huggingFaceModel: huggingFaceModel
ollamaModel: ollamaModel
ollamaBaseUrl: ollamaBaseUrl
openAIModel: openAIModel
openAIApiKey: openAIApiKey
openchatPlaygroundappExists: openchatPlaygroundappExists
Expand Down
6 changes: 6 additions & 0 deletions infra/main.parameters.json
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,12 @@
"huggingFaceModel": {
"value": "${HUGGING_FACE_MODEL}"
},
"ollamaModel": {
"value": "${OLLAMA_MODEL}"
},
"ollamaBaseUrl": {
"value": "${OLLAMA_BASE_URL}"
},
"openAIModel": {
"value": "${OPENAI_MODEL}"
},
Expand Down
19 changes: 18 additions & 1 deletion infra/resources.bicep
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@ param githubModelsToken string = ''
// Hugging Face
param huggingFaceModel string = ''
// Ollama
param ollamaModel string = ''
param ollamaBaseUrl string = ''
// Anthropic
// LG
// Naver
Expand Down Expand Up @@ -130,6 +132,20 @@ var envHuggingFace = connectorType == 'HuggingFace' ? concat(huggingFaceModel !=
}
] : []) : []
// Ollama
var envOllama = connectorType == 'Ollama' ? concat(
ollamaModel != '' ? [
{
name: 'Ollama__Model'
value: ollamaModel
}
] : [],
ollamaBaseUrl != '' ? [
{
name: 'Ollama__BaseUrl'
value: ollamaBaseUrl
}
] : []
) : []
// Anthropic
// LG
// Naver
Expand Down Expand Up @@ -191,6 +207,7 @@ module openchatPlaygroundapp 'br/public:avm/res/app/container-app:0.18.1' = {
envConnectorType,
envGitHubModels,
envHuggingFace,
envOllama,
envOpenAI)
}
]
Expand All @@ -211,4 +228,4 @@ module openchatPlaygroundapp 'br/public:avm/res/app/container-app:0.18.1' = {
}

output AZURE_CONTAINER_REGISTRY_ENDPOINT string = containerRegistry.outputs.loginServer
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundapp.outputs.resourceId
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundapp.outputs.resourceId
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ public static async Task<IChatClient> CreateChatClientAsync(AppSettings settings
{
ConnectorType.GitHubModels => new GitHubModelsConnector(settings),
ConnectorType.HuggingFace => new HuggingFaceConnector(settings),
ConnectorType.Ollama => new OllamaConnector(settings),
ConnectorType.OpenAI => new OpenAIConnector(settings),
_ => throw new NotSupportedException($"Connector type '{settings.ConnectorType}' is not supported.")
};
Expand Down
54 changes: 54 additions & 0 deletions src/OpenChat.PlaygroundApp/Connectors/OllamaConnector.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
using Microsoft.Extensions.AI;

using OllamaSharp;

using OpenChat.PlaygroundApp.Abstractions;
using OpenChat.PlaygroundApp.Configurations;

namespace OpenChat.PlaygroundApp.Connectors;

/// <summary>
/// This represents the connector entity for Ollama.
/// </summary>
public class OllamaConnector(AppSettings settings) : LanguageModelConnector(settings.Ollama)
{
/// <inheritdoc/>
public override bool EnsureLanguageModelSettingsValid()
{
var settings = this.Settings as OllamaSettings;
if (settings is null)
{
throw new InvalidOperationException("Missing configuration: Ollama.");
}

if (string.IsNullOrWhiteSpace(settings.BaseUrl!.Trim()) == true)
{
throw new InvalidOperationException("Missing configuration: Ollama:BaseUrl.");
}

if (string.IsNullOrWhiteSpace(settings.Model!.Trim()) == true)
{
throw new InvalidOperationException("Missing configuration: Ollama:Model.");
}

return true;
}

/// <inheritdoc/>
public override async Task<IChatClient> GetChatClientAsync()
{
var settings = this.Settings as OllamaSettings;
var baseUrl = settings!.BaseUrl!;
var model = settings!.Model!;

var config = new OllamaApiClient.Configuration
{
Uri = new Uri(baseUrl),
Model = model,
};

var chatClient = new OllamaApiClient(config);

return await Task.FromResult(chatClient).ConfigureAwait(false);
}
}
Loading