Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
- [ ] [Docker Model Runner](https://docs.docker.com/ai/model-runner)
- [ ] [Foundry Local](https://learn.microsoft.com/azure/ai-foundry/foundry-local/what-is-foundry-local)
- [x] [Hugging Face](https://huggingface.co/docs)
- [ ] [Ollama](https://github.com/ollama/ollama/tree/main/docs)
- [x] [Ollama](https://github.com/ollama/ollama/tree/main/docs)
- [ ] [Anthropic](https://docs.anthropic.com)
- [ ] [Naver](https://api.ncloud-docs.com/docs/ai-naver-clovastudio-summary)
- [x] [LG](https://github.com/LG-AI-EXAONE)
Expand Down Expand Up @@ -63,6 +63,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
- [Use Azure AI Foundry](./docs/azure-ai-foundry.md#run-on-local-machine)
- [Use GitHub Models](./docs/github-models.md#run-on-local-machine)
- [Use Hugging Face](./docs/hugging-face.md#run-on-local-machine)
- [Use Ollama](./docs/ollama.md#run-on-local-machine)
- [Use LG](./docs/lg.md#run-on-local-machine)
- [Use OpenAI](./docs/openai.md#run-on-local-machine)

Expand All @@ -71,6 +72,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
- [Use Azure AI Foundry](./docs/azure-ai-foundry.md#run-in-local-container)
- [Use GitHub Models](./docs/github-models.md#run-in-local-container)
- [Use Hugging Face](./docs/hugging-face.md#run-in-local-container)
- [Use Ollama](./docs/ollama.md#run-on-local-container)
- [Use LG](./docs/lg.md#run-in-local-container)
- [Use OpenAI](./docs/openai.md#run-in-local-container)

Expand All @@ -79,6 +81,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
- [Use Azure AI Foundry](./docs/azure-ai-foundry.md#run-on-azure)
- [Use GitHub Models](./docs/github-models.md#run-on-azure)
- [Use Hugging Face](./docs/hugging-face.md#run-on-azure)
- [Use Ollama](./docs/ollama.md#run-on-azure)
- [Use LG](./docs/lg.md#run-on-azure)
- [Use OpenAI](./docs/openai.md#run-on-azure)

Expand Down
1 change: 1 addition & 0 deletions docs/README.md
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

README.md 파일 말고 리포지토리 루트에 있는 README.md 파일도 수정하셔야 합니다.

Original file line number Diff line number Diff line change
Expand Up @@ -3,5 +3,6 @@
- [Azure AI Foundry](azure-ai-foundry.md)
- [GitHub Models](github-models.md)
- [Hugging Face](hugging-face.md)
- [Ollama](ollama.md)
- [LG](lg.md)
- [OpenAI](openai.md)
218 changes: 218 additions & 0 deletions docs/ollama.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,218 @@
# OpenChat Playground with Ollama

This page describes how to run OpenChat Playground (OCP) with Ollama integration.

## Get the repository root

1. Get the repository root.

```bash
# bash/zsh
REPOSITORY_ROOT=$(git rev-parse --show-toplevel)
```

```powershell
# PowerShell
$REPOSITORY_ROOT = git rev-parse --show-toplevel
```

## Run on local machine

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Make sure Ollama is installed and running on your local machine. If not, install Ollama from [ollama.com](https://ollama.com/) and start the service.

```bash
# Start Ollama service
ollama serve
```

1. Pull the model you want to use. The default model OCP uses is "llama3.2"

```bash
# Example: Pull llama3.2 model
ollama pull llama3.2

# Or pull other models
ollama pull mistral
ollama pull phi3
ollama pull qwen
```

1. Run the app.

```bash
# bash/zsh
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- \
--connector-type Ollama
```

```powershell
# PowerShell
dotnet run --project $REPOSITORY_ROOT\src\OpenChat.PlaygroundApp -- `
--connector-type Ollama
```


1. Open your web browser, navigate to `http://localhost:5280`, and enter prompts.

## Run on local container

This approach runs OpenChat Playground in a container while connecting to Ollama running on the host machine.

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Configure Ollama to accept connections from containers.

```bash
ollama serve
```
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

이번에 집 PC를 포맷하는 김에 클린한 환경에서 우리 프로젝트를 돌려봤습니다!
Ollama 앱이 업데이트 되며 아래 표시한 옵션을 활성화해야 외부 접근이 가능해졌습니다.
이 부분도 ollama.md 문서에 포함 부탁해요.

Image


1. Pull the model you want to use.

```bash
# bash/zsh
ollama pull llama3.2

# Verify Ollama is accessible
curl http://localhost:11434/api/version
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

curl 명령어는 파워셸(윈도우OS)에서는 동작하지 않습니다. 따라서, 파워셸용 Invoke-RestMethod 커맨드로도 돌아갈 수 있게 해야 합니다.

다른 문서 보시면 이런 경우 # bash/zsh# PowerShell을 구분지어 놓은 것을 볼 수 있을 겁니다.

```

```powershell
# PowerShell
ollama pull llama3.2

# Verify Ollama is accessible
Invoke-RestMethod -Uri http://localhost:11434/api/version
```


1. Build a container.

```bash
docker build -f Dockerfile -t openchat-playground:latest .
```

1. Run the app.

```bash
# bash/zsh - from locally built container
docker run -i --rm -p 8080:8080 openchat-playground:latest \
--connector-type Ollama \
--base-url http://host.docker.internal:11434 \
```

```powershell
# PowerShell - from locally built container
docker run -i --rm -p 8080:8080 openchat-playground:latest `
--connector-type Ollama `
--base-url http://host.docker.internal:11434
```

```bash
# bash/zsh - from GitHub Container Registry
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest
\
--connector-type Ollama \
--base-url http://host.docker.internal:11434
```

```powershell
# PowerShell - from GitHub Container Registry
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest `
--connector-type Ollama `
--base-url http://host.docker.internal:11434
```
Alternatively, if you want to run with a different model, say [qwen], make sure you've already downloaded the model by running the `ollama pull qwen` command.

```bash
ollama pull qwen
```

```bash
# bash/zsh - from locally built container
docker run -i --rm -p 8080:8080 openchat-playground:latest \
--connector-type Ollama \
--base-url http://host.docker.internal:11434 \
--model qwen
```

```powershell
# PowerShell - from locally built container (with a different model)
docker run -i --rm -p 8080:8080 openchat-playground:latest `
--connector-type Ollama `
--base-url http://host.docker.internal:11434 `
--model qwen
```

> **NOTE**: Use `host.docker.internal:11434` to connect to Ollama running on the host machine from inside the container.

1. Open your web browser, navigate to `http://localhost:8080`, and enter prompts.

## Run on Azure

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Login to Azure.

```bash
# Login to Azure Dev CLI
azd auth login
```

1. Check login status.

```bash
# Azure Dev CLI
azd auth login --check-status
```

1. Initialize `azd` template.

```bash
azd init
```

> **NOTE**: You will be asked to provide environment name for provisioning.

1. Set Ollama configuration to azd environment variables.

**Azure-hosted Ollama (Automatic Deployment)**

```bash
# Set connector type to Ollama
azd env set CONNECTOR_TYPE "Ollama"

# Set a specific model
azd env set OLLAMA_MODEL "llama3.2"

# BaseUrl is automatically configured - no need to set OLLAMA_BASE_URL
```

> **NOTE**: When deploying to Azure, the Ollama server will be automatically provisioned and deployed as a container with GPU support. The BaseUrl will be automatically configured to connect to the deployed Ollama instance.

1. Run the following commands in order to provision and deploy the app.

```bash
azd up
```

> **NOTE**: You will be asked to provide Azure subscription and location for deployment.

1. Clean up all the resources.

```bash
azd down --force --purge
```
3 changes: 3 additions & 0 deletions infra/main.parameters.json
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,9 @@
"huggingFaceModel": {
"value": "${HUGGING_FACE_MODEL=hf.co/Qwen/Qwen3-0.6B-GGUF}"
},
"ollamaModel": {
"value": "${OLLAMA_MODEL=llama3.2}"
},
"lgModel": {
"value": "${LG_MODEL=hf.co/LGAI-EXAONE/EXAONE-4.0-1.2B-GGUF}"
},
Expand Down
2 changes: 1 addition & 1 deletion infra/resources.bicep
Original file line number Diff line number Diff line change
Expand Up @@ -392,4 +392,4 @@ module ollama 'br/public:avm/res/app/container-app:0.18.1' = if (useOllama == tr
}

output AZURE_CONTAINER_REGISTRY_ENDPOINT string = containerRegistry.outputs.loginServer
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundApp.outputs.resourceId
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundApp.outputs.resourceId
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ public static async Task<IChatClient> CreateChatClientAsync(AppSettings settings
ConnectorType.AzureAIFoundry => new AzureAIFoundryConnector(settings),
ConnectorType.GitHubModels => new GitHubModelsConnector(settings),
ConnectorType.HuggingFace => new HuggingFaceConnector(settings),
ConnectorType.Ollama => new OllamaConnector(settings),
ConnectorType.LG => new LGConnector(settings),
ConnectorType.OpenAI => new OpenAIConnector(settings),
_ => throw new NotSupportedException($"Connector type '{settings.ConnectorType}' is not supported.")
Expand Down
85 changes: 85 additions & 0 deletions src/OpenChat.PlaygroundApp/Connectors/OllamaConnector.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
using Microsoft.Extensions.AI;

using OllamaSharp;

using OpenChat.PlaygroundApp.Abstractions;
using OpenChat.PlaygroundApp.Configurations;

using System.Linq;

namespace OpenChat.PlaygroundApp.Connectors;

/// <summary>
/// This represents the connector entity for Ollama.
/// </summary>
public class OllamaConnector(AppSettings settings) : LanguageModelConnector(settings.Ollama)
{
private readonly AppSettings _appSettings = settings ?? throw new ArgumentNullException(nameof(settings));
/// <inheritdoc/>
public override bool EnsureLanguageModelSettingsValid()
{
var settings = this.Settings as OllamaSettings;
if (settings is null)
{
throw new InvalidOperationException("Missing configuration: Ollama.");
}

if (string.IsNullOrWhiteSpace(settings.BaseUrl!.Trim()) == true)
{
throw new InvalidOperationException("Missing configuration: Ollama:BaseUrl.");
}

if (string.IsNullOrWhiteSpace(settings.Model!.Trim()) == true)
{
throw new InvalidOperationException("Missing configuration: Ollama:Model.");
}

return true;
}

/// <inheritdoc/>
public override async Task<IChatClient> GetChatClientAsync()
{
var settings = this.Settings as OllamaSettings;
var baseUrl = settings!.BaseUrl!;
var model = settings!.Model!;

var config = new OllamaApiClient.Configuration
{
Uri = new Uri(baseUrl),
Model = model,
};

var chatClient = new OllamaApiClient(config);

// Only attempt to pull model if not in test environment
if (!IsTestEnvironment())
{
try
{
var pulls = chatClient.PullModelAsync(model);
await foreach (var pull in pulls)
{
Console.WriteLine($"Pull status: {pull!.Status}");
}
}
catch (Exception ex)
{
Console.WriteLine($"Warning: Could not pull model {model}: {ex.Message}");
// Continue anyway - model might already exist
}
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

요 사이에 요걸 넣어줘야 나중에 자동으로 모델을 pull 땡겨옵니다.

var pulls = chatClient.PullModelAsync(model);
await foreach (var pull in pulls)
{
Console.WriteLine($"Pull status: {pull!.Status}");
}
Console.WriteLine($"The {this._appSettings.ConnectorType} connector created with model: {settings.Model}");

Console.WriteLine($"The {this._appSettings.ConnectorType} connector created with model: {settings.Model}");
return await Task.FromResult(chatClient).ConfigureAwait(false);
}

private static bool IsTestEnvironment()
{
// Check if running in test environment
return Environment.GetEnvironmentVariable("DOTNET_RUNNING_IN_CONTAINER") == "true" ||
Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT") == "Testing" ||
System.Diagnostics.Debugger.IsAttached ||
AppDomain.CurrentDomain.GetAssemblies().Any(assembly => assembly.FullName?.Contains("xunit") == true);
}
}
Loading
Loading