Skip to content
Open
Show file tree
Hide file tree
Changes from 8 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/README.md
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

README.md 파일 말고 리포지토리 루트에 있는 README.md 파일도 수정하셔야 합니다.

Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,5 @@

- [GitHub Models](github-models.md)
- [Hugging Face](hugging-face.md)
- [Ollama](ollama.md)
- [OpenAI](openai.md)
179 changes: 179 additions & 0 deletions docs/ollama.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,179 @@
# OpenChat Playground with Ollama

This page describes how to run OpenChat Playground (OCP) with Ollama integration.

## Get the repository root

1. Get the repository root.

```bash
# bash/zsh
REPOSITORY_ROOT=$(git rev-parse --show-toplevel)
```

```powershell
# PowerShell
$REPOSITORY_ROOT = git rev-parse --show-toplevel
```

## Run on local machine

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Make sure Ollama is installed and running on your local machine. If not, install Ollama from [ollama.com](https://ollama.com/) and start the service.

```bash
# Start Ollama service
ollama serve
```

1. Pull the model you want to use. Replace `{{MODEL_NAME}}` with your desired model.

```bash
# Example: Pull llama3.2 model
ollama pull llama3.2
# Or pull other models
ollama pull mistral
ollama pull phi3
ollama pull qwen
```
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

여기 어디에도 {{MODEL_NAME}}은 없습니다.


1. Run the app.

```bash
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- --connector-type Ollama --model llama3.2
```
Comment on lines 48 to 58
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

다른 문서 - docs/github-models.md, docs/hugging-face.md, docs/openai.md 파일과 같이 일관성을 맞춰 주세요.


1. Open your web browser, navigate to `http://localhost:5280`, and enter prompts.

## Run in local container

This approach runs OpenChat Playground in a container while connecting to Ollama running on the host machine.

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Configure Ollama to accept connections from containers.

```powershell
# PowerShell (Windows)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

파워셸은 윈도우 환경이 아니어도 돌아갑니다. 따라서 (Windows)는 불필요합니다.

$env:OLLAMA_HOST = "0.0.0.0:11434"
# Start Ollama service
ollama serve
```

```bash
# bash/zsh (Linux/macOS)
export OLLAMA_HOST=0.0.0.0:11434
ollama serve
```
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

이번에 집 PC를 포맷하는 김에 클린한 환경에서 우리 프로젝트를 돌려봤습니다!
Ollama 앱이 업데이트 되며 아래 표시한 옵션을 활성화해야 외부 접근이 가능해졌습니다.
이 부분도 ollama.md 문서에 포함 부탁해요.

Image


1. Pull the model you want to use.

```bash
# Pull llama3.2 model (recommended)
ollama pull llama3.2
# Verify Ollama is accessible
curl http://localhost:11434/api/version
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

curl 명령어는 파워셸(윈도우OS)에서는 동작하지 않습니다. 따라서, 파워셸용 Invoke-RestMethod 커맨드로도 돌아갈 수 있게 해야 합니다.

다른 문서 보시면 이런 경우 # bash/zsh# PowerShell을 구분지어 놓은 것을 볼 수 있을 겁니다.

```

1. Build a container.

```bash
docker build -f Dockerfile -t openchat-playground:latest .
```

1. Run the app.

```bash
# Using command-line arguments
docker run -i --rm -p 8080:8080 \
openchat-playground:latest \
--connector-type Ollama \
--base-url http://host.docker.internal:11434 \
--model llama3.2
```

```bash
# Alternative: Using environment variables
docker run -i --rm -p 8080:8080 \
-e ConnectorType=Ollama \
-e Ollama__BaseUrl=http://host.docker.internal:11434 \
-e Ollama__Model=llama3.2 \
openchat-playground:latest
```

> **NOTE**: Use `host.docker.internal:11434` to connect to Ollama running on the host machine from inside the container. Make sure `OLLAMA_HOST=0.0.0.0:11434` is set on the host.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

이거 OLLAMA_HOST=0.0.0.0:11434 꼭 호스트 머신에서 해 줘야 하나요? 안하면 어떻게 되나요? 허깅페이스 돌릴 땐 이런 얘기가 없었던 것 같은데?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

설정하지 않으면 외부 앱에서 Ollama에 접근하지 못합니다.

Ollama의 고질적인 문제인데, 앱 자체가 127.0.0.1 위에 떠있어서 다른 네트워크 위의 요청을 듣지 못해서요.
공식 문서에서도 운영체제별로 0.0.0.0 위에 서빙해서 네트워크 이슈를 해결하라고 가이드할 정도입니다.
https://docs.ollama.com/faq#how-do-i-configure-ollama-server%3F

최근에는 앱이 업데이트되며 GUI 설정 메뉴에서 expose ollama to the network도 활성화해줘야해요.

이런 네트워크 연결 이슈 해결방법 자체는 Ollama쪽에서도 계속해서 업데이트가 될것 같으니,
저희 문서에서는 제외하고 Ollama 공식 문서를 참고하라 정도만 남겨도 좋겠네요.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

그렇다면, 저 OLLAMA_HOST=0.0.0.0:11434 관련 내용 및 Ollama UI 설정 관련해서는 우리 문서에서는 그쪽으로 링크를 걸어주는 게 낫겠네요. 우리는 host.docker.internal:11434 이 정도만 언급해도 충분할 듯.

@name-of-okja 혹시 허깅페이스 설정할 때 이 내용도 함께 설정하셨나요?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@justinyoo

별도로 OLLAMA_HOST 설정을 하지 않았습니다!

(ollama는 wsl에 설치하였습니다.)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

추가로
widnwos에 ollama를 설치하고 별도로 OLLAMA_HOST 설정을 안하고 실행시에도 문제 없이 동작 했습니다..!


1. Open your web browser, navigate to `http://localhost:8080`, and enter prompts.

## Run on Azure

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Login to Azure.

```bash
# Login to Azure Dev CLI
azd auth login
```

1. Check login status.

```bash
# Azure Dev CLI
azd auth login --check-status
```

1. Initialize `azd` template.

```bash
azd init
```

> **NOTE**: You will be asked to provide environment name for provisioning.

1. Set Ollama configuration to azd environment variables.

**Azure-hosted Ollama**

```bash
# Set connector type to Ollama
azd env set CONNECTOR_TYPE "Ollama"
# Use placeholder URL that will be replaced when Ollama is deployed in Azure
azd env set OLLAMA_BASE_URL "https://{{OLLAMA_PLACEHOLDER}}:11434"
# Set a specific model
azd env set OLLAMA_MODEL "llama3.2"
```

> **NOTE**: The `{{OLLAMA_PLACEHOLDER}}` will be replaced with the actual Ollama service URL when you deploy Ollama in Azure. This allows you to prepare the configuration before the Ollama service is available.

1. Run the following commands in order to provision and deploy the app.

```bash
azd up
```

> **NOTE**: You will be asked to provide Azure subscription and location for deployment.

1. Clean up all the resources.

```bash
azd down --force --purge
```
4 changes: 4 additions & 0 deletions infra/main.bicep
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ param githubModelsToken string = ''
// Hugging Face
param huggingFaceModel string = ''
// Ollama
param ollamaModel string = ''
param ollamaBaseUrl string = ''
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

여기도

// Anthropic
// LG
// Naver
Expand Down Expand Up @@ -69,6 +71,8 @@ module resources 'resources.bicep' = {
githubModelsModel: githubModelsModel
githubModelsToken: githubModelsToken
huggingFaceModel: huggingFaceModel
ollamaModel: ollamaModel
ollamaBaseUrl: ollamaBaseUrl
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

여기도

openAIModel: openAIModel
openAIApiKey: openAIApiKey
openchatPlaygroundappExists: openchatPlaygroundappExists
Expand Down
6 changes: 6 additions & 0 deletions infra/main.parameters.json
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,12 @@
"huggingFaceModel": {
"value": "${HUGGING_FACE_MODEL}"
},
"ollamaModel": {
"value": "${OLLAMA_MODEL}"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"value": "${OLLAMA_MODEL=llama3.2}" 라고 해 줘야 나중에 azd up 실행시킬 때 기본값으로 지정한 모델이 올라갑니다.

},
"ollamaBaseUrl": {
"value": "${OLLAMA_BASE_URL}"
},
Comment on lines 32 to 37
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

여기도 마찬가지

"openAIModel": {
"value": "${OPENAI_MODEL}"
},
Expand Down
66 changes: 44 additions & 22 deletions infra/resources.bicep
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@ param githubModelsToken string = ''
// Hugging Face
param huggingFaceModel string = ''
// Ollama
param ollamaModel string = ''
param ollamaBaseUrl string = ''
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

여기도

// Anthropic
// LG
// Naver
Expand Down Expand Up @@ -107,16 +109,18 @@ var envConnectorType = connectorType != '' ? [
// Azure AI Foundry
// GitHub Models
var envGitHubModels = (connectorType == '' || connectorType == 'GitHubModels') ? concat(githubModelsModel != '' ? [
{
name: 'GitHubModels__Model'
value: githubModelsModel
}
] : [], [
{
name: 'GitHubModels__Token'
secretRef: 'github-models-token'
}
]) : []
{
name: 'GitHubModels__Model'
value: githubModelsModel
}
] : [],
githubModelsToken != '' ? [
{
name: 'GitHubModels__Token'
secretRef: 'github-models-token'
}
] : []
) : []
// Google Vertex AI
// Docker Model Runner
// Foundry Local
Expand All @@ -128,6 +132,20 @@ var envHuggingFace = connectorType == 'HuggingFace' && huggingFaceModel != '' ?
}
] : []
// Ollama
var envOllama = connectorType == 'Ollama' ? concat(
ollamaModel != '' ? [
{
name: 'Ollama__Model'
value: ollamaModel
}
] : [],
ollamaBaseUrl != '' ? [
{
name: 'Ollama__BaseUrl'
value: ollamaBaseUrl
}
] : []
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

별거 아니긴 한데, 보통 우리가 base URL 다음 model을 가져가는 순서를 쓰니까 그 순서를 맞춰볼까요?

) : []
// Anthropic
// LG
// Naver
Expand All @@ -154,17 +172,20 @@ module openchatPlaygroundapp 'br/public:avm/res/app/container-app:0.18.1' = {
minReplicas: 1
maxReplicas: 10
}
secrets: concat([
{
name: 'github-models-token'
value: githubModelsToken
}
], openAIApiKey != '' ? [
{
name: 'openai-api-key'
value: openAIApiKey
}
] : [])
secrets: concat([],
githubModelsToken != '' ? [
{
name: 'github-models-token'
value: githubModelsToken
}
] : [],
openAIApiKey != '' ? [
{
name: 'openai-api-key'
value: openAIApiKey
}
] : []
)
containers: [
{
image: openchatPlaygroundappFetchLatestImage.outputs.?containers[?0].?image ?? 'mcr.microsoft.com/azuredocs/containerapps-helloworld:latest'
Expand All @@ -189,6 +210,7 @@ module openchatPlaygroundapp 'br/public:avm/res/app/container-app:0.18.1' = {
envConnectorType,
envGitHubModels,
envHuggingFace,
envOllama,
envOpenAI)
}
]
Expand All @@ -209,4 +231,4 @@ module openchatPlaygroundapp 'br/public:avm/res/app/container-app:0.18.1' = {
}

output AZURE_CONTAINER_REGISTRY_ENDPOINT string = containerRegistry.outputs.loginServer
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundapp.outputs.resourceId
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundapp.outputs.resourceId
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ public static async Task<IChatClient> CreateChatClientAsync(AppSettings settings
{
ConnectorType.GitHubModels => new GitHubModelsConnector(settings),
ConnectorType.HuggingFace => new HuggingFaceConnector(settings),
ConnectorType.Ollama => new OllamaConnector(settings),
ConnectorType.OpenAI => new OpenAIConnector(settings),
_ => throw new NotSupportedException($"Connector type '{settings.ConnectorType}' is not supported.")
};
Expand Down
53 changes: 53 additions & 0 deletions src/OpenChat.PlaygroundApp/Connectors/OllamaConnector.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
using Microsoft.Extensions.AI;
using OllamaSharp;

using OpenChat.PlaygroundApp.Abstractions;
using OpenChat.PlaygroundApp.Configurations;

namespace OpenChat.PlaygroundApp.Connectors;

/// <summary>
/// This represents the connector entity for Ollama.
/// </summary>
public class OllamaConnector(AppSettings settings) : LanguageModelConnector(settings.Ollama)
{
/// <inheritdoc/>
public override bool EnsureLanguageModelSettingsValid()
{
var settings = this.Settings as OllamaSettings;
if (settings is null)
{
throw new InvalidOperationException("Missing configuration: Ollama.");
}

if (string.IsNullOrWhiteSpace(settings.BaseUrl!.Trim()) == true)
{
throw new InvalidOperationException("Missing configuration: Ollama:BaseUrl.");
}

if (string.IsNullOrWhiteSpace(settings.Model!.Trim()) == true)
{
throw new InvalidOperationException("Missing configuration: Ollama:Model.");
}

return true;
}

/// <inheritdoc/>
public override async Task<IChatClient> GetChatClientAsync()
{
var settings = this.Settings as OllamaSettings;
var baseUrl = settings!.BaseUrl!;
var model = settings!.Model!;

var config = new OllamaApiClient.Configuration
{
Uri = new Uri(baseUrl),
Model = model,
};

var chatClient = new OllamaApiClient(config);

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

요 사이에 요걸 넣어줘야 나중에 자동으로 모델을 pull 땡겨옵니다.

var pulls = chatClient.PullModelAsync(model);
await foreach (var pull in pulls)
{
Console.WriteLine($"Pull status: {pull!.Status}");
}
Console.WriteLine($"The {this._appSettings.ConnectorType} connector created with model: {settings.Model}");

return await Task.FromResult(chatClient).ConfigureAwait(false);
}
}
Loading