-
Notifications
You must be signed in to change notification settings - Fork 23
Connector Implementation & Inheritance Ollama #456
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from 8 commits
9598325
a1f9683
8add602
22eafbd
ae1f2a5
0d42437
8bb60fa
61b73b0
791da59
f7c1d84
eebb2cf
a3a109a
e78460d
52417c2
226b5fc
68302e9
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -2,4 +2,5 @@ | |
|
||
- [GitHub Models](github-models.md) | ||
- [Hugging Face](hugging-face.md) | ||
- [Ollama](ollama.md) | ||
- [OpenAI](openai.md) |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,179 @@ | ||
# OpenChat Playground with Ollama | ||
|
||
This page describes how to run OpenChat Playground (OCP) with Ollama integration. | ||
|
||
## Get the repository root | ||
|
||
1. Get the repository root. | ||
|
||
```bash | ||
# bash/zsh | ||
REPOSITORY_ROOT=$(git rev-parse --show-toplevel) | ||
``` | ||
|
||
```powershell | ||
# PowerShell | ||
$REPOSITORY_ROOT = git rev-parse --show-toplevel | ||
``` | ||
|
||
## Run on local machine | ||
|
||
1. Make sure you are at the repository root. | ||
|
||
```bash | ||
cd $REPOSITORY_ROOT | ||
``` | ||
|
||
1. Make sure Ollama is installed and running on your local machine. If not, install Ollama from [ollama.com](https://ollama.com/) and start the service. | ||
|
||
```bash | ||
# Start Ollama service | ||
ollama serve | ||
``` | ||
|
||
1. Pull the model you want to use. Replace `{{MODEL_NAME}}` with your desired model. | ||
|
||
```bash | ||
# Example: Pull llama3.2 model | ||
ollama pull llama3.2 | ||
# Or pull other models | ||
ollama pull mistral | ||
ollama pull phi3 | ||
ollama pull qwen | ||
``` | ||
|
||
|
||
1. Run the app. | ||
|
||
```bash | ||
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- --connector-type Ollama --model llama3.2 | ||
``` | ||
Comment on lines
48
to
58
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 다른 문서 - |
||
|
||
1. Open your web browser, navigate to `http://localhost:5280`, and enter prompts. | ||
|
||
## Run in local container | ||
|
||
This approach runs OpenChat Playground in a container while connecting to Ollama running on the host machine. | ||
|
||
1. Make sure you are at the repository root. | ||
|
||
```bash | ||
cd $REPOSITORY_ROOT | ||
``` | ||
|
||
1. Configure Ollama to accept connections from containers. | ||
|
||
```powershell | ||
# PowerShell (Windows) | ||
|
||
$env:OLLAMA_HOST = "0.0.0.0:11434" | ||
# Start Ollama service | ||
ollama serve | ||
``` | ||
|
||
```bash | ||
# bash/zsh (Linux/macOS) | ||
export OLLAMA_HOST=0.0.0.0:11434 | ||
ollama serve | ||
``` | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. |
||
|
||
1. Pull the model you want to use. | ||
|
||
```bash | ||
# Pull llama3.2 model (recommended) | ||
ollama pull llama3.2 | ||
# Verify Ollama is accessible | ||
curl http://localhost:11434/api/version | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
다른 문서 보시면 이런 경우 |
||
``` | ||
|
||
1. Build a container. | ||
|
||
```bash | ||
docker build -f Dockerfile -t openchat-playground:latest . | ||
``` | ||
|
||
1. Run the app. | ||
|
||
```bash | ||
# Using command-line arguments | ||
docker run -i --rm -p 8080:8080 \ | ||
openchat-playground:latest \ | ||
--connector-type Ollama \ | ||
--base-url http://host.docker.internal:11434 \ | ||
--model llama3.2 | ||
``` | ||
|
||
```bash | ||
# Alternative: Using environment variables | ||
docker run -i --rm -p 8080:8080 \ | ||
-e ConnectorType=Ollama \ | ||
-e Ollama__BaseUrl=http://host.docker.internal:11434 \ | ||
-e Ollama__Model=llama3.2 \ | ||
openchat-playground:latest | ||
``` | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
> **NOTE**: Use `host.docker.internal:11434` to connect to Ollama running on the host machine from inside the container. Make sure `OLLAMA_HOST=0.0.0.0:11434` is set on the host. | ||
|
||
|
||
1. Open your web browser, navigate to `http://localhost:8080`, and enter prompts. | ||
|
||
## Run on Azure | ||
|
||
1. Make sure you are at the repository root. | ||
|
||
```bash | ||
cd $REPOSITORY_ROOT | ||
``` | ||
|
||
1. Login to Azure. | ||
|
||
```bash | ||
# Login to Azure Dev CLI | ||
azd auth login | ||
``` | ||
|
||
1. Check login status. | ||
|
||
```bash | ||
# Azure Dev CLI | ||
azd auth login --check-status | ||
``` | ||
|
||
1. Initialize `azd` template. | ||
|
||
```bash | ||
azd init | ||
``` | ||
|
||
> **NOTE**: You will be asked to provide environment name for provisioning. | ||
|
||
1. Set Ollama configuration to azd environment variables. | ||
|
||
**Azure-hosted Ollama** | ||
|
||
```bash | ||
# Set connector type to Ollama | ||
azd env set CONNECTOR_TYPE "Ollama" | ||
# Use placeholder URL that will be replaced when Ollama is deployed in Azure | ||
azd env set OLLAMA_BASE_URL "https://{{OLLAMA_PLACEHOLDER}}:11434" | ||
tae0y marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
# Set a specific model | ||
azd env set OLLAMA_MODEL "llama3.2" | ||
``` | ||
|
||
> **NOTE**: The `{{OLLAMA_PLACEHOLDER}}` will be replaced with the actual Ollama service URL when you deploy Ollama in Azure. This allows you to prepare the configuration before the Ollama service is available. | ||
tae0y marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
|
||
1. Run the following commands in order to provision and deploy the app. | ||
|
||
```bash | ||
azd up | ||
``` | ||
|
||
> **NOTE**: You will be asked to provide Azure subscription and location for deployment. | ||
|
||
1. Clean up all the resources. | ||
|
||
```bash | ||
azd down --force --purge | ||
``` |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -24,6 +24,8 @@ param githubModelsToken string = '' | |
// Hugging Face | ||
param huggingFaceModel string = '' | ||
// Ollama | ||
param ollamaModel string = '' | ||
param ollamaBaseUrl string = '' | ||
|
||
// Anthropic | ||
// LG | ||
// Naver | ||
|
@@ -69,6 +71,8 @@ module resources 'resources.bicep' = { | |
githubModelsModel: githubModelsModel | ||
githubModelsToken: githubModelsToken | ||
huggingFaceModel: huggingFaceModel | ||
ollamaModel: ollamaModel | ||
ollamaBaseUrl: ollamaBaseUrl | ||
|
||
openAIModel: openAIModel | ||
openAIApiKey: openAIApiKey | ||
openchatPlaygroundappExists: openchatPlaygroundappExists | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -20,6 +20,12 @@ | |
"huggingFaceModel": { | ||
"value": "${HUGGING_FACE_MODEL}" | ||
}, | ||
"ollamaModel": { | ||
"value": "${OLLAMA_MODEL}" | ||
|
||
}, | ||
"ollamaBaseUrl": { | ||
"value": "${OLLAMA_BASE_URL}" | ||
}, | ||
Comment on lines
32
to
37
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 여기도 마찬가지 |
||
"openAIModel": { | ||
"value": "${OPENAI_MODEL}" | ||
}, | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -19,6 +19,8 @@ param githubModelsToken string = '' | |
// Hugging Face | ||
param huggingFaceModel string = '' | ||
// Ollama | ||
param ollamaModel string = '' | ||
param ollamaBaseUrl string = '' | ||
|
||
// Anthropic | ||
// LG | ||
// Naver | ||
|
@@ -107,16 +109,18 @@ var envConnectorType = connectorType != '' ? [ | |
// Azure AI Foundry | ||
// GitHub Models | ||
var envGitHubModels = (connectorType == '' || connectorType == 'GitHubModels') ? concat(githubModelsModel != '' ? [ | ||
{ | ||
name: 'GitHubModels__Model' | ||
value: githubModelsModel | ||
} | ||
] : [], [ | ||
{ | ||
name: 'GitHubModels__Token' | ||
secretRef: 'github-models-token' | ||
} | ||
]) : [] | ||
{ | ||
name: 'GitHubModels__Model' | ||
value: githubModelsModel | ||
} | ||
] : [], | ||
githubModelsToken != '' ? [ | ||
{ | ||
name: 'GitHubModels__Token' | ||
secretRef: 'github-models-token' | ||
} | ||
] : [] | ||
) : [] | ||
// Google Vertex AI | ||
// Docker Model Runner | ||
// Foundry Local | ||
|
@@ -128,6 +132,20 @@ var envHuggingFace = connectorType == 'HuggingFace' && huggingFaceModel != '' ? | |
} | ||
] : [] | ||
// Ollama | ||
var envOllama = connectorType == 'Ollama' ? concat( | ||
ollamaModel != '' ? [ | ||
{ | ||
name: 'Ollama__Model' | ||
value: ollamaModel | ||
} | ||
] : [], | ||
ollamaBaseUrl != '' ? [ | ||
{ | ||
name: 'Ollama__BaseUrl' | ||
value: ollamaBaseUrl | ||
} | ||
] : [] | ||
|
||
) : [] | ||
// Anthropic | ||
// LG | ||
// Naver | ||
|
@@ -154,17 +172,20 @@ module openchatPlaygroundapp 'br/public:avm/res/app/container-app:0.18.1' = { | |
minReplicas: 1 | ||
maxReplicas: 10 | ||
} | ||
secrets: concat([ | ||
{ | ||
name: 'github-models-token' | ||
value: githubModelsToken | ||
} | ||
], openAIApiKey != '' ? [ | ||
{ | ||
name: 'openai-api-key' | ||
value: openAIApiKey | ||
} | ||
] : []) | ||
secrets: concat([], | ||
githubModelsToken != '' ? [ | ||
{ | ||
name: 'github-models-token' | ||
value: githubModelsToken | ||
} | ||
] : [], | ||
openAIApiKey != '' ? [ | ||
{ | ||
name: 'openai-api-key' | ||
value: openAIApiKey | ||
} | ||
] : [] | ||
) | ||
containers: [ | ||
{ | ||
image: openchatPlaygroundappFetchLatestImage.outputs.?containers[?0].?image ?? 'mcr.microsoft.com/azuredocs/containerapps-helloworld:latest' | ||
|
@@ -189,6 +210,7 @@ module openchatPlaygroundapp 'br/public:avm/res/app/container-app:0.18.1' = { | |
envConnectorType, | ||
envGitHubModels, | ||
envHuggingFace, | ||
envOllama, | ||
envOpenAI) | ||
} | ||
] | ||
|
@@ -209,4 +231,4 @@ module openchatPlaygroundapp 'br/public:avm/res/app/container-app:0.18.1' = { | |
} | ||
|
||
output AZURE_CONTAINER_REGISTRY_ENDPOINT string = containerRegistry.outputs.loginServer | ||
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundapp.outputs.resourceId | ||
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundapp.outputs.resourceId |
Original file line number | Diff line number | Diff line change | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
@@ -0,0 +1,53 @@ | ||||||||||||||||
using Microsoft.Extensions.AI; | ||||||||||||||||
using OllamaSharp; | ||||||||||||||||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||||||||||||
|
||||||||||||||||
using OpenChat.PlaygroundApp.Abstractions; | ||||||||||||||||
using OpenChat.PlaygroundApp.Configurations; | ||||||||||||||||
|
||||||||||||||||
namespace OpenChat.PlaygroundApp.Connectors; | ||||||||||||||||
|
||||||||||||||||
/// <summary> | ||||||||||||||||
/// This represents the connector entity for Ollama. | ||||||||||||||||
/// </summary> | ||||||||||||||||
public class OllamaConnector(AppSettings settings) : LanguageModelConnector(settings.Ollama) | ||||||||||||||||
{ | ||||||||||||||||
/// <inheritdoc/> | ||||||||||||||||
public override bool EnsureLanguageModelSettingsValid() | ||||||||||||||||
{ | ||||||||||||||||
var settings = this.Settings as OllamaSettings; | ||||||||||||||||
if (settings is null) | ||||||||||||||||
{ | ||||||||||||||||
throw new InvalidOperationException("Missing configuration: Ollama."); | ||||||||||||||||
} | ||||||||||||||||
|
||||||||||||||||
if (string.IsNullOrWhiteSpace(settings.BaseUrl!.Trim()) == true) | ||||||||||||||||
{ | ||||||||||||||||
throw new InvalidOperationException("Missing configuration: Ollama:BaseUrl."); | ||||||||||||||||
} | ||||||||||||||||
|
||||||||||||||||
if (string.IsNullOrWhiteSpace(settings.Model!.Trim()) == true) | ||||||||||||||||
{ | ||||||||||||||||
throw new InvalidOperationException("Missing configuration: Ollama:Model."); | ||||||||||||||||
} | ||||||||||||||||
|
||||||||||||||||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||||||||||||
return true; | ||||||||||||||||
} | ||||||||||||||||
|
||||||||||||||||
/// <inheritdoc/> | ||||||||||||||||
public override async Task<IChatClient> GetChatClientAsync() | ||||||||||||||||
{ | ||||||||||||||||
var settings = this.Settings as OllamaSettings; | ||||||||||||||||
var baseUrl = settings!.BaseUrl!; | ||||||||||||||||
var model = settings!.Model!; | ||||||||||||||||
|
||||||||||||||||
var config = new OllamaApiClient.Configuration | ||||||||||||||||
{ | ||||||||||||||||
Uri = new Uri(baseUrl), | ||||||||||||||||
Model = model, | ||||||||||||||||
}; | ||||||||||||||||
|
||||||||||||||||
var chatClient = new OllamaApiClient(config); | ||||||||||||||||
|
||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 요 사이에 요걸 넣어줘야 나중에 자동으로 모델을 pull 땡겨옵니다. open-chat-playground/src/OpenChat.PlaygroundApp/Connectors/HuggingFaceConnector.cs Lines 64 to 70 in 5ac25c0
|
||||||||||||||||
return await Task.FromResult(chatClient).ConfigureAwait(false); | ||||||||||||||||
} | ||||||||||||||||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
이
README.md
파일 말고 리포지토리 루트에 있는README.md
파일도 수정하셔야 합니다.