-
Notifications
You must be signed in to change notification settings - Fork 23
Connector Implementation & Inheritance Ollama #456
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from 10 commits
9598325
a1f9683
8add602
22eafbd
ae1f2a5
0d42437
8bb60fa
61b73b0
791da59
f7c1d84
eebb2cf
a3a109a
e78460d
52417c2
226b5fc
68302e9
a03cb5c
c1abb53
bf43b02
45ff64b
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -2,4 +2,5 @@ | |
|
||
- [GitHub Models](github-models.md) | ||
- [Hugging Face](hugging-face.md) | ||
- [Ollama](ollama.md) | ||
- [OpenAI](openai.md) |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,179 @@ | ||
# OpenChat Playground with Ollama | ||
|
||
This page describes how to run OpenChat Playground (OCP) with Ollama integration. | ||
|
||
## Get the repository root | ||
|
||
1. Get the repository root. | ||
|
||
```bash | ||
# bash/zsh | ||
REPOSITORY_ROOT=$(git rev-parse --show-toplevel) | ||
``` | ||
|
||
```powershell | ||
# PowerShell | ||
$REPOSITORY_ROOT = git rev-parse --show-toplevel | ||
``` | ||
|
||
## Run on local machine | ||
|
||
1. Make sure you are at the repository root. | ||
|
||
```bash | ||
cd $REPOSITORY_ROOT | ||
``` | ||
|
||
1. Make sure Ollama is installed and running on your local machine. If not, install Ollama from [ollama.com](https://ollama.com/) and start the service. | ||
|
||
```bash | ||
# Start Ollama service | ||
ollama serve | ||
``` | ||
|
||
1. Pull the model you want to use. Replace `{{MODEL_NAME}}` with your desired model. | ||
|
||
```bash | ||
# Example: Pull llama3.2 model | ||
ollama pull llama3.2 | ||
# Or pull other models | ||
ollama pull mistral | ||
ollama pull phi3 | ||
ollama pull qwen | ||
``` | ||
tae0y marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
|
||
1. Run the app. | ||
|
||
```bash | ||
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- --connector-type Ollama --model llama3.2 | ||
``` | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
1. Open your web browser, navigate to `http://localhost:5280`, and enter prompts. | ||
|
||
## Run in local container | ||
|
||
This approach runs OpenChat Playground in a container while connecting to Ollama running on the host machine. | ||
|
||
1. Make sure you are at the repository root. | ||
|
||
```bash | ||
cd $REPOSITORY_ROOT | ||
``` | ||
|
||
1. Configure Ollama to accept connections from containers. | ||
|
||
```powershell | ||
# PowerShell (Windows) | ||
tae0y marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
$env:OLLAMA_HOST = "0.0.0.0:11434" | ||
# Start Ollama service | ||
ollama serve | ||
``` | ||
|
||
```bash | ||
# bash/zsh (Linux/macOS) | ||
export OLLAMA_HOST=0.0.0.0:11434 | ||
ollama serve | ||
``` | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
1. Pull the model you want to use. | ||
|
||
```bash | ||
# Pull llama3.2 model (recommended) | ||
ollama pull llama3.2 | ||
# Verify Ollama is accessible | ||
curl http://localhost:11434/api/version | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
다른 문서 보시면 이런 경우 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 이 부분 Invoke-RestMethod로 변경해야합니다. Powershell에서 curl을 자동으로 Invoke-RestMethod로 바꿔 실행하긴 하는데, |
||
``` | ||
|
||
1. Build a container. | ||
|
||
```bash | ||
docker build -f Dockerfile -t openchat-playground:latest . | ||
``` | ||
|
||
1. Run the app. | ||
|
||
```bash | ||
# Using command-line arguments | ||
docker run -i --rm -p 8080:8080 \ | ||
openchat-playground:latest \ | ||
--connector-type Ollama \ | ||
--base-url http://host.docker.internal:11434 \ | ||
--model llama3.2 | ||
``` | ||
|
||
```bash | ||
# Alternative: Using environment variables | ||
docker run -i --rm -p 8080:8080 \ | ||
-e ConnectorType=Ollama \ | ||
-e Ollama__BaseUrl=http://host.docker.internal:11434 \ | ||
-e Ollama__Model=llama3.2 \ | ||
openchat-playground:latest | ||
``` | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
> **NOTE**: Use `host.docker.internal:11434` to connect to Ollama running on the host machine from inside the container. Make sure `OLLAMA_HOST=0.0.0.0:11434` is set on the host. | ||
tae0y marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
|
||
1. Open your web browser, navigate to `http://localhost:8080`, and enter prompts. | ||
|
||
## Run on Azure | ||
|
||
1. Make sure you are at the repository root. | ||
|
||
```bash | ||
cd $REPOSITORY_ROOT | ||
``` | ||
|
||
1. Login to Azure. | ||
|
||
```bash | ||
# Login to Azure Dev CLI | ||
azd auth login | ||
``` | ||
|
||
1. Check login status. | ||
|
||
```bash | ||
# Azure Dev CLI | ||
azd auth login --check-status | ||
``` | ||
|
||
1. Initialize `azd` template. | ||
|
||
```bash | ||
azd init | ||
``` | ||
|
||
> **NOTE**: You will be asked to provide environment name for provisioning. | ||
|
||
1. Set Ollama configuration to azd environment variables. | ||
|
||
**Azure-hosted Ollama** | ||
|
||
```bash | ||
# Set connector type to Ollama | ||
azd env set CONNECTOR_TYPE "Ollama" | ||
# Use placeholder URL that will be replaced when Ollama is deployed in Azure | ||
azd env set OLLAMA_BASE_URL "https://{{OLLAMA_URL}}:11434" | ||
# Set a specific model | ||
azd env set OLLAMA_MODEL "llama3.2" | ||
``` | ||
|
||
> **NOTE**: The `{{OLLAMA_URL}}` will be replaced with the actual Ollama service URL when you deploy Ollama in Azure. This allows you to prepare the configuration before the Ollama service is available. | ||
|
||
1. Run the following commands in order to provision and deploy the app. | ||
|
||
```bash | ||
azd up | ||
``` | ||
|
||
> **NOTE**: You will be asked to provide Azure subscription and location for deployment. | ||
|
||
1. Clean up all the resources. | ||
|
||
```bash | ||
azd down --force --purge | ||
``` |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,54 @@ | ||
using Microsoft.Extensions.AI; | ||
|
||
using OllamaSharp; | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
using OpenChat.PlaygroundApp.Abstractions; | ||
using OpenChat.PlaygroundApp.Configurations; | ||
|
||
namespace OpenChat.PlaygroundApp.Connectors; | ||
|
||
/// <summary> | ||
/// This represents the connector entity for Ollama. | ||
/// </summary> | ||
public class OllamaConnector(AppSettings settings) : LanguageModelConnector(settings.Ollama) | ||
{ | ||
/// <inheritdoc/> | ||
public override bool EnsureLanguageModelSettingsValid() | ||
{ | ||
var settings = this.Settings as OllamaSettings; | ||
if (settings is null) | ||
{ | ||
throw new InvalidOperationException("Missing configuration: Ollama."); | ||
} | ||
|
||
if (string.IsNullOrWhiteSpace(settings.BaseUrl!.Trim()) == true) | ||
{ | ||
throw new InvalidOperationException("Missing configuration: Ollama:BaseUrl."); | ||
} | ||
|
||
if (string.IsNullOrWhiteSpace(settings.Model!.Trim()) == true) | ||
{ | ||
throw new InvalidOperationException("Missing configuration: Ollama:Model."); | ||
} | ||
|
||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
return true; | ||
} | ||
|
||
/// <inheritdoc/> | ||
public override async Task<IChatClient> GetChatClientAsync() | ||
{ | ||
var settings = this.Settings as OllamaSettings; | ||
var baseUrl = settings!.BaseUrl!; | ||
var model = settings!.Model!; | ||
|
||
var config = new OllamaApiClient.Configuration | ||
{ | ||
Uri = new Uri(baseUrl), | ||
Model = model, | ||
}; | ||
|
||
var chatClient = new OllamaApiClient(config); | ||
|
||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
return await Task.FromResult(chatClient).ConfigureAwait(false); | ||
} | ||
} |
Uh oh!
There was an error while loading. Please reload this page.