Skip to content
Open
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
129 changes: 129 additions & 0 deletions docs/ollama.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,129 @@
# OpenChat Playground with Ollama

This page describes to run OpenChat Playground (OCP) with Ollama integration.

## Get the repository root

1. Get the repository root.

```bash
# bash/zsh
REPOSITORY_ROOT=$(git rev-parse --show-toplevel)
```

```powershell
# PowerShell
$REPOSITORY_ROOT = git rev-parse --show-toplevel
```

## Run on local machine

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Make sure Ollama is installed and running on your local machine. If not, install Ollama from [ollama.com](https://ollama.com/) and start the service.

```bash
# Start Ollama service
ollama serve
```

1. Pull the model you want to use. Replace `{{MODEL_NAME}}` with your desired model.

```bash
# Example: Pull llama3.2 model
ollama pull llama3.2

# Or pull other models
ollama pull mistral
ollama pull phi3
```

1. Run the app.

```bash
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- --connector-type Ollama
```

1. Open your web browser, navigate to `http://localhost:5280`, and enter prompts.

## Run in local container

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Build a container.

```
docker build -f Dockerfile -t openchat-playground:latest .
```

1. Run the app.

```
# From locally built container
docker run -i --rm -p 8080:8080 openchat-playground:latest --connector-type Ollama --base-url http://host.docker.internal:11434 --model llama3.2
```

> **NOTE**: Use `host.docker.internal:11434` to connect to Ollama running on the host machine from inside the container.

1. Open your web browser, navigate to `http://localhost:8080`, and enter prompts.

## Run on Azure

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Login to Azure.

```bash
# Login to Azure Dev CLI
azd auth login
```

1. Check login status.

```bash
# Azure Dev CLI
azd auth login --check-status
```

1. Initialize `azd` template.

```bash
azd init
```

> **NOTE**: You will be asked to provide environment name for provisioning.

1. Set Ollama configuration to azd environment variables.
```bash
# Set connector type to Ollama
azd env set CONNECTOR_TYPE "Ollama"

# Optionally, set a specific model (default is llama3.2)
azd env set OLLAMA_MODEL "llama3.2"
```

1. Run the following commands in order to provision and deploy the app.

```bash
azd up
```

> **NOTE**: You will be asked to provide Azure subscription and location for deployment.

1. Clean up all the resources.

```bash
azd down --force --purge
```
2 changes: 2 additions & 0 deletions infra/main.bicep
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ param githubModelsToken string = ''
// Foundry Local
// Hugging Face
// Ollama
param ollamaModel string = ''
// Anthropic
// LG
// Naver
Expand Down Expand Up @@ -64,6 +65,7 @@ module resources 'resources.bicep' = {
connectorType: connectorType
githubModelsModel: githubModelsModel
githubModelsToken: githubModelsToken
ollamaModel: ollamaModel
openchatPlaygroundappExists: openchatPlaygroundappExists
}
}
Expand Down
3 changes: 3 additions & 0 deletions infra/main.parameters.json
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,9 @@
"githubModelsToken": {
"value": "${GH_MODELS_TOKEN}"
},
"ollamaModel": {
"value": "${OLLAMA_MODEL}"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"value": "${OLLAMA_MODEL=llama3.2}" 라고 해 줘야 나중에 azd up 실행시킬 때 기본값으로 지정한 모델이 올라갑니다.

},
"openchatPlaygroundappExists": {
"value": "${SERVICE_OPENCHAT_PLAYGROUNDAPP_RESOURCE_EXISTS=false}"
},
Expand Down
12 changes: 10 additions & 2 deletions infra/resources.bicep
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ param githubModelsToken string = ''
// Foundry Local
// Hugging Face
// Ollama
param ollamaModel string = ''
// Anthropic
// LG
// Naver
Expand Down Expand Up @@ -118,6 +119,12 @@ var envGitHubModels = (connectorType == '' || connectorType == 'GitHubModels') ?
// Foundry Local
// Hugging Face
// Ollama
var envOllama = (connectorType == '' || connectorType == 'Ollama') ? (ollamaModel != '' ? [
{
name: 'Ollama__Model'
value: ollamaModel
}
] : []) : []
// Anthropic
// LG
// Naver
Expand Down Expand Up @@ -161,7 +168,8 @@ module openchatPlaygroundapp 'br/public:avm/res/app/container-app:0.18.1' = {
value: '8080'
}],
envConnectorType,
envGitHubModels)
envGitHubModels,
envOllama)
}
]
managedIdentities:{
Expand All @@ -181,4 +189,4 @@ module openchatPlaygroundapp 'br/public:avm/res/app/container-app:0.18.1' = {
}

output AZURE_CONTAINER_REGISTRY_ENDPOINT string = containerRegistry.outputs.loginServer
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundapp.outputs.resourceId
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundapp.outputs.resourceId
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@ public static async Task<IChatClient> CreateChatClientAsync(AppSettings settings
LanguageModelConnector connector = settings.ConnectorType switch
{
ConnectorType.GitHubModels => new GitHubModelsConnector(settings),
ConnectorType.Ollama => new OllamaConnector(settings),
ConnectorType.OpenAI => new OpenAIConnector(settings),
_ => throw new NotSupportedException($"Connector type '{settings.ConnectorType}' is not supported.")
};
Expand Down
53 changes: 53 additions & 0 deletions src/OpenChat.PlaygroundApp/Connectors/OllamaConnector.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
using Microsoft.Extensions.AI;
using OllamaSharp;

using OpenChat.PlaygroundApp.Abstractions;
using OpenChat.PlaygroundApp.Configurations;

namespace OpenChat.PlaygroundApp.Connectors;

/// <summary>
/// This represents the connector entity for Ollama.
/// </summary>
public class OllamaConnector(AppSettings settings) : LanguageModelConnector(settings.Ollama)
{
/// <inheritdoc/>
public override bool EnsureLanguageModelSettingsValid()
{
var settings = this.Settings as OllamaSettings;
if (settings is null)
{
throw new InvalidOperationException("Missing configuration: Ollama.");
}

if (string.IsNullOrWhiteSpace(settings.BaseUrl!.Trim()) == true)
{
throw new InvalidOperationException("Missing configuration: Ollama:BaseUrl.");
}

if (string.IsNullOrWhiteSpace(settings.Model!.Trim()) == true)
{
throw new InvalidOperationException("Missing configuration: Ollama:Model.");
}

return true;
}

/// <inheritdoc/>
public override async Task<IChatClient> GetChatClientAsync()
{
var settings = this.Settings as OllamaSettings;
var baseUrl = settings!.BaseUrl!;
var model = settings!.Model!;

var config = new OllamaApiClient.Configuration
{
Uri = new Uri(baseUrl),
Model = model,
};

var chatClient = new OllamaApiClient(config);

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

요 사이에 요걸 넣어줘야 나중에 자동으로 모델을 pull 땡겨옵니다.

var pulls = chatClient.PullModelAsync(model);
await foreach (var pull in pulls)
{
Console.WriteLine($"Pull status: {pull!.Status}");
}
Console.WriteLine($"The {this._appSettings.ConnectorType} connector created with model: {settings.Model}");

return await Task.FromResult(chatClient).ConfigureAwait(false);
}
}
Loading