-
Notifications
You must be signed in to change notification settings - Fork 23
Connector Implementation & Inheritance Ollama #456
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
donghyeon639
wants to merge
20
commits into
aliencube:main
Choose a base branch
from
donghyeon639:feat/269-ollama-connector-implementation-clean
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from 2 commits
Commits
Show all changes
20 commits
Select commit
Hold shift + click to select a range
9598325
Connector Implementation & Inheritance
donghyeon639 a1f9683
Update OllamaConnectorTests.cs with latest improvements
donghyeon639 8add602
fix OllamaconnetcorTests.cs
donghyeon639 22eafbd
fix local container ollama
donghyeon639 ae1f2a5
fix: ollama bicep, ollama.md
donghyeon639 0d42437
conflicts fix and README.MD ollama
donghyeon639 8bb60fa
fix resuorces.bicep, mainparemeter.json
donghyeon639 61b73b0
comfilct fix
donghyeon639 791da59
fix conflict and Ollama Tests,ollama.md
donghyeon639 f7c1d84
ollama test fix
donghyeon639 eebb2cf
fix ollama.md Ollamatests
donghyeon639 a3a109a
conflict fix
donghyeon639 e78460d
confilct fix
donghyeon639 52417c2
fix ollama test
donghyeon639 226b5fc
fix ollama bicep
donghyeon639 68302e9
fix root README.md
donghyeon639 a03cb5c
fix ollamatest , ollama.md
donghyeon639 c1abb53
Merge latest changes from upstream/main and fix OllamaConnectorTests
donghyeon639 bf43b02
fix ollamatest, ollama.md
donghyeon639 45ff64b
ollama test languagemodel unit test
donghyeon639 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Some comments aren't visible on the classic Files Changed page.
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,129 @@ | ||
# OpenChat Playground with Ollama | ||
|
||
This page describes to run OpenChat Playground (OCP) with Ollama integration. | ||
|
||
## Get the repository root | ||
|
||
1. Get the repository root. | ||
|
||
```bash | ||
# bash/zsh | ||
REPOSITORY_ROOT=$(git rev-parse --show-toplevel) | ||
``` | ||
|
||
```powershell | ||
# PowerShell | ||
$REPOSITORY_ROOT = git rev-parse --show-toplevel | ||
``` | ||
|
||
## Run on local machine | ||
|
||
1. Make sure you are at the repository root. | ||
|
||
```bash | ||
cd $REPOSITORY_ROOT | ||
``` | ||
|
||
1. Make sure Ollama is installed and running on your local machine. If not, install Ollama from [ollama.com](https://ollama.com/) and start the service. | ||
|
||
```bash | ||
# Start Ollama service | ||
ollama serve | ||
``` | ||
|
||
1. Pull the model you want to use. Replace `{{MODEL_NAME}}` with your desired model. | ||
|
||
```bash | ||
# Example: Pull llama3.2 model | ||
ollama pull llama3.2 | ||
|
||
# Or pull other models | ||
ollama pull mistral | ||
ollama pull phi3 | ||
``` | ||
|
||
1. Run the app. | ||
|
||
```bash | ||
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- --connector-type Ollama | ||
``` | ||
|
||
1. Open your web browser, navigate to `http://localhost:5280`, and enter prompts. | ||
|
||
## Run in local container | ||
|
||
1. Make sure you are at the repository root. | ||
|
||
```bash | ||
cd $REPOSITORY_ROOT | ||
``` | ||
|
||
1. Build a container. | ||
|
||
``` | ||
docker build -f Dockerfile -t openchat-playground:latest . | ||
``` | ||
|
||
1. Run the app. | ||
|
||
``` | ||
# From locally built container | ||
docker run -i --rm -p 8080:8080 openchat-playground:latest --connector-type Ollama --base-url http://host.docker.internal:11434 --model llama3.2 | ||
``` | ||
|
||
> **NOTE**: Use `host.docker.internal:11434` to connect to Ollama running on the host machine from inside the container. | ||
|
||
1. Open your web browser, navigate to `http://localhost:8080`, and enter prompts. | ||
|
||
## Run on Azure | ||
|
||
1. Make sure you are at the repository root. | ||
|
||
```bash | ||
cd $REPOSITORY_ROOT | ||
``` | ||
|
||
1. Login to Azure. | ||
|
||
```bash | ||
# Login to Azure Dev CLI | ||
azd auth login | ||
``` | ||
|
||
1. Check login status. | ||
|
||
```bash | ||
# Azure Dev CLI | ||
azd auth login --check-status | ||
``` | ||
|
||
1. Initialize `azd` template. | ||
|
||
```bash | ||
azd init | ||
``` | ||
|
||
> **NOTE**: You will be asked to provide environment name for provisioning. | ||
|
||
1. Set Ollama configuration to azd environment variables. | ||
```bash | ||
# Set connector type to Ollama | ||
azd env set CONNECTOR_TYPE "Ollama" | ||
|
||
# Optionally, set a specific model (default is llama3.2) | ||
azd env set OLLAMA_MODEL "llama3.2" | ||
``` | ||
|
||
1. Run the following commands in order to provision and deploy the app. | ||
|
||
```bash | ||
azd up | ||
``` | ||
|
||
> **NOTE**: You will be asked to provide Azure subscription and location for deployment. | ||
|
||
1. Clean up all the resources. | ||
|
||
```bash | ||
azd down --force --purge | ||
``` |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,53 @@ | ||
using Microsoft.Extensions.AI; | ||
using OllamaSharp; | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
using OpenChat.PlaygroundApp.Abstractions; | ||
using OpenChat.PlaygroundApp.Configurations; | ||
|
||
namespace OpenChat.PlaygroundApp.Connectors; | ||
|
||
/// <summary> | ||
/// This represents the connector entity for Ollama. | ||
/// </summary> | ||
public class OllamaConnector(AppSettings settings) : LanguageModelConnector(settings.Ollama) | ||
{ | ||
/// <inheritdoc/> | ||
public override bool EnsureLanguageModelSettingsValid() | ||
{ | ||
var settings = this.Settings as OllamaSettings; | ||
if (settings is null) | ||
{ | ||
throw new InvalidOperationException("Missing configuration: Ollama."); | ||
} | ||
|
||
if (string.IsNullOrWhiteSpace(settings.BaseUrl!.Trim()) == true) | ||
{ | ||
throw new InvalidOperationException("Missing configuration: Ollama:BaseUrl."); | ||
} | ||
|
||
if (string.IsNullOrWhiteSpace(settings.Model!.Trim()) == true) | ||
{ | ||
throw new InvalidOperationException("Missing configuration: Ollama:Model."); | ||
} | ||
|
||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
return true; | ||
} | ||
|
||
/// <inheritdoc/> | ||
public override async Task<IChatClient> GetChatClientAsync() | ||
{ | ||
var settings = this.Settings as OllamaSettings; | ||
var baseUrl = settings!.BaseUrl!; | ||
var model = settings!.Model!; | ||
|
||
var config = new OllamaApiClient.Configuration | ||
{ | ||
Uri = new Uri(baseUrl), | ||
Model = model, | ||
}; | ||
|
||
var chatClient = new OllamaApiClient(config); | ||
|
||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
return await Task.FromResult(chatClient).ConfigureAwait(false); | ||
} | ||
} |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.