Local LLM support #624
its-pixel-poet
started this conversation in
Ideas
Replies: 1 comment
-
Yes, it supports Ollama and any other locally run LLM that has an OpenAI compatible API. For the suggestion service, you need to use the CustomSuggestionServiceForCopilotForXcode extension. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am not sure, but is it possible to use locally running LLM model with this plugin ?
Beta Was this translation helpful? Give feedback.
All reactions