-
-
Notifications
You must be signed in to change notification settings - Fork 9
Description
Hello,
I am using Open WebUI in an environment isolated from the Internet.
By default, Open WebUI includes the "sentence-transformers/all-MiniLM-L6-v2" embedding model in its Docker image. This allows me to run and use the application without requiring Internet access.
However, after installing the confluence_search tool, I encountered the following error: "Error loading embedding model: We couldn't connect to https://huggingface.co to load the files, and couldn't find the...".
Additionally, the Docker logs show this message: "No sentence-transformers model found with name sentence-transformers/all-MiniLM-L6-v2"
The tool’s configuration includes an environment variable called "Embedding Model Save Path" with a default value of "/tmp/cache/sentence_transformers".
I have two questions:
- Why doesn’t the confluence_search tool use the embedding model already included in Open WebUI?
- Is it possible to configure the confluence_search tool to use the "sentence-transformers/all-MiniLM-L6-v2" model from the Open WebUI application files?
Thank you in advance for your help.