Skip to content

Embedding vector length issue in Copilot #1826

@arvigilg

Description

@arvigilg

Hi,
First of all, thank you for your great work on the Copilot plugin — it has been really useful in my Obsidian workflow!
I’ve run into a small issue with embeddings. I’m using a local llama.cpp proxy (using the opain compatible fortmat) that returns vectors of length 768 for nomic-embed-v1.5-768d. However, in the Obsidian developer console, Copilot always detects this model as having a vector length of 192 and builds the semantic index that way.
This mismatch leads to problems with indexing, since the plugin doesn’t seem to pick up the vectorLength field that my server provides in /v1/models.
Would it be possible for Copilot to respect the vector length reported by the backend, instead of defaulting to 192? That would make it easier to use custom embeddings without running into conflicts.
Happy to share logs if that would help. Thanks again for all the effort you’ve put into making this plugin so powerful!
Best regards,
Alicia

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions