Skip to content

[Bug]: help mentions models.json, but contents are ignored. #73

@rcfa

Description

@rcfa

Bug Description

When issuing /help, I get among other things the following line:

    Model Configuration:
      Edit ~/.grok/models.json to add custom models (Claude, GPT, Gemini, etc.)

While user-settings.json may be useful for other settings, I should be able all model providers in models.json.
If I do that, I get an error on startup, because grok won’t find its API key.
If I just move the API key to user-settings.json, grok will start up, but it still won’t see any of the models defined in models.json

e.g. with the following contents in models.json I’d expect
a) the CLI to successfully start up (without the need to dump the grok API key into user-settings.json
b) the locally provided by LMStudio model to be listed when I issue the /models command
c) the CLI to work, even without any Grok API key or model information, for fully local/off-line operation

$ cat ~/.grok/models.json
[
  {
    "provider": "xai",
    "baseURL": "https://api.x.ai/v1",
    "apiKey": "xai-blahblahblahblah",
    "defaultModel": "grok-4-latest",
    "models": [
      "grok-4-latest",
      "grok-3-latest",
      "grok-3-fast",
      "grok-3-mini-fast"
    ]
  },
  {
    "provider": "LMStudio",
    "apiKey": "",
    "baseURL": "http://localhost:1234/v1",
    "defaultModel": "openai/gpt-oss-20b",
    "models": ["openai/gpt-oss-20b"]
  }
]

Can’t seem to get grok-CLI to work with anything but Grok, which is, even though that will be often the way I plan to use it, not what is "advertised", as the whole killer aspect of this CLI tool, is the ability to switch among models, or operate fully local.

Steps to reproduce

No response

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions