Skip to content

Conversation

@twaugh
Copy link
Owner

@twaugh twaugh commented Nov 22, 2025

When switching to Ollama in logsqueak init, the wizard now:

  1. Always prompts for endpoint confirmation (even if auto-test succeeds)

    • Tests default endpoint and shows connection status
    • Gives users chance to review/change before proceeding
    • Reuses test result if user keeps default and it succeeded
  2. Pairs endpoint and model from the same provider

    • Tracks which provider (ollama_remote/ollama_local/active) the default endpoint came from
    • Uses model from SAME provider when user accepts default endpoint
    • Clears model default if user changes endpoint (no invalid pairing)

Before this fix:

  • Switching from custom → Ollama would skip endpoint prompt if connection test succeeded
  • Model default could come from wrong provider (e.g., endpoint from ollama_remote but model from ollama_local)

After this fix:

  • User ALWAYS sees endpoint prompt with correct default
  • Endpoint and model stay paired from their source provider
  • Changing endpoint clears model default (prevents stale pairing)

Changes:

  • wizard.py:224-303: Track provider key, always prompt, pair model
  • test_config_wizard.py:139-183: Mock endpoint prompt, verify pairing

Fixes issue where remembered Ollama endpoints were auto-accepted without user confirmation, and models were not properly paired with their endpoints.

Assisted-by: Claude Code

@codecov-commenter
Copy link

codecov-commenter commented Nov 22, 2025

Codecov Report

❌ Patch coverage is 72.91667% with 13 lines in your changes missing coverage. Please review.
✅ Project coverage is 85.06%. Comparing base (c0dff0a) to head (ced08dc).

Files with missing lines Patch % Lines
src/logsqueak/wizard/wizard.py 72.91% 13 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main      #32      +/-   ##
==========================================
- Coverage   85.39%   85.06%   -0.34%     
==========================================
  Files          48       48              
  Lines        5082     5083       +1     
==========================================
- Hits         4340     4324      -16     
- Misses        742      759      +17     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

…model

When switching to Ollama in `logsqueak init`, the wizard now:

1. Always prompts for endpoint confirmation (even if auto-test succeeds)
   - Tests default endpoint and shows connection status
   - Gives users chance to review/change before proceeding
   - Reuses test result if user keeps default and it succeeded

2. Pairs endpoint and model from the same provider
   - Tracks which provider (ollama_remote/ollama_local/active) the
     default endpoint came from
   - Uses model from SAME provider when user accepts default endpoint
   - Clears model default if user changes endpoint (no invalid pairing)

Before this fix:
- Switching from custom → Ollama would skip endpoint prompt if
  connection test succeeded
- Model default could come from wrong provider (e.g., endpoint from
  ollama_remote but model from ollama_local)

After this fix:
- User ALWAYS sees endpoint prompt with correct default
- Endpoint and model stay paired from their source provider
- Changing endpoint clears model default (prevents stale pairing)

Changes:
- wizard.py:224-303: Track provider key, always prompt, pair model
- test_config_wizard.py:139-183: Mock endpoint prompt, verify pairing

Fixes issue where remembered Ollama endpoints were auto-accepted
without user confirmation, and models were not properly paired with
their endpoints.

Assisted-by: Claude Code
@twaugh twaugh force-pushed the fix/wizard-ollama-endpoint-prompt branch from 8dbadd0 to ced08dc Compare November 22, 2025 15:22
@twaugh twaugh merged commit 2e22f41 into main Nov 22, 2025
1 check passed
@twaugh twaugh deleted the fix/wizard-ollama-endpoint-prompt branch November 22, 2025 15:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants