Skip to content

Conversation

BibekBhusal0
Copy link
Owner

@BibekBhusal0 BibekBhusal0 commented Aug 23, 2025

Todos:

  • Support for all AI providers.
  • Use for generateObject for citation (less chance of error due to schema validation).
  • Add stop AI response button.
  • URL bar integration.
  • Improving tool calls and including more tools for youtube, workspace, tabs.
  • Pseudo bg.
  • Better Styling and Animations.

Bugs:

  • Pseudo BG not so dynamic.
  • Streaming stops suddenly.
  • Multiple tool calls in row don't work.

BibekBhusal0 and others added 30 commits July 24, 2025 17:32
- Migrate to the Vercel AI SDK for streaming responses.
- Implement tool calling with user confirmation.
- Update message handling to support string or array content.
- Improve error handling and message persistence.
- Add build step with npm to bundle the project using Vercel AI SDK
- Document technologies used: Vercel AI SDK and Zod
- Update import path to the bundled file `browse-bot.uc.js`
- Add detailed debug logs for LLM interactions, including system prompt, history, model details, tool calls, stream chunks, and errors.
- Improve visibility into LLM behavior for debugging and troubleshooting.
- Refactor sendMessage to handle streaming and citations separately
- Add citation support via generateObject with schema validation
- Update UI to display citations and handle historical messages
- Add chokidar-cli for file watching during development
- Update build script to include a `dev` script for watching file changes and rebuild
- Implemented streaming message responses from the LLM, improving UI responsiveness.
- Added a "Stop Generation" button to allow users to abort ongoing streaming requests.
- Refactored message sending to handle both streaming and non-streaming modes with a unified try/catch/finally structure.
- Added abort signal to LLM sendMessage calls.
- Move Gemini and Mistral providers to a single "providers.js" file
- Remove individual provider files for better organization
- Update available models for Mistral and Gemini providers
- Add new Mistral and Gemini models with their labels
- Adds OpenAI, Claude, Grok, Perplexity, and Ollama providers
- Updates preferences and provider handling
- Remove redundant getters/setters in PREFS
- Use PREFS.getPref/setPref in LLM provider files
- Use provider prototype to share logic
- Apply prototype to each provider object
- Update Ollama model list and add descriptive labels for each model.
- Refactors findbar-ai.uc.js to use `export const` for `browseBotFindbar`
- Updates references to `browserBotfindbar` to use `browseBotFindbar` throughout the project
- Add Claude, Grok, and Perplexity AI as LLM providers
- Update model options for Gemini and Mistral
- Add API key settings for Claude, Grok and Perplexity
BibekBhusal0 and others added 30 commits October 9, 2025 10:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant