AIChatBot is a local and online AI-powered chatbot built using open-source language models. This project supports both locally hosted models (via Ollama) and cloud-based models via OpenRouter. It demonstrates the integration of AI with a .NET 8 API and Angular 20 frontend.
Transform your chatbot experience with Knowledge-Based AI! Upload your documents and get contextually-aware responses based on your own content.
Key Highlights:
- π Multi-format Document Support: Upload
.txt
,.md
, and.pdf
files - π Intelligent Document Search: Advanced retrieval algorithms find relevant content
- π¬ Context-Aware Responses: AI answers based on your uploaded documents
- π Source Attribution: See exactly which documents informed each response
- ποΈ Document Management: Easy upload, view, and delete capabilities
- π€ User-Specific Collections: Each user maintains their own document library
Experience AI that truly understands YOUR content!
Watch the AIChatBot in action on YouTube:
πΊ AIChatBot Demo
To run this project locally, ensure the following:
- Windows 10/11 with WSL support
- Installed Ubuntu 20.04.6 LTS (via Microsoft Store)
- .NET 8 SDK
- Node.js (v20+)
- Angular CLI (
npm install -g @angular/cli
) - Ollama installed in Ubuntu for running local AI models
- Optional: Account on https://openrouter.ai
- Go to Microsoft Store β Search for Ubuntu 20.04.6 LTS β Install.
- Open Ubuntu and create your UNIX user account.
curl -fsSL https://ollama.com/install.sh | sh
To pull and run the desired models:
# Pull models
ollama pull phi3:latest
ollama pull mistral:latest
ollama pull gemma:2b
ollama pull llama3:latest
# Run models
ollama run phi3
ollama run mistral
ollama run gemma:2b
ollama run llama3
# List all pulled models
ollama list
# Stop a running model
ollama stop phi3
# View running models
ps aux | grep ollama
From Ubuntu terminal:
shutdown now
Or simply close the terminal window if you donβt need a full shutdown.
-
Go to https://openrouter.ai and sign up.
-
Navigate to API Keys in your profile and generate an API key.
-
Set this key as an environment variable in your API project:
export OPENROUTER_API_KEY=your_key_here
-
Models used:
google/gemma-3-27b-it:free
deepseek/deepseek-chat-v3-0324:free
API requests are routed via OpenRouter using this key, supporting seamless AI chat.
- Navigate to
AIChatBot.API/
- Run the following commands:
dotnet restore
dotnet build
dotnet run
- Ensure
appsettings.json
file includes:
ApiKey=YOUR_KEY_HERE
- Navigate to
AIChatBot.UI/
- Run:
npm install
ng serve
- Access the chatbot UI at
http://localhost:4200/
The AIChatBot includes advanced RAG capabilities that allow AI models to answer questions based on your uploaded documents. This feature significantly enhances the AI's ability to provide contextually relevant and accurate responses.
The RAG system consists of several key components:
-
Document Processing Pipeline:
- File upload handling for multiple formats
- Text extraction from PDF, TXT, and MD files
- Content chunking for efficient retrieval
- In-memory indexing with similarity search
-
Retrieval System:
- Semantic search across document chunks
- Top-K retrieval (configurable, default: 3 chunks)
- Relevance scoring and ranking
- Source attribution and metadata tracking
-
Generation Enhancement:
- Context-aware prompt construction
- Integration with all supported AI models
- Source citation in responses
- Fallback to general knowledge when needed
The RAG system supports various AI models with different levels of effectiveness:
Model Category | Models | RAG Performance |
---|---|---|
Best RAG Support | GPT-3.5 Turbo Gemini Flash 2.0 (Unlimited) Gemini Flash 2.0 (Limited) |
βββββ |
Good RAG Support | DeepSeek v3 Gemma 3 27B LLaMA 3 |
ββββ |
Basic RAG Support | PHI-3 Mistral 7B |
βββ |
-
Upload Documents:
# Supported formats - Plain text files (.txt) - Markdown files (.md) - PDF documents (.pdf)
-
Document Management:
- View all uploaded documents
- Delete individual documents
- User-specific document collections
- Automatic indexing upon upload
-
RAG-Enhanced Chat:
- Select "Knowledge-Based (RAG)" mode
- Ask questions about your uploaded content
- Receive responses with source attribution
- Contextual answers based on document content
Currently, the RAG system uses in-memory storage (InMemoryRagStore
), which provides:
- Fast retrieval performance
- Simple deployment setup
- Automatic cleanup on application restart
- User-isolated document collections
Note: For production deployments, consider implementing persistent storage solutions.
Model | Type | Source | Access | RAG Support |
---|---|---|---|---|
PHI-3:latest | Local | Ollama | ollama run |
βββ |
Mistral:latest | Local | Ollama | ollama run |
βββ |
Gemma:2b | Local | Ollama | ollama run |
βββ |
Llama3:latest | Local | Ollama | ollama run |
ββββ |
google/gemma-3-27b-it:free | Online | OpenRouter.ai | API Key | ββββ |
deepseek/deepseek-chat-v3-0324 | Online | OpenRouter.ai | API Key | ββββ |
google/gemini-2.0-flash-exp | Online | OpenRouter.ai | API Key | βββββ |
openai/gpt-3.5-turbo-0613 | Online | OpenRouter.ai | API Key | βββββ |
google/gemini-2.0-flash-001 | Online | OpenRouter.ai | API Key | βββββ |
AIChatBot/
β
βββ AIChatBot.API/ # .NET 8 API for chatbot
β βββ Controllers/ # API endpoints
β β βββ DocumentsController.cs # RAG document upload/management
β βββ Services/ # Business logic services
β β βββ RagChatService.cs # RAG-enabled chat functionality
β β βββ InMemoryRagStore.cs # Document indexing and search
β β βββ AgentService.cs # AI tool integration
β β βββ ChatService.cs # Standard chat functionality
β βββ Interfaces/ # Service contracts
β β βββ IRagStore.cs # RAG storage interface
β βββ Migrations/ # Database schema updates for RAG
βββ AIChatBot.UI/ # Angular 20 UI frontend
β βββ src/app/components/
β βββ document-upload/ # RAG document upload component
β βββ chat/ # Main chat interface
β βββ model-selector/ # AI model and mode selection
βββ README.md # Project documentation
The AIChatBot supports three advanced operation modes beyond simple chat:
In this mode, the AI can recognize specific tasks in user prompts and use internal tools (functions) to perform actions. Integrated tools include:
Tool Function | Description | Example Prompt |
---|---|---|
CreateFile |
Creates a text file with given content | "Create a file called report.txt with the text Hello world ." |
FetchWebData |
Fetches the HTML/content of a public URL | "Fetch the content of https://example.com" |
SendEmail |
Simulates sending an email (console-logged) | "Send an email to john@example.com with subject Hello ." |
These functions are executed server-side in .NET
, with input parsed from natural language prompts.
NEW FEATURE: The RAG (Retrieval-Augmented Generation) mode enables AI to answer questions based on your uploaded documents. This powerful feature allows you to:
- Upload Documents: Support for
.txt
,.md
, and.pdf
files - Intelligent Retrieval: Automatically finds relevant content from your documents
- Source Attribution: AI responses include references to source documents
- Document Management: View, organize, and delete uploaded documents
- User-Specific Storage: Each user has their own document collection
- Text Files:
.txt
- Plain text documents - Markdown Files:
.md
- Formatted markdown documents - PDF Documents:
.pdf
- Portable document format
- Upload Documents: Use the drag-and-drop interface or browse to upload documents
- Select RAG Mode: Choose "Knowledge-Based (RAG)" from the chat mode dropdown
- Ask Questions: Query your documents using natural language
- Get Contextual Answers: Receive AI responses enriched with your document content
User: "What are the key findings in the latest market report?"
AI: Based on the provided context from "market_analysis_2025.pdf", the key findings include:
1. Consumer spending increased by 15% in Q4
2. Digital transformation investments rose by 23%
3. Supply chain disruptions decreased significantly
*Sources: 1 document(s) referenced*
The AI agent is capable of:
- Understanding high-level tasks
- Selecting and invoking appropriate tools
- Providing intelligent responses based on the outcome
This is powered by an AgentService
that works with both local LLMs (via Ollama) and cloud models (via OpenRouter) to determine the right function to execute and handle the response.
You can toggle between AI modes via the UI:
- Chat-Only Mode
- AI + Tools Mode
- Knowledge-Based (RAG) Mode
- Agent Mode (multi-step planning, coming soon)
- Choose your preferred model type (local or online)
- Start the backend using
.NET 8
- Start the frontend using Angular CLI
- Access AIChatBot at
http://localhost:4200/
-
Upload Documents:
- Click the "Documents for RAG" section in the sidebar
- Drag & drop or browse to upload
.txt
,.md
, or.pdf
files - Wait for successful upload confirmation
-
Enable Knowledge-Based Mode:
- Select "Knowledge-Based (RAG)" from the chat mode dropdown
- Choose an AI model with good RAG support (ββββ or βββββ)
-
Start Asking Questions:
Examples: "Summarize the key points from my uploaded documents" "What does the report say about market trends?" "Find information about [specific topic] in my files"
-
Review Source Attribution:
- AI responses will include "Sources: X document(s) referenced"
- Responses are enriched with content from your uploaded documents
Pull requests and suggestions are welcome! Feel free to fork the repo and enhance it.
This project is open-source and available under the MIT License.