A local, privacy-preserving question-answering system for financial documents using FastAPI, sentence-transformers, and Ollama Mistral. The system extracts content from raw PDFs, performs semantic search over chunked embeddings, and returns precise answers with a clean HTML UI.
- FastAPI-based backend for real-time inference
- PDF ingestion using
pdfplumber
- Semantic search using
sentence-transformers
- LLM integration using Ollama Mistral for local answer generation
- Simple UI using HTML/CSS/JS (chat-style interface)
- Fully offline and privacy-respecting
git clone https://github.com/Amaan-developpeur/FinancialQA-Assistant.git
cd FinancialQA-Assistant
python -m venv .venv
.\.venv\Scripts\activate
pip install -r requirements.txt
ollama run mistral
uvicorn app.main1:app --reload
Then open your browser and visit:
http://127.0.0.1:8000
financial-qa/
├── app/
│ ├── main.py # Swagget.UI can be openned
├── main1.py # FastAPI app entrypoint
│ ├── query_engine.py # Embedding search logic
│ ├── templates/chat.html # Jinja2-based UI
│ ├── static/ # CSS & JS files
│ └── utils/
│ ├── local_generate.py # LLM inference using Ollama
│ └── system_prompt.py # System prompt builder
├── data/
├── scripts/
├── Images/ # Screenshots for documentation
├── requirements.txt
├── .gitignore
└── README.md
This project is licensed under the MIT License.