A smart AI-powered document assistant for contextual Q&A and challenge-based learning. Upload a document, ask questions, and test your understanding—all with justifications and references.
- Python 3.9+
- Ollama installed and running
- Gemma3:4b model downloaded (
ollama pull gemma3:4b
)
- Clone the repo
git clone <repository-url> cd Rag_1-main
- Create and activate a virtual environment
python -m venv rag_env source rag_env/bin/activate # Windows: rag_env\Scripts\activate
- Install dependencies
pip install -r requirements.txt
- Start Ollama
ollama serve ollama pull gemma3:4b
- Start the backend
cd backend python main.py # Backend: http://localhost:8000
- Start the frontend (new terminal)
streamlit run streamlit_app.py # Frontend: http://localhost:8501
- Frontend: Streamlit (Python)
- Backend: FastAPI (Python)
- LLM: Ollama (Gemma3:4b)
- Document Parsing: PyPDF2, LangChain
- Document Upload:
- User uploads PDF/TXT via Streamlit UI
- Backend extracts text, generates summary
- Q&A Chat:
- User asks a question
- Backend retrieves context, sends to LLM
- LLM generates answer, justification, and highlights source text
- Frontend displays answer, justification, and highlights
- Challenge Mode:
- Backend generates comprehension questions from document
- User answers; backend evaluates, scores, and provides feedback/reasoning
- Upload a PDF/TXT on the Document Upload page and process it
- Q&A Chat: Ask questions, get answers with justifications and source text
- Challenge Mode: Generate questions, answer, and receive feedback and scores
Rag_1-main/
├── backend/
│ ├── main.py
│ ├── models.py
│ ├── utils.py
│ └── prompt.py
├── streamlit_app.py
├── requirements.txt
├── activate_env.sh
├── rag_env/ # (virtual environment)
└── README.md
- Ollama not running:
ollama serve
- Model not found:
ollama pull gemma3:4b
- Virtualenv issues: Delete and recreate
rag_env
- Port conflicts: Change ports in
main.py
/streamlit_app.py
- Import errors:
pip install -r requirements.txt --force-reinstall
MIT License. See LICENSE file.
For questions or issues, please open an issue in the repository.