A smart and responsive Gen AI app that answers your questions based on a given paragraph using Hugging Face Transformers β built with Gradio and perfect for real-time document-based QA!
- Pipeline:
question-answering
- Model:
deepset/roberta-base-squad2
- Task: Extractive Question Answering (answers pulled directly from context)
- Framework: Transformers by Hugging Face
- Real-time answer extraction from user-provided context
- Context-aware β only answers if information exists in the paragraph
- Clean, interactive Gradio UI for instant feedback
- Ideal for education, legal/medical docs, support tools & more
This model only answers questions directly related to the input context.
If your question is unrelated or slightly off-topic, the model may give wrong or biased answers.
β Example:
- Context: βThe Amazon River flows through Brazil, Peru, and Colombia.β
- Question: βWhich countries does the Amazon River flow through?β
- β β Answer: Brazil, Peru, and Colombia
β Asking general questions like βWhat is AI?β wonβt work unless the context contains the answer.
π Try the live demo here: Hugging Face Space
Install dependencies:
pip install -r requirements.txt