This project is a Blog Generation System that utilizes Streamlit and Ollama to generate blog content based on user input. It leverages the gemma:2b model and external resources like Wikipedia and DuckDuckGo search to enhance content generation.
- Blog Generation: Generates a blog with the following sections:
- Heading
- Introduction
- Content
- Summary
- Research Tools: Uses DuckDuckGo Search and Wikipedia to gather information.
- Streamlit Interface: Provides a simple web interface for users to input a topic and view the generated blog.
- Download Option: Allows users to download the generated blog as a .txt file.
To run this project, you need the following:
- Python 3.8 or later
- Ollama: A lightweight framework for running large language models locally.
- Streamlit: For the web interface.
- LangChain: For integrating the language model and tools.
- Additional Libraries: duckduckgo-search, wikipedia-api, etc.
Download and install Ollama from the official website: Ollama.
Run the following command to download the gemma:2b model:
bash
ollama pull gemma:2b
Install the necessary Python dependencies using pip:
bash
pip install streamlit langchain langchain-community duckduckgo-search wikipedia-api
Ensure the Ollama server is running in the background before launching the application:
bash
ollama serve
Navigate to the project directory and run the Streamlit app using:
bash
streamlit run app.py
- Open the URL displayed in the terminal (usually http://localhost:8501).
- Enter a blog topic in the input field and click Generate Blog.
- View the generated blog and download it using the Download Blog button.
Blog-Generation-using-ollama/
├── app.py # Main Streamlit app code
├── README.md # Project documentation
└── requirements.txt # List of dependencies
This project is open-source and available under the MIT License.
Feel free to contribute by submitting pull requests or reporting issues.