-
-
Notifications
You must be signed in to change notification settings - Fork 441
Add FastAPI integration example #189
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from 1 commit
Commits
Show all changes
6 commits
Select commit
Hold shift + click to select a range
e8e7728
Add FastAPI integration example
KennyVaneetvelde f67ab94
Fix import paths and add required docstrings
KennyVaneetvelde d63b5fc
Fix schema type parameters and update setup docs
KennyVaneetvelde 398d5c6
Update model names, finish FastAPI implementation
KennyVaneetvelde 7236108
Run Black & Flake8
KennyVaneetvelde 71c8376
Remove hardoded windows path
KennyVaneetvelde File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Some comments aren't visible on the classic Files Changed page.
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,92 @@ | ||
| # FastAPI with Atomic Agents | ||
|
|
||
| A simple example demonstrating how to integrate Atomic Agents with FastAPI for building stateful conversational APIs. | ||
|
|
||
| ## Features | ||
|
|
||
| - Session-based conversation management | ||
| - RESTful API endpoints for chat interactions | ||
| - Automatic session creation and cleanup | ||
| - Environment-based configuration | ||
|
|
||
| ## Setup | ||
|
|
||
| 1. Install dependencies: | ||
| ```bash | ||
| poetry install | ||
| ``` | ||
|
|
||
| 2. Create a `.env` file with your OpenAI API key: | ||
| ```bash | ||
| cp .env.example .env | ||
| # Edit .env and add your OpenAI API key | ||
| ``` | ||
|
|
||
| ## Running the Example | ||
|
|
||
| Start the FastAPI server: | ||
| ```bash | ||
| poetry run python fastapi_memory/main.py | ||
| ``` | ||
|
|
||
| The API will be available at `http://localhost:8000`. | ||
|
|
||
| ## API Documentation | ||
|
|
||
| Once running, visit: | ||
| - Interactive API docs: `http://localhost:8000/docs` | ||
| - Alternative docs: `http://localhost:8000/redoc` | ||
|
|
||
| ## Usage Examples | ||
|
|
||
| ### Send a message (creates new session automatically): | ||
| ```bash | ||
| curl -X POST "http://localhost:8000/chat" \ | ||
| -H "Content-Type: application/json" \ | ||
| -d '{"message": "Hello, what can you help me with?"}' | ||
| ``` | ||
|
|
||
| ### Continue a conversation with session ID: | ||
| ```bash | ||
| curl -X POST "http://localhost:8000/chat" \ | ||
| -H "Content-Type: application/json" \ | ||
| -d '{"message": "Tell me more about that", "session_id": "user123"}' | ||
| ``` | ||
|
|
||
| ### List active sessions: | ||
| ```bash | ||
| curl "http://localhost:8000/sessions" | ||
| ``` | ||
|
|
||
| ### Clear a specific session: | ||
| ```bash | ||
| curl -X DELETE "http://localhost:8000/sessions/user123" | ||
| ``` | ||
|
|
||
| ## How It Works | ||
|
|
||
| The example demonstrates several key patterns: | ||
|
|
||
| 1. **Session Management**: Each session maintains its own agent instance with independent conversation history. | ||
|
|
||
| 2. **Lazy Initialization**: Agent instances are created on-demand when a session is first accessed. | ||
|
|
||
| 3. **Automatic Cleanup**: The lifespan context manager ensures proper cleanup when the application shuts down. | ||
|
|
||
| 4. **Type Safety**: Uses Pydantic schemas for request/response validation. | ||
|
|
||
| ## Project Structure | ||
|
|
||
| ``` | ||
| fastapi-memory/ | ||
| ├── pyproject.toml # Project dependencies | ||
| ├── .env.example # Environment variable template | ||
| ├── README.md # This file | ||
| └── fastapi_memory/ | ||
| └── main.py # FastAPI application | ||
| ``` | ||
|
|
||
| ## Related Examples | ||
|
|
||
| For more advanced usage, check out: | ||
| - `mcp-agent/example-client/example_client/main_fastapi.py` - Advanced example with MCP protocol integration | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,94 @@ | ||
| import os | ||
| from contextlib import asynccontextmanager | ||
| from typing import Optional | ||
|
|
||
| import instructor | ||
| import openai | ||
| from atomic_agents.agents.atomic_agent import AtomicAgent | ||
| from atomic_agents.lib.base.base_io_schema import BaseIOSchema | ||
| from atomic_agents.lib.components.agent_config import AgentConfig | ||
| from atomic_agents.lib.components.system_prompt_generator import SystemPromptGenerator | ||
KennyVaneetvelde marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
| from dotenv import load_dotenv | ||
| from fastapi import FastAPI, HTTPException | ||
| from pydantic import Field | ||
|
|
||
| load_dotenv() | ||
|
|
||
|
|
||
| class ChatRequest(BaseIOSchema): | ||
| message: str = Field(..., description="User message") | ||
| session_id: Optional[str] = Field(None, description="Session identifier for conversation continuity") | ||
|
|
||
|
|
||
| class ChatResponse(BaseIOSchema): | ||
| response: str = Field(..., description="Agent response") | ||
| session_id: str = Field(..., description="Session identifier") | ||
|
|
||
|
|
||
| sessions = {} | ||
|
|
||
|
|
||
| def get_or_create_agent(session_id: str) -> AtomicAgent: | ||
| if session_id not in sessions: | ||
| client = instructor.from_openai(openai.OpenAI(api_key=os.getenv("OPENAI_API_KEY"))) | ||
|
|
||
| system_prompt = SystemPromptGenerator( | ||
| background=["You are a helpful AI assistant that maintains conversation context."], | ||
| steps=["Understand the user's message", "Provide a clear and helpful response"], | ||
| output_instructions=["Be concise and friendly", "Reference previous context when relevant"], | ||
| ) | ||
|
|
||
| config = AgentConfig( | ||
| client=client, | ||
| model="gpt-4o-mini", | ||
| system_prompt_generator=system_prompt, | ||
| ) | ||
|
|
||
| sessions[session_id] = AtomicAgent(config=config) | ||
KennyVaneetvelde marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
|
|
||
| return sessions[session_id] | ||
|
|
||
|
|
||
| @asynccontextmanager | ||
| async def lifespan(app: FastAPI): | ||
| yield | ||
| sessions.clear() | ||
|
|
||
|
|
||
| app = FastAPI( | ||
| title="Atomic Agents FastAPI Example", | ||
| description="Simple example showing FastAPI integration with Atomic Agents", | ||
| lifespan=lifespan, | ||
| ) | ||
|
|
||
|
|
||
| @app.post("/chat", response_model=ChatResponse) | ||
| async def chat(request: ChatRequest): | ||
| try: | ||
| session_id = request.session_id or "default" | ||
| agent = get_or_create_agent(session_id) | ||
|
|
||
| result = agent.run(ChatRequest(message=request.message)) | ||
KennyVaneetvelde marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
| return ChatResponse(response=result.response, session_id=session_id) | ||
|
|
||
| except Exception as e: | ||
| raise HTTPException(status_code=500, detail=str(e)) | ||
|
|
||
|
|
||
| @app.delete("/sessions/{session_id}") | ||
| async def clear_session(session_id: str): | ||
| if session_id in sessions: | ||
| del sessions[session_id] | ||
| return {"message": f"Session {session_id} cleared"} | ||
| raise HTTPException(status_code=404, detail="Session not found") | ||
|
|
||
|
|
||
| @app.get("/sessions") | ||
| async def list_sessions(): | ||
| return {"active_sessions": list(sessions.keys())} | ||
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| import uvicorn | ||
|
|
||
| uvicorn.run(app, host="0.0.0.0", port=8000) | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,20 @@ | ||
| [tool.poetry] | ||
| name = "fastapi-memory" | ||
| version = "0.1.0" | ||
| description = "Simple FastAPI integration example with Atomic Agents" | ||
| authors = ["BrainBlend AI"] | ||
| readme = "README.md" | ||
|
|
||
| [tool.poetry.dependencies] | ||
| python = ">=3.12,<4.0" | ||
| atomic-agents = {path = "../..", develop = true} | ||
| fastapi = "^0.115.14" | ||
| uvicorn = "^0.32.1" | ||
| instructor = "==1.9.2" | ||
| openai = ">=1.0.0" | ||
| pydantic = ">=2.10.3,<3.0.0" | ||
| python-dotenv = ">=1.0.1,<2.0.0" | ||
|
|
||
| [build-system] | ||
| requires = ["poetry-core"] | ||
| build-backend = "poetry.core.masonry.api" |
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.