Skip to content

Commit 183e653

Browse files
authored
Merge pull request #5 from Cognitive-Stacks/write_docs
Write docs
2 parents 5702304 + a6cfb31 commit 183e653

File tree

7 files changed

+482
-49
lines changed

7 files changed

+482
-49
lines changed

README.md

Lines changed: 278 additions & 35 deletions
Original file line numberDiff line numberDiff line change
@@ -1,59 +1,302 @@
11
# MCPHub
22

3-
A hub for Model Context Protocol (MCP) servers that enables you to manage and run MCP servers locally.
3+
MCPHub is an embeddable Model Context Protocol (MCP) solution for AI services. It enables seamless integration of MCP servers into any AI framework, allowing developers to easily configure, set up, and manage MCP servers within their applications. Whether you're using OpenAI Agents, LangChain, or Autogen, MCPHub provides a unified way to connect your AI services with MCP tools and resources.
44

5-
## Installation
5+
## Quick Start
66

7+
### Prerequisites
8+
9+
Ensure you have the following tools installed:
710
```bash
11+
# Install uv (Python package manager)
12+
curl -LsSf https://astral.sh/uv/install.sh | sh
13+
14+
# Install git (for repository cloning)
15+
sudo apt-get install git # Ubuntu/Debian
16+
brew install git # macOS
17+
18+
# Install npx (comes with Node.js)
19+
npm install -g npx
20+
21+
# Install MCPHub
822
pip install mcphub
923
```
1024

11-
## Usage
25+
### Configuration
1226

13-
### Command Line Interface
27+
Create a `.mcphub.json` file in your project root:
1428

15-
MCPHub comes with a command-line interface for common operations:
29+
```json
30+
{
31+
"mcpServers": {
32+
"sequential-thinking-mcp": {
33+
"package_name": "smithery-ai/server-sequential-thinking",
34+
"command": "npx",
35+
"args": [
36+
"-y",
37+
"@smithery/cli@latest",
38+
"run",
39+
"@smithery-ai/server-sequential-thinking"
40+
]
41+
}
42+
}
43+
}
44+
```
1645

17-
```bash
18-
# Set up all configured MCP servers
19-
mcphub setup
46+
### Usage with OpenAI Agents
2047

21-
# List available MCP servers
22-
mcphub list-servers
48+
```python
49+
import asyncio
50+
import json
51+
from agents import Agent, Runner
52+
from mcphub import MCPHub
2353

24-
# List tools from all MCP servers
25-
mcphub list-tools
54+
async def main():
55+
"""
56+
Example of using MCPHub to integrate MCP servers with OpenAI Agents.
57+
58+
This example demonstrates:
59+
1. Initializing MCPHub
60+
2. Fetching and using an MCP server
61+
3. Listing available tools
62+
4. Creating and running an agent with MCP tools
63+
"""
64+
65+
# Step 1: Initialize MCPHub
66+
# MCPHub will automatically:
67+
# - Find .mcphub.json in your project
68+
# - Load server configurations
69+
# - Set up servers (clone repos, run setup scripts if needed)
70+
hub = MCPHub()
71+
72+
# Step 2: Create an MCP server instance using async context manager
73+
# Parameters:
74+
# - mcp_name: The name of the server from your .mcphub.json
75+
# - cache_tools_list: Cache the tools list for better performance
76+
async with hub.fetch_openai_mcp_server(
77+
mcp_name="sequential-thinking-mcp",
78+
cache_tools_list=True
79+
) as server:
80+
# Step 3: List available tools from the MCP server
81+
# This shows what capabilities are available to your agent
82+
tools = await server.list_tools()
83+
84+
# Pretty print the tools for better readability
85+
tools_dict = [
86+
dict(tool) if hasattr(tool, "__dict__") else tool for tool in tools
87+
]
88+
print("Available MCP Tools:")
89+
print(json.dumps(tools_dict, indent=2))
2690

27-
# List tools from a specific server
28-
mcphub list-tools --server azure-devops
91+
# Step 4: Create an OpenAI Agent with MCP server
92+
# The agent can now use all tools provided by the MCP server
93+
agent = Agent(
94+
name="Assistant",
95+
instructions="Use the available tools to accomplish the given task",
96+
mcp_servers=[server] # Provide the MCP server to the agent
97+
)
98+
99+
# Step 5: Run your agent with a complex task
100+
# The agent will automatically have access to all MCP tools
101+
complex_task = """Please help me analyze the following complex problem:
102+
We need to design a new feature for our product that balances user privacy
103+
with data collection for improving the service. Consider the ethical implications,
104+
technical feasibility, and business impact. Break down your thinking process
105+
step by step, and provide a detailed recommendation with clear justification
106+
for each decision point."""
107+
108+
# Execute the task and get the result
109+
result = await Runner.run(agent, complex_task)
110+
print("\nAgent Response:")
111+
print(result)
29112

30-
# Use the MCPHubAdapter
31-
mcphub adapter --config mcp_config.json --server azure-devops-mcp
113+
if __name__ == "__main__":
114+
# Run the async main function
115+
asyncio.run(main())
32116
```
33117

34-
### Using in code
118+
## Features and Guidelines
119+
120+
### Server Configuration
121+
122+
- **JSON-based Configuration**: Simple `.mcphub.json` configuration file
123+
- **Environment Variable Support**: Use environment variables in configuration
124+
- **Predefined Servers**: Access to a growing list of pre-configured MCP servers
125+
- **Custom Server Support**: Easy integration of custom MCP servers
126+
127+
Configure your MCP servers in `.mcphub.json`:
128+
129+
```json
130+
{
131+
"mcpServers": {
132+
// TypeScript-based MCP server using NPX
133+
"sequential-thinking-mcp": {
134+
"package_name": "smithery-ai/server-sequential-thinking", // NPM package name
135+
"command": "npx", // Command to run server
136+
"args": [ // Command arguments
137+
"-y",
138+
"@smithery/cli@latest",
139+
"run",
140+
"@smithery-ai/server-sequential-thinking"
141+
]
142+
},
143+
// Python-based MCP server from GitHub
144+
"azure-storage-mcp": {
145+
"package_name": "mashriram/azure_mcp_server", // Package identifier
146+
"repo_url": "https://github.com/mashriram/azure_mcp_server", // GitHub repository
147+
"command": "uv", // Python package manager
148+
"args": ["run", "mcp_server_azure_cmd"], // Run command
149+
"setup_script": "uv pip install -e .", // Installation script
150+
"env": { // Environment variables
151+
"AZURE_STORAGE_CONNECTION_STRING": "${AZURE_STORAGE_CONNECTION_STRING}",
152+
"AZURE_STORAGE_CONTAINER_NAME": "${AZURE_STORAGE_CONTAINER_NAME}",
153+
"AZURE_STORAGE_BLOB_NAME": "${AZURE_STORAGE_BLOB_NAME}"
154+
}
155+
}
156+
}
157+
}
158+
```
159+
160+
### MCP Server Installation and Management
161+
162+
- **Flexible Server Setup**: Supports both TypeScript and Python-based MCP servers
163+
- **Multiple Installation Sources**:
164+
- NPM packages via `npx`
165+
- Python packages via GitHub repository URLs
166+
- Local development servers
167+
- **Automatic Setup**: Handles repository cloning, dependency installation, and server initialization
168+
169+
### Transport Support
170+
171+
- **stdio Transport**: Run MCP servers as local subprocesses
172+
- **Automatic Path Management**: Manages server paths and working directories
173+
- **Environment Variable Handling**: Configurable environment variables per server
174+
175+
### Framework Integration
176+
177+
Provides adapters for popular AI frameworks:
178+
- OpenAI Agents
179+
- LangChain
180+
- Autogen
35181

36182
```python
37-
import asyncio
38-
from mcphub import MCPHubAdapter, setup_all_servers, store_mcp, list_tools
39-
from dataclasses import asdict
183+
from mcphub import MCPHub
184+
185+
async def framework_examples():
186+
hub = MCPHub()
187+
188+
# 1. OpenAI Agents Integration
189+
async with hub.fetch_openai_mcp_server(
190+
mcp_name="sequential-thinking-mcp",
191+
cache_tools_list=True
192+
) as server:
193+
# Use server with OpenAI agents
194+
agent = Agent(
195+
name="Assistant",
196+
mcp_servers=[server]
197+
)
198+
199+
# 2. LangChain Tools Integration
200+
langchain_tools = await hub.fetch_langchain_mcp_tools(
201+
mcp_name="sequential-thinking-mcp",
202+
cache_tools_list=True
203+
)
204+
# Use tools with LangChain
205+
206+
# 3. Autogen Adapters Integration
207+
autogen_adapters = await hub.fetch_autogen_mcp_adapters(
208+
mcp_name="sequential-thinking-mcp"
209+
)
210+
# Use adapters with Autogen
211+
```
212+
213+
### Tool Management
40214

41-
# Initialize and set up servers
42-
async def init():
43-
await setup_all_servers()
44-
await store_mcp()
215+
- **Tool Discovery**: Automatically list and manage available tools from MCP servers
216+
- **Tool Caching**: Optional caching of tool lists for improved performance
217+
- **Framework-specific Adapters**: Convert MCP tools to framework-specific formats
218+
219+
Discover and manage MCP server tools:
220+
221+
```python
222+
from mcphub import MCPHub
223+
224+
async def tool_management():
225+
hub = MCPHub()
45226

46-
# List all available tools
47-
tools = await list_tools()
48-
print(f"Available tools: {tools}")
227+
# List all tools from a specific MCP server
228+
tools = await hub.list_tools(mcp_name="sequential-thinking-mcp")
49229

50-
# Use the adapter to get a specific server
51-
adapter = MCPHubAdapter().from_config("mcp_config.json", cache_path="cache")
52-
server = adapter.get_server("azure-devops-mcp")
230+
# Print tool information
231+
for tool in tools:
232+
print(f"Tool Name: {tool.name}")
233+
print(f"Description: {tool.description}")
234+
print(f"Parameters: {tool.parameters}")
235+
print("---")
53236

54-
if server:
55-
print(f"Server config: {server}")
237+
# Tools can be:
238+
# - Cached for better performance using cache_tools_list=True
239+
# - Converted to framework-specific formats automatically
240+
# - Used directly with AI frameworks through adapters
241+
```
242+
243+
## MCPHub: High-Level Overview
244+
245+
MCPHub simplifies the integration of Model Context Protocol (MCP) servers into AI applications through four main components:
246+
247+
![MCPHub Architecture](./docs/simple_mcphub_work.png)
248+
249+
### Core Components
250+
251+
1. **Params Hub**
252+
- Manages configurations from `.mcphub.json`
253+
- Defines which MCP servers to use and how to set them up
254+
- Stores server parameters like commands, arguments, and environment variables
255+
256+
2. **MCP Servers Manager**
257+
- Handles server installation and setup
258+
- Supports two types of servers:
259+
* TypeScript-based servers (installed via npx)
260+
* Python-based servers (installed via uv from GitHub)
261+
- Manages server lifecycle and environment
262+
263+
3. **MCP Client**
264+
- Establishes communication with MCP servers
265+
- Uses stdio transport for server interaction
266+
- Handles two main operations:
267+
* `list_tools`: Discovers available server tools
268+
* `call_tool`: Executes server tools
269+
270+
4. **Framework Adapters**
271+
- Converts MCP tools to framework-specific formats
272+
- Supports multiple AI frameworks:
273+
* OpenAI Agents
274+
* LangChain
275+
* Autogen
276+
277+
### Workflow
278+
279+
1. **Configuration & Setup**
280+
- Params Hub reads configuration
281+
- Servers Manager sets up required servers
282+
- Servers start and become available
283+
284+
2. **Communication**
285+
- MCP Client connects to servers via stdio
286+
- Tools are discovered and made available
287+
- Requests and responses flow between client and server
288+
289+
3. **Integration**
290+
- Framework adapters convert MCP tools
291+
- AI applications use adapted tools through their preferred framework
292+
- Tools are executed through the established communication channel
293+
294+
This architecture provides a seamless way to integrate MCP capabilities into any AI application while maintaining clean separation of concerns and framework flexibility.
295+
296+
## Contributing
297+
298+
We welcome contributions! Please check out our [Contributing Guide](CONTRIBUTING.md) for guidelines on how to proceed.
299+
300+
## License
56301

57-
# Run the async function
58-
asyncio.run(init())
59-
```
302+
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

0 commit comments

Comments
 (0)