Skip to content

Commit 58e1378

Browse files
constantiniusshellmayr
authored andcommitted
feat(python): add docs for Pydantic AI integration
1 parent 66d18c5 commit 58e1378

File tree

3 files changed

+255
-2
lines changed

3 files changed

+255
-2
lines changed

docs/platforms/python/integrations/index.mdx

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,6 @@ The Sentry SDK uses integrations to hook into the functionality of popular libra
3737
| <LinkWithPlatformIcon platform="python.sqlalchemy" label="SQLAlchemy" url="/platforms/python/integrations/sqlalchemy" /> ||
3838

3939
### AI
40-
4140
| | **Auto-enabled** |
4241
| --------------------------------------------------------------------------------------------------------------------------------- | :--------------: |
4342
| <LinkWithPlatformIcon platform="anthropic" label="Anthropic" url="/platforms/python/integrations/anthropic" /> ||
@@ -48,7 +47,9 @@ The Sentry SDK uses integrations to hook into the functionality of popular libra
4847
| <LinkWithPlatformIcon platform="langchain" label="LangChain" url="/platforms/python/integrations/langchain" /> ||
4948
| <LinkWithPlatformIcon platform="langgraph" label="LangGraph" url="/platforms/python/integrations/langgraph" /> ||
5049
| <LinkWithPlatformIcon platform="litellm" label="LiteLLM" url="/platforms/python/integrations/litellm" /> | |
51-
| <LinkWithPlatformIcon platform="mcp" label="MCP (Model Context Protocol)" url="/platforms/python/integrations/mcp" /> | |
50+
| <LinkWithPlatformIcon platform="pydantic-ai" label="Pydantic AI" url="/platforms/python/integrations/pydantic-ai" /> | |
51+
| <LinkWithPlatformIcon platform="mcp" label="MCP (Model Context Protocol)" url="/platforms/python/integrations/mcp" /> | |
52+
5253

5354
### Data Processing
5455

Lines changed: 251 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,251 @@
1+
---
2+
title: Pydantic AI
3+
description: "Learn about using Sentry for Pydantic AI."
4+
---
5+
6+
This integration connects Sentry with the [Pydantic AI](https://ai.pydantic.dev/) library.
7+
The integration has been confirmed to work with Pydantic AI version 0.0.1+.
8+
9+
Once you've installed this SDK, you can use [Sentry AI Agents Insights](https://sentry.io/orgredirect/organizations/:orgslug/insights/agents/), a Sentry dashboard that helps you understand what's going on with your AI agents.
10+
11+
Sentry AI Agents monitoring will automatically collect information about agents, tools, prompts, tokens, and models.
12+
13+
## Install
14+
15+
Install `sentry-sdk` from PyPI:
16+
17+
```bash {tabTitle:pip}
18+
pip install "sentry-sdk"
19+
```
20+
21+
```bash {tabTitle:uv}
22+
uv add "sentry-sdk"
23+
```
24+
25+
## Configure
26+
27+
Add `PydanticAIIntegration()` to your `integrations` list:
28+
29+
```python {tabTitle:OpenAI}
30+
import sentry_sdk
31+
from sentry_sdk.integrations.pydantic_ai import PydanticAIIntegration
32+
from sentry_sdk.integrations.openai import OpenAIIntegration
33+
34+
sentry_sdk.init(
35+
dsn="___PUBLIC_DSN___",
36+
traces_sample_rate=1.0,
37+
# Add data like LLM and tool inputs/outputs;
38+
# see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
39+
send_default_pii=True,
40+
integrations=[
41+
PydanticAIIntegration(),
42+
],
43+
# Disable the OpenAI integration to avoid double reporting of chat spans
44+
disabled_integrations=[OpenAIIntegration()],
45+
)
46+
```
47+
48+
```python {tabTitle:Anthropic}
49+
import sentry_sdk
50+
from sentry_sdk.integrations.pydantic_ai import PydanticAIIntegration
51+
52+
sentry_sdk.init(
53+
dsn="___PUBLIC_DSN___",
54+
traces_sample_rate=1.0,
55+
# Add data like LLM and tool inputs/outputs;
56+
# see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
57+
send_default_pii=True,
58+
integrations=[
59+
PydanticAIIntegration(),
60+
],
61+
)
62+
```
63+
64+
<Alert level="warning">
65+
66+
When using Pydantic AI with OpenAI models, you must disable the OpenAI integration to avoid double reporting of chat spans. Add `disabled_integrations=[OpenAIIntegration()]` to your `sentry_sdk.init()` call as shown in the OpenAI tab above.
67+
68+
</Alert>
69+
70+
## Verify
71+
72+
Verify that the integration works by running an AI agent. The resulting data should show up in your AI Agents Insights dashboard. In this example, we're creating a customer support agent that analyzes customer inquiries and can optionally look up order information using a tool.
73+
74+
```python {tabTitle:OpenAI}
75+
import asyncio
76+
77+
import sentry_sdk
78+
from sentry_sdk.integrations.pydantic_ai import PydanticAIIntegration
79+
from sentry_sdk.integrations.openai import OpenAIIntegration
80+
from pydantic_ai import Agent, RunContext
81+
from pydantic import BaseModel
82+
83+
class SupportResponse(BaseModel):
84+
message: str
85+
sentiment: str
86+
requires_escalation: bool
87+
88+
support_agent = Agent(
89+
'openai:gpt-4o-mini',
90+
name="Customer Support Agent",
91+
system_prompt=(
92+
"You are a helpful customer support agent. Analyze customer inquiries, "
93+
"provide helpful responses, and determine if escalation is needed. "
94+
"If the customer mentions an order number, use the lookup tool to get details."
95+
),
96+
result_type=SupportResponse,
97+
)
98+
99+
@support_agent.tool
100+
async def lookup_order(ctx: RunContext[None], order_id: str) -> dict:
101+
"""Look up order details by order ID.
102+
103+
Args:
104+
ctx: The context object.
105+
order_id: The order identifier.
106+
107+
Returns:
108+
Order details including status and tracking.
109+
"""
110+
# In a real application, this would query a database
111+
return {
112+
"order_id": order_id,
113+
"status": "shipped",
114+
"tracking_number": "1Z999AA10123456784",
115+
"estimated_delivery": "2024-03-15"
116+
}
117+
118+
async def main() -> None:
119+
sentry_sdk.init(
120+
dsn="___PUBLIC_DSN___",
121+
traces_sample_rate=1.0,
122+
# Add data like LLM and tool inputs/outputs;
123+
# see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
124+
send_default_pii=True,
125+
integrations=[
126+
PydanticAIIntegration(),
127+
],
128+
# Disable the OpenAI integration to avoid double reporting of chat spans
129+
disabled_integrations=[OpenAIIntegration()],
130+
)
131+
132+
result = await support_agent.run(
133+
"Hi, I'm wondering about my order #ORD-12345. When will it arrive?"
134+
)
135+
print(result.data)
136+
137+
if __name__ == "__main__":
138+
asyncio.run(main())
139+
```
140+
141+
```python {tabTitle:Anthropic}
142+
import asyncio
143+
144+
import sentry_sdk
145+
from sentry_sdk.integrations.pydantic_ai import PydanticAIIntegration
146+
from pydantic_ai import Agent, RunContext
147+
from pydantic import BaseModel
148+
149+
class SupportResponse(BaseModel):
150+
message: str
151+
sentiment: str
152+
requires_escalation: bool
153+
154+
support_agent = Agent(
155+
'anthropic:claude-3-5-sonnet-latest',
156+
name="Customer Support Agent",
157+
system_prompt=(
158+
"You are a helpful customer support agent. Analyze customer inquiries, "
159+
"provide helpful responses, and determine if escalation is needed. "
160+
"If the customer mentions an order number, use the lookup tool to get details."
161+
),
162+
result_type=SupportResponse,
163+
)
164+
165+
@support_agent.tool
166+
async def lookup_order(ctx: RunContext[None], order_id: str) -> dict:
167+
"""Look up order details by order ID.
168+
169+
Args:
170+
ctx: The context object.
171+
order_id: The order identifier.
172+
173+
Returns:
174+
Order details including status and tracking.
175+
"""
176+
# In a real application, this would query a database
177+
return {
178+
"order_id": order_id,
179+
"status": "shipped",
180+
"tracking_number": "1Z999AA10123456784",
181+
"estimated_delivery": "2024-03-15"
182+
}
183+
184+
async def main() -> None:
185+
sentry_sdk.init(
186+
dsn="___PUBLIC_DSN___",
187+
traces_sample_rate=1.0,
188+
# Add data like LLM and tool inputs/outputs;
189+
# see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
190+
send_default_pii=True,
191+
integrations=[
192+
PydanticAIIntegration(),
193+
],
194+
)
195+
196+
result = await support_agent.run(
197+
"Hi, I'm wondering about my order #ORD-12345. When will it arrive?"
198+
)
199+
print(result.data)
200+
201+
if __name__ == "__main__":
202+
asyncio.run(main())
203+
```
204+
205+
It may take a couple of moments for the data to appear in [sentry.io](https://sentry.io).
206+
207+
## Behavior
208+
209+
Data on the following will be collected:
210+
211+
- AI agents invocations
212+
- execution of tools
213+
- number of input and output tokens used
214+
- LLM models usage
215+
- model settings (temperature, max_tokens, etc.)
216+
217+
Sentry considers LLM and tool inputs/outputs as PII and doesn't include PII data by default. If you want to include the data, set `send_default_pii=True` in the `sentry_sdk.init()` call. To explicitly exclude prompts and outputs despite `send_default_pii=True`, configure the integration with `include_prompts=False` as shown in the [Options section](#options) below.
218+
219+
## Options
220+
221+
By adding `PydanticAIIntegration` to your `sentry_sdk.init()` call explicitly, you can set options for `PydanticAIIntegration` to change its behavior:
222+
223+
```python
224+
import sentry_sdk
225+
from sentry_sdk.integrations.pydantic_ai import PydanticAIIntegration
226+
227+
sentry_sdk.init(
228+
# ...
229+
# Add data like inputs and responses;
230+
# see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
231+
send_default_pii=True,
232+
integrations=[
233+
PydanticAIIntegration(
234+
include_prompts=False, # LLM and tool inputs/outputs will be not sent to Sentry, despite send_default_pii=True
235+
),
236+
],
237+
)
238+
```
239+
240+
You can pass the following keyword arguments to `PydanticAIIntegration()`:
241+
242+
- `include_prompts`:
243+
244+
Whether LLM and tool inputs and outputs should be sent to Sentry. Sentry considers this data personal identifiable data (PII) by default. If you want to include the data, set `send_default_pii=True` in the `sentry_sdk.init()` call. To explicitly exclude prompts and outputs despite `send_default_pii=True`, configure the integration with `include_prompts=False`.
245+
246+
The default is `True`.
247+
248+
## Supported Versions
249+
250+
- Pydantic AI: 1.0.0+
251+
- Python: 3.9+

docs/platforms/python/tracing/instrumentation/custom-instrumentation/ai-agents-module.mdx

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,7 @@ The Python SDK supports automatic instrumentation for some AI libraries. We reco
1919
- <PlatformLink to="/integrations/langchain/">LangChain</PlatformLink>
2020
- <PlatformLink to="/integrations/langgraph/">LangGraph</PlatformLink>
2121
- <PlatformLink to="/integrations/litellm/">LiteLLM</PlatformLink>
22+
- <PlatformLink to="/integrations/pydantic-ai/">Pydantic AI</PlatformLink>
2223

2324
## Manual Instrumentation
2425

0 commit comments

Comments
 (0)