Skip to content

Commit 4ac9e4c

Browse files
authored
Merge pull request #17345 from BerriAI/litellm_fix_jwt_auth_route_issue
Add other routes in jwt auth
2 parents 397acec + 6d296b1 commit 4ac9e4c

File tree

3 files changed

+67
-3
lines changed

3 files changed

+67
-3
lines changed

docs/my-website/docs/proxy/pass_through.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -275,6 +275,20 @@ In this video, we'll add the Azure OpenAI Assistants API as a pass through endpo
275275
- Check LiteLLM proxy logs for error details
276276
- Verify the target API's expected request format
277277

278+
### Allowing Team JWTs to use pass-through routes
279+
280+
If you are using pass-through provider routes (e.g., `/anthropic/*`) and want your JWT team tokens to access these routes, add `mapped_pass_through_routes` to the `team_allowed_routes` in `litellm_jwtauth` or explicitly add the relevant route(s).
281+
282+
Example (`proxy_server_config.yaml`):
283+
284+
```yaml
285+
general_settings:
286+
enable_jwt_auth: True
287+
litellm_jwtauth:
288+
team_ids_jwt_field: "team_ids"
289+
team_allowed_routes: ["openai_routes","info_routes","mapped_pass_through_routes"]
290+
```
291+
278292
### Getting Help
279293
280294
[Schedule Demo 👋](https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version)

docs/my-website/docs/proxy/token_auth.md

Lines changed: 52 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -338,6 +338,58 @@ general_settings:
338338
team_allowed_routes: ["/v1/chat/completions"] # 👈 Set accepted routes
339339
```
340340
341+
### Allowing other provider routes for Teams
342+
343+
To enable team JWT tokens to access Anthropic-style endpoints such as `/v1/messages`, update `team_allowed_routes` in your `litellm_jwtauth` configuration. `team_allowed_routes` supports the following values:
344+
345+
- Named route groups from `LiteLLMRoutes` (e.g., `openai_routes`, `anthropic_routes`, `info_routes`, `mapped_pass_through_routes`).
346+
347+
Below is a quick reference for the route groups you can use and example representative routes from each group. If you need the exhaustive list, see the `LiteLLMRoutes` enum in `litellm/proxy/_types.py` for the authoritative list.
348+
349+
| Route Group | What it contains | Representative routes |
350+
|-------------|------------------|-----------------------|
351+
| `openai_routes` | OpenAI-compatible REST endpoints (chat, completion, embeddings, images, responses, models, etc.) | `/v1/chat/completions`, `/v1/completions`, `/v1/embeddings`, `/v1/images/generations`, `/v1/models` |
352+
| `anthropic_routes` | Anthropic-style endpoints (`/v1/messages` and related) | `/v1/messages`, `/v1/messages/count_tokens`, `/v1/skills` |
353+
| `mapped_pass_through_routes` | Provider-specific pass-through route prefixes (e.g., Anthropic when proxied via `/anthropic`). Use with `mapped_pass_through_routes` for provider wildcard mapping | `/anthropic/*`, `/vertex-ai/*`, `/bedrock/*` |
354+
| `passthrough_routes_wildcard` | Wildcard mapping for providers (e.g., `/anthropic/*`) - precomputed wildcard list used by the proxy | `/anthropic/*`, `/vllm/*` |
355+
| `google_routes` | Google-specific (e.g., Vertex / Batching endpoints) | `/v1beta/models/{model_name}:generateContent` |
356+
| `mcp_routes` | Internal MCP management endpoints | `/mcp/tools`, `/mcp/tools/call` |
357+
| `info_routes` | Read-only & info endpoints used by the UI | `/key/info`, `/team/info`, `/v1/models` |
358+
| `management_routes` | Admin-only management endpoints (create/update/delete user/team/model) | `/team/new`, `/key/generate`, `/model/new` |
359+
| `spend_tracking_routes` | Budget/spend related endpoints | `/spend/logs`, `/spend/keys` |
360+
| `public_routes` | Public and unauthenticated endpoints | `/`, `/routes`, `/.well-known/litellm-ui-config` |
361+
362+
Note: `llm_api_routes` is the union of OpenAI, Anthropic, Google, pass-through and other LLM routes (`openai_routes + anthropic_routes + google_routes + mapped_pass_through_routes + passthrough_routes_wildcard + apply_guardrail_routes + mcp_routes + litellm_native_routes`).
363+
364+
Defaults (what the proxy uses if you don't override them in `litellm_jwtauth`):
365+
366+
- `admin_jwt_scope`: `litellm_proxy_admin`
367+
- `admin_allowed_routes` (default): `management_routes`, `spend_tracking_routes`, `global_spend_tracking_routes`, `info_routes`
368+
- `team_allowed_routes` (default): `openai_routes`, `info_routes`
369+
- `public_allowed_routes` (default): `public_routes`
370+
371+
372+
Example: Allow team JWTs to call Anthropic `/v1/messages` (either by route group or by explicit route string):
373+
374+
```yaml
375+
general_settings:
376+
enable_jwt_auth: True
377+
litellm_jwtauth:
378+
team_ids_jwt_field: "team_ids"
379+
team_allowed_routes: ["openai_routes", "info_routes", "anthropic_routes"]
380+
```
381+
382+
Or selectively allow the exact Anthropic message endpoint only:
383+
384+
```yaml
385+
general_settings:
386+
enable_jwt_auth: True
387+
litellm_jwtauth:
388+
team_ids_jwt_field: "team_ids"
389+
team_allowed_routes: ["/v1/messages", "info_routes"]
390+
```
391+
392+
341393
### Caching Public Keys
342394

343395
Control how long public keys are cached for (in seconds).

litellm/proxy/_types.py

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3440,9 +3440,7 @@ class LiteLLM_JWTAuth(LiteLLMPydanticObjectBase):
34403440
team_id_upsert: bool = False
34413441
team_ids_jwt_field: Optional[str] = None
34423442
upsert_sso_user_to_team: bool = False
3443-
team_allowed_routes: List[
3444-
Literal["openai_routes", "info_routes", "management_routes"]
3445-
] = ["openai_routes", "info_routes"]
3443+
team_allowed_routes: List[str] = ["openai_routes", "info_routes"]
34463444
team_id_default: Optional[str] = Field(
34473445
default=None,
34483446
description="If no team_id given, default permissions/spend-tracking to this team.s",

0 commit comments

Comments
 (0)