-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Open
Description
Feature Request
Problem
There is no way to configure a custom Anthropic base URL.
Users running local LLM gateways (e.g. Bifrost, LiteLLM) that expose
an Anthropic-compatible endpoint cannot use them as a drop-in replacement.
Requested Change
Add support for an ANTHROPIC_BASE_URL environment variable (similar to
how OPENAI_BASE_URL is supported) so users can point the Anthropic
provider to a custom endpoint.
Expected Behavior
Setting:
ANTHROPIC_BASE_URL=http://my-gateway:8080/anthropic
...should route all Anthropic API calls to that endpoint instead of
https://api.anthropic.com
Use Case
- Local LLM gateways (Bifrost, LiteLLM, etc.)
- Corporate proxies
- Self-hosted Anthropic-compatible endpoints
Workaround
Currently forced to use the Custom OpenAI provider which has
JSON parsing issues with Claude models wrapping responses in markdown fences.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels