Anthropic / OpenAI
Bring your own LLM key. SeldonFrame never holds the LLM bill — your agents call your provider directly with your key, encrypted at rest.
Why BYOK
AI agents on SeldonFrame use your own Anthropic or OpenAI key. That means:
- You pay the LLM provider directly — no markup on tokens.
- You see usage in your provider's dashboard.
- Your data goes to your provider's account, not SF's pooled account.
- SF doesn't ration tokens by tier — you spend what you spend.
Anthropic setup
1
Get a key
Go to console.anthropic.com, create an API key, and copy it.
2
Paste it into SeldonFrame
Settings → LLM keys → "Add Anthropic key." Paste the key. SeldonFrame encrypts it with your workspace's encryption key (set as an env var in your deployment) and stores it.
3
Verify
Click "Test connection." We make a tiny call to Claude — if it succeeds, you're done.
OpenAI setup
Same flow. Get a key at platform.openai.com, paste into Settings → LLM keys.
Top-up your account
Anthropic and OpenAI both require a positive balance to make calls. If your agent suddenly stops responding, the most common cause is a depleted balance. SeldonFrame surfaces the actual provider error ("credit balance too low") in the chat surface so you know what to fix.
Encryption at rest
Keys are encrypted with AES-256-GCM using a per-deployment ENCRYPTION_KEY env var. The encrypted key is stored in Postgres; the env var lives only in your Vercel project. Two-key compromise is required to decrypt.
Switching providers per agent
You can configure each agent independently — agent A uses Anthropic Sonnet 4, agent B uses GPT-4. Set in Agents → Settings → Brain.