LLM Proxy
LLM Proxy API docs
Test Authentication
Paste a valid LLM Proxy API key to run endpoint tests.
LLM Proxy API Documentation
Use one base URL and one key to access multiple model providers with an OpenAI-compatible API.
Getting Started
Create a key and make your first request.
Authentication
Bearer auth format and key usage.
Endpoints
All OpenAI-compatible routes.
Claude Code
Connect Claude Code to this proxy.
Supported OpenAI-compatible endpoints
List Models
Returns all currently available logical models for your key.
GET /v1/modelsChat Completions
Primary text generation endpoint compatible with OpenAI chat clients.
POST /v1/chat/completionsCompletions
Legacy text completion endpoint for clients that use prompt-based completions.
POST /v1/completionsEmbeddings
Generate vector embeddings for semantic search and retrieval.
POST /v1/embeddings