LLM Proxy
LLM Proxy API docs
Test Authentication
Paste a valid LLM Proxy API key to run endpoint tests.

Getting Started

Get a key, choose a model, and make your first API request.

Step 1: Create your user API key

Register and login, then open the API key manager to create a key.

Open API Keys page

Copy your key immediately when created. It is only shown once.

Step 2: Make a quick chat request
bash
curl -s https://your-domain.com/v1/chat/completions \
  -H "Authorization: Bearer $LLM_PROXY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet-4-20250514",
    "messages": [{"role":"user","content":"Say hello from LLM Proxy"}]
  }'
Step 3: Use any OpenAI-compatible SDK
typescript
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.LLM_PROXY_API_KEY,
  baseURL: "https://your-domain.com/v1",
});

const response = await client.chat.completions.create({
  model: "claude-sonnet-4-20250514",
  messages: [{ role: "user", content: "Give me one deployment checklist item." }],
});

console.log(response.choices[0]?.message?.content);