LLM Proxy
LLM Proxy API docs
Test Authentication
Paste a valid LLM Proxy API key to run endpoint tests.
Embeddings
Generate vector embeddings for semantic search and retrieval.
POST/v1/embeddings
cURL example
bash
curl -s http://localhost:3000/v1/embeddings \
-H "Authorization: Bearer $LLM_PROXY_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "text-embedding-3-small",
"input": "Hello from LLM Proxy"
}'Request body example
json
{
"model": "text-embedding-3-small",
"input": "LLM Proxy routes one OpenAI-compatible API across providers."
}Response example
json
{
"object": "list",
"data": [
{
"object": "embedding",
"index": 0,
"embedding": [0.0123, -0.0191, 0.0442]
}
],
"model": "text-embedding-3-small",
"usage": {
"prompt_tokens": 13,
"total_tokens": 13
}
}Try It
POST
/v1/embeddingsRun a live request using a pasted API key.
Select an account key or paste a manual key first to test endpoints.