Use any model, from any provider
— with just one API.
Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.
import OpenAI from "openai";const client = new OpenAI({apiKey: process.env.LLM_GATEWAY_API_KEY,baseURL: "https://api.llmgateway.io/v1/"});const response = await client.chat.completions.create({model: "gpt-4o",messages: [{ role: "user", content: "Hello, how are you?" }]});console.log(response.choices[0].message.content);
