AI SDK Provider v2.0 Released
Released v2.0 of our @llmgateway/ai-sdk-provider npm package with improved Vercel AI SDK integration and simplified model access.

We're excited to announce the release of v2.0 of our @llmgateway/ai-sdk-provider npm package, making it even easier to integrate LLM Gateway with the Vercel AI SDK.
🚀 What's New in v2.0
Enhanced integration with the Vercel AI SDK for seamless model access across all our supported providers and models.
📦 Installation
npm install @llmgateway/ai-sdk-provider
npm install @llmgateway/ai-sdk-provider
🔧 Quick Start
Simple and intuitive API for accessing any model through our unified gateway:
import { llmgateway } from "@llmgateway/ai-sdk-provider";import { generateText } from "ai";const { text } = await generateText({model: llmgateway("openai/gpt-4o"),prompt: `What's up?`,});console.log(`output: ${text}`);
import { llmgateway } from "@llmgateway/ai-sdk-provider";import { generateText } from "ai";const { text } = await generateText({model: llmgateway("openai/gpt-4o"),prompt: `What's up?`,});console.log(`output: ${text}`);
✨ Key Features
Unified Model Access: Use any of our 40+ models with the same simple interface
Provider Agnostic: Switch between OpenAI, Anthropic, Groq, and other providers seamlessly
Full AI SDK Compatibility: Works with all Vercel AI SDK functions including generateText
, streamText
, and generateObject
TypeScript Support: Full type safety and IntelliSense support
🎯 Supported Models
Access all models using the familiar provider/model format:
openai/gpt-4o
anthropic/claude-3-5-sonnet-20241022
groq/llama-3.1-70b-versatile
- And 40+ more models across 14+ providers
Check out the full documentation and explore the package on npm.