Grok-3 Mini

Compact Grok-3 for fast, cost-effective inference.

grok-3-mini
STABLEModel DeactivatedGet StartedView uptime
131,072 context
Starting at $0.30/M input tokens
Starting at $0.50/M output tokens
Streaming
Tools
JSON Output

Select Provider

All Providers for Grok-3 Mini

LLM Gateway routes requests to the best providers that are able to handle your prompt size and parameters.

xAI
Context: 131.1k
Deprecated since Jan 30, 2026Deactivated since Feb 28, 2026
Input
$0.3
/M tokens
Cached
/M tokens
Output
$0.5
/M tokens
Get Started