MiniMax M2

MiniMax M2 model with reasoning and tool support.

minimax-m2
STABLE
196,608 context
Starting at $0.06/M (75% off) input tokens
Starting at $0.25/M (75% off) output tokens
Streaming
Tools
Reasoning
JSON Output

Providers for MiniMax M2

LLM Gateway routes requests to the best providers that are able to handle your prompt size and parameters.

CanopyWave

canopywave/minimax-m2
Context Size
196.6k
Stability
STABLE
Pricing
Input
$0.25$0.06
/M
Cached
Output
$1.00$0.25
/M
-75% off
Capabilities
Streaming
Tools
Reasoning
JSON Output
Try in Playground
    MiniMax M2 – AI Model on LLM Gateway