CanopyWave Provider
CanopyWave is a platform for running large language models with OpenAI-compatible API
Available Models
DeepSeek V3.1
deepseek-v3.1Providers
CanopyWave
canopywave/deepseek-v3.1Context Size
128k
Stability
STABLEPricing
Input
$0.27$0.03
Cached
—/M
Output
$1.00$0.10
Capabilities
Streaming
Vision
Tools
JSON Output
MiniMax M2
minimax-m2Providers
CanopyWave
canopywave/minimax-m2Context Size
196.6k
Stability
STABLEPricing
Input
$0.25$0.06
Cached
—/M
Output
$1.00$0.25
Capabilities
Streaming
Tools
Reasoning
JSON Output
Kimi K2 Thinking
kimi-k2-thinkingProviders
CanopyWave
canopywave/kimi-k2-thinkingContext Size
262.1k
Stability
STABLEPricing
Input
$0.48$0.12
Cached
—/M
Output
$2.00$0.50
Capabilities
Streaming
Tools
JSON Output
Qwen3 Coder
qwen3-coderProviders
CanopyWave
canopywave/qwen3-coderContext Size
262k
Stability
STABLEPricing
Input
$0.22$0.06
Cached
—/M
Output
$0.95$0.24
Capabilities
Streaming
Tools
JSON Output
GLM-4.6
glm-4.6Providers
CanopyWave
canopywave/glm-4.6Context Size
202.8k
Stability
STABLEPricing
Input
$0.45$0.11
Cached
—/M
Output
$1.50$0.38
Capabilities
Streaming
Tools
Reasoning
JSON Output