DeepSeek R1 Distill Llama 70B

DeepSeek R1 distilled into Llama 70B architecture.

deepseek-r1-distill-llama-70b
BETA
131,072 context
Starting at $0.75/M input tokens
Starting at $0.99/M output tokens
Streaming
Tools
JSON Output

Providers for DeepSeek R1 Distill Llama 70B

LLM Gateway routes requests to the best providers that are able to handle your prompt size and parameters.

Groq

BETA
groq/deepseek-r1-distill-llama-70b
Context Size
131.1k
Stability
STABLE
Pricing
Input
$0.75
/M
Cached
Output
$0.99
/M
Capabilities
Streaming
Tools
JSON Output
Try in Playground
    DeepSeek R1 Distill Llama 70B – AI Model on LLM Gateway