DeepSeek: R1 Distill Llama 70B API

Access DeepSeek: R1 Distill Llama 70B from DeepSeek using Puter.js AI API.

Get Started

Model Card

DeepSeek R1 Distill Llama 70B is a 70 billion parameter dense model fine-tuned from Llama 3.3-70B-Instruct using 800K reasoning samples generated by DeepSeek R1. It brings R1's reasoning capabilities to a more accessible size while maintaining strong performance on math and coding benchmarks.

Context Window N/A

tokens

Max Output 16K

tokens

Input Cost $0.7

per million tokens

Output Cost $0.8

per million tokens

Release Date Jan 20, 2025

 

API Usage Example

Add DeepSeek: R1 Distill Llama 70B to your app with just a few lines of code.
No API keys, no backend, no configuration required.

// npm install @heyputer/puter.js
import { puter } from '@heyputer/puter.js';

puter.ai.chat("Explain quantum computing in simple terms", {
    model: "deepseek/deepseek-r1-distill-llama-70b"
}).then(response => {
    document.body.innerHTML = response.message.content;
});
<html>
<body>
    <script src="https://js.puter.com/v2/"></script>
    <script>
        puter.ai.chat("Explain quantum computing in simple terms", {
            model: "deepseek/deepseek-r1-distill-llama-70b"
        }).then(response => {
            document.body.innerHTML = response.message.content;
        });
    </script>
</body>
</html>

View full documentation →

Frequently Asked Questions

What is this DeepSeek: R1 Distill Llama 70B API about?

The DeepSeek: R1 Distill Llama 70B API gives you access to DeepSeek's chat model through Puter.js. With just a few lines of JavaScript, you can integrate DeepSeek: R1 Distill Llama 70B into any web app or Node.js project — no API keys, no backend, and no configuration required.

Who created DeepSeek: R1 Distill Llama 70B?

DeepSeek: R1 Distill Llama 70B was created by DeepSeek and released on Jan 20, 2025.

What is the max output length of DeepSeek: R1 Distill Llama 70B?

DeepSeek: R1 Distill Llama 70B can generate up to 16K tokens in a single response.

How much does it cost?
The DeepSeek: R1 Distill Llama 70B API is available through the User-Pays Model. As a developer, you can add the DeepSeek: R1 Distill Llama 70B API to your app for free — your users pay for their own AI costs directly.
Price per 1M tokens
Input$0.7
Output$0.8
How do I access the DeepSeek: R1 Distill Llama 70B API?

You can access the DeepSeek: R1 Distill Llama 70B API with just a few lines of JavaScript — no API keys, no backend, and no configuration required. Include the Puter.js library in your project and start making calls right away. For more details, check out our documentation.

Does the DeepSeek: R1 Distill Llama 70B API work with React / Vue / Vanilla JS / Node / etc.?

Yes — the DeepSeek: R1 Distill Llama 70B API works with any JavaScript framework, Node.js, or plain HTML through Puter.js. Just include the library and start building. See the documentation for more details.

Get started with Puter.js

Add DeepSeek: R1 Distill Llama 70B to your app without worrying about API keys or setup.

Read the Docs View Tutorials