Meta: Llama 3.3 70B Instruct API

Access Meta: Llama 3.3 70B Instruct from Meta Llama using Puter.js AI API.

Get Started

Model Card

Llama 3.3 70B Instruct is Meta's refined 70 billion parameter multilingual model with improved instruction following and tool use capabilities. It supports 8 languages and offers enhanced reasoning performance over previous versions.

Context Window N/A

tokens

Max Output 16K

tokens

Input Cost $0.1

per million tokens

Output Cost $0.32

per million tokens

Release Date Dec 6, 2024

 

API Usage Example

Add Meta: Llama 3.3 70B Instruct to your app with just a few lines of code.
No API keys, no backend, no configuration required.

// npm install @heyputer/puter.js
import { puter } from '@heyputer/puter.js';

puter.ai.chat("Explain quantum computing in simple terms", {
    model: "meta-llama/llama-3.3-70b-instruct"
}).then(response => {
    document.body.innerHTML = response.message.content;
});
<html>
<body>
    <script src="https://js.puter.com/v2/"></script>
    <script>
        puter.ai.chat("Explain quantum computing in simple terms", {
            model: "meta-llama/llama-3.3-70b-instruct"
        }).then(response => {
            document.body.innerHTML = response.message.content;
        });
    </script>
</body>
</html>

View full documentation →

Frequently Asked Questions

What is this Meta: Llama 3.3 70B Instruct API about?

The Meta: Llama 3.3 70B Instruct API gives you access to Meta Llama's chat model through Puter.js. With just a few lines of JavaScript, you can integrate Meta: Llama 3.3 70B Instruct into any web app or Node.js project — no API keys, no backend, and no configuration required.

Who created Meta: Llama 3.3 70B Instruct?

Meta: Llama 3.3 70B Instruct was created by Meta Llama and released on Dec 6, 2024.

What is the max output length of Meta: Llama 3.3 70B Instruct?

Meta: Llama 3.3 70B Instruct can generate up to 16K tokens in a single response.

How much does it cost?
The Meta: Llama 3.3 70B Instruct API is available through the User-Pays Model. As a developer, you can add the Meta: Llama 3.3 70B Instruct API to your app for free — your users pay for their own AI costs directly.
Price per 1M tokens
Input$0.1
Output$0.32
How do I access the Meta: Llama 3.3 70B Instruct API?

You can access the Meta: Llama 3.3 70B Instruct API with just a few lines of JavaScript — no API keys, no backend, and no configuration required. Include the Puter.js library in your project and start making calls right away. For more details, check out our documentation.

Does the Meta: Llama 3.3 70B Instruct API work with React / Vue / Vanilla JS / Node / etc.?

Yes — the Meta: Llama 3.3 70B Instruct API works with any JavaScript framework, Node.js, or plain HTML through Puter.js. Just include the library and start building. See the documentation for more details.

Get started with Puter.js

Add Meta: Llama 3.3 70B Instruct to your app without worrying about API keys or setup.

Read the Docs View Tutorials