Mistral 7B API

Access Mistral 7B from Mistral AI using Puter.js AI API.

Get Started

Model Card

Mistral 7B is Mistral's foundational 7.3B parameter open-source model under Apache 2.0, using sliding window attention and grouped-query attention. It outperforms Llama 2 13B on all benchmarks while being efficient enough for consumer hardware.

Context Window N/A

tokens

Max Output 33K

tokens

Input Cost $0.25

per million tokens

Output Cost $0.25

per million tokens

Input text

modalities

Tool Use Yes

 

Knowledge Cutoff Dec 2023

 

Release Date Sep 27, 2023

 

API Usage Example

Add Mistral 7B to your app with just a few lines of code.
No API keys, no backend, no configuration required.

// npm install @heyputer/puter.js
import { puter } from '@heyputer/puter.js';

puter.ai.chat("Explain quantum computing in simple terms", {
    model: "mistralai/open-mistral-7b"
}).then(response => {
    document.body.innerHTML = response.message.content;
});
<html>
<body>
    <script src="https://js.puter.com/v2/"></script>
    <script>
        puter.ai.chat("Explain quantum computing in simple terms", {
            model: "mistralai/open-mistral-7b"
        }).then(response => {
            document.body.innerHTML = response.message.content;
        });
    </script>
</body>
</html>

View full documentation →

Frequently Asked Questions

What is this Mistral 7B API about?

The Mistral 7B API gives you access to Mistral AI's chat model through Puter.js. With just a few lines of JavaScript, you can integrate Mistral 7B into any web app or Node.js project — no API keys, no backend, and no configuration required.

Who created Mistral 7B?

Mistral 7B was created by Mistral AI and released on Sep 27, 2023.

What is the max output length of Mistral 7B?

Mistral 7B can generate up to 33K tokens in a single response.

What is the knowledge cutoff of Mistral 7B?

Mistral 7B has a knowledge cutoff date of Dec 2023. This means the model was trained on data available up to that date.

What types of input can Mistral 7B process?

Mistral 7B accepts the following input types: text. It produces: text.

Does Mistral 7B support tool use (function calling)?

Yes, Mistral 7B supports tool use (function calling), allowing it to interact with external tools, APIs, and data sources as part of its response flow.

How much does it cost?
The Mistral 7B API is available through the User-Pays Model. As a developer, you can add the Mistral 7B API to your app for free — your users pay for their own AI costs directly.
Price per 1M tokens
Input$0.25
Output$0.25
How do I access the Mistral 7B API?

You can access the Mistral 7B API with just a few lines of JavaScript — no API keys, no backend, and no configuration required. Include the Puter.js library in your project and start making calls right away. For more details, check out our documentation.

Does the Mistral 7B API work with React / Vue / Vanilla JS / Node / etc.?

Yes — the Mistral 7B API works with any JavaScript framework, Node.js, or plain HTML through Puter.js. Just include the library and start building. See the documentation for more details.

Get started with Puter.js

Add Mistral 7B to your app without worrying about API keys or setup.

Read the Docs View Tutorials