deepseek/deepseek-chat
Model Card
DeepSeek Chat is the general-purpose conversational alias that points to the latest DeepSeek V3 chat model, a 671B parameter Mixture-of-Experts LLM optimized for everyday conversations, coding assistance, and general tasks. It supports 128K context and provides fast, direct responses without explicit reasoning chains.
Context Window 128K
tokens
Max Output 8K
tokens
Input Cost $0.56
per million tokens
Output Cost $1.68
per million tokens
Input text
modalities
Tool Use Yes
Knowledge Cutoff Jul 2024
Release Date Dec 26, 2024
API Usage Example
Add DeepSeek Chat to your app with just a few lines of code.
No API keys, no backend, no configuration required.
// npm install @heyputer/puter.js
import { puter } from '@heyputer/puter.js';
puter.ai.chat("Explain quantum computing in simple terms", {
model: "deepseek/deepseek-chat"
}).then(response => {
document.body.innerHTML = response.message.content;
});
<html>
<body>
<script src="https://js.puter.com/v2/"></script>
<script>
puter.ai.chat("Explain quantum computing in simple terms", {
model: "deepseek/deepseek-chat"
}).then(response => {
document.body.innerHTML = response.message.content;
});
</script>
</body>
</html>
More Models from DeepSeek
DeepSeek: DeepSeek V3.2
DeepSeek V3.2 is the December 2025 flagship model featuring DeepSeek Sparse Attention for efficiency...
ChatDeepSeek: DeepSeek V3.2 Speciale
DeepSeek V3.2-Speciale is a high-compute variant designed exclusively for maximum reasoning accuracy...
ChatDeepSeek: DeepSeek V3.2 Exp
DeepSeek V3.2-Exp is the September 2025 experimental predecessor to V3.2, introducing DeepSeek Spars...
Frequently Asked Questions
The DeepSeek Chat API gives you access to DeepSeek's chat model through Puter.js. With just a few lines of JavaScript, you can integrate DeepSeek Chat into any web app or Node.js project — no API keys, no backend, and no configuration required.
DeepSeek Chat was created by DeepSeek and released on Dec 26, 2024.
DeepSeek Chat supports a context window of 128K tokens. For reference, that is roughly equivalent to 256 pages of text.
DeepSeek Chat can generate up to 8K tokens in a single response.
DeepSeek Chat has a knowledge cutoff date of Jul 2024. This means the model was trained on data available up to that date.
DeepSeek Chat accepts the following input types: text. It produces: text.
Yes, DeepSeek Chat supports tool use (function calling), allowing it to interact with external tools, APIs, and data sources as part of its response flow.
| Price per 1M tokens | |
|---|---|
| Input | $0.56 |
| Output | $1.68 |
You can access the DeepSeek Chat API with just a few lines of JavaScript — no API keys, no backend, and no configuration required. Include the Puter.js library in your project and start making calls right away. For more details, check out our documentation.
Yes — the DeepSeek Chat API works with any JavaScript framework, Node.js, or plain HTML through Puter.js. Just include the library and start building. See the documentation for more details.
Get started with Puter.js
Add DeepSeek Chat to your app without worrying about API keys or setup.
Read the Docs View Tutorials