deepseek/deepseek-chat
Model Card
DeepSeek Chat is the general-purpose conversational alias that points to the latest DeepSeek V3 chat model, a 671B parameter Mixture-of-Experts LLM optimized for everyday conversations, coding assistance, and general tasks. It supports 128K context and provides fast, direct responses without explicit reasoning chains.
Context Window
128K
tokens
Max Output
8,000
tokens
Input Cost
$0.56
per million tokens
Output Cost
$1.68
per million tokens
API Usage Example
Add DeepSeek Chat to your app with just a few lines of code.
No API keys, no backend, no configuration required.
<html>
<body>
<script src="https://js.puter.com/v2/"></script>
<script>
puter.ai.chat("Explain quantum computing in simple terms", {
model: "deepseek/deepseek-chat"
}).then(response => {
document.body.innerHTML = response.message.content;
});
</script>
</body>
</html>
Get started with Puter.js
Add DeepSeek Chat to your app without worrying about API keys or setup.
Read the Docs View Tutorials