Alibaba: Tongyi DeepResearch 30B A3B
alibaba/tongyi-deepresearch-30b-a3b
Access Tongyi DeepResearch 30B A3B from Alibaba using Puter.js AI API.
Get Started// npm install @heyputer/puter.js
import { puter } from '@heyputer/puter.js';
puter.ai.chat("Explain quantum computing in simple terms", {
model: "alibaba/tongyi-deepresearch-30b-a3b"
}).then(response => {
document.body.innerHTML = response.message.content;
});
<html>
<body>
<script src="https://js.puter.com/v2/"></script>
<script>
puter.ai.chat("Explain quantum computing in simple terms", {
model: "alibaba/tongyi-deepresearch-30b-a3b"
}).then(response => {
document.body.innerHTML = response.message.content;
});
</script>
</body>
</html>
# pip install openai
from openai import OpenAI
client = OpenAI(
base_url="https://api.puter.com/puterai/openai/v1/",
api_key="YOUR_PUTER_AUTH_TOKEN",
)
response = client.chat.completions.create(
model="alibaba/tongyi-deepresearch-30b-a3b",
messages=[
{"role": "user", "content": "Explain quantum computing in simple terms"}
],
)
print(response.choices[0].message.content)
curl https://api.puter.com/puterai/openai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_PUTER_AUTH_TOKEN" \
-d '{
"model": "alibaba/tongyi-deepresearch-30b-a3b",
"messages": [
{"role": "user", "content": "Explain quantum computing in simple terms"}
]
}'
Model Card
Tongyi DeepResearch 30B A3B is an agentic large language model from Alibaba's Tongyi Lab, purpose-built for long-horizon, multi-step information-seeking and web research tasks. It uses a Mixture-of-Experts architecture with 30.5B total parameters but only 3.3B activated per token, keeping inference costs low.
The model achieves state-of-the-art results across agentic research benchmarks, scoring 32.9 on Humanity's Last Exam, 43.4 on BrowseComp, 70.9 on GAIA, 75.0 on xbench-DeepSearch, and 90.6 on FRAMES — outperforming OpenAI o3 and DeepSeek-V3.1 on most of these tasks.
It supports a 128K context window and two inference modes: a standard ReAct mode and a heavier iterative research mode for maximum performance. Best suited for developers building autonomous research agents, deep fact-finding pipelines, or complex multi-source synthesis workflows — especially where cost efficiency matters.
Context Window 131K
tokens
Max Output 131K
tokens
Input Cost $0.09
per million tokens
Output Cost $0.45
per million tokens
Release Date Mar 24, 2025
Model Playground
Try Tongyi DeepResearch 30B A3B instantly in your browser.
This playground uses the Puter.js AI API — no API keys or setup required.
Frequently Asked Questions
You can access Tongyi DeepResearch 30B A3B by Alibaba through Puter.js AI API. Include the library in your web app or Node.js project and start making calls with just a few lines of JavaScript — no backend and no configuration required. You can also use it with Python or cURL via Puter's OpenAI-compatible API.
Yes, it is free if you're using it through Puter.js. With the User-Pays Model, you can add Tongyi DeepResearch 30B A3B to your app at no cost — your users pay for their own AI usage directly, making it completely free for you as a developer.
| Price per 1M tokens | |
|---|---|
| Input | $0.09 |
| Output | $0.45 |
Tongyi DeepResearch 30B A3B was created by Alibaba and released on Mar 24, 2025.
Tongyi DeepResearch 30B A3B supports a context window of 131K tokens. For reference, that is roughly equivalent to 262 pages of text.
Tongyi DeepResearch 30B A3B can generate up to 131K tokens in a single response.
Yes — the Tongyi DeepResearch 30B A3B API works with any JavaScript framework, Node.js, or plain HTML through Puter.js. Just include the library and start building. See the documentation for more details.
Get started with Puter.js
Add Tongyi DeepResearch 30B A3B to your app without worrying about API keys or setup.
Read the Docs View Tutorials