Free, Unlimited Alibaba AI API
On this page
This tutorial will show you how to use Puter.js to access Alibaba's Tongyi DeepResearch model for free, without needing API keys, backend, or server-side setup. Tongyi DeepResearch is an agentic large language model from Alibaba's Tongyi Lab, purpose-built for long-horizon, multi-step information-seeking and web research tasks.
Puter is the pioneer of the "User-Pays" model, which allows developers to incorporate AI capabilities into their applications while each user covers their own usage costs. This model enables developers to offer advanced AI capabilities to users at no cost to themselves, without any API keys or server-side setup.
Getting Started
To use Puter.js, import our NPM library in your project:
// npm install @heyputer/puter.js
import { puter } from '@heyputer/puter.js';
Or alternatively, add our script via CDN if you are working directly with HTML, simply add it to the <head> or <body> section of your code:
<script src="https://js.puter.com/v2/"></script>
Nothing else is required to start using Puter.js for free access to Alibaba's Tongyi DeepResearch model.
Example 1: Deep research with Tongyi DeepResearch
To run a deep research query using Tongyi DeepResearch, use the puter.ai.chat() function:
puter.ai.chat(
"Research the current state of solid-state battery technology. Cover the leading companies, key technical breakthroughs from the past two years, and the main commercialization barriers.",
{ model: "alibaba/tongyi-deepresearch-30b-a3b" }
).then(response => {
puter.print(response);
});
Full code example:
<html>
<body>
<script src="https://js.puter.com/v2/"></script>
<script>
puter.ai.chat(
"Research the current state of solid-state battery technology. Cover the leading companies, key technical breakthroughs from the past two years, and the main commercialization barriers.",
{ model: "alibaba/tongyi-deepresearch-30b-a3b" }
).then(response => {
puter.print(response);
});
</script>
</body>
</html>
Tongyi DeepResearch shines on this kind of open-ended fact-finding — it scores 43.4 on BrowseComp and 90.6 on FRAMES, outperforming OpenAI o3 and DeepSeek-V3.1 on most agentic research benchmarks.
Example 2: Multi-step analysis and synthesis
Tongyi DeepResearch is built for long-horizon reasoning across multiple sources. Use it to break down a complex question into sub-questions and synthesize a structured answer:
<html>
<body>
<script src="https://js.puter.com/v2/"></script>
<script>
puter.ai.chat(
"A SaaS startup is deciding between AWS, GCP, and a bare-metal provider for hosting an AI inference workload. Walk through the key cost drivers, latency trade-offs, and operational considerations for each option, and recommend a decision framework.",
{ model: "alibaba/tongyi-deepresearch-30b-a3b" }
).then(response => {
puter.print(response);
});
</script>
</body>
</html>
This kind of multi-source synthesis is exactly what the model's Mixture-of-Experts architecture (30.5B total parameters with only 3.3B activated per token) is optimized for — strong reasoning at low inference cost.
Example 3: Streaming long research responses
Research-style answers tend to be lengthy, so streaming the response back to the user keeps the UI responsive instead of waiting for the full reply:
<html>
<body>
<div id="response"></div>
<script src="https://js.puter.com/v2/"></script>
<script>
async function streamResponse() {
const outputDiv = document.getElementById('response');
const response = await puter.ai.chat(
"Compare the AI strategies of the major US, Chinese, and European tech companies. Identify their differentiators, regulatory exposure, and long-term moats.",
{ model: "alibaba/tongyi-deepresearch-30b-a3b", stream: true }
);
for await (const part of response) {
if (part?.text) {
outputDiv.innerHTML += part.text;
}
}
}
streamResponse();
</script>
</body>
</html>
Streaming makes a real difference for long-form research output — users see progress immediately, and you can render the answer as it arrives.
List of supported models
The following Alibaba model is supported by Puter.js under the alibaba/ namespace:
alibaba/tongyi-deepresearch-30b-a3b
Alibaba is also the creator of the Qwen family of models, which is published separately under the qwen/ namespace. If you're looking for general-purpose chat, coding, or vision models from Alibaba, check out the Free, Unlimited Qwen API tutorial — it covers the full Qwen lineup including Qwen3.6 Plus, Qwen3.6 Max Preview, Qwen3 Coder, and the Qwen-VL vision models.
That's it! You now have free access to Alibaba's Tongyi DeepResearch model using Puter.js. This makes it simple to build research agents, fact-finding tools, and multi-step analysis workflows in the browser — no API keys, backend infrastructure, or billing setup required.
Related
- Free, Unlimited Qwen API
- Free, Unlimited DeepSeek API
- Free, Unlimited Kimi K2.6 API
- Free, Unlimited Moonshot AI API
- Free, Unlimited Z.AI GLM API
- Free, Unlimited MiniMax API
- Free, Unlimited ByteDance Seed API
- Free, Unlimited Perplexity AI API
- Free, Unlimited OpenAI API
- Free, Unlimited Claude API
- Free, Unlimited AI API
- Free, Unlimited OpenRouter API
Free, Serverless AI and Cloud
Start creating powerful web applications with Puter.js in seconds!
Get Started Now