Tencent AI

Tencent AI API

Access Tencent AI instantly with Puter.js, and add AI to any app in a few lines of code without backend or API keys.

// npm install @heyputer/puter.js
import { puter } from '@heyputer/puter.js';

puter.ai.chat("Explain AI like I'm five!", {
    model: "tencent/hunyuan-a13b-instruct"
}).then(response => {
    console.log(response);
});
<html>
<body>
    <script src="https://js.puter.com/v2/"></script>
    <script>
        puter.ai.chat("Explain AI like I'm five!", {
            model: "tencent/hunyuan-a13b-instruct"
        }).then(response => {
            console.log(response);
        });
    </script>
</body>
</html>

List of Tencent AI Models

Chat

Hy 3 Preview

tencent/hy3-preview

Tencent Hy3 is a 295B-parameter Mixture-of-Experts reasoning model developed by Tencent's Hunyuan team, with only 21B parameters active per query. It supports a 256K-token context window and configurable reasoning levels (disabled, low, high), letting you trade off latency and depth per request. Hy3 is particularly strong on coding and agentic tasks. It scores 74.4% on SWE-bench Verified for real-world bug fixing and 67.1% on BrowseComp for complex web research. Its MoE architecture delivers competitive performance against much larger models — matching Kimi-K2.5 (1T+ parameters) on agent benchmarks at a fraction of the compute cost. Best suited for developers building agentic workflows, code generation pipelines, and multi-step reasoning applications where cost-efficiency matters.

Chat

Hunyuan A13B Instruct

tencent/hunyuan-a13b-instruct

Hunyuan A13B Instruct is an open-source large language model from Tencent built on a fine-grained Mixture-of-Experts (MoE) architecture, with 80B total parameters and 13B active during inference. It natively supports a 256K-token context window. It performs competitively with OpenAI o1 and DeepSeek R1 across math, science, and reasoning benchmarks, scoring 87.3 on AIME 2024, 89.1 on BBH, and 84.7 on ZebraLogic. Hunyuan A13B particularly excels at agentic tasks and tool use, leading on benchmarks like BFCL-v3 (78.3) and ComplexFuncBench (61.2). It's a strong choice for developers building agent workflows, long-context applications, or cost-sensitive reasoning pipelines.

Frequently Asked Questions

What is this Tencent AI API about?

The Tencent AI API gives you access to models for AI chat. Through Puter.js, you can start using Tencent AI models instantly with zero setup or configuration.

Which Tencent AI models can I use?

Puter.js supports a variety of Tencent AI models, including Hy 3 Preview and Hunyuan A13B Instruct. Find all AI models supported by Puter.js in the AI model list.

How much does it cost?

With the User-Pays model, users cover their own AI costs through their Puter account. This means you can build apps without worrying about infrastructure expenses.

What is Puter.js?

Puter.js is a JavaScript library that provides access to AI, storage, and other cloud services directly from a single API. It handles authentication, infrastructure, and scaling so you can focus on building your app.

Does this work with React / Vue / Vanilla JS / Node / etc.?

Yes — the Tencent AI API through Puter.js works with any JavaScript framework, Node.js, or plain HTML. Just include the library and start building. See the documentation for more details.