Xiaomi MiMo API
Access Xiaomi MiMo instantly with Puter.js, and add AI to any app in a few lines of code without backend or API keys.
// npm install @heyputer/puter.js
import { puter } from '@heyputer/puter.js';
puter.ai.chat("Explain AI like I'm five!", {
model: "xiaomi/mimo-v2-flash"
}).then(response => {
console.log(response);
});
<html>
<body>
<script src="https://js.puter.com/v2/"></script>
<script>
puter.ai.chat("Explain AI like I'm five!", {
model: "xiaomi/mimo-v2-flash"
}).then(response => {
console.log(response);
});
</script>
</body>
</html>
List of Xiaomi MiMo Models
MiMo-V2-Omni
xiaomi/mimo-v2-omni
MiMo V2 Omni is Xiaomi's omni-modal foundation model that natively processes text, image, video, and audio within a unified architecture, combining multimodal perception with agentic capabilities like visual grounding, multi-step planning, and tool use. It supports over 10 hours of continuous audio understanding and a 256K context window. It outperformed Gemini 3 Pro and GPT-5.2 on several benchmarks.
ChatMiMo-V2-Pro
xiaomi/mimo-v2-pro
MiMo V2 Pro is Xiaomi's flagship text-only reasoning model built for the 'agent era,' featuring over 1T total parameters (42B active) with a 1M-token context window, deeply optimized for agentic workflows like coding, tool calling, and task orchestration. Previously tested anonymously as 'Hunter Alpha' on OpenRouter where it topped daily API call charts, it ranks 8th globally and 2nd among Chinese LLMs on the Artificial Analysis Intelligence Index. Its agent performance approaches Claude Opus 4.6 at roughly one-fifth the cost.
ChatMiMo-V2-Flash
xiaomi/mimo-v2-flash
MiMo-V2-Flash is Xiaomi's open-source Mixture-of-Experts language model with 309B total parameters (15B active), designed for high-speed reasoning, coding, and agentic workflows. It uses a hybrid attention architecture with Multi-Token Prediction to achieve up to 150 tokens/second inference while keeping costs extremely low. The model excels at software engineering benchmarks and supports a 256K context window.
Frequently Asked Questions
The Xiaomi MiMo API gives you access to models for AI chat. Through Puter.js, you can start using Xiaomi MiMo models instantly with zero setup or configuration.
Puter.js supports a variety of Xiaomi MiMo models, including MiMo-V2-Omni, MiMo-V2-Pro, and MiMo-V2-Flash. Find all AI models supported by Puter.js in the AI model list.
With the User-Pays model, users cover their own AI costs through their Puter account. This means you can build apps without worrying about infrastructure expenses.
Puter.js is a JavaScript library that provides access to AI, storage, and other cloud services directly from a single API. It handles authentication, infrastructure, and scaling so you can focus on building your app.
Yes — the Xiaomi MiMo API through Puter.js works with any JavaScript framework, Node.js, or plain HTML. Just include the library and start building. See the documentation for more details.