Arcee AI API
Access Arcee AI instantly with Puter.js, and add AI to any app in a few lines of code without backend or API keys.
// npm install @heyputer/puter.js
import { puter } from '@heyputer/puter.js';
puter.ai.chat("Explain AI like I'm five!", {
model: "arcee-ai/coder-large"
}).then(response => {
console.log(response);
});
<html>
<body>
<script src="https://js.puter.com/v2/"></script>
<script>
puter.ai.chat("Explain AI like I'm five!", {
model: "arcee-ai/coder-large"
}).then(response => {
console.log(response);
});
</script>
</body>
</html>
List of Arcee AI Models
Arcee AI: Trinity Large Preview
arcee-ai/trinity-large-preview:free
Trinity Large Preview is a 400B-parameter open-weight sparse Mixture-of-Experts model from Arcee AI with 13B active parameters per token, trained on 17+ trillion tokens. It excels at creative writing, multi-turn conversations, tool use, and agentic coding tasks with support for up to 128K context.
ChatArcee AI: Trinity Mini
arcee-ai/trinity-mini
Arcee Trinity Mini is a 26B parameter sparse mixture-of-experts (MoE) model with only 3B active parameters per token, trained end-to-end in the U.S. on 10T tokens. It features 128 experts with 8 active per token, a 128k context window, and is optimized for multi-turn reasoning, function calling, and agent workflows. Released under Apache 2.0, it offers strong performance at extremely cost-efficient pricing.
ChatArcee AI: Virtuoso Large
arcee-ai/virtuoso-large
Arcee Virtuoso Large is a 72B parameter general-purpose model based on Qwen 2.5-72B, trained using DistillKit and MergeKit with DeepSeek R1 distillation techniques. It retains a 128k context window for ingesting large documents, codebases, or financial filings, excelling at cross-domain reasoning, creative writing, and enterprise QA. The model serves as the fallback brain in Arcee Conductor pipelines when smaller SLMs flag low confidence.
ChatArcee AI: Spotlight
arcee-ai/spotlight
Arcee Spotlight is a 7B parameter vision-language model derived from Qwen 2.5-VL, fine-tuned for image-text grounding tasks like captioning, visual question-answering, and diagram analysis. It offers a 32k-128k token context window enabling multimodal conversations combining documents with images. The model matches or outperforms larger VLMs like LLaVA-1.6 13B while running efficiently on consumer GPUs.
ChatArcee AI: Coder Large
arcee-ai/coder-large
Arcee Coder Large is a 32B parameter code-specialized model based on Qwen 2.5-Instruct, fine-tuned on GitHub, CodeSearchNet, and synthetic bug-fix data for code generation and debugging. It supports 30+ programming languages with a 32k context window and shows 5-8 point gains over CodeLlama-34B-Python on HumanEval benchmarks. The model excels at producing compilable code with structured explanations, making it ideal for both educational tooling and production copilot scenarios.
ChatArcee AI: Maestro Reasoning
arcee-ai/maestro-reasoning
Arcee Maestro Reasoning is a 32B parameter flagship analysis model derived from Qwen 2.5-32B, tuned with DPO and chain-of-thought reinforcement learning for step-by-step logical reasoning. It features a 128k context window and doubles pass-rates on MATH and GSM-8K benchmarks compared to its 7B predecessor. The model is designed for complex problem-solving, abstract reasoning, and scenario modeling with transparent reasoning traces suited for audit-focused industries.
Frequently Asked Questions
The Arcee AI API gives you access to models for AI chat. Through Puter.js, you can start using Arcee AI models instantly with zero setup or configuration.
Puter.js supports a variety of Arcee AI models, including Arcee AI: Trinity Large Preview, Arcee AI: Trinity Mini, Arcee AI: Virtuoso Large, and more. Find all AI models supported by Puter.js in the AI model list.
With the User-Pays model, users cover their own AI costs through their Puter account. This means you can build apps without worrying about infrastructure expenses.
Puter.js is a JavaScript library that provides access to AI, storage, and other cloud services directly from a single API. It handles authentication, infrastructure, and scaling so you can focus on building your app.
Yes — the Arcee AI API through Puter.js works with any JavaScript framework, Node.js, or plain HTML. Just include the library and start building. See the documentation for more details.