LFM2.5-1.2B-Thinking API

Access LFM2.5-1.2B-Thinking from Liquid AI using Puter.js AI API.

Get Started

Model Card

Liquid LFM 2.5 1.2B Thinking is a reasoning-enhanced variant of Liquid AI's edge-optimized model that uses chain-of-thought reasoning while requiring fewer output tokens than comparable thinking models. It's designed for on-device deployment with fast CPU inference, ideal for agentic tasks, data extraction, and RAG. Not recommended for knowledge-intensive tasks or programming.

Context Window N/A

tokens

Max Output N/A

tokens

Input Cost $0

per million tokens

Output Cost $0

per million tokens

Release Date Jan 5, 2026

 

API Usage Example

Add LFM2.5-1.2B-Thinking to your app with just a few lines of code.
No API keys, no backend, no configuration required.

// npm install @heyputer/puter.js
import { puter } from '@heyputer/puter.js';

puter.ai.chat("Explain quantum computing in simple terms", {
    model: "liquid/lfm-2.5-1.2b-thinking:free"
}).then(response => {
    document.body.innerHTML = response.message.content;
});
<html>
<body>
    <script src="https://js.puter.com/v2/"></script>
    <script>
        puter.ai.chat("Explain quantum computing in simple terms", {
            model: "liquid/lfm-2.5-1.2b-thinking:free"
        }).then(response => {
            document.body.innerHTML = response.message.content;
        });
    </script>
</body>
</html>

View full documentation →

Frequently Asked Questions

What is this LFM2.5-1.2B-Thinking API about?

The LFM2.5-1.2B-Thinking API gives you access to Liquid AI's chat model through Puter.js. With just a few lines of JavaScript, you can integrate LFM2.5-1.2B-Thinking into any web app or Node.js project — no API keys, no backend, and no configuration required.

Who created LFM2.5-1.2B-Thinking?

LFM2.5-1.2B-Thinking was created by Liquid AI and released on Jan 5, 2026.

How much does it cost?
The LFM2.5-1.2B-Thinking API is available through the User-Pays Model. As a developer, you can add the LFM2.5-1.2B-Thinking API to your app for free — your users pay for their own AI costs directly.
Price per 1M tokens
Input$0
Output$0
How do I access the LFM2.5-1.2B-Thinking API?

You can access the LFM2.5-1.2B-Thinking API with just a few lines of JavaScript — no API keys, no backend, and no configuration required. Include the Puter.js library in your project and start making calls right away. For more details, check out our documentation.

Does the LFM2.5-1.2B-Thinking API work with React / Vue / Vanilla JS / Node / etc.?

Yes — the LFM2.5-1.2B-Thinking API works with any JavaScript framework, Node.js, or plain HTML through Puter.js. Just include the library and start building. See the documentation for more details.

Get started with Puter.js

Add LFM2.5-1.2B-Thinking to your app without worrying about API keys or setup.

Read the Docs View Tutorials