Liquid AI LFM 2.5 Is Now Available in Puter.js
On this page
Puter.js now supports LFM 2.5, Liquid AI's next-generation family of edge-optimized models designed for real on-device AI agents.
What is LFM 2.5?
LFM 2.5 is Liquid AI's most capable release for edge AI deployment. Built on their hybrid architecture that combines grouped query attention with short convolutional layers, these models deliver best-in-class results while maintaining blazing inference speed.
Two variants are available:
- LFM 2.5 Instruct: General-purpose model optimized for agentic tasks, data extraction, and RAG
- LFM 2.5 Thinking: Reasoning-enhanced variant that uses chain-of-thought while requiring fewer output tokens than comparable thinking models
Examples
Basic Chat
puter.ai.chat("Explain the benefits of edge computing for AI applications", {
model: "liquid/lfm-2.5-1.2b-instruct:free"
})
Chain-of-Thought Reasoning
puter.ai.chat("A store sells apples for $2 each and oranges for $3 each. If I buy 5 fruits and spend exactly $12, how many of each fruit did I buy?",
{ model: "liquid/lfm-2.5-1.2b-thinking:free" }
)
Data Extraction
puter.ai.chat(`Extract the product name, price, and availability from this text:
"The new XPS 15 laptop is now available for $1,299. Currently in stock with free shipping."`,
{ model: "liquid/lfm-2.5-1.2b-instruct:free" }
)
Get Started Now
Just add one script tag to your HTML:
<script src="https://js.puter.com/v2/"></script>
No API keys or account needed. Start building with LFM 2.5 immediately.
Learn more:
Free, Serverless AI and Cloud
Start creating powerful web applications with Puter.js in seconds!
Get Started Now