OpenAI

OpenAI: GPT-5.2 Codex

openai/gpt-5.2-codex

Access GPT-5.2 Codex from OpenAI using Puter.js AI API.

Get Started
// npm install @heyputer/puter.js
import { puter } from '@heyputer/puter.js';

puter.ai.chat("Explain quantum computing in simple terms", {
    model: "openai/gpt-5.2-codex"
}).then(response => {
    document.body.innerHTML = response.message.content;
});
<html>
<body>
    <script src="https://js.puter.com/v2/"></script>
    <script>
        puter.ai.chat("Explain quantum computing in simple terms", {
            model: "openai/gpt-5.2-codex"
        }).then(response => {
            document.body.innerHTML = response.message.content;
        });
    </script>
</body>
</html>
# pip install openai
from openai import OpenAI

client = OpenAI(
    base_url="https://api.puter.com/puterai/openai/v1/",
    api_key="YOUR_PUTER_AUTH_TOKEN",
)

response = client.chat.completions.create(
    model="openai/gpt-5.2-codex",
    messages=[
        {"role": "user", "content": "Explain quantum computing in simple terms"}
    ],
)

print(response.choices[0].message.content)
curl https://api.puter.com/puterai/openai/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_PUTER_AUTH_TOKEN" \
  -d '{
    "model": "openai/gpt-5.2-codex",
    "messages": [
      {"role": "user", "content": "Explain quantum computing in simple terms"}
    ]
  }'

Model Card

GPT-5.2 Codex is OpenAI's most advanced agentic coding model for professional software engineering and defensive cybersecurity. It achieves state-of-the-art on SWE-Bench Pro with improved long-horizon work through context compaction.

Context Window 128K

tokens

Max Output 128K

tokens

Input Cost $1.75

per million tokens

Output Cost $14

per million tokens

Input text, image, pdf

modalities

Tool Use Yes

 

Knowledge Cutoff Aug 31, 2025

 

Release Date Dec 11, 2025

 

Output Speed 111

tokens / sec

Latency 7.39s

time to first token

Model Playground

Try GPT-5.2 Codex instantly in your browser.
This playground uses the Puter.js AI API — no API keys or setup required.

Chat openai/gpt-5.2-codex
OpenAI
Chat with GPT-5.2 Codex
Powered by Puter.js

Benchmarks

How GPT-5.2 Codex performs on standard evaluations.

Artificial Analysis
Intelligence Index
49.0
Better than 96% of tracked models
Artificial Analysis
Coding Index
43.0
Better than 93% of tracked models
BenchmarkScore
GPQA Diamond Graduate-level science Q&A
89.9%
Humanity's Last Exam Cross-domain reasoning
33.5%
SciCode Scientific programming
54.6%
IFBench Instruction following
77.6%
LCR Long-context reasoning
75.7%
Terminal-Bench Hard Agentic terminal tasks
37.1%
τ²-Bench Tool use / agents
92.1%

Scores sourced from Artificial Analysis.

Frequently Asked Questions

How do I use GPT-5.2 Codex?

You can access GPT-5.2 Codex by OpenAI through Puter.js AI API. Include the library in your web app or Node.js project and start making calls with just a few lines of JavaScript — no backend and no configuration required. You can also use it with Python or cURL via Puter's OpenAI-compatible API.

Is GPT-5.2 Codex free?

Yes, it is free if you're using it through Puter.js. With the User-Pays Model, you can add GPT-5.2 Codex to your app at no cost — your users pay for their own AI usage directly, making it completely free for you as a developer.

What is the pricing for GPT-5.2 Codex?
Pricing for GPT-5.2 Codex is based on the number of input and output tokens used per request.
Price per 1M tokens
Input$1.75
Output$14
Who created GPT-5.2 Codex?

GPT-5.2 Codex was created by OpenAI and released on Dec 11, 2025.

What is the context window of GPT-5.2 Codex?

GPT-5.2 Codex supports a context window of 128K tokens. For reference, that is roughly equivalent to 256 pages of text.

What is the max output length of GPT-5.2 Codex?

GPT-5.2 Codex can generate up to 128K tokens in a single response.

What is the knowledge cutoff of GPT-5.2 Codex?

GPT-5.2 Codex has a knowledge cutoff date of Aug 31, 2025. This means the model was trained on data available up to that date.

What types of input can GPT-5.2 Codex process?

GPT-5.2 Codex accepts the following input types: text, image, pdf. It produces: text.

Does GPT-5.2 Codex support tool use (function calling)?

Yes, GPT-5.2 Codex supports tool use (function calling), allowing it to interact with external tools, APIs, and data sources as part of its response flow.

Does it work with React / Vue / Vanilla JS / Node / etc.?

Yes — the GPT-5.2 Codex API works with any JavaScript framework, Node.js, or plain HTML through Puter.js. Just include the library and start building. See the documentation for more details.

Get started with Puter.js

Add GPT-5.2 Codex to your app without worrying about API keys or setup.

Read the Docs View Tutorials