Chat Completions API is a compatibility surface. For new LLM integrations,
prefer Responses API.messages[], choices[0].message, or chat-compatible SDK behavior.
If you are starting fresh, use Responses API instead.
Best Fit
- existing OpenAI SDK integrations
- chat UIs already built around
messages[] - systems that expect
tool_calls,response_format, and chunked chat streaming
When to stay on Chat Completions
- your app already uses OpenAI chat-shaped requests everywhere
- your client code expects
choices[0].message - the migration cost is higher than the near-term benefit
Request Model
Common fields include:modelmessagestoolsresponse_formatstreamstream_optionsreasoning_effortweb_search_options
Quick Example
Response Model
Successful non-streaming responses follow the OpenAI chat shape.Common mistakes
- starting a new integration on Chat Completions when Responses would be simpler
- assuming every capability maps 1:1 to Responses item semantics
- forgetting that this is a compatibility layer over the same platform catalog
Learn The API In Detail
Streaming
Stream chat chunks and handle terminal chunks correctly.
Tool Calling
Keep OpenAI-style chat tool workflows.
Reasoning
Control reasoning behavior on reasoning-capable models.
Structured Outputs
Enforce JSON or schema-shaped outputs.
Multimodal Content
Send text, images, files, and audio in chat content blocks.
Web Search
Enable grounded answers with search-aware models.
Migration to Responses
Move from chat-shaped requests to the newer Responses model.