Support Matrix
| API | How to enable it | Notes |
|---|---|---|
Responses | tools: [{"type": "web_search"}] | Best starting point for new search-enabled LLM work |
Chat Completions | web_search_options | Use when you already need the chat protocol |
Messages | versioned server tool such as {"type":"web_search_20250305","name":"web_search"} | Use when you already need Anthropic compatibility |
When To Use It
- fresh or time-sensitive answers
- grounded answers with citations
- workflows where the model should retrieve context instead of relying only on training data
Enablement Shapes
- Responses
- Chat Completions
- Messages
New search-enabled LLM workflows should usually start here.
Recommended Example
Citations And Annotations
When search is active, citations often appear as annotations rather than only inside the generated text. Inspect the structured response, not just the final prose. ForChat Completions, search is enabled through the separate web_search_options field rather than through chat tools.
Practical Advice
- use web search for fresh or time-sensitive questions, not for everything
- verify that your chosen model actually supports search-enabled behavior
- inspect annotations or citations if your app needs to display sources
- keep a non-search fallback if your product can still answer from stored context