OpenAI Chat Completion to Zod

Paste an OpenAI Chat Completions JSON response and get a Zod schema for choices, usage, message, tool_calls, and the rest. Validate LLM responses at runtime — useful for any /v1/chat/completions or OpenAI-compatible endpoint.

JSON input valid
Zod output

      
    

About this conversion

LLM responses are the most untrusted JSON your backend handles: schema drift across model versions, refusal fields appearing only sometimes, tool_calls and logprobs filled or null, and OpenAI-compatible providers (Groq, Together, OpenRouter, Ollama) adding their own fields. A Zod schema turns the response into a parse-or-fail boundary — when a provider ships a new field you don't expect, you get a clear validation error at the edge of your code instead of a runtime crash twelve function calls deep.

The generated schema is a runnable z.object using only standard Zod primitives (z.string, z.number, z.array, z.union, .optional). Drop it into your LLM client wrapper, run schema.parse(response.json()), and the rest of your code consumes a strongly typed Choice. For tool calling, paste a response with tool_calls populated; the converter turns the nested tool_call shape into its own schema you can narrow on type === 'function'.

Same shape, other validators

Other JSON shapes