Providers

Provider-specific setup for OpenAI, Vercel AI SDK, and LangGraph.

Choose config based on the stream format and message shape your backend emits, not just the provider name.

This page maps common provider and backend patterns to the matching streamProtocol and messageFormat configuration.

For the core connection concepts, see Connecting to LLM.

Common mappings

Backend patternstreamProtocolmessageFormatUse this when...
OpenUI ProtocolnonenoneYour backend already emits the default OpenUI stream and accepts OpenUI messages
Raw OpenAI Chat Completions SSEopenAIAdapter()openAIMessageFormat when neededYou forward raw data: SSE chunks from Chat Completions
OpenAI SDK toReadableStream() / NDJSONopenAIReadableStreamAdapter()openAIMessageFormat when neededYou return response.toReadableStream() from the OpenAI SDK
OpenAI Responses APIopenAIResponsesAdapter()openAIConversationMessageFormat when neededYour backend uses openai.responses.create()

Start with the backend output format. Then add messageFormat only if the request or stored-history message shape also differs from the OpenUI default.

OpenAI Chat Completions

There are two common OpenAI Chat Completions patterns.

Raw SSE

Use openAIAdapter() if your server forwards raw Chat Completions SSE events.

import { openAIAdapter, openAIMessageFormat } from "@openuidev/react-headless";
import { FullScreen } from "@openuidev/react-ui";

<FullScreen
  apiUrl="/api/chat"
  streamProtocol={openAIAdapter()}
  messageFormat={openAIMessageFormat}
  agentName="Assistant"
/>;

OpenAI SDK toReadableStream()

Use openAIReadableStreamAdapter() if your route returns response.toReadableStream().

import { openAIMessageFormat, openAIReadableStreamAdapter } from "@openuidev/react-headless";
import { FullScreen } from "@openuidev/react-ui";

<FullScreen
  apiUrl="/api/chat"
  streamProtocol={openAIReadableStreamAdapter()}
  messageFormat={openAIMessageFormat}
  agentName="Assistant"
/>;

OpenAI Responses API

Use openAIResponsesAdapter() for the Responses API event stream.

Add openAIConversationMessageFormat only if your backend also expects or stores Responses conversation items instead of the default AG-UI message shape.

import { openAIConversationMessageFormat, openAIResponsesAdapter } from "@openuidev/react-headless";
import { FullScreen } from "@openuidev/react-ui";

<FullScreen
  apiUrl="/api/chat"
  streamProtocol={openAIResponsesAdapter()}
  messageFormat={openAIConversationMessageFormat}
  agentName="Assistant"
/>;

Vercel AI SDK

Ignore the SDK name at first and inspect what your route actually returns.

  • If the route already speaks the OpenUI Protocol, apiUrl is usually enough.
  • If it returns a different stream format, keep apiUrl or switch to processMessage, then add the matching streamProtocol.
  • If the route expects a custom request body, use processMessage.

LangGraph

Use the same decision rules:

  • start with apiUrl when the endpoint already matches the request and stream shape your frontend expects
  • switch to processMessage when you need auth headers, a custom body, dynamic routing, or provider-specific metadata

On this page