Connecting to LLM
Configure apiUrl, streamProtocol adapters, and authentication.
Every chat layout needs a backend connection, but there are a few separate pieces involved:
- how the frontend sends the request
- how the backend streams the response
- what message shape the backend expects
This page introduces each one first, then shows how to choose the right combination for your backend.
apiUrl
apiUrl is the simplest connection option. Use it when your frontend can call one backend endpoint directly and you do not need custom request logic on the client.
import { FullScreen } from "@openuidev/react-ui";
<FullScreen apiUrl="/api/chat" agentName="Assistant" />;With apiUrl, OpenUI sends the message history to your endpoint for you. If your backend expects a different message format, configure messageFormat. If you need custom headers, extra fields, or a different request body, use processMessage instead.
processMessage
processMessage gives you full control over the request. Use it when you need to:
- add auth headers
- build a dynamic URL
- include extra request fields
- convert
messagesbefore sending them
import { openAIMessageFormat, openAIReadableStreamAdapter } from "@openuidev/react-headless";
import { FullScreen } from "@openuidev/react-ui";
import { openuiLibrary } from "@openuidev/react-ui/genui-lib";
<FullScreen
processMessage={async ({ messages, abortController }) => {
return fetch("/api/chat", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${getToken()}`,
},
body: JSON.stringify({
messages: openAIMessageFormat.toApi(messages),
}),
signal: abortController.signal,
});
}}
streamProtocol={openAIReadableStreamAdapter()}
componentLibrary={openuiLibrary}
agentName="Assistant"
/>;processMessage receives threadId, messages, and abortController, and must return a standard Response from your backend call.
streamProtocol
streamProtocol tells OpenUI how to parse the response stream. By default, OpenUI expects the OpenUI Protocol, so only set this when your backend streams a different format.
| Backend output | Frontend config |
|---|---|
| OpenUI Protocol | No adapter required |
| Raw OpenAI Chat Completions SSE | streamProtocol={openAIAdapter()} |
OpenAI SDK toReadableStream() / NDJSON | streamProtocol={openAIReadableStreamAdapter()} |
| OpenAI Responses API | streamProtocol={openAIResponsesAdapter()} |
import { openAIReadableStreamAdapter } from "@openuidev/react-headless";
<FullScreen
apiUrl="/api/chat"
streamProtocol={openAIReadableStreamAdapter()}
agentName="Assistant"
/>;messageFormat
messageFormat controls the shape of the messages array sent to your backend and the shape expected when loading thread history.
| Backend message shape | Frontend config |
|---|---|
| AG-UI message shape | No converter required |
| OpenAI chat completions messages | messageFormat={openAIMessageFormat} |
| OpenAI Responses conversation items | messageFormat={openAIConversationMessageFormat} |
import { openAIMessageFormat, openAIReadableStreamAdapter } from "@openuidev/react-headless";
import { FullScreen } from "@openuidev/react-ui";
<FullScreen
apiUrl="/api/chat"
streamProtocol={openAIReadableStreamAdapter()}
messageFormat={openAIMessageFormat}
agentName="Assistant"
/>;Use messageFormat whenever your backend expects or returns a non-default message shape. This is especially important if you store messages for thread history.
How to choose
Once you know what each prop does, the decision becomes:
- Start with
apiUrl. - Switch to
processMessageonly if you need auth, extra fields, dynamic URLs, or request conversion. - Add
streamProtocolonly if your backend does not stream the default OpenUI Protocol. - Add
messageFormatonly if your backend expects or returns a non-default message shape.
Rules summary
apiUrlis the simplest path when one endpoint can handle the request as-is.processMessageis the right choice when you need auth, extra fields, or payload conversion.streamProtocolparses the response stream.messageFormatconverts request messages and loaded thread history.