Connecting to LLM
Configure apiUrl, streamProtocol adapters, and authentication.
Every chat layout needs to know where to send messages. Use either:
apiUrlfor standard endpoint usageprocessMessagefor full control (custom auth, URLs, payload transforms)
Standard setup
import { Copilot } from "@openuidev/react-ui";
<Copilot apiUrl="/api/chat" agentName="Assistant" />;Custom request logic
<Copilot
processMessage={async ({ threadId, messages, abortController }) => {
return fetch(`/api/threads/${threadId}/chat`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${getToken()}`,
},
body: JSON.stringify({ messages }),
signal: abortController.signal,
});
}}
/>Stream adapters
Use streamProtocol when your backend stream format differs from OpenUI default.
import { openAIAdapter } from "@openuidev/react-headless";
<Copilot apiUrl="/api/chat" streamProtocol={openAIAdapter()} />;See full provider setup in Providers.