Shadcn Chat
An example of connecting OpenUI Lang to a custom component library, using shadcn/ui as the design system.
OpenUI Lang is not tied to any component library. The LLM generates abstract UI structure; your library definition decides how that structure renders. This example wires OpenUI Lang to shadcn/ui — swapping in a custom shadcnChatLibrary in place of the built-in libraries — to show that any design system can sit behind the same protocol.
Bringing your own component library
The bridge between OpenUI Lang and shadcn/ui is built with two primitives: defineComponent and createLibrary from @openuidev/react-lang.
defineComponent maps a single OpenUI Lang node to a React component. Here's the root Card component from shadcn-genui/index.tsx:
const ChatCard = defineComponent({
name: "Card",
props: z.object({
children: z.array(ChatCardChildUnion),
}),
description:
"Vertical container for all content in a chat response. Children stack top to bottom automatically.",
component: ({ props, renderNode }) => (
<Card>
<CardContent className="p-0 space-y-3">{renderNode(props.children)}</CardContent>
</Card>
),
});name— the OpenUI Lang node name the LLM will emitprops— a Zod schema that validates the node's props as they stream indescription— included in the generated system prompt so the LLM knows when and how to use the componentcomponent— the React component that renders it;renderNoderecursively renders children
Once you've defined your components, createLibrary assembles them into a library you pass to the renderer:
export const shadcnChatLibrary = createLibrary({
root: "Card",
componentGroups: shadcnComponentGroups,
components: [ChatCard, CardHeader, TextContent, Alert /* ... */],
});Swap shadcnChatLibrary for a library built on any other design system — Material UI, Radix, your own primitives — and the rest of the stack stays the same.
See Defining Components for the full defineComponent API.
Architecture
Browser (FullScreen) -- POST /api/chat --> Next.js route --> OpenAI
<-- SSE stream -- (OpenUI Lang + tool calls)The client sends a conversation to /api/chat. The API route loads a generated system-prompt.txt, forwards the messages to the LLM with streaming and tool definitions, and returns SSE events. On the client, openAIAdapter() parses the SSE stream and shadcnChatLibrary maps each OpenUI Lang node to a shadcn/ui component that renders progressively as tokens arrive.
The key decoupling: the server streams OpenUI Lang — an abstract description of UI structure. The client's shadcnChatLibrary decides how each node renders. That's where shadcn/ui enters, and that's also where you'd plug in any other design system.
The API route also supports server-side tool execution. When the model invokes a tool (weather, stock price, calculator, or web search), the route runs it and feeds the result back into the completion loop before streaming the final UI response.
Project layout
examples/shadcn-chat/
|- src/app/ # Next.js app (layout, page, API route)
|- src/hooks/ # Theme detection and context
|- src/components/ui/ # Base shadcn/ui primitives
|- src/lib/shadcn-genui/ # Generative UI component library (40+ components)
|- src/generated/ # Generated system promptRun the example
Run these commands from examples/shadcn-chat.
- Install dependencies:
cd examples/shadcn-chat
pnpm install- Create a
.env.localfile with your API key:
OPENAI_API_KEY=sk-...- Start the dev server:
pnpm dev