React Native
stream LLM-generated UI to an Expo app that renders real native components.
OpenUI Lang runs on React Native. This example streams LLM-generated UI to an Expo app, where <Renderer /> turns it into real native components — not a webview, not markdown, native Text, View, and SVG charts. The stack is an Expo app paired with a Next.js backend that handles the OpenAI call.
What changes on native
Three things differ from a web OpenUI Lang integration:
1. Native primitives only
There is no DOM, no HTML, no CSS. Components use Text, View, StyleSheet from react-native, and react-native-svg for charts:
component: ({ props }) => (
<RNText style={styles.textBody}>{props.content}</RNText>
)2. Two mirror libraries
@openuidev/cli runs in Node.js to generate the system prompt, but it can't import React Native. This will be fixed in future versions of the CLI. The solution is two files with identical schemas — one with real renderers for the app, one with null renderers for the backend:
| File | Purpose |
|---|---|
chat-app/library.tsx | Real React Native renderers — what runs on device |
backend/src/library.ts | Null renderers (() => null) — used only to generate system-prompt.txt |
Keep these in sync: same component names, same prop schemas, same root component.
Architecture
Expo app -- POST /api/chat --> Next.js backend --> OpenAI
<-- text/plain stream -- (OpenUI Lang)The backend loads a pre-generated system-prompt.txt, forwards the conversation to OpenAI with streaming enabled, and returns raw text/plain chunks. The Expo app accumulates those chunks and passes the growing string into <Renderer />, which progressively parses and renders native UI as the response arrives.
Project layout
examples/openui-react-native/
|- backend/ # Next.js API that talks to OpenAI
\- chat-app/ # Expo app that renders streamed OpenUI LangRun the example
Run these commands from examples/openui-react-native.
- Install dependencies:
cd examples/openui-react-native
pnpm install- Configure the backend:
cp backend/env.example backend/.env.localThen add your OpenAI key to backend/.env.local:
OPENAI_API_KEY=sk-...- Generate the prompt file used by the backend:
pnpm generate:promptThis generates backend/src/system-prompt.txt from backend/src/library.ts. Re-run it any time you change component names, schemas, descriptions, or prompt rules.
- Start the Next.js backend:
pnpm dev:backend- Start the Expo app in a second terminal:
pnpm dev:mobileBy default, chat-app/metro.config.js auto-detects your local IP address and sets EXPO_PUBLIC_BACKEND_URL to http://<your-ip>:3000/api/chat. If you need to point the app somewhere else, set EXPO_PUBLIC_BACKEND_URL yourself before starting Expo.
If you are testing on a physical device, make sure the phone and your development machine are on the same network.