The @assistant-ui/react-data-stream package provides integration with data stream protocol endpoints, enabling streaming AI responses with tool support and state management.
Overview#
The data stream protocol is a standardized format for streaming AI responses that supports:
- Streaming text responses with real-time updates
- Tool calling with structured parameters and results
- State management for conversation context
- Error handling and cancellation support
- Attachment support for multimodal interactions
Installation#
<InstallCommand npm={["@assistant-ui/react", "@assistant-ui/react-data-stream"]} />
Basic Usage#
Set up the Runtime#
Use useDataStreamRuntime to connect to your data stream endpoint:
"use client";
import { useDataStreamRuntime } from "@assistant-ui/react-data-stream";
import { AssistantRuntimeProvider } from "@assistant-ui/react";
import { Thread } from "@/components/assistant-ui/thread";
export default function ChatPage() {
const runtime = useDataStreamRuntime({
api: "/api/chat",
});
return (
<AssistantRuntimeProvider runtime={runtime}>
<Thread />
</AssistantRuntimeProvider>
);
}
Create Backend Endpoint#
Your backend endpoint should accept POST requests and return data stream responses:
import { createAssistantStreamResponse } from "assistant-stream";
export async function POST(request: Request) {
const { messages, tools, system, threadId } = await request.json();
return createAssistantStreamResponse(async (controller) => {
// Process the request with your AI provider
const stream = await processWithAI({
messages,
tools,
system,
});
// Stream the response
for await (const chunk of stream) {
controller.appendText(chunk.text);
}
});
}
The request body includes:
messages- The conversation historytools- Available tool definitionssystem- System prompt (if configured)threadId- The current thread/conversation identifier
Advanced Configuration#
Custom Headers and Authentication#
const runtime = useDataStreamRuntime({
api: "/api/chat",
headers: {
"Authorization": "Bearer " + token,
"X-Custom-Header": "value",
},
credentials: "include",
});
Dynamic Headers#
const runtime = useDataStreamRuntime({
api: "/api/chat",
headers: async () => {
const token = await getAuthToken();
return {
"Authorization": "Bearer " + token,
};
},
});
Dynamic Body#
const runtime = useDataStreamRuntime({
api: "/api/chat",
headers: async () => ({
"Authorization": `Bearer ${await getAuthToken()}`,
}),
body: async () => ({
requestId: crypto.randomUUID(),
timestamp: Date.now(),
signature: await computeSignature(),
}),
});
Event Callbacks#
const runtime = useDataStreamRuntime({
api: "/api/chat",
onResponse: (response) => {
console.log("Response received:", response.status);
},
onFinish: (message) => {
console.log("Message completed:", message);
},
onError: (error) => {
console.error("Error occurred:", error);
},
onCancel: () => {
console.log("Request cancelled");
},
});
Tool Integration#
Human-in-the-loop tools (using `human()` for tool interrupts) are not supported in the data stream runtime. If you need human approval workflows or interactive tool UIs, consider using [LocalRuntime](/docs/runtimes/custom/local) or [Assistant Cloud](/docs/cloud) instead.Frontend Tools#
Use toToolsJSONSchema to serialize client-side tools:
import { tool } from "@assistant-ui/react";
import { toToolsJSONSchema } from "assistant-stream";
const myTools = {
get_weather: tool({
description: "Get current weather",
parameters: z.object({
location: z.string(),
}),
execute: async ({ location }) => {
const weather = await fetchWeather(location);
return `Weather in ${location}: ${weather}`;
},
}),
};
const runtime = useDataStreamRuntime({
api: "/api/chat",
body: {
tools: toToolsJSONSchema(myTools),
},
});
Backend Tool Processing#
Your backend should handle tool calls and return results:
// Tools are automatically forwarded to your endpoint
const { tools } = await request.json();
// Process tools with your AI provider
const response = await ai.generateText({
messages,
tools,
// Tool results are streamed back automatically
});
Assistant Cloud Integration#
For Assistant Cloud deployments, use useCloudRuntime:
import { useCloudRuntime } from "@assistant-ui/react-data-stream";
const runtime = useCloudRuntime({
cloud: assistantCloud,
assistantId: "my-assistant-id",
});
Message Conversion#
Framework-Agnostic Conversion (Recommended)#
For custom integrations, use the framework-agnostic utilities from assistant-stream:
import { toGenericMessages, toToolsJSONSchema } from "assistant-stream";
// Convert messages to a generic format
const genericMessages = toGenericMessages(messages);
// Convert tools to JSON Schema format
const toolSchemas = toToolsJSONSchema(tools);
The GenericMessage format can be easily converted to any LLM provider format:
import type { GenericMessage } from "assistant-stream";
// GenericMessage is a union of:
// - { role: "system"; content: string }
// - { role: "user"; content: (GenericTextPart | GenericFilePart)[] }
// - { role: "assistant"; content: (GenericTextPart | GenericToolCallPart)[] }
// - { role: "tool"; content: GenericToolResultPart[] }
AI SDK Specific Conversion#
For AI SDK integration, use toLanguageModelMessages:
import { toLanguageModelMessages } from "@assistant-ui/react-data-stream";
// Convert to AI SDK LanguageModelV2Message format
const languageModelMessages = toLanguageModelMessages(messages, {
unstable_includeId: true, // Include message IDs
});
Error Handling#
The runtime automatically handles common error scenarios:
- Network errors: Automatically retried with exponential backoff
- Stream interruptions: Gracefully handled with partial content preservation
- Tool execution errors: Displayed in the UI with error states
- Cancellation: Clean abort signal handling
Best Practices#
Performance Optimization#
// Use React.memo for expensive components
const OptimizedThread = React.memo(Thread);
// Memoize runtime configuration
const runtimeConfig = useMemo(() => ({
api: "/api/chat",
headers: { "Authorization": `Bearer ${token}` },
}), [token]);
const runtime = useDataStreamRuntime(runtimeConfig);
Error Boundaries#
import { ErrorBoundary } from "react-error-boundary";
function ChatErrorFallback({ error, resetErrorBoundary }) {
return (
<div role="alert">
<h2>Something went wrong:</h2>
<pre>{error.message}</pre>
<button onClick={resetErrorBoundary}>Try again</button>
</div>
);
}
export default function App() {
return (
<ErrorBoundary FallbackComponent={ChatErrorFallback}>
<AssistantRuntimeProvider runtime={runtime}>
<Thread />
</AssistantRuntimeProvider>
</ErrorBoundary>
);
}
State Persistence#
const runtime = useDataStreamRuntime({
api: "/api/chat",
body: {
// Include conversation state
state: conversationState,
},
onFinish: (message) => {
// Save state after each message
saveConversationState(message.metadata.unstable_state);
},
});
Examples#
Explore our examples repository for implementation references.
LocalRuntimeOptions#
useDataStreamRuntime accepts all options from LocalRuntimeOptions in addition to its own options. These control the underlying local runtime behavior.
maxSteps#
The maximum number of agentic steps (tool call rounds) allowed per run. Defaults to unlimited.
const runtime = useDataStreamRuntime({
api: "/api/chat",
maxSteps: 5,
});
initialMessages#
Pre-populate the thread with messages on first render. Useful for continuing an existing conversation.
const runtime = useDataStreamRuntime({
api: "/api/chat",
initialMessages: [
{ role: "user", content: [{ type: "text", text: "Hello" }] },
{ role: "assistant", content: [{ type: "text", text: "Hi! How can I help?" }] },
],
});
adapters#
Extend the runtime with optional capability adapters. The chatModel adapter is handled internally by useDataStreamRuntime and cannot be overridden here.
const runtime = useDataStreamRuntime({
api: "/api/chat",
adapters: {
attachments: myAttachmentAdapter,
history: myHistoryAdapter,
speech: mySpeechAdapter,
dictation: myDictationAdapter,
feedback: myFeedbackAdapter,
suggestion: mySuggestionAdapter,
},
});
See the LocalRuntime adapters documentation for details on implementing each adapter.
cloud#
Connect to Assistant Cloud for managed multi-thread support, persistence, and thread management.
import { AssistantCloud } from "assistant-cloud";
const cloud = new AssistantCloud({ /* ... */ });
const runtime = useDataStreamRuntime({
api: "/api/chat",
cloud,
});
unstable_humanToolNames#
Names of tools that should pause execution and wait for human or external approval before proceeding.
Human-in-the-loop tool interrupts (`unstable_humanToolNames`) are not supported in the data stream runtime. Use [LocalRuntime](/docs/runtimes/custom/local) if you need this feature.API Reference#
For detailed API documentation, see the @assistant-ui/react-data-stream API Reference.