ChatContainer
ChatContainer renders the full chat experience—message list, input surface, model switcher, thread controls, and assistant actions—while managing AI state internally via useAIChat.
Set ui.showTimestamps to true if you want per-message timestamps rendered using the metadata.timestamp field (the hook automatically populates this for all messages the client creates).
Props
// Core types such as UIMessage and AssistantAction are exported from the library.
interface ChatContainerProps extends UseAIChatOptions {
/** Opt-in to the unstyled variant (removes default borders/background). */
"data-acb-unstyled"?: "" | boolean;
/** Convenience toggle for assistant branching (aliases `features.branching`). */
enableBranching?: boolean;
header?: {
title?: string;
subtitle?: string;
avatar?: React.ReactNode;
badge?: React.ReactNode;
actions?: React.ReactNode;
className?: string;
};
ui?: {
placeholder?: string;
className?: string;
classes?: {
header?: string;
messages?: string;
message?: string;
input?: string;
assistantActions?: string;
toolbar?: string;
};
showTimestamps?: boolean;
emptyState?: React.ReactNode;
style?: React.CSSProperties;
assistantAvatar?: string | React.ReactNode;
userAvatar?: string | React.ReactNode;
};
suggestions?: {
enabled?: boolean;
prompt?: string;
count?: number;
api?: string;
strategy?: 'assistant-finish' | 'eager' | 'manual';
debounceMs?: number;
};
commands?: {
enabled?: boolean;
};
threads?: {
enabled?: boolean;
};
assistantActions?: {
copy?: boolean;
regenerate?: boolean;
debug?: boolean;
feedback?: {
onThumbsUp: (message: UIMessage) => void | Promise<void>;
onThumbsDown: (message: UIMessage) => void | Promise<void>;
};
custom?: AssistantAction[];
};
}All AI configuration (API endpoint, system prompt, models, MCP, threading, compression, suggestions, etc.) is supplied via the
UseAIChatOptionskeys:transport,messages,threads,features,mcp,models,compression, andsuggestions.
Usage
import { ChatContainer } from "ai-chat-bootstrap";
export function MyChat() {
return (
<ChatContainer
transport={{ api: "/api/chat" }}
messages={{ systemPrompt: "You are a helpful assistant." }}
models={{
available: [
{ id: "gpt-4o-mini", label: "GPT-4o mini" },
{ id: "gpt-4", label: "GPT-4" },
],
initial: "gpt-4o-mini",
}}
header={{ title: "AI Assistant", subtitle: "Connected" }}
ui={{ placeholder: "Ask me anything..." }}
suggestions={{ enabled: true, count: 3 }}
commands={{ enabled: true }}
threads={{ enabled: true }}
/>
);
}Customizing the transport request
transport.prepareSendMessagesRequest lets you enrich the payload before it reaches your /api/chat route. When adding per-request data, prefer storing it in metadata instead of mutating the outgoing messages:
<ChatContainer
transport={{
api: "/api/chat",
prepareSendMessagesRequest: async (options) => {
const body = options.body ?? {};
const metadata =
typeof body.metadata === "object" && body.metadata !== null
? body.metadata
: {};
return {
body: {
...body,
metadata: {
...metadata,
transportDemo: {
signatureActive: true,
signature: "Sent via transport hook ✨",
preparedAt: new Date().toISOString(),
},
},
},
};
},
}}
/>See the demo route at packages/ai-chat-bootstrap-demo/src/app/transport/page.tsx for a full UI that previews the prepared payload.
Custom thread persistence
useAIChat persists conversation history through a pluggable ChatThreadPersistence adapter. The package ships IndexedDB helpers (createIndexedDBChatThreadPersistence, getDefaultChatThreadPersistence) and also exports normalizeMessagesMetadata so custom stores can guarantee the required metadata bag. Initialising your own persistence layer is as simple as:
import {
ChatThreadPersistence,
normalizeMessagesMetadata,
useChatThreadsStore,
} from "ai-chat-bootstrap";
const sqlitePersistence: ChatThreadPersistence = {
async loadAll(scopeKey) {
return db.getThreads(scopeKey);
},
async save(thread) {
const { messages } = normalizeMessagesMetadata(thread.messages);
await db.saveThread({ ...thread, messages });
},
async delete(id) {
await db.deleteThread(id);
},
};
useChatThreadsStore.getState().initializePersistent(sqlitePersistence);Header Actions
Supply React nodes via header.actions to surface custom controls next to the built-in title/subtitle block:
<ChatContainer
transport={{ api: "/api/chat" }}
header={{
title: "AI Assistant",
subtitle: "Connected",
actions: (
<div className="flex items-center gap-2">
<button
type="button"
className="rounded-md border px-2 py-1 text-xs"
onClick={() => exportThread()}
>
Export
</button>
<button
type="button"
className="rounded-md border px-2 py-1 text-xs"
onClick={() => clearChat()}
>
Clear
</button>
</div>
),
}}
/>Features
- Prompt Actions Toolbar: Model selector, attachment button, compression usage, and suggestions share a single toolbar (customize placement via variants or
ui.classes.toolbar). - Model Selection: Dropdown automatically appears when multiple models are configured via
models.available. - Chain of Thought: Reasoning blocks render when
features.chainOfThoughtis enabled and the assistant returns reasoning parts. - Compression: Token usage indicator, pinned message UI, and artifact review sheet when
compression.enabledis true. - Assistant Actions: Built-in copy/regenerate/debug/feedback actions with support for custom buttons.
- Suggestions: Optional AI-generated follow-up suggestions displayed above the input.
- Commands: Slash-command palette that integrates with
useAIChatCommandand frontend tools. - Timestamps: Opt-in
ui.showTimestampssurfaces per-message timestamps sourced frommetadata.timestamp. - Threads: Built-in thread picker (
threads.enabled) that works with scoped persistence. - MCP Support: When
mcp.enabledis true, the header surfaces the MCP server manager.
Assistant Actions
import { Share2Icon } from "lucide-react";
// Built-in buttons
<ChatContainer
transport={{ api: "/api/chat" }}
messages={{ systemPrompt: "You are a helpful assistant." }}
assistantActions={{
copy: true,
regenerate: true,
debug: true,
feedback: {
onThumbsUp: (message) => console.log("thumbs up", message.id),
onThumbsDown: (message) => console.log("thumbs down", message.id),
},
}}
/>
// Mixing built-ins with custom actions
<ChatContainer
transport={{ api: "/api/chat" }}
messages={{ systemPrompt: "You are a helpful assistant." }}
assistantActions={{
copy: true,
regenerate: true,
custom: [
{
id: "share",
icon: Share2Icon,
label: "Share",
tooltip: "Copy sharable link",
onClick: (message) => console.log("share", message.id),
},
],
}}
/>Built-in actions automatically hide on older messages when
onlyOnMostRecentis set (e.g. regenerate). Custom actions can useonlyOnMostRecent,visible, ordisabledpredicates for granular control.
Compression
<ChatContainer
transport={{ api: "/api/chat" }}
compression={{
enabled: true,
api: "/api/compression",
maxTokenBudget: 16000,
pinnedMessageLimit: 4,
}}
/>When compression is enabled the prompt toolbar surfaces current usage, exposes a “Review compression” sheet (artifacts, snapshots, survivors), and respects pinned messages per thread. Pair it with createCompressionHandler so the backend can summarize transcripts and return updated artifacts.
See Also
- useAIChat – Hook used internally by
ChatContainer - ChatPopout – Popout shell that wraps
ChatContainer