API Reference
Complete API documentation for all hooks, components, and TypeScript interfaces in the AI Chat Bootstrap library.
Hooks
Chat Management
- useAIChat – Advanced hook for full chat state + transport orchestration (used internally by
ChatContainer) - useAIChatCompression – Access the compression controller, pinned messages, artifacts, and
runCompression - useMCPServer – Discover and sync Model Context Protocol (MCP) servers from the client
- useAIChatCommand – Register slash commands that call frontend tools or remote actions
- useUIChatCommand – Present command palettes bound to
useAIChatCommand
Context & Focus
- useAIContext – Share React component state with the AI without causing re-renders
- useAIFocus – Enable users to explicitly mark which items should be prioritized in AI conversations
Tools
- useAIFrontendTool – Register tools that the AI can execute in your React components for direct UI interaction
Components
Chat Interface Components
- ChatContainer – Main chat interface component with header, threads, compression, and actions toolbar
- ChatPopout – Overlay/inline popout shell that composes
ChatContainer - ChatMessage – Individual message component with support for different message types
- ChatInput – Standalone input component with model selector, suggestions, compression indicators, and toolbar variants
Utilities
Message & Thread Management
- Message Normalization – Ensure messages have valid metadata and timestamps with
ensureMessageMetadataandnormalizeMessagesMetadata - Threading & Persistence – Types and utilities for implementing custom thread persistence backends
Core Concepts
Message Format
The library uses the Vercel AI SDK UIMessage format for all chat interactions:
interface UIMessage {
id: string;
role: 'system' | 'user' | 'assistant';
metadata?: unknown;
parts: Array<UIMessagePart>;
}Context vs Focus
- Context (
useAIContext): Automatic, ambient information that’s always available to the AI - Focus (
useAIFocus): User-controlled, explicit items that are temporarily prioritized
Tools Integration
Frontend tools execute in the browser and allow the AI to:
- Modify React component state
- Trigger UI actions
- Fetch data from APIs
- Update forms and user interfaces
Getting Started
- Install peers + library:
npm install ai-chat-bootstrap react react-dom ai @ai-sdk/react @ai-sdk/openai zod
# or pnpm add ... (preferred)- Import styles (choose ONE mode): Zero-config:
@import "ai-chat-bootstrap/tokens.css";
@import "ai-chat-bootstrap/ai-chat.css";Tailwind-native: add preset in tailwind config.
- Render the chat container:
import { ChatContainer } from 'ai-chat-bootstrap'; export function AppChat() { return ( <ChatContainer transport={{ api: "/api/chat" }} messages={{ systemPrompt: "You are a helpful assistant." }} /> ); }
Quick Examples
Basic Chat
import { ChatContainer } from 'ai-chat-bootstrap';
function MyChat() {
return (
<ChatContainer
transport={{ api: "/api/chat" }}
messages={{ systemPrompt: "You are a helpful assistant." }}
header={{ title: 'My Chat' }}
ui={{ placeholder: 'Ask me anything…' }}
/>
);
}Model Selection & Chain of Thought
import { ChatContainer } from 'ai-chat-bootstrap';
function ChatWithFeatures() {
return (
<ChatContainer
transport={{ api: "/api/chat" }}
models={{
available: [
{ id: "gpt-4", label: "GPT-4" },
{ id: "gpt-4o-mini", label: "GPT-4o mini" },
],
initial: "gpt-4o-mini",
}}
features={{ chainOfThought: true }}
/>
);
}Sharing Context
import { ChatContainer, useAIContext } from 'ai-chat-bootstrap';
function ChatWithContext() {
const user = { name: 'Alice', role: 'admin' };
useAIContext({ description: 'User Profile', value: user, priority: 100 });
return (
<ChatContainer
transport={{ api: "/api/chat" }}
messages={{ systemPrompt: "You are a helpful assistant." }}
/>
);
}Frontend Tools
import React, { useState } from 'react';
import { ChatContainer, useAIFrontendTool } from 'ai-chat-bootstrap';
import { z } from 'zod';
function ChatWithTools() {
const [count, setCount] = useState(0);
useAIFrontendTool({
name: 'increment',
description: 'Increment the counter',
parameters: z.object({
amount: z.number().default(1),
}),
execute: async ({ amount }) => {
setCount((prev) => prev + amount);
return { newValue: count + amount };
},
});
return (
<div className="space-y-2">
<div>Count: {count}</div>
<ChatContainer transport={{ api: "/api/chat" }} />
</div>
);
}Enabling Compression
import { ChatContainer } from "ai-chat-bootstrap";
function ChatWithCompression() {
return (
<ChatContainer
transport={{ api: "/api/chat" }}
compression={{
enabled: true,
api: "/api/compression",
maxTokenBudget: 16000,
pinnedMessageLimit: 4,
}}
/>
);
}When compression is configured, the prompt actions toolbar surfaces token usage, compression artifacts, and manual controls. Pair it with createCompressionHandler on the backend.
Component Reference
ChatContainer
Main chat interface component with built-in useAIChat state. It accepts all UseAIChatOptions (via transport, messages, threads, features, models, mcp, compression) plus UI configuration groups such as header, ui, suggestions, commands, threads, and assistantActions. Always prefer the structured configuration groups when wiring new code.
See the dedicated ChatContainer API for the complete prop interface and examples.
ChatMessage
Individual message component with support for different message types.
interface ChatMessageProps {
message: UIMessage;
className?: string;
}ChatInput
Standalone input component for chat interfaces. Provides the prompt toolbar, model selector, suggestions trigger, compression indicators, and focus chips. See the ChatInput API for the full prop interface and examples.
TypeScript Interfaces
Core Types
// Focus item structure
interface FocusItem {
id: string;
label?: string;
description?: string;
data?: Record<string, unknown>;
}
// Context item structure
interface ContextItem {
id: string;
label?: string;
description?: string;
scope: "session" | "conversation" | "message";
priority: number;
data: Record<string, unknown>;
}
// Frontend tool definition
interface FrontendTool {
name: string;
description: string;
parameters: ZodSchema;
execute: (params: any) => Promise<any> | any;
}Message Types
// Message part types
type UIMessagePart =
| { type: 'text'; text: string; state?: 'streaming' | 'done' }
| { type: 'reasoning'; text: string; state?: 'streaming' | 'done' }
| { type: 'file'; mediaType: string; filename?: string; url: string }
| { type: 'source-url'; sourceId: string; url: string; title?: string }
| { type: 'tool-*'; toolCallId: string; /* ... */ }
| { type: 'data-*'; id?: string; data: any };Backend Integration
The library automatically sends structured payloads to your API endpoint:
interface ChatRequestPayload {
messages: UIMessage[];
/**
* Original system prompt provided by the app (optional).
* Its content is appended inside the enrichedSystemPrompt automatically.
*/
systemPrompt?: string;
/**
* Automatically generated enriched system prompt (always sent unless explicitly overridden).
* Contains standardized preamble + conditional Tools / Context / Focus sections + appended original systemPrompt.
*/
enrichedSystemPrompt: string;
tools?: Record<string, ToolDefinition>;
context?: ContextItem[];
focus?: FocusItem[];
}Server Templates
New helper functions make deploying API endpoints easier:
// app/api/chat/route.ts
import { createAIChatHandler } from "ai-chat-bootstrap/server";
import { openai } from "@ai-sdk/openai";
const handler = createAIChatHandler({
model: openai("gpt-4"),
streamOptions: { temperature: 0.7 },
});
export { handler as POST };Available Handlers
import {
createAIChatHandler,
createCompressionHandler,
createSuggestionsHandler,
createThreadTitleHandler,
createMcpToolsHandler,
} from "ai-chat-bootstrap/server";
// Main chat streaming
export const POST = createAIChatHandler({ model });
// AI-generated suggestions
export const POST = createSuggestionsHandler({ model });
// Auto thread titles
export const POST = createThreadTitleHandler({ model });
// Compression summaries
export const POST = createCompressionHandler({ model });
// MCP server tools
export const POST = createMcpToolsHandler();Each handler accepts model configuration and error handling options.
Migration from Other Libraries
The AI Chat Bootstrap library is designed to work seamlessly with the Vercel AI SDK while adding powerful context and tool management capabilities. If you’re migrating from other chat libraries, the useAIChat hook provides a familiar interface while adding automatic integration with context, focus, and tools.
Examples & Tutorials
- Basic Chat - Simple chat implementation
- Chat with Tools - Adding frontend tools
- Sharing Context - Using ambient context
- Focus Items - User-controlled focus
- Compression - Keeping transcripts within token budgets
Support
For issues, feature requests, or questions:
- GitHub: ai-chat-bootstrap repository
- Documentation: Complete guides and examples available in this documentation