AI Chat Bootstrap
AI Chat Bootstrap provides React components, hooks, and design tokens to build AI chat UIs quickly. It layers cleanly on top of the Vercel AI SDK and AI SDK Elements, adding opinionated primitives for messages, commands, contextual tools, compression, actions toolbar controls, and theming.
If you just want to see code first, jump to the Basic Chat example.
Install
pnpm add ai-chat-bootstrap react react-dom ai @ai-sdk/react @ai-sdk/openai zod
# or
npm install ai-chat-bootstrap react react-dom ai @ai-sdk/react @ai-sdk/openai zodRequired peers: react, react-dom, ai, @ai-sdk/react. Add one provider package (e.g. @ai-sdk/openai, @ai-sdk/azure, etc.). zod strongly recommended for tools/commands.
Styling
The library requires shadcn/ui-compatible CSS variables (light and dark palettes) to be present. Choose one setup mode to supply tokens and component styles:
Zero‑config (recommended)
/* globals.css */
@import "tw-animate-css";
@import "ai-chat-bootstrap/tokens.css";
@import "ai-chat-bootstrap/ai-chat.css";tokens.css defines the design tokens + minimal globals. ai-chat.css adds the component styles.
Install tw-animate-css once (the CLI adds it automatically) so shared animation keyframes stay in sync with our components.
To theme, override the shadcn-compatible CSS custom properties that live outside of any @layer block. Provide the full light and dark palettes—the components reference every variable directly:
:root {
--radius: 0.75rem;
--primary: oklch(0.58 0.2 264);
--background: oklch(1 0 0);
--foreground: oklch(0.145 0 0);
}Copy the full shadcn token set (--background, --card, --primary, etc.) from the CLI scaffold or Theme System guide and adjust the values for your brand.
Tailwind‑native (advanced)
// tailwind.config.ts
import preset from "ai-chat-bootstrap/tailwind.preset";
export default {
presets: [preset],
content: ["./src/**/*.{js,ts,jsx,tsx}"],
};Don’t use both modes together.
Quick Start
import React from "react";
import { ChatContainer } from "ai-chat-bootstrap";
export function App() {
return (
<ChatContainer
transport={{ api: "/api/chat" }}
messages={{ systemPrompt: "You are a helpful AI assistant." }}
header={{ title: "AI Assistant", subtitle: "Connected to AI" }}
ui={{ placeholder: "Ask me anything..." }}
/>
);
}For a full end‑to‑end example (including API route + streaming) see Basic Chat.
Demo reference
- Next.js showcase using
ChatPopout, tools, compression, and MCP: packages/ai-chat-bootstrap-demo/src/app/page.tsx - Minimal chat route that mirrors this quick start: packages/ai-chat-bootstrap-demo/src/app/basic/page.tsx
- Streaming API handler used by the demo: packages/ai-chat-bootstrap-demo/src/app/api/chat/route.ts
Run it locally
git clone https://github.com/knifeyspoony/ai-chat-bootstrap
cd ai-chat-bootstrap
pnpm install
pnpm --filter ai-chat-bootstrap-demo devThe command launches the Next.js demo alongside the library watcher so code edits reflect immediately.
Server Templates
New helper functions make deploying API endpoints easier:
// app/api/chat/route.ts
import { createAIChatHandler } from "ai-chat-bootstrap/server";
import { openai } from "@ai-sdk/openai";
const handler = createAIChatHandler({
model: openai("gpt-4"),
streamOptions: { temperature: 0.7 },
});
export { handler as POST };Also available: createCompressionHandler, createSuggestionsHandler, createThreadTitleHandler, createMcpToolsHandler.
Features
- Chat container + composable message primitives with popout support
- Prompt actions toolbar (models, suggestions, attachments, compression status)
- Automatic compression pipelines with pinned messages and artifact review UI
- Slash command system with parameter schemas (
zod) and tool execution - Context sharing + focus tracking hooks for dynamic prompt enrichment
- MCP and frontend tool integration (merged in enriched system prompt)
- Threading, auto titling, and AI suggestion queue components
- Threading, auto titling, and AI suggestion queue components (see Threading & Persistence)
- Tailwind + shadcn/ui compatible base components and design tokens
CSS & Theming
ai-chat-bootstrap/tokens.css: design tokens + minimal globalsai-chat-bootstrap/ai-chat.css: chat component styles without Tailwind preflighttw-animate-css: shared keyframes/animations used by menu, dialog, and tooltip primitives- Tailwind preset: maps tokens into Tailwind theme when compiling yourself
- Theme by overriding CSS variables (
--background,--foreground,--primary, etc.)
Choosing an approach
| Need | Choose |
|---|---|
| Fast drop‑in, minimal config | Zero‑config + ai-chat-bootstrap/tokens.css + ai-chat-bootstrap/ai-chat.css |
| Centralized Tailwind control | Tailwind‑native preset |
| Lowest duplicate CSS across many libs | Tailwind‑native |
| Safest isolation from app utilities | Zero‑config |
Tree‑shaking
In Tailwind‑native mode unused utilities are removed by your Tailwind build. In Zero‑config mode we already ship a pre‑shaken subset containing only what components require.
Peer Dependencies
Provide react and react-dom (version 18+). The Vercel AI SDK + model provider packages are required only for live AI streaming/inference features—not for purely static UI rendering.
Repository & Issues
Source & issues live at: https://github.com/knifeyspoony/ai-chat-bootstrap
Changelog
See the monorepo root CHANGELOG for release notes.
License
MIT © Contributors
Questions or missing details? Open an issue in the repo.