DeepCitation + Next.js App Router
In your Next.js App Router app, DeepCitation runs entirely server-side until the final render step. This guide answers the architectural questions up front, then walks through three complete integration patterns.
CSS setup required. Add @import "deepcitation/tailwind.css" to your globals.css (Tailwind v4), or import "deepcitation/styles.css" in your root layout (non-Tailwind). See Styling.
“use client” Boundary Map
This is the question most developers hit first. Here’s the complete split:
The deepcitation/react entry point now ships with a "use client" directive at the top of the module. This means React components imported from deepcitation/react are automatically marked as client components — you no longer need to add your own "use client" boundary in wrapper files that only re-export DeepCitation components. You still need "use client" in your own page/component files that use React hooks or browser APIs.
| Import | Where it runs | Why |
|---|---|---|
DeepCitation (class) |
Server only | Makes network calls to DC API; never expose API key to client |
prepareAttachments() |
Server only (Route Handler or Server Action) | Uploads files to DC API |
wrapCitationPrompt() |
Server only | Runs during prompt construction, same process as the LLM call |
getAllCitationsFromLlmOutput() |
Server only (Route Handler) | Called after streaming completes |
verifyAttachment() |
Server only (Route Handler) | Makes DC API call |
validateUploadFile() |
Server only | File validation before upload |
CitationComponent |
Client only (auto via "use client") |
Uses React state/hooks, renders interactive popover |
CitationDrawer |
Client only (auto via "use client") |
Interactive drawer with animation |
CitationDrawerTrigger |
Client only (auto via "use client") |
Trigger button for the drawer |
DeepCitationTheme |
Client only (auto via "use client") |
CSS variable provider for theming |
parseCitationResponse() |
Either | Pure function, no side effects |
getCitationKey() |
Either | Pure function, no side effects (import from "deepcitation") |
DeepCitation uses your DEEPCITATION_API_KEY. Never import it in a file with "use client" — the key would be bundled into the browser.
Pattern 1: Route Handlers (Recommended)
The most common pattern: three Route Handlers handle upload, streaming, and verification. The client uses useChat from @ai-sdk/react to stream and triggers verification when streaming ends.
This pattern is used in the nextjs-ai-sdk example.
File Structure
app/
├── api/
│ ├── upload/route.ts ← prepareAttachments() lives here
│ ├── chat/route.ts ← wrapCitationPrompt() + streamText() live here
│ └── verify/route.ts ← getAllCitationsFromLlmOutput() + verifyAttachment() live here
├── page.tsx ← "use client" — useChat, state, CitationComponent
└── components/
└── ChatMessage.tsx ← "use client" — CitationComponent rendering
/app/api/upload/route.ts
import { DeepCitation, validateUploadFile } from "deepcitation";
import { type NextRequest, NextResponse } from "next/server";
const dc = new DeepCitation({ apiKey: process.env.DEEPCITATION_API_KEY! });
export async function POST(req: NextRequest) {
const formData = await req.formData();
const file = formData.get("file") as File;
if (!file) {
return NextResponse.json({ error: "No file provided" }, { status: 400 });
}
const arrayBuffer = await file.arrayBuffer();
const buffer = Buffer.from(arrayBuffer);
// Validate file size, MIME type, and magic bytes before uploading
const uploadError = validateUploadFile(file.size, file.type, new Uint8Array(arrayBuffer));
if (uploadError) {
return NextResponse.json({ error: uploadError }, { status: 400 });
}
const { fileDataParts, deepTextPages } = await dc.prepareAttachments([
{ file: buffer, filename: file.name },
]);
// Return both — client stores fileDataPart for verification tracking,
// deepTextPages for injecting into subsequent LLM prompts.
return NextResponse.json({
fileDataPart: fileDataParts[0],
deepTextPages,
});
}
/app/api/chat/route.ts
import { openai } from "@ai-sdk/openai";
import { wrapCitationPrompt } from "deepcitation/prompts";
import { convertToModelMessages, streamText, type UIMessage } from "ai";
export const maxDuration = 60;
export async function POST(req: Request) {
const { messages, deepTextPagesByAttachmentId = {} } = await req.json();
const uiMessages = messages as UIMessage[];
const lastUserMessage = uiMessages.findLast(m => m.role === "user");
const lastUserContent =
lastUserMessage?.parts
?.filter((p): p is { type: "text"; text: string } => p.type === "text")
.map(p => p.text)
.join("") ?? "";
const hasDocuments = Object.keys(deepTextPagesByAttachmentId).length > 0;
const { enhancedSystemPrompt, enhancedUserPrompt } = hasDocuments
? wrapCitationPrompt({
systemPrompt: "You are a helpful assistant that cites sources.",
userPrompt: lastUserContent,
deepTextPagesByAttachmentId,
})
: {
enhancedSystemPrompt: "You are a helpful assistant.",
enhancedUserPrompt: lastUserContent,
};
// Inject the enhanced user message before converting to model format
const enhancedMessages = uiMessages.map((m, i) => {
if (i === uiMessages.length - 1 && m.role === "user" && hasDocuments) {
return { ...m, parts: [{ type: "text" as const, text: enhancedUserPrompt }] };
}
return m;
});
const modelMessages = await convertToModelMessages(enhancedMessages);
const result = streamText({
model: openai("gpt-4o-mini"),
system: enhancedSystemPrompt,
messages: modelMessages,
});
return result.toTextStreamResponse();
}
/app/api/verify/route.ts
import { DeepCitation, getAllCitationsFromLlmOutput } from "deepcitation";
import { type NextRequest, NextResponse } from "next/server";
const dc = new DeepCitation({ apiKey: process.env.DEEPCITATION_API_KEY! });
export async function POST(req: NextRequest) {
const { llmOutput, attachmentId } = await req.json();
// Citation extraction happens server-side — client sends raw LLM output
const citations = getAllCitationsFromLlmOutput(llmOutput);
if (Object.keys(citations).length === 0) {
return NextResponse.json({ citations: {}, verifications: {} });
}
const { verifications } = await dc.verifyAttachment(attachmentId, citations, {
outputImageFormat: "avif",
});
return NextResponse.json({ citations, verifications });
}
/app/page.tsx — Streaming + Verification Wiring
"use client";
import { useChat } from "@ai-sdk/react";
import type { Citation, FileDataPart, Verification } from "deepcitation";
import { useEffect, useEffectEvent, useRef, useState } from "react";
export default function Home() {
const [fileDataParts, setFileDataParts] = useState<FileDataPart[]>([]);
const [deepTextPagesByAttachmentId, setDeepTextPagesByAttachmentId] = useState<Record<string, string[]>>({});
const [messageVerifications, setMessageVerifications] = useState<
Record<string, { citations: Record<string, Citation>; verifications: Record<string, Verification> }>
>({});
const [isVerifying, setIsVerifying] = useState(false);
const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat({
streamProtocol: "text",
body: { deepTextPagesByAttachmentId },
});
// Stable event handler — not affected by stale closure over isLoading/messages
const onVerifyMessage = useEffectEvent((messageId: string, content: string) => {
if (!content || fileDataParts.length === 0) return;
setIsVerifying(true);
fetch("/api/verify", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
llmOutput: content,
attachmentId: fileDataParts[0].attachmentId,
}),
})
.then(res => res.json())
.then(data => setMessageVerifications(prev => ({ ...prev, [messageId]: data })))
.finally(() => setIsVerifying(false));
});
// Trigger verification when streaming transitions from loading → done
const prevIsLoadingRef = useRef(false);
useEffect(() => {
if (prevIsLoadingRef.current && !isLoading) {
const lastMsg = messages[messages.length - 1];
if (lastMsg?.role === "assistant" && !messageVerifications[lastMsg.id]) {
const content =
lastMsg.content ||
lastMsg.parts
?.filter((p): p is { type: "text"; text: string } => p.type === "text")
.map(p => p.text)
.join("") ||
"";
onVerifyMessage(lastMsg.id, content);
}
}
prevIsLoadingRef.current = isLoading;
}, [isLoading, messages, messageVerifications]);
const handleFileUpload = async (file: File) => {
const formData = new FormData();
formData.append("file", file);
const res = await fetch("/api/upload", { method: "POST", body: formData });
const data = await res.json();
if (res.ok) {
setFileDataParts(prev => [...prev, data.fileDataPart]);
setDeepTextPagesByAttachmentId(prev => ({
...prev,
[data.fileDataPart.attachmentId]: data.deepTextPages,
}));
}
};
// ... render messages with CitationComponent (see Pattern 1 — Client Rendering below)
}
Pattern 2: Streaming + When to Verify
A common question: does verifyAttachment() work on partial (mid-stream) responses?
No — verification requires the complete LLM output. The LLM appends its <<<CITATION_DATA>>> block at the very end of the response. Calling getAllCitationsFromLlmOutput() on a partial response will miss the citation data if that block hasn’t arrived yet.
The correct pattern (used in the example above) is:
Stream begins (isLoading: true)
→ tokens arrive, render incrementally
Stream ends (isLoading: false)
→ collect full message content
→ call /api/verify with complete text
→ citations appear once verified
In useChat, detect the transition with a ref:
const prevIsLoadingRef = useRef(false);
useEffect(() => {
if (prevIsLoadingRef.current && !isLoading) {
// Streaming just finished — safe to verify
const lastMsg = messages[messages.length - 1];
if (lastMsg?.role === "assistant") {
triggerVerification(lastMsg);
}
}
prevIsLoadingRef.current = isLoading;
}, [isLoading, messages]);
Use useEffectEvent for the verification handler so it captures the latest fileDataParts without becoming a useEffect dependency. This avoids re-triggering verification when fileDataParts changes.
Pattern 3: Rendering Citations Client-Side
CitationComponent renders an interactive inline citation with a popover showing verification status and a proof image. It’s client-only.
In your chat message component:
"use client";
import { parseCitationResponse, type Citation, type Verification } from "deepcitation";
import { CitationComponent } from "deepcitation/react";
import ReactMarkdown from "react-markdown";
import remarkGfm from "remark-gfm";
import { CONTINUE, visit } from "unist-util-visit";
// Remark plugin — replaces [N] in text nodes with custom AST nodes,
// keeping markdown formatting (bold, lists, etc.) intact.
// See INTEGRATION.md Recipe 3 for details.
const MARKER_RE = /(\[\d+\])/g;
function remarkCitationMarkers() {
return (tree: any) => {
visit(tree, "text", (node: any, index: any, parent: any) => {
if (index == null || !parent || !node.value) return;
const parts = node.value.split(MARKER_RE);
if (parts.length <= 1) return;
const newNodes = parts.filter(Boolean).map((part: string) => {
const m = part.match(/^\[(\d+)\]$/);
if (m) return { type: "citation-marker", data: { hName: "citation-marker", hProperties: { n: m[1] } } };
return { type: "text", value: part };
});
parent.children.splice(index, 1, ...newNodes);
return [CONTINUE, index + newNodes.length];
});
};
}
// Replace [N] citation markers in LLM output with CitationComponent
function RenderWithCitations({
content,
citations,
verifications,
}: {
content: string;
citations: Record<string, Citation>;
verifications: Record<string, Verification>;
}) {
const result = parseCitationResponse(content);
return (
<ReactMarkdown
remarkPlugins={[remarkGfm, remarkCitationMarkers]}
components={{
"citation-marker": ({ n }: { n: string }) => {
const key = result.markerMap[Number(n)];
const citation = key ? (citations[key] ?? result.citations[key]) : null;
if (!key || !citation) return <sup>[{n}]</sup>;
return <CitationComponent citation={citation} verification={verifications[key]} />;
},
}}
>
{result.visibleText}
</ReactMarkdown>
);
}
Pattern 4: Static / SSG Pre-Verified Citations
For published articles or documentation where citations are pre-verified at build time, run verifyAttachment() during the build and embed the results in your page props.
// app/articles/[slug]/page.tsx ← Server Component (no "use client")
import { DeepCitation, getAllCitationsFromLlmOutput } from "deepcitation";
const dc = new DeepCitation({ apiKey: process.env.DEEPCITATION_API_KEY! });
export default async function ArticlePage({ params }: { params: { slug: string } }) {
const article = await getArticle(params.slug);
// Citations verified at render time (SSR) or build time (SSG with generateStaticParams)
const citations = getAllCitationsFromLlmOutput(article.content);
const { verifications } = await dc.verifyAttachment(article.attachmentId, citations);
// Pass pre-verified data to the client component
return <ArticleContent content={article.content} verifications={verifications} />;
}
// components/ArticleContent.tsx ← "use client" for CitationComponent
"use client";
import type { Verification } from "deepcitation";
import { CitationComponent } from "deepcitation/react";
export function ArticleContent({
content,
verifications,
}: {
content: string;
verifications: Record<string, Verification>;
}) {
// Render content with inline citations — verifications already computed server-side
return <div>{renderWithCitations(content, {}, verifications)}</div>;
}
Can CitationComponent be server-rendered for SEO? Not with its interactive popover — the component uses React state and browser APIs. For SEO, render citation text in the Server Component and hydrate the interactive CitationComponent on the client.
Theming with DeepCitationTheme
DeepCitationTheme is a provider component that sets CSS custom properties (design tokens) for all nested DeepCitation components. Wrap it around your citation rendering area — typically in your root layout or chat container:
import { DeepCitationTheme } from "deepcitation/react";
// In your layout or chat wrapper:
<DeepCitationTheme
theme=
>
{children}
</DeepCitationTheme>
DeepCitationTheme renders a <div> that applies --dc-* CSS variables. Any CitationComponent or CitationDrawer nested inside will inherit these values. See Styling for the full list of supported tokens.
Environment Variables
# .env.local
DEEPCITATION_API_KEY=dc_live_YOUR_API_KEY # Server-only — never prefix with NEXT_PUBLIC_
OPENAI_API_KEY=sk-your-key # Server-only
Do not prefix with NEXT_PUBLIC_. DeepCitation calls run server-side only. Exposing the key to the browser would allow anyone to make API calls at your cost.
Scaffold This Integration
Clone the working example directly:
npx degit DeepCitation/deepcitation/examples/nextjs-ai-sdk my-citation-app
cd my-citation-app
cp .env.example .env.local
# Add your keys to .env.local
npm install && npm run dev
Next Steps
- Vercel AI SDK guide —
streamTextmiddleware patterns - Components — full CitationComponent and CitationDrawer API
- Styling — CSS customization and theming