Assistant
About the Assistant
The TockDocs assistant is a docs-grounded chat UI built on top of:
- AI SDK for chat and model orchestration
- MCP for documentation tools
- FlexSearch for primary full-text retrieval
- Fuse.js for fuzzy fallback retrieval
It is designed to answer questions from your documentation, not from generic model memory alone.
When users ask questions, the assistant:
- searches and retrieves relevant content through the built-in MCP server
- searches full page content, not just titles
- uses fuzzy fallback retrieval when exact search is weak
- cites sources with links back to the docs
- can generate code examples grounded in the retrieved docs
How it Works
The runtime flow is tool-grounded:
- The client sends chat messages to
/__tockdocs__/assistant. - The server resolves the active provider and model.
- When the request comes from a docs page, TockDocs scopes the request to the current knowledge base and locale when possible.
- The assistant exposes MCP tools to the model.
search-pagesperforms retrieval with FlexSearch first and Fuse.js fallback when needed.- The model streams a grounded answer back to the UI.
By default, the assistant connects to your built-in MCP server at /mcp, which exposes search-pages, list-pages, and get-page.
Quick Start
1. Configure a model provider
TockDocs supports Vercel AI Gateway, plus:
- OpenRouter
- DeepSeek
- Nvidia
- Hugging Face
- Groq
- GitHub Models
- Gemini
- Cloudflare Workers AI
Vercel AI Gateway
AI_PROVIDER=vercel
AI_MODEL=google/gemini-3-flash
# Use AI_GATEWAY_API_KEY for manual keys; Vercel reserves the VERCEL_* prefix.
AI_GATEWAY_API_KEY=your-api-key
OIDC (only on Vercel) — VERCEL_OIDC_TOKEN is injected automatically. Nothing to add in production. For local dev, run vercel env pull on a linked project.
Example with another provider
AI_PROVIDER=deepseek
DEEPSEEK_API_KEY=your-api-key
DEEPSEEK_MODEL=deepseek-chat
If AI_PROVIDER is unset, TockDocs auto-detects the first configured provider from your available credentials.
2. Run or deploy your site
In development, the assistant UI is enabled automatically.
In production, it becomes available when either:
NUXT_PUBLIC_ASSISTANT_ENABLED=true, or- supported provider credentials are present
Using the Assistant
Floating Input
On documentation pages, a floating input appears at the bottom of the screen. Users can type their questions directly and press Enter to get answers.
Explain with AI
Each documentation page includes an Explain with AI button in the table of contents sidebar. Clicking it opens the assistant with the current page as context.
Slideover Chat
When a conversation starts, a slideover panel opens on the right side of the screen and keeps the conversation state visible.
UI Configuration
Configure the assistant UI through app.config.ts:
export default defineAppConfig({
assistant: {
floatingInput: true,
explainWithAi: true,
faqQuestions: [],
shortcuts: {
focusInput: 'meta_i',
},
icons: {
trigger: 'i-lucide-sparkles',
explain: 'i-lucide-brain',
},
},
})
FAQ Questions
Display suggested questions when the chat is empty.
Simple format
export default defineAppConfig({
assistant: {
faqQuestions: [
'How do I install TockDocs?',
'How do I customize the theme?',
'How do I add components to my pages?',
],
},
})
Category format
export default defineAppConfig({
assistant: {
faqQuestions: [
{
category: 'Getting Started',
items: ['How do I install TockDocs?', 'What is the project structure?'],
},
{
category: 'Customization',
items: ['How do I change the theme colors?', 'How do I add a custom logo?'],
},
],
},
})
Localized format
export default defineAppConfig({
assistant: {
faqQuestions: {
en: [{ category: 'Getting Started', items: ['How do I install?'] }],
zh: [{ category: '快速入门', items: ['如何安装?'] }],
},
},
})
Keyboard Shortcuts
export default defineAppConfig({
assistant: {
shortcuts: {
focusInput: 'meta_k',
},
},
})
Common examples:
meta_imeta_kctrl_shift_p
Advanced Configuration
Configure advanced runtime options in nuxt.config.ts under tockdocs.assistant:
export default defineNuxtConfig({
tockdocs: {
assistant: {
provider: 'vercel',
model: 'google/gemini-3-flash',
mcpServer: '/mcp',
apiPath: '/__tockdocs__/assistant',
},
},
})
assistant config is still read for compatibility, but it is deprecated. Prefer tockdocs.assistant.MCP server selection
You can:
- use the built-in MCP server at
/mcp - point the assistant at an external MCP server URL
export default defineNuxtConfig({
tockdocs: {
assistant: {
mcpServer: 'https://other-docs.example.com/mcp',
},
},
})
Provider and model overrides
You can explicitly pin a provider:
export default defineNuxtConfig({
tockdocs: {
assistant: {
provider: 'deepseek',
},
},
})
And optionally override the model used by that provider:
export default defineNuxtConfig({
tockdocs: {
assistant: {
model: 'anthropic/claude-opus-4.5',
},
},
})
Disable Features
Disable the floating input
export default defineAppConfig({
assistant: {
floatingInput: false,
},
})
Disable “Explain with AI”
export default defineAppConfig({
assistant: {
explainWithAi: false,
},
})
Disable the assistant entirely
The assistant UI is disabled when no supported provider credentials are available and NUXT_PUBLIC_ASSISTANT_ENABLED is not set to true.
NUXT_PUBLIC_ASSISTANT_ENABLED=false
Programmatic Access
Use the useAssistant composable to control the assistant programmatically:
<script setup>
const { isEnabled, isOpen, open, close, toggle } = useAssistant()
function askQuestion() {
open('How do I configure the theme?', true)
}
</script>
<template>
<UButton v-if="isEnabled" @click="askQuestion">
Ask about themes
</UButton>
</template>
Composable API
| Property | Type | Description |
|---|---|---|
isEnabled | ComputedRef<boolean> | Whether the assistant UI is enabled in the current runtime |
isOpen | Ref<boolean> | Whether the slideover is open |
open(message?, clearPrevious?) | Function | Open the assistant, optionally with a message |
close() | Function | Close the assistant slideover |
toggle() | Function | Toggle the assistant open/closed |
clearMessages() | Function | Clear the conversation history |