Chat

AI SDK
Build AI chat interfaces with streaming, reasoning, and tool calling.

Nuxt UI provides a set of components designed to build AI-powered chat interfaces. They integrate seamlessly with the Vercel AI SDK for streaming responses, reasoning, tool calling, and more.

Check out the Nuxt and Vue AI Chat templates on GitHub for production-ready implementations.

Components

ComponentDescription
ChatMessagesScrollable message list with auto-scroll and loading indicator.
ChatMessageIndividual message bubble with avatar, actions, and slots.
ChatPromptEnhanced textarea for submitting prompts.
ChatPromptSubmitSubmit button with automatic status handling.
ChatReasoningCollapsible block for AI reasoning / thinking process.
ChatToolCollapsible block for AI tool invocation status.
ChatShimmerText shimmer animation for streaming states.
ChatPaletteLayout wrapper for embedding chat in modals or drawers.

Installation

The Chat components are designed to be used with the Vercel AI SDK, specifically the Chat class for managing chat state and streaming responses.

Install the required dependencies:

pnpm add ai @ai-sdk/gateway @ai-sdk/vue @comark/nuxt

Add @comark/nuxt to your modules:

nuxt.config.ts
export default defineNuxtConfig({
  modules: [
    '@nuxt/ui',
    '@comark/nuxt'
  ]
})
@comark/nuxt provides the Comark component used to render AI responses as streaming Markdown, it incrementally renders tokens as they arrive, avoiding the flicker and re-parsing that traditional Markdown renderers cause. It also automatically enables Nuxt UI's prose components so your content is styled to match your theme.
pnpm add ai @ai-sdk/gateway @ai-sdk/vue @comark/vue
@comark/vue provides the Comark component used to render AI responses as streaming Markdown, it incrementally renders tokens as they arrive, avoiding the flicker and re-parsing that traditional Markdown renderers cause.

To use Nuxt UI's prose components with Comark, enable the prose option in your vite.config.ts:
vite.config.ts
import { defineConfig } from 'vite'
import vue from '@vitejs/plugin-vue'
import ui from '@nuxt/ui/vite'

export default defineConfig({
  plugins: [
    vue(),
    ui({
      prose: true
    })
  ]
})

Server Setup

Create a server API endpoint to handle chat requests using streamText. You can use the Vercel AI Gateway to access AI models through a centralized endpoint:

server/api/chat.post.ts
import { streamText, convertToModelMessages } from 'ai'
import { gateway } from '@ai-sdk/gateway'

export default defineEventHandler(async (event) => {
  const { messages } = await readBody(event)

  return streamText({
    model: gateway('anthropic/claude-sonnet-4.6'),
    maxOutputTokens: 10000,
    system: 'You are a helpful assistant.',
    messages: await convertToModelMessages(messages)
  }).toUIMessageStreamResponse()
})

Reasoning

To enable reasoning, configure providerOptions for your provider (Anthropic, Google, OpenAI):

server/api/chat.post.ts
import { streamText, convertToModelMessages } from 'ai'
import { gateway } from '@ai-sdk/gateway'

export default defineEventHandler(async (event) => {
  const { messages } = await readBody(event)

  return streamText({
    model: gateway('anthropic/claude-sonnet-4.6'),
    maxOutputTokens: 10000,
    system: 'You are a helpful assistant.',
    messages: await convertToModelMessages(messages),
    providerOptions: {
      anthropic: {
        thinking: {
          type: 'adaptive'
        },
        effort: 'low'
      },
      google: {
        thinkingConfig: {
          includeThoughts: true,
          thinkingLevel: 'low'
        }
      },
      openai: {
        reasoningEffort: 'low',
        reasoningSummary: 'detailed'
      }
    }
  }).toUIMessageStreamResponse()
})

Some providers offer built-in web search tools: Anthropic, Google, OpenAI.

import { streamText, convertToModelMessages } from 'ai'
import { anthropic } from '@ai-sdk/anthropic'
import { gateway } from '@ai-sdk/gateway'

export default defineEventHandler(async (event) => {
  const { messages } = await readBody(event)

  return streamText({
    model: gateway('anthropic/claude-sonnet-4.6'),
    system: 'You are a helpful assistant.',
    messages: await convertToModelMessages(messages),
    tools: {
      web_search: anthropic.tools.webSearch_20250305({})
    }
  }).toUIMessageStreamResponse()
})

MCP Client

Empower your chatbot with advanced tool-calling features using the Model Context Protocol (MCP) from @ai-sdk/mcp. MCP enables your AI to perform dynamic actions, such as searching your documentation or executing custom tasks, to provide more relevant and accurate responses.

To get started, install the MCP package:

npm install @ai-sdk/mcp

Then, configure your server endpoint to use MCP tools:

server/api/chat.post.ts
import { streamText, convertToModelMessages, stepCountIs } from 'ai'
import { createMCPClient } from '@ai-sdk/mcp'
import { gateway } from '@ai-sdk/gateway'

export default defineEventHandler(async (event) => {
  const { messages } = await readBody(event)

  const httpClient = await createMCPClient({
    transport: { type: 'http', url: 'https://your-app.com/mcp' }
  })
  const tools = await httpClient.tools()

  return streamText({
    model: gateway('anthropic/claude-sonnet-4.6'),
    maxOutputTokens: 10000,
    system: 'You are a helpful assistant. Use your tools to search for relevant information before answering questions.',
    messages: await convertToModelMessages(messages),
    stopWhen: stepCountIs(6),
    tools,
    onFinish: async () => {
      await httpClient.close()
    },
    onError: async (error) => {
      console.error(error)
      await httpClient.close()
    }
  }).toUIMessageStreamResponse()
})

Client Setup

Use the Chat class from @ai-sdk/vue to manage chat state and connect to your server endpoint:

<script setup lang="ts">
import { isReasoningUIPart, isTextUIPart, isToolUIPart, getToolName } from 'ai'
import { Chat } from '@ai-sdk/vue'
import { isPartStreaming, isToolStreaming } from '@nuxt/ui/utils/ai'
import highlight from '@comark/nuxt/plugins/highlight'

const input = ref('')

const chat = new Chat({
  onError(error) {
    console.error(error)
  }
})

function onSubmit() {
  chat.sendMessage({ text: input.value })

  input.value = ''
}
</script>

<template>
  <UChatMessages
    :messages="chat.messages"
    :status="chat.status"
  >
    <template #content="{ message }">
      <template
        v-for="(part, index) in message.parts"
        :key="`${message.id}-${part.type}-${index}`"
      >
        <UChatReasoning
          v-if="isReasoningUIPart(part)"
          :text="part.text"
          :streaming="isPartStreaming(part)"
        >
          <Comark
            :markdown="part.text"
            :streaming="isPartStreaming(part)"
            :plugins="[highlight()]"
            class="*:first:mt-0 *:last:mb-0"
          />
        </UChatReasoning>

        <UChatTool
          v-else-if="isToolUIPart(part)"
          :text="getToolName(part)"
          :streaming="isToolStreaming(part)"
        />

        <template v-else-if="isTextUIPart(part)">
          <Comark
            v-if="message.role === 'assistant'"
            :markdown="part.text"
            :streaming="isPartStreaming(part)"
            :plugins="[highlight()]"
            class="*:first:mt-0 *:last:mb-0"
          />
          <p v-else-if="message.role === 'user'" class="whitespace-pre-wrap">
            {{ part.text }}
          </p>
        </template>
      </template>
    </template>
  </UChatMessages>

  <UChatPrompt
    v-model="input"
    :error="chat.error"
    @submit="onSubmit"
  >
    <UChatPromptSubmit
      :status="chat.status"
      @stop="chat.stop()"
      @reload="chat.regenerate()"
    />
  </UChatPrompt>
</template>
<script setup lang="ts">
import { ref } from 'vue'
import { isReasoningUIPart, isTextUIPart, isToolUIPart, getToolName } from 'ai'
import { Chat } from '@ai-sdk/vue'
import { isPartStreaming, isToolStreaming } from '@nuxt/ui/utils/ai'
import { Comark } from '@comark/vue'
import highlight from '@comark/vue/plugins/highlight'

const input = ref('')

const chat = new Chat({
  onError(error) {
    console.error(error)
  }
})

function onSubmit() {
  chat.sendMessage({ text: input.value })

  input.value = ''
}
</script>

<template>
  <UChatMessages
    :messages="chat.messages"
    :status="chat.status"
  >
    <template #content="{ message }">
      <template
        v-for="(part, index) in message.parts"
        :key="`${message.id}-${part.type}-${index}`"
      >
        <UChatReasoning
          v-if="isReasoningUIPart(part)"
          :text="part.text"
          :streaming="isPartStreaming(part)"
        >
          <Comark
            :markdown="part.text"
            :streaming="isPartStreaming(part)"
            :plugins="[highlight()]"
            class="*:first:mt-0 *:last:mb-0"
          />
        </UChatReasoning>

        <UChatTool
          v-else-if="isToolUIPart(part)"
          :text="getToolName(part)"
          :streaming="isToolStreaming(part)"
        />

        <template v-else-if="isTextUIPart(part)">
          <Comark
            v-if="message.role === 'assistant'"
            :markdown="part.text"
            :streaming="isPartStreaming(part)"
            :plugins="[highlight()]"
            class="*:first:mt-0 *:last:mb-0"
          />
          <p v-else-if="message.role === 'user'" class="whitespace-pre-wrap">
            {{ part.text }}
          </p>
        </template>
      </template>
    </template>
  </UChatMessages>

  <UChatPrompt
    v-model="input"
    :error="chat.error"
    @submit="onSubmit"
  >
    <UChatPromptSubmit
      :status="chat.status"
      @stop="chat.stop()"
      @reload="chat.regenerate()"
    />
  </UChatPrompt>
</template>
For reusable Comark configuration (plugins, class, etc.), use defineComarkComponent to create a custom component instead of passing props inline each time.
components/chat/Comark.ts
import highlight from '@comark/nuxt/plugins/highlight'

export default defineComarkComponent({
  name: 'ChatComark',
  plugins: [highlight()],
  class: '*:first:mt-0 *:last:mb-0'
})
components/chat/Comark.ts
import { defineComarkComponent } from '@comark/vue'
import highlight from '@comark/vue/plugins/highlight'

export default defineComarkComponent({
  name: 'ChatComark',
  plugins: [highlight()],
  class: '*:first:mt-0 *:last:mb-0'
})
When using the highlight plugin, add the following CSS to your stylesheet to support dark mode:
main.css
html.dark .shiki span {
  color: var(--shiki-dark) !important;
  background-color: var(--shiki-dark-bg) !important;
  font-style: var(--shiki-dark-font-style) !important;
  font-weight: var(--shiki-dark-font-weight) !important;
  text-decoration: var(--shiki-dark-text-decoration) !important;
}
Read the full Build an AI Chatbot tutorial for a step-by-step guide.