前言
现在 AI 能力越来越便宜,接入门槛也低了很多。本文记录在 UniApp 项目中集成 AI 对话功能的过程,使用 TypeScript 和 setup 语法糖,代码可以直接复制使用。
一、项目初始化
1.1 创建 UniApp 项目
使用 Vite 模板创建项目,支持 Vue3 和 TypeScript:
npx degit dcloudio/uni-preset-vue#vite-ts my-ai-app
cd my-ai-app
pnpm install
1.2 安装依赖
pnpm add axios
pnpm add -D @types/node
二、AI 服务封装
2.1 AI 服务类
新建 service/ai-service.ts,封装流式响应的核心逻辑:
import axios, { AxiosInstance, AxiosError } from 'axios'
export interface AIMessage {
role: 'user' | 'assistant' | 'system'
content: string
}
export interface AIStreamChunk {
id: string
choices: {
delta: { content?: string }
index: number
finish_reason?: string
}[]
}
export type MessageHandler = (text: string, done: boolean) => void
export type ErrorHandler = (error: Error) => void
class AIService {
private client: AxiosInstance
private abortController: AbortController | null = null
constructor(baseURL: string, apiKey: string) {
this.client = axios.create({
baseURL,
timeout: 60000,
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${apiKey}`
}
})
}
async sendMessage(
messages: AIMessage[],
onMessage: MessageHandler,
onError?: ErrorHandler
): Promise {
this.abortController = new AbortController()
try {
const response = await this.client.post('/chat/completions', {
model: 'gpt-3.5-turbo',
messages,
stream: true
}, {
responseType: 'stream',
signal: this.abortController.signal
})
const reader = response.data.getReader()
const decoder = new TextDecoder()
let buffer = ''
while (true) {
const { done, value } = await reader.read()
if (done) break
buffer += decoder.decode(value, { stream: true })
const lines = buffer.split('\n')
buffer = lines.pop() || ''
for (const line of lines) {
if (line.startsWith('data: ')) {
const data = line.slice(6)
if (data === '[DONE]') {
onMessage('', true)
return
}
try {
const chunk: AIStreamChunk = JSON.parse(data)
const content = chunk.choices[0]?.delta?.content
if (content) {
onMessage(content, false)
}
} catch {
// ignore parse errors
}
}
}
}
onMessage('', true)
} catch (error) {
if (axios.isAxiosError(error)) {
const axiosError = error as AxiosError
if (axiosError.code === 'ECONNABORTED') {
onError?.(new Error('请求超时,请检查网络连接'))
} else {
onError?.(new Error(axiosError.message))
}
} else {
onError?.(error as Error)
}
}
}
stop(): void {
this.abortController?.abort()
}
}
export default AIService
2.2 服务配置
封装成工厂函数,方便切换不同 AI 服务商:
import AIService from './ai-service'
// OpenAI
export const openAIService = new AIService(
'https://api.openai.com/v1',
process.env.OPENAI_API_KEY || ''
)
// 阿里云百炼
export const bailianService = new AIService(
'https://dashscope.aliyuncs.com/compatible-mode/v1',
process.env.DASHSCOPE_API_KEY || ''
)
// 智谱AI
export const zhipuService = new AIService(
'https://open.bigmodel.cn/api/paas/v4',
process.env.ZHIPU_API_KEY || ''
)
三、Composable 封装
3.1 useAIChat
使用 Vue3 Composition API 封装聊天逻辑,在任意页面都可以复用:
import { ref, reactive } from 'vue'
import AIService, { AIMessage, MessageHandler, ErrorHandler } from '@/service/ai-service'
export interface UseAIChatOptions {
baseURL?: string
apiKey: string
systemPrompt?: string
}
// module-level state
const isLoading = ref(false)
const messages = reactive([])
const currentResponse = ref('')
const error = ref(null)
let aiService: AIService | null = null
export function useAIChat(options: UseAIChatOptions) {
const {
baseURL = 'https://api.openai.com/v1',
apiKey,
systemPrompt
} = options
if (!aiService) {
aiService = new AIService(baseURL, apiKey)
}
if (systemPrompt && messages.length === 0) {
messages.push({ role: 'system', content: systemPrompt })
}
const handleMessage: MessageHandler = (text, done) => {
if (done) {
if (currentResponse.value) {
messages.push({ role: 'assistant', content: currentResponse.value })
currentResponse.value = ''
}
} else {
currentResponse.value += text
}
}
const handleError: ErrorHandler = (err) => {
error.value = err.message
isLoading.value = false
}
const sendMessage = async (content: string): Promise => {
if (isLoading.value) return
error.value = null
isLoading.value = true
currentResponse.value = ''
messages.push({ role: 'user', content })
try {
await aiService!.sendMessage([...messages], handleMessage, handleError)
} finally {
isLoading.value = false
}
}
const clearMessages = (): void => {
messages.splice(0, messages.length)
if (systemPrompt) {
messages.push({ role: 'system', content: systemPrompt })
}
error.value = null
}
const stopGeneration = (): void => {
aiService?.stop()
isLoading.value = false
if (currentResponse.value) {
messages.push({ role: 'assistant', content: currentResponse.value })
currentResponse.value = ''
}
}
return {
isLoading,
messages,
currentResponse,
error,
sendMessage,
clearMessages,
stopGeneration
}
}
四、页面组件
4.1 AI 聊天页面
完整的聊天界面,支持消息列表、流式输出、停止生成:
<template>
<view class="chat-container">
<scroll-view
class="message-list"
scroll-y
:scroll-into-view="scrollIntoView"
>
<view
v-for="(msg, index) in messages"
:key="index"
:id="'msg-' + index"
:class="['message-item', msg.role]"
>
<view class="avatar">
{{ msg.role === 'user' ? '我' : 'AI' }}
</view>
<view class="message-content">
<rich-text :nodes="formatMessage(msg.content)"></rich-text>
</view>
</view>
<view v-if="isLoading" class="message-item assistant">
<view class="avatar">AI</view>
<view class="message-content typing">
<text>{{ currentResponse }}</text>
<text class="cursor">|</text>
</view>
</view>
<view v-if="error" class="error-tip">
<text>出错了:{{ error }}</text>
</view>
</scroll-view>
<view class="input-area">
<button v-if="isLoading" class="stop-btn" @tap="stopGeneration">停止</button>
<input
v-model="inputText"
class="message-input"
placeholder="问点什么吧..."
confirm-type="send"
@confirm="handleSend"
/>
<button
class="send-btn"
:disabled="!inputText.trim() || isLoading"
@tap="handleSend"
>发送</button>
</view>
</view>
</template>
<script setup lang="ts">
import { ref, nextTick } from 'vue'
import { useAIChat } from '@/composables/useAIChat'
const inputText = ref('')
const scrollIntoView = ref('')
const {
isLoading,
messages,
currentResponse,
error,
sendMessage,
clearMessages,
stopGeneration
} = useAIChat({
apiKey: import.meta.env.VITE_AI_API_KEY || '',
baseURL: import.meta.env.VITE_AI_BASE_URL || 'https://api.openai.com/v1',
systemPrompt: '你是一个专业的技术助手,请用简洁清晰的语言回答问题。'
})
const handleSend = async () => {
const text = inputText.value.trim()
if (!text) return
inputText.value = ''
await sendMessage(text)
await nextTick()
scrollIntoView.value = 'msg-' + (messages.length - 1)
}
const formatMessage = (content: string): string => {
return content
.replace(/\n/g, '<br/>')
.replace(/`([^`]+)`/g, '<code style="background:#f0f0f0;padding:2rpx 8rpx;border-radius:4rpx;">$1</code>')
.replace(/\*\*([^*]+)\*\*/g, '<strong>$1</strong>')
}
</script>
<style lang="scss" scoped>
.chat-container {
display: flex;
flex-direction: column;
height: 100vh;
background: #f5f5f5;
}
.message-list {
flex: 1;
padding: 20rpx;
}
.message-item {
display: flex;
margin-bottom: 30rpx;
align-items: flex-start;
&.user {
flex-direction: row-reverse;
.message-content {
background: linear-gradient(135deg, #007aff, #5856d6);
color: #fff;
margin-right: 20rpx;
}
}
&.assistant .message-content {
background: #fff;
color: #333;
margin-left: 20rpx;
}
}
.avatar {
width: 80rpx;
height: 80rpx;
border-radius: 50%;
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
color: #fff;
display: flex;
align-items: center;
justify-content: center;
font-size: 24rpx;
font-weight: bold;
flex-shrink: 0;
}
.user .avatar {
background: linear-gradient(135deg, #11998e 0%, #38ef7d 100%);
}
.message-content {
max-width: 70%;
padding: 24rpx;
border-radius: 24rpx;
line-height: 1.8;
word-break: break-word;
box-shadow: 0 4rpx 12rpx rgba(0, 0, 0, 0.08);
&.typing .cursor {
animation: blink 1s infinite;
}
}
@keyframes blink {
0%, 100% { opacity: 1; }
50% { opacity: 0; }
}
.error-tip {
padding: 20rpx;
background: #fff0f0;
color: #f56c6c;
border-radius: 12rpx;
text-align: center;
margin: 20rpx 0;
}
.input-area {
display: flex;
align-items: center;
padding: 20rpx;
background: #fff;
border-top: 1px solid #eee;
gap: 20rpx;
}
.message-input {
flex: 1;
height: 80rpx;
padding: 0 30rpx;
background: #f5f5f5;
border-radius: 40rpx;
font-size: 28rpx;
}
.send-btn, .stop-btn {
height: 80rpx;
padding: 0 40rpx;
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
color: #fff;
border-radius: 40rpx;
font-size: 28rpx;
border: none;
line-height: 80rpx;
}
.stop-btn {
background: linear-gradient(135deg, #f093fb 0%, #f5576c 100%);
}
.send-btn[disabled] {
opacity: 0.5;
}
</style>
五、环境变量配置
.env.development 文件:
VITE_AI_API_KEY=sk-your-api-key-here
VITE_AI_BASE_URL=https://api.openai.com/v1
.env.production 文件:
VITE_AI_API_KEY=sk-prod-api-key-here
VITE_AI_BASE_URL=https://api.openai.com/v1
注意:生产环境的 API Key 建议通过服务端代理暴露,不要直接写入前端代码。
六、优化建议
6.1 错误重试机制
const MAX_RETRIES = 3
const RETRY_DELAY = 1000
async function sendWithRetry(
messages: AIMessage[],
retries = 0
): Promise {
try {
return await sendMessage(messages)
} catch (error) {
if (retries < MAX_RETRIES) {
await new Promise(r =>
setTimeout(r, RETRY_DELAY * Math.pow(2, retries))
)
return sendWithRetry(messages, retries + 1)
}
throw error
}
}
6.2 本地历史记录
const saveHistory = () => {
uni.setStorageSync('chat_history', JSON.stringify(messages))
}
const loadHistory = () => {
try {
const history = uni.getStorageSync('chat_history')
if (history) {
messages.push(...JSON.parse(history))
}
} catch (e) {
console.error('加载历史记录失败', e)
}
}
onMounted(() => loadHistory())
onUnload(() => saveHistory())
七、常见问题
Q1: App 端流式响应不生效?
UniApp 对 App 端流式响应支持有限,可选方案:
- 使用非流式 API(响应较慢但稳定)
- 使用 uni.request 的 onDataReceived 事件
- 改用 WebSocket 通信
Q2: 需要内容安全审核?
建议在服务端添加审核层,可选服务:
- 阿里云内容安全
- 腾讯云文本内容安全
- 百度文本审核
Q3: 如何实现多轮对话?
将历史消息一起发送给 AI 即可,API 会自动处理上下文:
const allMessages = [
{ role: 'system', content: '你是一个助手' },
{ role: 'user', content: '你好' },
{ role: 'assistant', content: '你好,有什么可以帮你的?' },
{ role: 'user', content: '继续说' }
]
八、总结
本文方案包含以下核心内容:
- 基于 Axios 的流式响应处理
- Vue3 Composition API 的 Composable 封装
- 完整的聊天界面实现
- 环境变量配置与安全建议
- 错误处理与本地存储
核心思路:封装好 AI 服务层,页面只负责 UI 渲染。复制代码后改一下环境变量就能用。
© 版权声明
文章版权归作者所有,未经允许请勿转载。
THE END





