Spamming function calls on stream OpenAI responses

Hi, I was looking at the convex-ai-chat repo and found this https://github.com/get-convex/convex-ai-chat/blob/main/convex/serve.ts#L70
const stream = await openai.chat.completions.create({
model: OPENAI_MODEL,
stream: true,
messages: [
{
role: "system",
content:
"Answer the user question based on the provided documents " +
"or report that the question cannot be answered based on " +
"these documents. Keep the answer informative but brief, " +
"do not enumerate all possibilities.",
},
...(relevantDocuments.map(({ text }) => ({
role: "system",
content: "Relevant document:\n\n" + text,
})) as ChatCompletionMessageParam[]),
...(messages.map(({ isViewer, text }) => ({
role: isViewer ? "user" : "assistant",
content: text,
})) as ChatCompletionMessageParam[]),
],
});
let text = "";
for await (const { choices } of stream) {
const replyDelta = choices[0].delta.content;
if (typeof replyDelta === "string" && replyDelta.length > 0) {
text += replyDelta;
await ctx.runMutation(internal.serve.updateBotMessage, {
messageId,
text,
});
}
}
const stream = await openai.chat.completions.create({
model: OPENAI_MODEL,
stream: true,
messages: [
{
role: "system",
content:
"Answer the user question based on the provided documents " +
"or report that the question cannot be answered based on " +
"these documents. Keep the answer informative but brief, " +
"do not enumerate all possibilities.",
},
...(relevantDocuments.map(({ text }) => ({
role: "system",
content: "Relevant document:\n\n" + text,
})) as ChatCompletionMessageParam[]),
...(messages.map(({ isViewer, text }) => ({
role: isViewer ? "user" : "assistant",
content: text,
})) as ChatCompletionMessageParam[]),
],
});
let text = "";
for await (const { choices } of stream) {
const replyDelta = choices[0].delta.content;
if (typeof replyDelta === "string" && replyDelta.length > 0) {
text += replyDelta;
await ctx.runMutation(internal.serve.updateBotMessage, {
messageId,
text,
});
}
}
Isn't a function trigger every time the a new token gets streamed?
GitHub
convex-ai-chat/convex/serve.ts at main · get-convex/convex-ai-chat
Contribute to get-convex/convex-ai-chat development by creating an account on GitHub.
1 Reply
lee
lee4mo ago
Yep that does happen. So we have a slightly modified pattern https://stack.convex.dev/ai-chat-with-http-streaming that reduces function calls and bandwidth
AI Chat with HTTP Streaming
By leveraging HTTP actions with streaming, this chat app balances real-time responsiveness with efficient bandwidth usage. Users receive character-by-...

Did you find this page helpful?