Spamming function calls on stream OpenAI responses
Hi, I was looking at the convex-ai-chat repo and found this
https://github.com/get-convex/convex-ai-chat/blob/main/convex/serve.ts#L70
Isn't a function trigger every time the a new token gets streamed?
GitHub
convex-ai-chat/convex/serve.ts at main · get-convex/convex-ai-chat
Contribute to get-convex/convex-ai-chat development by creating an account on GitHub.
1 Reply
Yep that does happen. So we have a slightly modified pattern https://stack.convex.dev/ai-chat-with-http-streaming that reduces function calls and bandwidth
AI Chat with HTTP Streaming
By leveraging HTTP actions with streaming, this chat app balances real-time responsiveness with efficient bandwidth usage. Users receive character-by-...