If I'm making an app like ChatGPT / t3.chat, does convex support long running LLM queries?

Like if my LLM response takes 3-5 minutes to complete, will something timeout before I get a complete response back? How well designed is Convex for this usecase? Thanks!
4 Replies
Convex Bot
Convex Bot2w ago
Thanks for posting in <#1088161997662724167>. Reminder: If you have a Convex Pro account, use the Convex Dashboard to file support tickets. - Provide context: What are you trying to achieve, what is the end-user interaction, what are you seeing? (full error message, command output, etc.) - Use search.convex.dev to search Docs, Stack, and Discord all at once. - Additionally, you can post your questions in the Convex Community's <#1228095053885476985> channel to receive a response from AI. - Avoid tagging staff unless specifically instructed. Thank you!
Jonathan Leung
Jonathan LeungOP2w ago
I want to be able stream responses back over a time period of 5 minutes or so max
Matt Luo
Matt Luo2w ago
I haven’t tried myself but I think the agent component handles this. https://stack.convex.dev/ai-agents
AI Agents with Built-in Memory
With this new backend component, augment Agents to automatically save and search message history per-thread, providing realtime results across multipl...
Jonathan Leung
Jonathan LeungOP2w ago
Thanks Matt!

Did you find this page helpful?