can multiple workers safely update different rows in a table concurrently
Hi team – quick question: can multiple workers safely update different rows in a table concurrently? I’m currently seeing table locking issues when multiple workers try to update the different rows of the same table.
Would it be better to insert new rows instead of updating an existing one? Is that the more appropriate pattern for handling streaming workloads, rather than trying to maintain a single record per stream?
9 Replies
Thanks for posting in <#1088161997662724167>.
Reminder: If you have a Convex Pro account, use the Convex Dashboard to file support tickets.
- Provide context: What are you trying to achieve, what is the end-user interaction, what are you seeing? (full error message, command output, etc.)
- Use search.convex.dev to search Docs, Stack, and Discord all at once.
- Additionally, you can post your questions in the Convex Community's <#1228095053885476985> channel to receive a response from AI.
- Avoid tagging staff unless specifically instructed.
Thank you!
hi. no problem updating lots of things concurrently. when you say "table locking", what symptom are you seeing that leads you to believe that?
convex doesn't actually lock anything. convex concurrency control is optimistic, which doesn't use pessimistic locks
I’m streaming an AI response and updating the same row in the database every ~50ms. I’m wondering if that’s the right approach for a streaming workload.
Would it be more appropriate to insert new rows for each stream update instead of continuously updating the same record? I’m concerned that this would cause the table to grow quickly, but I could periodically move older rows to an archive table via a batch job if that’s the better pattern.
I understand Convex uses optimistic concurrency control, not locks. Still, I’m seeing symptoms like retries and delays - see screenot. Any tips on how to better handle concurrent updates in this scenario?
Thanks!

I’m streaming an AI response and updating the same row in the database every ~50ms. I’m wondering if that’s the right approach for a streaming workload.gotcha. are you running these updates in parallel?
btw, here's a component that does this streaming workload in convex. it uses chunks: https://www.convex.dev/components/persistent-text-streaming
Convex
Persistent Text Streaming
Stream text like AI chat to the browser in real-time while also efficiently storing it to the database.
Perfect!
So the optimal option is to stream and batch update to the database
yep
Thank you for helping me with this. Hope to have something for #show-and-tell in a couple weeks!