Andy
Andy3w ago

Large record update strategy

I have about 10k rows of data that changes daily for my company’s core product. I’m looking to import that as part of a nightly cron job for an internal tool I’m building in Convex to demonstrate its potential. I expect 70% of the data to remain the same. 10% to be removed and 20% to be added. I have an internal action to fetch the data nightly, but I’m unclear what next steps should be given that there’s no multi row functions in the Convex API. Do I really have to delete every row or do a lookup based on my internal ID and then perform the mutation for every record? Seems a bit inefficient.
2 Replies
Convex Bot
Convex Bot3w ago
Thanks for posting in <#1088161997662724167>. Reminder: If you have a Convex Pro account, use the Convex Dashboard to file support tickets. - Provide context: What are you trying to achieve, what is the end-user interaction, what are you seeing? (full error message, command output, etc.) - Use search.convex.dev to search Docs, Stack, and Discord all at once. - Additionally, you can post your questions in the Convex Community's <#1228095053885476985> channel to receive a response from AI. - Avoid tagging staff unless specifically instructed. Thank you!
lee
lee3w ago
Convex has multi-row functions: you can read and modify several hundred rows in a single mutation, so you can split your 10k rows into several batches and execute each batch in a mutation. Detecting deletions may be tricky, but as long as they are sorted it should work. There are other patterns if you want to do the import atomically; if you're interested let us know.

Did you find this page helpful?