Handling Updating Large amounts of Rows
If I need to update thousands of rows, what is the best approach for this to not get hit with mutation limits?
7 Replies
Thanks for posting in <#1088161997662724167>.
Reminder: If you have a Convex Pro account, use the Convex Dashboard to file support tickets.
- Provide context: What are you trying to achieve, what is the end-user interaction, what are you seeing? (full error message, command output, etc.)
- Use search.convex.dev to search Docs, Stack, and Discord all at once.
- Additionally, you can post your questions in the Convex Community's <#1228095053885476985> channel to receive a response from AI.
- Avoid tagging staff unless specifically instructed.
Thank you!
This sounds like a migration - if so, the easiest way is to use the Migrations component: https://www.convex.dev/components/migrations
But the basic answer is to use pagination to limit the number of reads/writes per function call.
yeah it is basically a migration and keeping it a live migration as data changes in the legacy system
I tried to use Airbyte but it isn't going to really work because of some issues on my SQL Server with Change detection and just thinking I should clean some schema stuff up as I go
Ah you're pulling from an external system
yes , but realized I was asking this for another reason, I am trying to find how deep I dive this weekend, family is out of town for the weekend lol
well this case actually I was thinking of tagging rows in a table for more of search type things. Where I would be filtering limits higher than the normal limits, thinking my query for the search could just filter based on a tag, if the search requirements change, update all rows that the tags have
but that may be expensive also
The approach makes sense