Help: Large CSV Export Hitting Memory Limits
Goal: Enable app user to export 25,000+ records to CSV for download
Constraints:
- Convex action memory limit: 64 MB
- Convex query read limit: 16 MB
What I've Tried:
1. Workflows ❌
- Hit 1 MB journal size limit even with batching
- Too much state accumulation
2. Scheduled Actions with Chunking ⚠️
- Successfully processes data in 500-record chunks
- Stores each chunk as separate file in R2
- BUT: Combining chunks into final CSV (by re-downloading chunks from R2 to process the combination into one CSV file) hits 64 MB memory limit
- Even with incremental combining (download chunk, append, re-upload)
Current Approach:
Are there any other patterns for handling workloads similar to large exports in Convex?
1 Reply
Thanks for posting in <#1088161997662724167>.
Reminder: If you have a Convex Pro account, use the Convex Dashboard to file support tickets.
- Provide context: What are you trying to achieve, what is the end-user interaction, what are you seeing? (full error message, command output, etc.)
- Use search.convex.dev to search Docs, Stack, and Discord all at once.
- Additionally, you can post your questions in the Convex Community's <#1228095053885476985> channel to receive a response from AI.
- Avoid tagging staff unless specifically instructed.
Thank you!