Dan Kuta
Dan Kuta5w ago

Help: Large CSV Export Hitting Memory Limits

Goal: Enable app user to export 25,000+ records to CSV for download Constraints: - Convex action memory limit: 64 MB - Convex query read limit: 16 MB What I've Tried: 1. Workflows ❌ - Hit 1 MB journal size limit even with batching - Too much state accumulation 2. Scheduled Actions with Chunking ⚠️ - Successfully processes data in 500-record chunks - Stores each chunk as separate file in R2 - BUT: Combining chunks into final CSV (by re-downloading chunks from R2 to process the combination into one CSV file) hits 64 MB memory limit - Even with incremental combining (download chunk, append, re-upload) Current Approach:
// Process in chunks (works fine)
for each 500 records:
- Fetch data
- Convert to CSV
- Store as chunk-N.csv in R2

// Combine chunks (fails with OOM)
for each chunk:
- Download chunk from R2
- Append to temp file
- Re-upload temp file
// Memory grows with each iteration
// Process in chunks (works fine)
for each 500 records:
- Fetch data
- Convert to CSV
- Store as chunk-N.csv in R2

// Combine chunks (fails with OOM)
for each chunk:
- Download chunk from R2
- Append to temp file
- Re-upload temp file
// Memory grows with each iteration
Are there any other patterns for handling workloads similar to large exports in Convex?
1 Reply
Convex Bot
Convex Bot5w ago
Thanks for posting in <#1088161997662724167>. Reminder: If you have a Convex Pro account, use the Convex Dashboard to file support tickets. - Provide context: What are you trying to achieve, what is the end-user interaction, what are you seeing? (full error message, command output, etc.) - Use search.convex.dev to search Docs, Stack, and Discord all at once. - Additionally, you can post your questions in the Convex Community's <#1228095053885476985> channel to receive a response from AI. - Avoid tagging staff unless specifically instructed. Thank you!

Did you find this page helpful?