Dan KutaD
Convex Community4mo ago
1 reply
Dan Kuta

Help: Large CSV Export Hitting Memory Limits

Goal: Enable app user to export 25,000+ records to CSV for download

Constraints:
- Convex action memory limit: 64 MB
- Convex query read limit: 16 MB

What I've Tried:

1. Workflows ❌
- Hit 1 MB journal size limit even with batching
- Too much state accumulation

2. Scheduled Actions with Chunking ⚠️
- Successfully processes data in 500-record chunks
- Stores each chunk as separate file in R2
- BUT: Combining chunks into final CSV (by re-downloading chunks from R2 to process the combination into one CSV file) hits 64 MB memory limit
- Even with incremental combining (download chunk, append, re-upload)

Current Approach:
// Process in chunks (works fine)
for each 500 records:
  - Fetch data
  - Convert to CSV
  - Store as chunk-N.csv in R2
  
// Combine chunks (fails with OOM)
for each chunk:
  - Download chunk from R2
  - Append to temp file
  - Re-upload temp file
// Memory grows with each iteration


Are there any other patterns for handling workloads similar to large exports in Convex?
Was this page helpful?