Using 5MB excel file from convex file storage: Your request couldn't be completed. Try again later
I keep getting this error randomly when manipulating a 5MB excel file within an internal action. Is there a way to understand the underlying issue on action error for such error message so I could fix it? Also, should convex be able tom handle such file?
12 Replies
Thanks for posting in <#1088161997662724167>.
Reminder: If you have a Convex Pro account, use the Convex Dashboard to file support tickets.
- Provide context: What are you trying to achieve, what is the end-user interaction, what are you seeing? (full error message, command output, etc.)
- Use search.convex.dev to search Docs, Stack, and Discord all at once.
- Additionally, you can post your questions in the Convex Community's <#1228095053885476985> channel to receive a response from AI.
- Avoid tagging staff unless specifically instructed.
Thank you!
@sbkl can you try using node runtime rather convex runtime for this action
Just move the action to a new file in convex directory and use
‘use node’ directive on top
(If you have other functions in this file)
Convex runtime has 64MB memory limit while the node runtime has 512MB memory limit
Reference to convex docs here:
https://docs.convex.dev/functions/actions#limits
Actions | Convex Developer Hub
Actions can call third party services to do things such as processing a payment
yes this is what I am doing. I am also using the workpool component and various optimisations to stream the excel workbook and hitting a wall. The most rows I was able to upsert with various batching sizes (500,1000 records) is 200k rows. enqueuing actions for each batch. Then at some point I get this error. Any direction on this would be appreciated!
use case: got 8000 articles for 80 stores for their quantity allocations. So roughly 700k rows to upsert. No transaction argument limit, making sure I don't go above the 600 seconds action limits or byte limit for function args.
also I am storing the file with convex file storage passing the storage id to the actions and then retrieve the worbook with XLSX package
Cant you use migrations or workflow to omit the time limit?
I am interested in seeing solutions for this kind of problem as I need to upload large files also.
@Eternal Mori this is what I am doing. I am using "use node" for memory purpose, streaming the content of the excel file not to hit the memory too hard, I am very careful not to exceed the limits for arguments for actions, queries and mutations, using workflows and batching to ensure no action time limit is passed, indexing and paginating to ensure not too many bytes read from database etc... Having this error message: "Your request couldn't be completed. Try again later." is not very helpful to understand what is wrong and would like to know if there is a way to investigate further the root cause of the error.
It sounds like Convex is not as easy to use as the Youtube videos suggest it is, if you have real-world complex data tables.
Just tried the streaming import api and uploaded 100k rows for test. While it was taking 2 min or so with regular queries (for database consistency check) and mutations, it takes 8 min with this import api which is supposed to be faster and more efficient at uploading big dataset than the regular function api.
Convex HTTP API | Convex Developer Hub
Connecting to Convex directly with HTTP
I might create another post on this as even if related, it is another issue...
I don't know if streaming import is supposed to be faster or more efficient, the point of it is to support large dataset imports, which is difficult to do reliably with regular functions as you've seen. Apart from the time it takes, is everything working with the import?
yes everything works. From other posts, I understood it was supposed to be faster. The difference is quite important.
Ah okay. I may be wrong on the performance expectation, just hadn't heard that before.