sbkl
sbkl2w ago

Using 5MB excel file from convex file storage: Your request couldn't be completed. Try again later

I keep getting this error randomly when manipulating a 5MB excel file within an internal action. Is there a way to understand the underlying issue on action error for such error message so I could fix it? Also, should convex be able tom handle such file?
12 Replies
Convex Bot
Convex Bot2w ago
Thanks for posting in <#1088161997662724167>. Reminder: If you have a Convex Pro account, use the Convex Dashboard to file support tickets. - Provide context: What are you trying to achieve, what is the end-user interaction, what are you seeing? (full error message, command output, etc.) - Use search.convex.dev to search Docs, Stack, and Discord all at once. - Additionally, you can post your questions in the Convex Community's <#1228095053885476985> channel to receive a response from AI. - Avoid tagging staff unless specifically instructed. Thank you!
Hmza
Hmza2w ago
@sbkl can you try using node runtime rather convex runtime for this action Just move the action to a new file in convex directory and use ‘use node’ directive on top (If you have other functions in this file) Convex runtime has 64MB memory limit while the node runtime has 512MB memory limit Reference to convex docs here: https://docs.convex.dev/functions/actions#limits
Actions | Convex Developer Hub
Actions can call third party services to do things such as processing a payment
sbkl
sbklOP2w ago
yes this is what I am doing. I am also using the workpool component and various optimisations to stream the excel workbook and hitting a wall. The most rows I was able to upsert with various batching sizes (500,1000 records) is 200k rows. enqueuing actions for each batch. Then at some point I get this error. Any direction on this would be appreciated! use case: got 8000 articles for 80 stores for their quantity allocations. So roughly 700k rows to upsert. No transaction argument limit, making sure I don't go above the 600 seconds action limits or byte limit for function args. also I am storing the file with convex file storage passing the storage id to the actions and then retrieve the worbook with XLSX package
Eternal Mori
Eternal Mori2w ago
Cant you use migrations or workflow to omit the time limit?
bobjoneswins
bobjoneswins2w ago
I am interested in seeing solutions for this kind of problem as I need to upload large files also.
sbkl
sbklOP7d ago
@Eternal Mori this is what I am doing. I am using "use node" for memory purpose, streaming the content of the excel file not to hit the memory too hard, I am very careful not to exceed the limits for arguments for actions, queries and mutations, using workflows and batching to ensure no action time limit is passed, indexing and paginating to ensure not too many bytes read from database etc... Having this error message: "Your request couldn't be completed. Try again later." is not very helpful to understand what is wrong and would like to know if there is a way to investigate further the root cause of the error.
bobjoneswins
bobjoneswins7d ago
It sounds like Convex is not as easy to use as the Youtube videos suggest it is, if you have real-world complex data tables.
sbkl
sbklOP6d ago
Just tried the streaming import api and uploaded 100k rows for test. While it was taking 2 min or so with regular queries (for database consistency check) and mutations, it takes 8 min with this import api which is supposed to be faster and more efficient at uploading big dataset than the regular function api.
Convex HTTP API | Convex Developer Hub
Connecting to Convex directly with HTTP
sbkl
sbklOP6d ago
// index/primary key was added with /api/streaming_import/add_primary_key_indexes and confirmed active with /api/streaming_import/primary_key_indexes_ready
const primaryKey = [
["organisationId"],
["collectionId"],
["regionId"],
["marketId"],
["storeId"],
["materialExternalId"],
["articleExternalId"],
];

for (let i = 0; i < batches.length; i += 10000) {
const batch = batches.slice(i, i + 10000);
const data = {
tables: {
collectionAllocations: {
primaryKey,
jsonSchema: {
type: "object",
properties: {
organisationId: {
type: "id",
tableName: "organisations",
},
collectionId: { type: "id", tableName: "collections" },
regionId: { type: "id", tableName: "regions" },
marketId: { type: "id", tableName: "markets" },
storeId: { type: "id", tableName: "stores" },
materialExternalId: { type: "string" },
articleExternalId: { type: "string" },
units: {
type: "union",
value: [
{ type: "number" },
{ type: "null" },
{ type: "literal", value: "x" },
],
},
},
},
},
},
messages: batch.map((row) => {
return {
tableName: "collectionAllocations",
data: row,
};
}),
};

const response = await fetch(
`${env.CONVEX_URL}/api/streaming_import/import_airbyte_records`,
{
method: "POST",
headers: {
Accept: "application/json",
"Content-Type": "application/json",
"Convex-Client": "streaming-import-0.1.0",
Authorization: `Convex ${env.DEPLOY_KEY}`,
},
body: JSON.stringify(data),
},
);
}
// index/primary key was added with /api/streaming_import/add_primary_key_indexes and confirmed active with /api/streaming_import/primary_key_indexes_ready
const primaryKey = [
["organisationId"],
["collectionId"],
["regionId"],
["marketId"],
["storeId"],
["materialExternalId"],
["articleExternalId"],
];

for (let i = 0; i < batches.length; i += 10000) {
const batch = batches.slice(i, i + 10000);
const data = {
tables: {
collectionAllocations: {
primaryKey,
jsonSchema: {
type: "object",
properties: {
organisationId: {
type: "id",
tableName: "organisations",
},
collectionId: { type: "id", tableName: "collections" },
regionId: { type: "id", tableName: "regions" },
marketId: { type: "id", tableName: "markets" },
storeId: { type: "id", tableName: "stores" },
materialExternalId: { type: "string" },
articleExternalId: { type: "string" },
units: {
type: "union",
value: [
{ type: "number" },
{ type: "null" },
{ type: "literal", value: "x" },
],
},
},
},
},
},
messages: batch.map((row) => {
return {
tableName: "collectionAllocations",
data: row,
};
}),
};

const response = await fetch(
`${env.CONVEX_URL}/api/streaming_import/import_airbyte_records`,
{
method: "POST",
headers: {
Accept: "application/json",
"Content-Type": "application/json",
"Convex-Client": "streaming-import-0.1.0",
Authorization: `Convex ${env.DEPLOY_KEY}`,
},
body: JSON.stringify(data),
},
);
}
I might create another post on this as even if related, it is another issue...
erquhart
erquhart5d ago
I don't know if streaming import is supposed to be faster or more efficient, the point of it is to support large dataset imports, which is difficult to do reliably with regular functions as you've seen. Apart from the time it takes, is everything working with the import?
sbkl
sbklOP4d ago
yes everything works. From other posts, I understood it was supposed to be faster. The difference is quite important.
erquhart
erquhart4d ago
Ah okay. I may be wrong on the performance expectation, just hadn't heard that before.

Did you find this page helpful?