whoami
whoami2y ago

Too many open queries

Unhandled Runtime Error
Error: [CONVEX Q(chats.js:get)] Uncaught Error: Too many open queries (64 > maximum 64)
at performSyscall (../../node_modules/convex/src/server/impl/syscall.ts:21:20)
at startQuery [as startQuery] (../../node_modules/convex/src/server/impl/query_impl.ts:188:25)
at [Symbol.asyncIterator] (../../node_modules/convex/src/server/impl/query_impl.ts:235:4)
at collect [as collect] (../../node_modules/convex/src/server/impl/query_impl.ts:297:19)
at <anonymous> (../convex/chats.ts:63:15)
at map [as map]
Error: [CONVEX Q(chats.js:get)] Uncaught Error: Too many open queries (64 > maximum 64)
at performSyscall (../../node_modules/convex/src/server/impl/syscall.ts:21:20)
at startQuery [as startQuery] (../../node_modules/convex/src/server/impl/query_impl.ts:188:25)
at [Symbol.asyncIterator] (../../node_modules/convex/src/server/impl/query_impl.ts:235:4)
at collect [as collect] (../../node_modules/convex/src/server/impl/query_impl.ts:297:19)
at <anonymous> (../convex/chats.ts:63:15)
at map [as map]
Where did I do wrong? Here is my mutation that causes the error (I think) My code might be messy so any advices not related to the issue but related to my convex code in general is appreciated as well
export const get = query({
args: {
id: v.id('chats'),
},

handler: async ({ db, auth }, { id }) => {
const identity = await auth.getUserIdentity()
if (!identity) {
throw new Error('Unauthenticated call to query')
}

const chat = await db.get(id)
if (!chat) {
return {
error: 'Chat not found',
}
}

const tools = await Promise.all(chat.tools.map((t) => db.get(t)))
const messages = await db
.query('messages')
.withIndex('by_chat', (q) => q.eq('chat', id))
.order('asc')
.collect()

const messagesWithSteps = await Promise.all(
messages.map(async (m) => {
const steps = await db
.query('steps')
.withIndex('by_message', (q) => q.eq('message', m._id))
.order('asc')
.collect()

return {
...m,
steps,
}
}),
)

return {
chat: {
id: chat._id,
title: chat.title,
// get the last 100 messages
messages: messagesWithSteps.splice(
messages.length - 100,
messages.length,
),
tools,
status: chat.status,
},
}
},
})
export const get = query({
args: {
id: v.id('chats'),
},

handler: async ({ db, auth }, { id }) => {
const identity = await auth.getUserIdentity()
if (!identity) {
throw new Error('Unauthenticated call to query')
}

const chat = await db.get(id)
if (!chat) {
return {
error: 'Chat not found',
}
}

const tools = await Promise.all(chat.tools.map((t) => db.get(t)))
const messages = await db
.query('messages')
.withIndex('by_chat', (q) => q.eq('chat', id))
.order('asc')
.collect()

const messagesWithSteps = await Promise.all(
messages.map(async (m) => {
const steps = await db
.query('steps')
.withIndex('by_message', (q) => q.eq('message', m._id))
.order('asc')
.collect()

return {
...m,
steps,
}
}),
)

return {
chat: {
id: chat._id,
title: chat.title,
// get the last 100 messages
messages: messagesWithSteps.splice(
messages.length - 100,
messages.length,
),
tools,
status: chat.status,
},
}
},
})
10 Replies
lee
lee2y ago
hi @whoami ! the error is saying you can't run more than 64 queries (db.query(...)... ) in parallel. however, i agree this error doesn't make much sense anymore, so i'm removing it today and you shouldn't need to change your code. thanks for reporting!
RJ
RJ2y ago
Hey @lee—I've run into this a number of times as well. Does this mean you're going to remove the concurrent queries limit altogether, or raise it to a larger number?
lee
lee2y ago
we're removing the concurrent queries limit altogether. there will still be limits on the number of queries & number of rows & number of bytes read within a single function, to make sure the function stays efficient. but concurrency-wise, you can go wild 🙂
RJ
RJ2y ago
Great, thanks!
whoami
whoamiOP2y ago
I ended up denormalize the schema and it works now, I hope I can get these notifications earlier also what does removing the limit mean? I assume it was there for some reason right?
lee
lee2y ago
sorry about that. i deployed the change a few hours ago, so it should be possible to do the pattern you were doing before. denormalization has its benefits though the limit was in place before we added limits on total queries & reads (to prevent out-of-memory errors), and before we did queries async at all. with the new patterns opened up and new limits in place to prevent poor performance, the limit on concurrency can be lifted
whoami
whoamiOP2y ago
Ok, I ran in to the limits on total queries previously and it got resolved via indexing, then I got the concurrent queries limit. I really hope I can get some headup on the resource limitations. What other limitations should I be aware of before it’s too late?
presley
presley2y ago
We can add a document with all limits. It is #queries, #rows read #of data read. There are also limits on writes bytes and rows.
lee
lee2y ago
great point. we're working on making the limits more discoverable and less surprising. we have some ideas, but please let us know how you would like to discover them. the limits center around building a fast, efficient API for a frontend app. so there are limits on reads and writes, and a limit on time spent in javascript -- mostly to prevent infinite loops.
whoami
whoamiOP2y ago
I would say the best place to discover them is of course in the code editor, not sure if a vscode plugin would be possible for this

Did you find this page helpful?