Tom RedmanT
Convex Community3mo ago
3 replies
Tom Redman

"Too many bytes read" when using an index + filter, but no issues when removing the filter

Hey all,

My understanding is that I should use an index to minimize the number of returned results, and using a filter beyond that will only filter on that subset of documents (vs a whole table scan).

I'm seeing slightly different behavior than expected.

This query throws a "too many bytes" error (and the dashboard log shows it reads ~9k rows, even though my paginate function is limited in this case to just 1):

const novemberFirst = 1761955200000;
const { page, isDone, continueCursor } = await ctx.db
.query("emailEvents")
.withIndex("by_creation_time", (q) => q.gt("_creationTime", novemberFirst))
.filter((q) =>
  q.and(
    q.neq(q.field("executionId"), undefined),
    q.or(
      q.eq(q.field("templateId"), undefined),
      q.eq(q.field("templateScheduledFor"), undefined),
      q.eq(q.field("campaignType"), undefined),
      q.eq(q.field("campaignTypeId"), undefined),
      q.eq(q.field("contactId"), undefined)
    )
  )
)
.paginate({
  cursor: null,
  numItems: 1
});


But this query does not:

const novemberFirst = 1761955200000;
const { page, isDone, continueCursor } = await ctx.db
.query("emailEvents")
.withIndex("by_creation_time", (q) => q.gt("_creationTime", novemberFirst))
.paginate({
  cursor: null,
  numItems: 1
});


Notice I'm also using paginate, which should limit the results no matter what, correct?

I guess I can drop the filter() and manually filter the pages of events, but I'm just wondering where my gap in understanding might be.
Was this page helpful?