Jcampuza
Jcampuza2mo ago

You'd use an index and pagination to

You'd use an index and pagination to avoid that. I can't really think of a use case outside of maybe bulk exports where you'd need 32000+ documents in one query.
7 Replies
Maxz
Maxz2mo ago
thanks
Maxz
Maxz2mo ago
thanks again, can you help me understand. would this be a problem ?
No description
Maxz
Maxz2mo ago
and these limits are only for convex clould right ? if I self host, non of this limits exist.
RJ
RJ2mo ago
The trouble with filter is that it's roughly the same as doing .collect() and then filtering in JS, which means you're reading all of those documents from the database. Using an index is much more efficient, so you want to do all of your filtering in .withIndex, basically always. I don't think there's an "or" combinator for indexes, so you'd probably just want to run two queries. You could, for example, create an index by_user_tags and by_user_desc and then run a query for each of those. Oh but you want to paginate as well
RJ
RJ2mo ago
In that case I think something like this would be the way to go: https://stack.convex.dev/merging-streams-of-convex-data
Merging Streams of Convex data
New convex-helpers are available now for fetching streams of documents, merging them together, filtering them them out, and paginating the results. Wi...
RJ
RJ2mo ago
Also I believe these limits also exist in the self-hosted version, too
Maxz
Maxz2mo ago
thanks, i'm glad it is at least doable when I really need that.

Did you find this page helpful?