kabloom
kabloom2mo ago

I'm sure this has been asked here a

I'm sure this has been asked here a billion times, but I have not seen any answers that directly answer my question. Is there a way to select only certain columns to query on convex? Say I have users id, first_name, last_name, email_address. I want to read id and first name. is there a way to do this on convex? is that what indexes are as a data structure?
9 Replies
Sara
Sara2mo ago
yes you could index the table like this
// assuming your table is here
.index("by_id",["id","name"])
// assuming your table is here
.index("by_id",["id","name"])
kabloom
kabloomOP2mo ago
wait sorry i can't see the code you sent will that be efficient for pagination and querying?
Sara
Sara2mo ago
I'm half asleep editing this haha
kabloom
kabloomOP2mo ago
also is there a way to exclude rather than include table columns??
Sara
Sara2mo ago
yes, in your query you could do:
ctx.db.query("sometsblr").withIndex("by_id", q=>q.eq("id",id).eq("name",name))
//anything else you might wanna add
ctx.db.query("sometsblr").withIndex("by_id", q=>q.eq("id",id).eq("name",name))
//anything else you might wanna add
not that I'm aware of, I just omit them from the object
kabloom
kabloomOP2mo ago
i don't intend to filter anything, i just want to avoid reading a certain column My main issue is actually this warning: I already added pagination but still Many bytes read in a single function execution (actual: 13956006 bytes, limit: 16777216 bytes). Consider using smaller limits in your queries, paginating your queries, or using indexed queries with a selective index range expressions. i could've used a better table design but we in prod now so we ball
Clever Tagline
Clever Tagline2mo ago
To my knowledge there's no way to limit the columns read in a query. The only way to reduce the size of a query result is to force the query to collect fewer documents. Indices, pagination, and query limits are all ways to do this, but they still return all columns.
erquhart
erquhart2mo ago
Even in prod you can still migrate to split up the table, that's the general approach here. Whatever larger columns are blowing up your queries, split them to one or more related tables. This is similar to storing vectors, which tend to be sizeable - they're usually kept in a separate table from the data they're related to for this same reason.
kabloom
kabloomOP2mo ago
I see thank you

Did you find this page helpful?