Zedd
Zedd2mo ago

Running into extremely high ram use on selfhosted version

I shouldn't be doing anything particularily memory intensive. What could be causing this? I'm using the latest version of convexdb as of posting.
No description
3 Replies
Zedd
ZeddOP2mo ago
Is there a way to customize caching behaviour / limit the amount of RAM convex will use? It loads a very large amount of memory into ram on container startup so my only guess is that it's the indexes?
Zedd
ZeddOP2mo ago
It really shouldn't be using this much ram though
No description
Zedd
ZeddOP2mo ago
export default defineSchema({
users: defineTable(userDoc)
.index("by_access_token", ["access_token"]),
entities: defineTable(entityDoc)
.index("by_template", ["template"])
.index("by_name", ["id", "template"])
.searchIndex("search_entity_id", {
searchField: "id",
filterFields: ["template"],
}),
templates: defineTable(templateDoc)
.index("by_path", ["path"]),
plans: defineTable(planDoc),
scrapedData: defineTable(scrapedDataDoc)
.index("by_self_id", ["id"]),
runs: defineTable(runDoc),
renders: defineTable(renderDoc)
.index("by_run", ["run"])
.index("by_name", ["name"])
.index("by_entity", ["entity"])
.index("by_template", ["template"])
.index("by_plan", ["plan"]),
providers: defineTable(providerDoc)
.index("by_identifier", ["id"]),
});
export default defineSchema({
users: defineTable(userDoc)
.index("by_access_token", ["access_token"]),
entities: defineTable(entityDoc)
.index("by_template", ["template"])
.index("by_name", ["id", "template"])
.searchIndex("search_entity_id", {
searchField: "id",
filterFields: ["template"],
}),
templates: defineTable(templateDoc)
.index("by_path", ["path"]),
plans: defineTable(planDoc),
scrapedData: defineTable(scrapedDataDoc)
.index("by_self_id", ["id"]),
runs: defineTable(runDoc),
renders: defineTable(renderDoc)
.index("by_run", ["run"])
.index("by_name", ["name"])
.index("by_entity", ["entity"])
.index("by_template", ["template"])
.index("by_plan", ["plan"]),
providers: defineTable(providerDoc)
.index("by_identifier", ["id"]),
});
There's maybe a few hundred entities at most all of small size Also can't find a way to reset / clear the cache / indexes for convexdb or a good method to diagnose the use I have under a hundred documents Also using sqlite3 this is such a strange issue When trying to export all convex data I run into this as well: they're all also tiny documents which should never be more than a 'kb' so I'm very confused Heading into the docker container the entire convex/data folder only has 1.6G worth of data That's including modules and exports and the entire .sqlite3 database So it using 6-8gb of ram on load seems insane

Did you find this page helpful?