zlanichZ
Convex Community2mo ago
2 replies
zlanich

Fan-out / Concurrency / Workpools - Limits?

Hey there! I'm re-designing my parking payment system with the intention to scale it to more cities. We have times of day where a lot of people are creating new sessions, and aside from the request itself, there is a bunch of background work that needs done per parking session -- ie. External integrations like enforcement providers need notified, notifications need sent etc.

If I scale this to more cities, the number of concurrent requests is going to go up, and some of these tasks (ie. enforcement especially) need handle immediately to prevent people from getting tickets, but I try not to include the actual http call to the provider within the parking request itself, because we like to handle provider outages gracefully and allow people to continue parking. There are also time-sensitive notifications that need sent quickly enough to be useful (expiry notices, etc).

Other jobs that handle aggregation/reporting, etc can be a little more delayed.

However, I keep seeing things in the docs like in Workpool: "maxParallelism: How many actions/mutations can run at once within this pool. Avoid exceeding 100 on Pro, 20 on the free plan, across all workpools and workflows." that make me feel like we're going to hit a concurrency ceiling really fast. This didn't make must sense to me when something like a single AWS lambda function can handle 1k concurrent requests.

Where exactly are the ceilings here? I understand that writing to hotspots in the database can cause concurrency conflicts, but in terms of just invoking concurrent actions, it seems like 100 is way low, and I can't find anything in the docs that clearly conveys where else we might see limits like this.

Any advice is appreciated!
Was this page helpful?