I am seeing a lot of these this
I am seeing a lot of these this afternoon. Are there any slowdowns at Convex right now or maybe our app is growing out of bounds?

11 Replies
Right now it eventually works but it takes a lot of retries:

(in the first example, it gave up and exited)
checking on this now @Mikael Lirbank. we did hit some bumps when we were updating customer instances about ~20 minutes ago, but I would have expected those to be resolved by now
I'm seeing that your backend did restart at 16:48 (and had momentary blips in the ~10 minute period leading up to that), but has been stable since.
I wouldn't expect you to still be seeing issues at 17:02 (timestamp from your screenshot). Have further pushes succeeded, or are you still seeing retries now?
Gotcha. I am still seeing it unfortunately.
Still working

okay, I'll get more people on this. might be unrelated to this issues earlier
@Mikael Lirbank is this the same
convex dev
command running uninterrupted, or have you killed it and started over since?I've probably killed it a few times. But mostly I let it run...
Ehh, I just killed it a minute ago.
I just started it up and added a comment to one of the Convex function files, and it happened right away.

Hey @Mikael Lirbank! Are you using Node actions with any large imported dependencies by any chance? I'm wondering if this is a large source code size issue paired with network hiccups
We do use them, but I don't in this feature.
Thanks for the info. While this does seem to be a large request, we don't see any limits explicitly being hit.
We're still digging into this; your backend itself is reporting that the client sent incomplete data and your CLI client is reporting that the server stopped accepting data before the request finished 😓
We're trying to see if there's a load balancing layer in between that's unexpectedly terminating/resetting the connection at some point during the request.
There's also a very small chance that this is caused by your network connection, so if you have a chance to test on a different network and/or machine that'd be helpful. Hoping to have an update for you tomorrow!