best practices/examples to use httpClient with unstable_cache
For some rather static parts of my app i want to use next.js server-side rendering, making use of the data cache to reduce the number of queries hitting convex layer. So I'm using the ConvexHttpClient and wrap it in
unstable_cache
from next/cache
.
A working example:
In the component i can then use it with const bikeModel = await getCachedBikeModel(bikeModelId)
This works as expected, but it's a lot of boilerplate to manually create wrappers. Is there any example/guide how I can automatically generate these from the api?11 Replies
I'm thinking there should be a way to create something like a
CachedHttpClient
which can be used as a wrapper around the Convex HttpClient and transparently proxies all queries through unstable_cache. But this seems way above my typescript skills 🙂Thanks for reaching out @TripleSpeeder! We're actually working on a tighter Next.js integration at the moment. Are you using Pages Router or App Router? I imagine App Router would cache the underlying fetch call by default and you wouldn't need unstable_cache.
I think we also need to figure out how to better price cached Convex queries. You shouldn't really need to put another cache in front of them (unless you want to get stale results explicitly - is that the case here?).
yep, eventually, you shouldn't need to worry about caching
we'll have all your values edge cached and, we'll have pricing plans with appropriate pricing for cache hits. but we're not there yet. hopefully later this year
Yes, I'm using the new app router. I'm still learning how all things work together there with server-side rendering. Especially caching is complex, as there are 4 different cache layers involved (See https://nextjs.org/docs/app/building-your-application/caching). Right now, if I use the ConvexHttpClient for server-side rendering of dynamic routes (e.g.
/blog/<blogId>
), for each request lifecycle the app will hit the convex endpoint to retrieve the blog entry. I want to leverage the Data Cache, which is persistent. So only the very first request will hit the convex db, all following requests for the same data will get the response directly from the cache, without a network roundtrip to convex db.
Next.js uses this data cache automatically with their patched version of fetch
. Unfortunately the patched fetch only caches GET
requests, and convexHttpClient uses POST
everywhere 🤷♂️ . That's the reason I'm trying to go the manual way with unstable_cache
.
This sounds awesome. With the current setup I anticipate that the next-native data cache will be faster than getting a cached response from convex db, especially for users located in Europe or Asia. They will get the data from a local edge cache (vercel), while convex db will always have a full roundtrip to US.yep, that's correct
right now things will have to round trip to caches in the eastern US
I'm probably way too early thinking about caches and speed optimization, as I'm nowhere close to being release-ready, so this is not really urgent. But I want to have the best possible user experience, so these kind of questions are always somewhere on my mind...
This page: https://nextjs.org/docs/app/building-your-application/data-fetching/fetching-caching-and-revalidating#caching-data
Says: "fetch requests that use the POST method are also automatically cached"
Data Fetching: Fetching, Caching, and Revalidating | Next.js
Learn how to fetch, cache, and revalidate data in your Next.js application.
Wow, this is maybe a recent change? Or I'm just stupid 🙂 I'm sure that I read some time ago that only GET requests are cached. Will have to test this again 🙂
It's confusing, but the part you quoted is the request memoization during a single render. That only applies to GET requests, is only stored in memory, gets emptied at the end of the render, and is a React feature.
But Next.js has a data cache on top of it, which is used across requests and renders.
Ah, that makes sense.
I did some more experiments today on this topic. I have a server-rendered component that uses convex httpclient to query. I added logging to the convex function, so i see whenever the convex layer is hit.
Reloading the page in the browser yields the following logs on the serverside:
When using the code snippet i posted above, manually using the unstable_cache yields this result:
So it looks like by default the data cache is not used... This is probably more a question for next then. I'm assuming that either for some reason convexHttpClient is not using the patched
fetch
from next.js, or there are some properties in the fetch request that make next.js think that it should not be cached (Maybe some auth headers?) .
Anyway, just wanted to let you know this. It's not really an issue for me at the moment. When I have more time for this topic I'll restart investigation 🙂
And some anecdotal data about page loading time, refreshing the same page ~10 times each:
- Without data cache: 1.45 to 2.2 seconds
- With data cache: 1.2-1.5 seconds.
This all measured from running next dev locally, so not sure how this translates to a prod environment. But still the trend seems clear. Since I'm located in Germany the additional roundtrip to convex db in US seems quite significant.Would definitely be curious to whether the data cache is correctly used by the prod build / on Vercel vs locally.
I would still expect some speed boost from the cache, as Jamie pointed out we don't have edge caching yet.