My Convex web app is working flawlessly offline! (I’m 40K ft in the sky)
1. Forked the streaming-chat-gpt Convex template
2. Added endpoints for local LLM support
3. Implemented the convex backend locally, so
./convex-local-backend starts the server4. Tinkered with package.json to add a script so “npm run dev-local” avoids the convex cloud initialization and plays nice with my own backend
5. Added some “magic strings” so I can invoke @gpt to perform actions, like removing the last message exchange or clearing the table. (It’s a small PoC, but when it worked, even the dude to my left on the airplane joined to celebrate
I've been having fun adding features in-flight! Big shoutout to the open source Continue extension for my offline VSCode Copilot! (Starcoder 3B for tab completions, Llama3 and nomic-embed for the fancier troubleshooting)
I’m planning on using this web app to display a chatroom-like interaction between multiple agents running in a separate project, and it’s looking super promising!
Attached is a (likely blurry) photo from the plane :P

