How does Convex access the local large oracle model API (such as GLM3)?
I have trained a GLM3 model locally and provided an API interface. How can I use Convex to integrate with it? I would be immensely grateful if someone could help me.👀
4 Replies
You can run something that exposes your local API to the convex backend. This project outlines how to use
cloudflared
: https://github.com/recursal/ai-town-rwkv-proxy#step-3---deploy-the-ai-town-proxy-via-cloudflared
I've also used ngrok
and want to try https://tunnel.devTunnel - Dev tool for real-time product feedback
Tunnel enables product teams to collaborate using live previews of local applications. With a single URL, your team can iterate rapidly on websites, APIs, webhooks, and more.
GitHub
GitHub - recursal/ai-town-rwkv-proxy: Run a large AI town, locally,...
Run a large AI town, locally, via RWKV ! Contribute to recursal/ai-town-rwkv-proxy development by creating an account on GitHub.
The trick is to give the convex backend a URL that points to your local machine by using a service that tunnels your connect.
You're very welcome! I appreciate your response. I will now try to follow your instructions and proceed with the operation.😆
Thank you very much. Just this morning, I saw your update on github. I have implemented the local LLm deployment of Ai-town using Ollama and ngrok.
