punn
punn11mo ago

Websocket endpoint

Hi all, is it possible to expose a websocket endpoint in Convex? Code snippet from express we're trying to emulate
import { RawData, WebSocket } from "ws";
import { Request } from "express";

var express = require('express');
var app = express();
var expressWs = require('express-ws')(app);
const port = 3000

// Your other API endpoints
app.get('/', (req, res) => {
res.send('Hello World!')
})

app.ws("/llm-websocket/:call_id",
async (ws: WebSocket, req: Request) => {
// callId is a unique identifier of a call, containing all information about it
const callId = req.params.call_id;

ws.on("error", (err) => {
console.error("Error received in LLM websocket client: ", err);
});
ws.on("message", async (data: RawData, isBinary: boolean) => {
// Retell server will send transcript from caller along with other information
// You will be adding code to process and respond here
});
},
);

app.listen(port, () => {
console.log(`Example app listening on port ${port}`)
});
import { RawData, WebSocket } from "ws";
import { Request } from "express";

var express = require('express');
var app = express();
var expressWs = require('express-ws')(app);
const port = 3000

// Your other API endpoints
app.get('/', (req, res) => {
res.send('Hello World!')
})

app.ws("/llm-websocket/:call_id",
async (ws: WebSocket, req: Request) => {
// callId is a unique identifier of a call, containing all information about it
const callId = req.params.call_id;

ws.on("error", (err) => {
console.error("Error received in LLM websocket client: ", err);
});
ws.on("message", async (data: RawData, isBinary: boolean) => {
// Retell server will send transcript from caller along with other information
// You will be adding code to process and respond here
});
},
);

app.listen(port, () => {
console.log(`Example app listening on port ${port}`)
});
7 Replies
punn
punnOP11mo ago
@ballingt I noticed another thread mentioning a new protocol you were working on
Michal Srb
Michal Srb11mo ago
It's not possible to expose a custom websocket from a Convex backend. Does the LLM provider not have a webhook as an option?
punn
punnOP11mo ago
We're implementing voice calls so we'd like to respond back with our LLM decision quickly do you suggest another provider that you've seen people use?
Indy
Indy11mo ago
To be clear. Convex does not currently let you expose a custom websocket endpoint from something like happy-animal-123.convex.site like we do for https. Tom is simply working on replacing the python client to use the rust client underneath which uses websockets. So that may have felt related but is unrelated. Some other folks have had luck using fly.io to host a websocket server.
punn
punnOP11mo ago
sounds good we'll explore that and connect to our db with the python client
ian
ian11mo ago
We're implementing voice calls so we'd like to respond back with our LLM decision quickly
Have you found http webhook latency to be too slow here? Especially if there’s an api to respond with the reply I haven’t had issues. Eg twilio call api for phone calls uses http and I’ve had good luck with it
punn
punnOP10mo ago
We'll give that a shot too

Did you find this page helpful?