sbklS
Convex Community14mo ago
6 replies
sbkl

Using gemini with openai library working in action but not http action

I followed this guide to integrate ai streaming in my convex app. But instead of using openai, I want to use gemini because of their larger context window (1M tokens for gemini-1.5-flash). I get a bad request answer just using the basic example from google docs here. However, the exact same code works in actions which I assume it is due to the fact that http action cannot run node packages? What I don't get is why it is supposed to work with openAI (cannot try as I am not located in a country where opeanai is available) but it doesn't work for gemini. But still works in actions. Any idea if there is a way to make it work?
Here is the code as an example (gemini api key created via google ai studio):

  const openai = new OpenAI({
    apiKey: env.GEMINI_API_KEY,
    baseURL: "https://generativelanguage.googleapis.com/v1beta/openai/",
  });
  const response = await openai.chat.completions.create({
    model: "gemini-1.5-flash",
    messages: [
        { role: "system", content: "You are a helpful assistant." },
        {
            role: "user",
            content: "Explain to me how AI works",
        },
    ],
  });
By leveraging HTTP actions with streaming, this chat app balances real-time responsiveness with efficient bandwidth usage. Users receive character-by-...
AI Chat with HTTP Streaming
Was this page helpful?