Private LLM - Agent Components
I am exploring how to use the Convex Agent component for building an AI agent, as outlined in your documentation. Specifically, I want to clarify whether it is possible to define a custom language model in the languageModel parameter of the Agent constructor, using a privately hosted LLM, such as Ollama, deployed in a VPC.
Does the Agent component support only predefined models like OpenAI and Grok, or can it be configured to integrate with a custom chat model hosted on our infrastructure? If custom integration is supported, could you give me some guidance or an example of how to implement this?
Thank you so much for your help.
