Ismayl
Ismayl2h ago

Private LLM - Agent Components

Hi, I am exploring how to use the Convex Agent component for building an AI agent, as outlined in your documentation. Specifically, I want to clarify whether it is possible to define a custom language model in the languageModel parameter of the Agent constructor, using a privately hosted LLM, such as Ollama, deployed in a VPC. Does the Agent component support only predefined models like OpenAI and Grok, or can it be configured to integrate with a custom chat model hosted on our infrastructure? If custom integration is supported, could you give me some guidance or an example of how to implement this? Thank you so much for your help.
1 Reply
Convex Bot
Convex Bot2h ago
Thanks for posting in <#1088161997662724167>. Reminder: If you have a Convex Pro account, use the Convex Dashboard to file support tickets. - Provide context: What are you trying to achieve, what is the end-user interaction, what are you seeing? (full error message, command output, etc.) - Use search.convex.dev to search Docs, Stack, and Discord all at once. - Additionally, you can post your questions in the Convex Community's <#1228095053885476985> channel to receive a response from AI. - Avoid tagging staff unless specifically instructed. Thank you!

Did you find this page helpful?