Sorin
Sorin3mo ago

[components/agent] set chat model per request

Hello, Is there a path to set the model used per request as part of the action of sending the user promp? I tried some things, but I can't find a way to propagate it to the actual agent.
2 Replies
Convex Bot
Convex Bot3mo ago
Thanks for posting in <#1088161997662724167>. Reminder: If you have a Convex Pro account, use the Convex Dashboard to file support tickets. - Provide context: What are you trying to achieve, what is the end-user interaction, what are you seeing? (full error message, command output, etc.) - Use search.convex.dev to search Docs, Stack, and Discord all at once. - Additionally, you can post your questions in the Convex Community's <#1228095053885476985> channel to receive a response from AI. - Avoid tagging staff unless specifically instructed. Thank you!
Sorin
SorinOP3mo ago
found it under streamText:
const result = await supportAgent.streamText(
ctx,
{ threadId },
{ promptMessageId, model },
{
saveStreamDeltas: {
chunking: "line",
},
}
);
const result = await supportAgent.streamText(
ctx,
{ threadId },
{ promptMessageId, model },
{
saveStreamDeltas: {
chunking: "line",
},
}
);

Did you find this page helpful?