llama farm
All this work is coordinated using Convex as a kind of job queue. This allows the LLM cluster to be scaled up, and nothing needs to expose ports to the Internet to participate.
Pretty neat idea!
Tweet: https://twitter.com/ianmacartney/status/1787663174394876000 (follow @ian ! )
Hosted: https://labs.convex.dev/llama-farm
Teh code: https://github.com/get-convex/llama-farm-chat
https://t.co/0aeOvbg6xj
- By default it's just you &
- Share the URL for others to join in.
- Streams to all users at once.
I'll add logins, rate limit, etc. if it gets traffic.
A couple
5/7/24, 1:58 AM
