Why no prebuilt convex-local-backend for windows?
The AI Town project finally supporting local is amazing and has so much potential to go viral, but at the moment it's practically impossible to run it on Windows (unless you jump through all kinds of hoops to run stuff in WSL, which most people won't do). Is there a reason why the Windows prebuilt version is not provided?
14 Replies
Just to reiterate how meaningful I think this is, when I first came across the AI Town project I didn't even bother because I don't want to waste my money paying for OpenAI API, and I'm sure most people felt the same and only talked about it instead of actually trying it out. But with Llama3 support, the entire thing is 100% free to experiment with, which is why I think there's a huge potential here, but it has to support ALL OS out of the box, so everyone can play with this (A lot of AI enthusiasts are Windows users, not Mac users) Would really love to get this working on windows without hassle, and the only thing that's blocking this is the lack of the prebuilt binary.
Have you had trouble building it from source on Windows? I believe you could build a windows binary for whatever versions you want to support and distribute it. it should be portable because it is rust, but I don’t have much windows experience.
Someone in #open-source has built it on windows so it’s possible- the readme has build instructions. Other than a long rust build the steps are quick
I’m curious about your pinokio project- do you distribute packaged up binaries from there? You could bundle up ollama & a convex binary you build I imagine?
if someone has built it, he should share it, I did try to build it but it's not easy to build, because of some dependencies that fail to compile when using Microsoft Visual C++. You have to use some very specific configuration, otherwise it fails. And I tried all kinds of different configurations but gave up in the end because it's too much work, i'm just a user, i'm just wondering why the Convex team is not distributing this, windows is not some obscure platform, in fact a huge chunk of AI ecosystem use Windows because theyh ave to use Nvidia GPUs (way more than Macs). So the reason I was asking this was because, IF this is possible at all, all that's needed is ONE person who successfully built a binary for windows x86-64, and everyone can use that binary
i'm just curious why this hasn't happened
I actually stumbled upon this while building a 1 click launcher for AI Town within Pinokio. i already have it running on my mac.
I believe one reason is that you can compile the rest from Unix which is what our infrastructure runs on. We’d need to manage a windows server just to do builds I believe
But then tried to get it to work on windows and spent like a whole day on it, and gave up
Sorry to hear that! The #open-source channel is a good spot to ask the community for support
The way Pinokio works is, it does NOT package everything in the binary. it's an automation engine that can automate anything on your computer using a script, which can be triggered through user friendly UI, a browser. So for example, AI Town, I wrote a script to download the convex local server binary, run npm install, create a conda environment so it runs node 18 (looks like for some reason Convex doesn't support higher version of node), and so on.
Anyway I really think Convex team should really look into this to see if it's possible to build at least one prebuilt version that's working for Windows. Of course it would be better to have an automated build pipeline for every change, but right now missing out on really huge opportunity by not supporting windows even though the AI Town project can be a really cool project that can showcase the tech.
I say this as someone who have just played around with it for a while on my mac, where it's running fine. This AI Town project is really cool, and with the llama3 support, it's a completely different game and this shouldn't be treated like just a dev tool, but something that anyone can use)
I will probably release the AI Town launcher one way or another (The way it works is, anyone who can click a button can instantly install and run AI Town in just a minute, withouth messing with the terminal and stuff), but just thought it would have the biggest impact if it works on all platforms out of the box (Especially since gemerative AI people use Windows more than Mac) Hope this make sense.
I totally agree windows should be an easy platform to work on. Quick question for your usecase:
If you had a windows binary from anyone who’s built it on windows, could you just upload it somewhere and have your script download it from there?
One alternative idea is to just put the Linux binary in a docker container and run that on windows- does your script allow for running docker containers?
If you had a windows binary from anyone who’s built it on windows, could you just upload it somewhere and have your script download it from there?I already do this for mac/linux https://github.com/cocktailpeanutlabs/aitown/blob/main/install.js#L24-L36
GitHub
aitown/install.js at main · cocktailpeanutlabs/aitown
Contribute to cocktailpeanutlabs/aitown development by creating an account on GitHub.
IMO Docker is too heavy
both tech-wise and UX wise
Gotcha. I'll discuss with the team more on Monday about what we can do for Windows. Thanks for your patience!
Awesome, thank you!
Want to add your 1-click install to the ai-town README btw? Feel free to submit a PR
oh thank you! was going to do that but looks like someone already added it 🙂