With the recent AI craze, it’s fun to play with local models. I like ollama, which lets you easily download and interact with various models. Thanks to this recently merged PR, it’s now trivial to run your own ollama service on nixos:
services.ollama.enable = true;
Adding this to your host config and rebuilding is all it takes to start the service, and you can then interact with it via the cli or rest interface. For instance to talk to the mistral model, you can do:
On first run it will automatically download the model, and then you can interact with it. They support a bunch of models, so have fun exploring! If you are of that inclination, there’s also a neovim plugin.