Ellama, a emacs AI helper.

1 minute read

Installation of Ollama.

In my case, I installed ollama from my package manager, although you can install it using the following code:

1curl -fsSL https://ollama.com/install.sh | sh

Once ollama is installed, you should add it to your startup. In my case, I do it from the init of hyprland, but it will depend on whether you use .xinitrc or something else. It should be launched as follows:

1ollama serve

Installing a model.

To use Ollama, we need to download a model. A model is basically who you will be talking to when making a query. In the case of ellama, they suggest installing zephyr as the model, so we need to install it.

1ollama pull zephyr

Installing ellama.

Once ollama & zephyr are installed, we can proceed to install and configure ellama in Emacs.

My ellama configuration.

1(use-package ellama
2  (setopt ellama-language "English")
3  (setopt ellama-user-nick "jpacheco.xyz")
4  (setopt ellama-keymap-prefix "C-c e")
5  (require 'llm-ollama)
6  (setopt ellama-provider
7          (make-llm-ollama
8           "zephyr"
9           "zephyr")))

As you can see, it’s not very complicated. The options for the ellama package are quite intuitive, like the language and keybindings. If you have any questions, feel free to comment, and we can follow up.