yay -S llama.cpp
I just installed llama.cpp on CachyOS after reading this article. It’s much faster and better than Ollama.