logoalt Hacker News

denismitoday at 7:42 AM2 repliesview on HN

Hmm..

  pacman -Ss ollama | wc -l                                                                                                              
  16
  pacman -Ss llama.cpp | wc -l
  0
  pacman -Ss lmstudio | wc -l
  0
Maybe some day.

Replies

mongreliontoday at 8:06 AM

llama.cpp moves too quickly to be added as a stable package. Instead, you can get it directly from AUR: https://aur.archlinux.org/packages?O=0&K=llama.cpp

There are packages for Vulkan, ROCm and CUDA. They all work.

FlyingSnaketoday at 8:41 AM

yay -S llama.cpp

I just installed llama.cpp on CachyOS after reading this article. It’s much faster and better than Ollama.