logoalt Hacker News

seemazeyesterday at 3:24 PM1 replyview on HN

I realize it does not address the OP security concerns, but I'm having success running rocm containers[0] on alpine linux specifically for llama.cpp. I also got vLLM to run in a rocm container, but I didn't have time to to diagnose perf problems, and llama.cpp is working well for my needs.

[0] https://github.com/kyuz0/amd-strix-halo-toolboxes


Replies

WhyNotHugoyesterday at 10:08 PM

FWIW, Alpine now has native packages for llama.cpp (using Vulkan).