logoalt Hacker News

DeathArrowtoday at 7:34 AM1 replyview on HN

I see no mention of vLLM in the article.


Replies

StrauXXtoday at 8:12 AM

vLLM isn't suitable for people running LLMs side-by-side with regular applications on their PC. It is very good at hosting LLMs for production on dedicated servers. For the prod usecase ollama/llamacpp are practically useless (but that's ok - it's not the projects goal to be).