logoalt Hacker News

array_key_firstyesterday at 4:38 PM1 replyview on HN

Well it depends on the task. For agentic coding, more is more, but for tasks that normal consumers use them for there really is a ceiling. OCR, text to speech, that type of thing doesn't really improve when going to a SOTA model, so you'd just be wasting your money. I think local LLMs have more value than software engineers give them credit for.


Replies

tuananhtoday at 3:03 AM

totally agree with that. local llm doesn't need to match SOTA performance in order to be useful.