logoalt Hacker News

Greedyesterday at 3:38 PM4 repliesview on HN

If 40k is the barrier to entry for impressive, that doesn't really sell the usecase of local LLMs very well.

For the same price in API calls, you could fund AI driven development across a small team for quite a long while.

Whether that remains the case once those models are no longer subsidized, TBD. But as of today the comparison isn't even close.


Replies

jazzyjacksonyesterday at 4:14 PM

It’s what a small business might have paid for an onprem web server a couple of decades ago before clouds caught on. I figure if a legal or medical practice saw value in LLMs it wouldn’t be a big deal to shove 50k into a closet

show 1 reply
blackqueerirohtoday at 2:58 AM

Sure, but now double the team size. Double it again.

Suddenly that $40k is quite reasonable because you’ll never pay another dollar for st least 2-3 years.

ttoinouyesterday at 3:48 PM

With M3 Max with 64GB of unified ram you can code with a local LLM, so the bar is much lower

show 1 reply
spacedcowboyyesterday at 6:42 PM

It's not. I've got a single one of those 512GB machines and it's pretty damn impressive for a local model.

show 1 reply