logoalt Hacker News

EagnaIonatyesterday at 8:36 AM1 replyview on HN

https://ollama.com

Although I'm starting to like LMStudio more, as it has more features that Ollama is missing.

https://lmstudio.ai

You can then get Claude to create the MCP server to talk to either. Then a CLAUDE.md that tells it to read the models you have downloaded, determine their use and when to offload. Claude will make all that for you as well.


Replies

shenyesterday at 4:39 PM

Which local models are you using for the 32gb MacBooks?

show 1 reply