logoalt Hacker News

songtoday at 8:38 AM1 replyview on HN

So, on a mac, what good alternative to ollama supports mlx for acceleration? My main use case is that I have an old m1 max macbook pro with 64 gb ram that I use as a model server.


Replies

wrxdtoday at 9:12 AM

I read good things about https://omlx.ai but I don't really know enough of the ecosystem to know if there are better options.

If someone has opinions please let us know!