logoalt Hacker News

mogomanlast Wednesday at 8:40 PM3 repliesview on HN

can you recommend a setup with ollama and a cli tool? Do you know if I need a licence for Claude if I only use my own local LLM?


Replies

w4yaiyesterday at 6:33 AM

You must try GLM4.7 and KimiK2.5 !

I also highly suggest OpenCode. You'll get the same Claude Code vibe.

If your computer is not beefy enough to run them locally, Synthetic is a bless when it comes to providing these models, their team is responsive, no downtime or any issue for the last 6 months.

Full list of models provided : https://dev.synthetic.new/docs/api/models

Referal link if you're interested in trying it for free, and discount for the first month : https://synthetic.new/?referral=kwjqga9QYoUgpZV

alexhanslast Wednesday at 8:57 PM

What are your needs/constraints (hardware constraints definitely a big one)?

The one I mentioned called continue.dev [1] is easy to try out and see if it meets your needs.

Hitting local models with it should be very easy (it calls APIs at a specific port)

[1] - https://github.com/continuedev/continue

show 2 replies
drifkinlast Wednesday at 9:42 PM

we recently added a `launch` command to Ollama, so you can set up tools like Claude Code easily: https://ollama.com/blog/launch

tldr; `ollama launch claude`

glm-4.7-flash is a nice local model for this sort of thing if you have a machine that can run it

show 1 reply