logoalt Hacker News

jakemangertoday at 4:56 AM2 repliesview on HN

What's the GPU VRAM requirements for this thing?

Awesome to have a open model that can compete, but damn it would be so much better if you could run it locally. Otherwise, it's almost so difficult to run (e.g. self host) that it's just way more convenient to pay OpenAI, Claude, etc


Replies

DeathArrowtoday at 5:05 AM

>Otherwise, it's almost so difficult to run (e.g. self host) that it's just way more convenient to pay OpenAI, Claude, etc

Getting a coding plan from Kimi.com will make coding 20x cheaper than using Anthropic.

BTW, I am using it with Claude Code.

redrovetoday at 5:33 AM

[dead]