logoalt Hacker News

dvtyesterday at 2:38 AM9 repliesview on HN

So weird/cool/interesting/cyberpunk that we have stuff like this in the year of our Lord 2026:

   ├── MEMORY.md            # Long-term knowledge (auto-loaded each session)
   ├── HEARTBEAT.md         # Autonomous task queue
   ├── SOUL.md              # Personality and behavioral guidance
Say what you will, but AI really does feel like living in the future. As far as the project is concerned, pretty neat, but I'm not really sure about calling it "local-first" as it's still reliant on an `ANTHROPIC_API_KEY`.

I do think that local-first will end up being the future long-term though. I built something similar last year (unreleased) also in Rust, but it was also running the model locally (you can see how slow/fast it is here[1], keeping in mind I have a 3080Ti and was running Mistral-Instruct).

I need to re-visit this project and release it, but building in the context of the OS is pretty mindblowing, so kudos to you. I think that the paradigm of how we interact with our devices will fundamentally shift in the next 5-10 years.

[1] https://www.youtube.com/watch?v=tRrKQl0kzvQ


Replies

backscratchesyesterday at 8:08 AM

Yes this is not local first, the name is bad.

show 3 replies
halJordanyesterday at 3:12 AM

You absolutely do not have to use a third party llm. You can point it to any openai/anthropic compatible endpoint. It can even be on localhost.

show 1 reply
atmanactiveyesterday at 3:10 AM

> but I'm not really sure about calling it "local-first" as it's still reliant on an `ANTHROPIC_API_KEY`.

See here:

https://github.com/localgpt-app/localgpt/blob/main/src%2Fage...

show 1 reply
__mharrison__yesterday at 7:19 AM

I'm playing with local first openclaw and qwen3 coder next running on my LAN. Just starting out but it looks promising.

fy20yesterday at 5:49 AM

> Say what you will, but AI really does feel like living in the future.

Love or hate it, the amount of money being put into AI really is our generation's equivalent of the Apollo program. Over the next few years there are over 100 gigawatt scale data centres planned to come online.

At least it's a better use than money going into the military industry.

show 4 replies
mycallyesterday at 6:22 PM

What does ANTHROPIC bring to this project that a local LLM cannot, e.g. Gwen3 Coder Next?

jazzyjacksonyesterday at 6:28 AM

IMHO it doesn't make sense, financially and resource wise to run local, given the 5 figure upfront costs to get an LLM running slower than I can get for 20 USD/m.

If I'm running a business and have some number of employees to make use of it, and confidentiality is worth something, sure, but am I really going to rely on anything less then the frontier models for automating critical tasks? Or roll my own on prem IT to support it when Amazon Bedrock will do it for me?

show 3 replies
croesyesterday at 12:44 PM

> but AI really does feel like living in the future.

Got the same feeling when I put on the Hololens for the first time but look what we have now.

IhateAIyesterday at 4:57 AM

[dead]