logoalt Hacker News

simonwlast Saturday at 1:20 AM2 repliesview on HN

These work well with local LLMs that are powerful enough to run a coding agent environment with a decent amount of context over longer loops.

The OpenAI GPT OSS models can drive Codex CLI, so they should be able to do this.

I have high hopes for Mistral's Devstral 2 but I've not run that locally yet.


Replies

bilekaslast Saturday at 1:52 AM

> These work well with local LLMs that are powerful enough to run a coding agent environment with a decent amount of context over longer loops.

That's actually super interesting, maybe something I'll try investigate and find the minimum requirements because as cool as they seem, personalized 'skills' might be a more useful use of AI overall.

Nice article, and thanks for answering.

Edit: My thinking is consumer grade could be good enough to run this soon.

ipaddrlast Saturday at 2:19 AM

Something that powerful requires some rewriting of the house.

Local LLMs are better for long batch jobs not things you want immediately or your flow gets killed.