logoalt Hacker News

qingcharlesyesterday at 5:42 AM1 replyview on HN

Claude Desktop and Code are built for synchronous, human-in-the-loop interactions. Scraping 3000 janky municipal websites, you need a "fire-and-forget" background worker. Claw lets you kick off a massive job and just get a ping when it's done.

I'd also instantly hit Claude Desktop's rate limits with this I reckon. Since Claw uses APIs, you bypass those limits and can route the messy scraping to cheap models, saving expensive ones for the actual analysis. It also handles Playwright integration and state persistence out of the box so a crash doesn't wipe your progress.

If I'm wrong, I'm open to learning. I'm as new to this as everyone :)


Replies

dgb23yesterday at 8:11 AM

I would first automate everything with scripts, and only use an agent for the parts that require it.

For example you mentioned playwright? That can be automated. It doesn’t need to be a free form tool that the agent uses at will.

If that means the scripts need to be adopted to changes, then that’s a separate, controlled workflow.

This approach can save you a ton of tokens, increasee reliability and observability, and it saves compute as well.

Sometimes it‘s useful to let the agent do things fully agentic, so you can then iteratively extract the deterministic parts.