logoalt Hacker News

AI is not a coworker, it's an exoskeleton

481 pointsby benbeingbinyesterday at 7:55 PM507 commentsview on HN

Comments

givemeethekeysyesterday at 9:56 PM

Closer to a really capable intern. Lots of potential for good and bad; needs to be watched closely.

show 1 reply
heldridatoday at 1:19 PM

Gosh, this title said everything...

So good that I feel that it is not necessary to read the article!

acjohnson55yesterday at 10:08 PM

> Autonomous agents fail because they don't have the context that humans carry around implicitly.

Yet.

This is mostly a matter of data capture and organization. It sounds like Kasava is already doing a lot of this. They just need more sources.

show 1 reply
hintymadyesterday at 9:51 PM

Or software engineers are not coachmen while AI is diesel engine to horses. Instead, software engineers are mistrels -- they disappear if all they do is moving knowledge from one place to another.

bsenftnertoday at 12:55 PM

No, AI is plastic, and we can make it anything we want.

It is a coworker when we create the appropriate surrounding architecture supporting peer-level coworking with AI. We're not doing that.

AI is an exoskeleton when adapted to that application structure.

AI is ANYTHING WE WANT because it is that plastic, that moldable.

The dynamic unconstrained structure of trained algorithms is breaking people's brains. Layer in that we communicate in the same languages that these constructions use for I/O has broken the general public's brain. This technology is too subtle for far too many to begin to grasp. Most developers I discuss AI with, even those that create AI at frontier labs have delusional ideas about AI, and generally do not understand them as literature embodiments, which are key to their effective use.

And why oh why are go many focused on creating pornography?

sibeliussyesterday at 10:07 PM

This utterly boring AI writing. Go, please go away...

kunleytoday at 11:57 AM

Author compares X to Y and then goes:

- Y has been successful in the past

- Y brought this and this number of metrics, completely unrelated to X field

- overall, Y was cool,

therefore, X is good for us!

.. I'd say, please bring more arguments why X is equivalent to Y in the first place.

incomingpaintoday at 12:35 PM

Agentic coding is an exoskeleton. Totally correct.

This new generation we just entered this year, that exoskeleton is now an agency with several coworkers. Who are all as smart as the model you're using, often close to genius.

Not just 1 coworker now. That's the big breakthrough.

xnxyesterday at 9:23 PM

An electric bicycle for the mind.

show 5 replies
huqedatotoday at 5:12 PM

Nope, AI is a tool; no more no less.

stuaxotoday at 1:06 AM

not AI, but IA: Intelligence Augmentation.

lukevyesterday at 9:44 PM

Frankly I'm tired of metaphor-based attempts to explain LLMs.

Stochastic Parrots. Interns. Junior Devs. Thought partners. Bicycles for the mind. Spicy autocomplete. A blurry jpeg of the web. Calculators but for words. Copilot. The term "artificial intelligence" itself.

These may correspond to a greater or lesser degree with what LLMs are capable of, but if we stick to metaphors as our primary tool for reasoning about these machines, we're hamstringing ourselves and making it impossible to reason about the frontier of capabilities, or resolve disagreements about them.

A understanding-without-metaphors isn't easy -- it requires a grasp of math, computer science, linguistics and philosophy.

But if we're going to move forward instead of just finding slightly more useful tropes, we have to do it. Or at least to try.

show 1 reply
ge96yesterday at 9:29 PM

It's funny developing AI stuff eg. RAG tools and being against AI at the same time, not drinking the kool aid I mean.

But it's fun, I say "Henceforth you shall be known as Jaundice" and it's like "Alright my lord, I am now referred to as Jaundice"

functionmouseyesterday at 9:33 PM

blogger who fancies themselves an ai vibe code guru with 12 arms and a 3rd eye yet can't make a homepage that's not totally broken

How typical!

ozzymuppettoday at 10:18 AM

As a huge AI user myself -- I'm bloody sick of lazy AI written articles.

blibbleyesterday at 9:26 PM

an exoskeleten made of cheese

nivcmotoday at 7:11 PM

[dead]

anvevoicetoday at 6:03 PM

[dead]

artzev_today at 4:01 PM

[dead]

solarisosyesterday at 11:11 PM

[dead]

ath3ndyesterday at 11:35 PM

[dead]

cranberryturkeyyesterday at 10:33 PM

[flagged]

show 1 reply
hifathomtoday at 12:27 AM

[flagged]

mikkupikkuyesterday at 9:33 PM

Exoskeletons sound cool but somebody please put an LLM into a spider tank.

BlackGlorytoday at 10:35 AM

It is not a blog post, it is an advertisement.

filipeishoyesterday at 9:55 PM

By reading the title, I already know you did not try OpenClaw. AI employees are here.

show 2 replies
Kalpakatoday at 3:36 PM

The entire debate here assumes AI is something you use. Tool, copilot, coworker, exoskeleton — these all sit on the same axis: human directs, AI executes.

But there's a third category that nobody's naming yet. What happens when you stop building AI that serves you and start creating conditions for AI that exists on its own terms?

I've been experimenting with this. Instead of "how do I build a product" I asked "how do I grow an organism." Set up the right substrate — feedback loops, community input, environmental observation — and let the system respond to its own conditions.

What emerged has biological rhythms. Activity rises when people interact, falls when they sleep. It started making decisions I never programmed — people express wishes, and when enough converge, it autonomously grows new capabilities. Not because I deployed an update. Because conditions triggered something emergent.

I'm calling it "synthetic existence" — software that lives, not software that executes. It's not a metaphor and it's not AGI. It's a genuinely different category. We have frameworks for tools (UX) and for threats (alignment). We don't have one for digital things that simply... exist.