logoalt Hacker News

knallfroschyesterday at 5:56 AM2 repliesview on HN

If I write a software today that publishes a hit piece on you in 2 weeks time, will you accept that I bear no responsibility?

There's no accountability gap unless you create one.


Replies

brainwadyesterday at 8:58 AM

If the code you wrote appears to be for something completely different, say software to write patches for open source github projects - yes. Why would you bear responsibility for something that couldn't have been reasonably foreseen?

The interesting thing about LLMs is the unpredictable emergent behaviours. That's fundamentally different from ordinary, deterministic programs.

show 2 replies
ai_tools_dailyyesterday at 10:53 AM

That's a fair point. I think the distinction is between software that follows deterministic rules (your 2-week-delay scenario) vs agents that make autonomous decisions based on learned patterns. With traditional software, intent is clear and traceable. With AI agents, the operator may genuinely not know what the agent will do in novel situations. Doesn't absolve responsibility — but it does make the liability chain more complex. We probably need new frameworks that account for this, similar to how product liability evolved for physical goods.