logoalt Hacker News

liftylast Monday at 8:53 PM4 repliesview on HN

Perhaps a meta evolution, they become experts at writing harnesses and prompts for discovering and patching vulnerabilities in existing code and software. My main interest is, now that we have LLMs, will the software industry move to adopting techniques like formal verification and other perhaps more lax approaches that massively increase the quality of software.


Replies

lelanthranyesterday at 12:00 PM

> Perhaps a meta evolution, they become experts at writing harnesses and prompts

Harnesses, maybe, but prompts?

There's still this belief amongst AI coders that they can command a premium for development because they can write a prompt better than Bob from HR, or Sally from Accounting.

When all you're writing are prompts, your value is less than it was before., because the number of people who can write the prompt is substantially more than the number of people who could program.

sputknickyesterday at 1:22 PM

I agree with this take. Nothing changes, everything just evolves. Been happening for 60 years, will (likely) continue to happen for the next 60 years.

nickpsecurityyesterday at 2:49 AM

Also, synthetic data and templates to help them discover new vulnerabilities or make agents work on things they're bad at. They differentiate with their prompts or specialist models.

Also, like ForAllSecure's Mayhem, I think they can differentiate on automatic patching that's reliable and secure. Maybe test generation, too, that does full coverage. They become drive by verification and validation specialists who also fix your stuff for you.

habineroyesterday at 12:36 AM

Testing exists.

> formal verification

Outside of limited specific circumstances, formal verification gives you nothing that tests don't give you, and it makes development slow and iteration a chore. People know about it, and it's not used for lot of reasons.