logoalt Hacker News

antonvsyesterday at 9:03 PM2 repliesview on HN

> Why do they often make completely unintuitive decisions

Most likely because you haven't constrained their behavior in your prompt. You're making the assumption that they "understand" that using best practices is what you want. You have to tell them that, and tell them which practices they should use.


Replies

datsci_est_2015yesterday at 9:36 PM

They already fail consistently follow very simple and concrete instructions like “Please do not ever mock this object, always properly construct it in your tests”, so I’m not sure how they’re going to adhere to more vague and conceptual architectural paradigms. This is a problem with generative AI in general - image generation has similar limitations.

antihipocratyesterday at 10:01 PM

Senior developers know what behavior to constrain.

If incorrect LLM output is a prompt issue then demand for experienced developers will remain, and demand may actually increase as time passes.