logoalt Hacker News

Nitionyesterday at 8:25 PM1 replyview on HN

And in real life you'd get them to clarify a weird question like this before you answered. I wonder if LLMs have just been trained too much into always having to try and answer right away. Even for programming tasks, more clarifying questions would often be useful before diving in ("planning mode" does seem designed to help with this, but wouldn't be needed for a human partner).


Replies

crackiyesterday at 11:45 PM

Absolutely!

I've been wondering for years how to make whatever LLM ask me stuff instead of just filling holes with assumptions and sprinting off.

User-configurable agent instructions haven't worked consistently. System prompts might actually contain instructions to not ask questions.

Sure there's a practical limit to how much clarification it ought to request, but not asking ever is just annoying.

show 1 reply