logoalt Hacker News

reliabilityguyyesterday at 10:13 PM2 repliesview on HN

Your objective has explicit instruction that car has to be present for a wash. Quite a difference from the original phrasing where the model has to figure it out.


Replies

J_cstyesterday at 10:48 PM

That's the answer of his LLM which has decomposed the question and built the answer following the op prompt obviously. I think you didn't get it.

show 1 reply
bwat49yesterday at 10:43 PM

> Your objective has explicit instruction that car has to be present for a wash.

Which is exactly how you're supposed to prompt an LLM, is the fact that giving a vague prompt gives poor results really suprising?

show 1 reply