logoalt Hacker News

hn_acc1today at 7:12 PM4 repliesview on HN

Sure, but.. I've been coding for 40 years and I don't know everything. To me, a LOT depends on what the plumber asked chatgpt about. For example: building codes in that city, to figure out what his options are - like, is he allowed to just put in any old toilet, or is there a gpf restriction? What's the replacement part number for faucet XYZ's gasket? Those seem reasonable.

"how do I fix a clogged toilet?" would be bad..


Replies

sidrag22today at 7:41 PM

I cling a bit to a prompt i sent a while ago about just tossing a chopped pepper into a recipe for baked ziti. I had a recipe that i followed fairly tightly with slight changes to see how they would work out each time. Instead of prompting "when should i add chopped bell pepper?" the small change of just, "what are my options for when to add chopped bell pepper?" opened up a variety of different methods i could try when returning to that recipe, and decide what i like best based on the outcome.

The first prompt style is I think a way society towards drifts incidentally towards a less interesting one, with less variety in solutions. The second one i think allows people to still exercise their potential to try a variety of things and keep that variety.

SirMastertoday at 7:37 PM

>like, is he allowed to just put in any old toilet, or is there a gpf restriction?

And if the LLM gets that wrong? It's his job to know the codes or how to go to a reliable resource to find out the correct codes.

show 2 replies
alpinismetoday at 7:26 PM

Presumably in his jurisdiction he should know what official resources to consult. But the point about it depending on his question is definitely fair.

theappsecguytoday at 7:17 PM

[dead]