logoalt Hacker News

bigfishrunningyesterday at 8:02 PM1 replyview on HN

> Meanwhile.. just.. ask an LLM if you can mix certain cleaning chemicals safely.

the cost of the wrong answer to this question is so incredibly high that I hope nobody is sincerely asking an LLM for this information. The things people trust to "machine that gives convincing answers that are correct 90% of the time" continue to shock me


Replies

themafiayesterday at 8:57 PM

> is so incredibly high that I hope nobody is sincerely asking an LLM for this information

Google trumps the search results with it's LLM box. There's only one reason to do that. They know their audience is not engaging in discretion.

> The things people trust to "machine that gives convincing answers that are correct 90% of the time" continue to shock me

People are having intimate relationships with chat bots. There's a deeper sociological problem here.