logoalt Hacker News

keiferskiyesterday at 5:13 PM2 repliesview on HN

Right now, 30 seconds ago, I asked ChatGPT to tell me about a book I found that was written in the 60s.

It made up the entire description. When I pointed this out, it apologized and then made up another description.

The idea that this is going to lead to superintelligence in a few years is absolutely nonsense.


Replies

i_think_soyesterday at 7:52 PM

Is that because this book is obscure and no human has yet written a description that could be scraped?

hirvi74yesterday at 5:36 PM

The other day I asked Claude Opus 4.6 one of my favorite trivia pieces:

What plural English word for an animal shares no letters with its singular form? Collective nouns (flock, herd, school, etc.) don't count.

Claude responded with:

"The answer is geese -- the plural of cow."

Though, to be fair, in the next paragraph of the response, Claude stated the correct answer. So, it went off the rails a bit, but self-corrected at least. Nevertheless, I got a bit of a chuckle out of its confidence in its first answer.

I asked GPT 5.2 the same question and it nailed the answer flawlessly. I wouldn't extrapolate much about the model quality based on this answer, but I thought it was interesting still.

(For those curious, the answer is 'kine' (archaic plural for cow).

show 1 reply