logoalt Hacker News

simonaskyesterday at 2:10 PM3 repliesview on HN

This is magical thinking.

LLMs are physically incapable of generating something “well thought out”, because they are physically incapable of thinking.


Replies

mikkupikkuyesterday at 5:32 PM

I don't care if the machine has a soul, I only care what the machine can produce. With good prompting, the machine produces more ""thoughtful"" results. As an engineer, that's all I care about.

Marha01yesterday at 4:46 PM

It is magical thinking to claim that LLMs are definitely physically incapable of thinking. You don't know that. No one knows that, since such large neural networks are opaque blackboxes that resist interpretation and we don't really know how they function internally.

You are just repeating that because you read that before somewhere else. Like a stochastic parrot. Quite ironic. ;)

show 1 reply