logoalt Hacker News

drw85yesterday at 5:52 AM1 replyview on HN

Nice ChatGPT answer. Put some real thought and data in it too.


Replies

crazy5sheepyesterday at 4:40 PM

The whole point is that LLMs, especially the attention mechanism in transformers, have already paved the road to AGI. The main gap is the training data and its quality. Humans have generations of distilled knowledge — books, language, culture passed down over centuries. And on top of that we have the physical world — we watched birds fly, saw apples drop, touched hot things. Maybe we should train the base model with physical world data first, and then fine tune with the distilled knowledge.

show 1 reply