In 2000 I learned about this old technology called "neural networks".
AI really depends on long winters and rare breakthroughs. Deep neural network was the most recent breakthrough.
The iterations you currently see it just adding more storage, but the fundamental neural network structure doesn't change.
I'm confident AGI will not be achieved by the LLM architecture, and when the next AI breakthrough is, is anyones guess. But if you take history into account, it will take a while.
Yes, same. In the late 90s through early aughts then I was taught over and over and over again that neural networks were a dead end concept and would never amount to anything.
Just like all the preceding AI booms, this one will hit its maximal point, the hype train will fizzle, the best parts will just become "normal", and then a couple of decades later something new will come to push the boundary again.