logoalt Hacker News

mitthrowaway2today at 1:41 AM1 replyview on HN

Think hard about this. Does that seem to you like it's likely to be a physical law?

First of all, it's not necessary for one person to build that super-intelligence all by themselves, or to understand it fully. It can be developed by a team, each of whom understands only a small part of the whole.

Secondly, it doesn't necessarily even require anybody to understand it. The way AI models are built today is by pressing "go" on a giant optimizer. We understand the inputs (data) and the optimizer machine (very expensive linear algebra) and the connective structure of the solution (transformer) but nobody fully understands the loss-minimizing solution that emerges from this process. We study these solutions empirically and are surprised by how they succeed and fail.

We may find we can keep improving the optimization machine, and tweaking the architecture, and eventually hit something with the capacity to grow beyond our own intelligence, and it's not a requirement that anyone understands how the resulting model works.

We also have many instances in nature and history of processes that follow this pattern, where one might expect to find a similar "law". Mammals can give birth to children that grow bigger than their parents. We can make metals puter than the crucible we melted them in. We can make machines more precise than the machines that made those parts. Evolution itself created human intelligence from the repeated application of very simple rules.


Replies

slopinthebagtoday at 2:51 AM

> Think hard about this. Does that seem to you like it's likely to be a physical law?

Yes, it seems likely to me.

It seems like the ultimate in hubris to assume we are capable of creating something we are not capable of ourselves.

show 1 reply