logoalt Hacker News

The Singularity will occur on a Tuesday

825 pointsby ectoyesterday at 5:04 PM470 commentsview on HN

Comments

vagrantstreetyesterday at 6:35 PM

Was expecting some mention of Universal Approximation Theorem

I really don't care much if this is semi-satire as someone else pointed out, the idea that AI will ever get "sentient" or explode into a singularity has to die out pretty please. Just make some nice Titanfall style robots or something, a pure tool with one purpose. No more parasocial sycophantic nonsense please

daveguyyesterday at 9:31 PM

What I want to know is how bitcoin going full tulip and Open AI going bankrupt will affect the projection. Can they extrapolate that? Extrapolation of those two event dates would be sufficient, regardless of effect on a potential singularity.

bradgessleryesterday at 9:13 PM

What time?

show 1 reply
singularfuturyesterday at 9:33 PM

The singularity is always scheduled for right after the current funding round closes but before the VCs need liquidity. Funny how that works.

nurettinyesterday at 9:04 PM

With this kind of scientific rigour, the author could also prove that his aunt is a green parakeet.

ck2yesterday at 9:03 PM

Does "tokens per dollar" have a "moore's law" of doubling?

Because while machine-learning is not actually "AI" an exponential increase in tokens per dollar would indeed change the world like smartphones once did

bitwizeyesterday at 8:28 PM

Thus will speak our machine overlord: "For you, the day AI came alive was the most important day of your life... but for me, it was Tuesday."

bradoryesterday at 8:13 PM

100% an AI wrote this. Possibly specifically to get to the top spot on HN.

Those short sentences are the most obvious clue. It’s too well written to be human.

show 1 reply
Night_Thastusyesterday at 8:22 PM

This'll be a fun re-read in ~5 years when most of this has ended up being a nothing burger. (Minus one or two OK use-cases of LLMs)

boca_honeyyesterday at 6:47 PM

Friendly reminder:

Scaling LLMs will not lead to AGI.

show 1 reply
cubefoxyesterday at 6:41 PM

A similar idea occurred to the Austrian-Americam cyberneticist Heinz von Foerster in a 1960 paper, titled:

  Doomsday: Friday, 13 November, A.D. 2026
There is an excellent blog post about it by Scott Alexander:

"1960: The Year The Singularity Was Cancelled" https://slatestarcodex.com/2019/04/22/1960-the-year-the-sing...

CGMthrowawayyesterday at 9:52 PM

> 95% CI: Jan 2030–Jan 2041

u8rghuxehuiyesterday at 10:03 PM

hi

hhhyesterday at 10:41 PM

this just feels like ai psychosis slop man

pickleRick243yesterday at 9:09 PM

LLM slop article.

apiyesterday at 7:40 PM

This really looks like it's describing a bubble, a mania. The tech is improving linearly, and most of the time such things asymptote. It'll hit a point of diminishing returns eventually. We're just not sure when.

The accelerating mania is bubble behavior. It'd be really interesting to have run this kind of model in, say, 1996, a few years before dot-com, and see if it would have predicted the dot-com collapse.

What this is predicting is a huge wave of social change associated with AI, not just because of AI itself but perhaps moreso as a result of anticipation of and fears about AI.

I find this scarier than unpredictable sentient machines, because we have data on what this will do. When humans are subjected to these kinds of pressures they have a tendency to lose their shit and freak the fuck out and elect lunatics, commit mass murder, riot, commit genocides, create religious cults, etc. Give me Skynet over that crap.

EloniousBlamiusyesterday at 8:33 PM

[dead]

AldenOnTheGridyesterday at 8:54 PM

[dead]

csmclassyesterday at 10:00 PM

[dead]

789bc7wassadyesterday at 7:31 PM

[dead]

tempaccountabcdyesterday at 6:28 PM

[dead]

u8rghuxehuiyesterday at 10:03 PM

[flagged]

AndrewKemendoyesterday at 6:26 PM

Y’all are hilarious

The singularity is not something that’s going to be disputable

it’s going to be like a meteor slamming into society and nobody’s gonna have any concept of what to do - even though we’ve had literal decades and centuries of possible preparation

I’ve completely abandoned the idea that there is a world where humans and ASI exist peacefully

Everybody needs to be preparing for the world where it’s;

human plus machine

versus

human groups by themselves

across all possible categories of competition and collaboration

Nobody is going to do anything about it and if you are one of the people complaining about vibecoding you’re already out of the race

Oh and by the way it’s not gonna be with LLMs it’s coming to you from RL + robotics

show 1 reply
zackmorrisyesterday at 9:49 PM

Just wanted to leave a note here that the Singularity is inevitable on this timeline (we've already passed the event horizon) so the only thing that can stop it now is to jump timelines.

In other words, there may be a geopolitical crisis in the works, similar to how the Dot Bomb, Bush v. Gore, 9/11, etc popped the Internet Bubble and shifted investment funds towards endless war, McMansions and SUVs to appease the illuminati. Someone might sabotage the birth of AGI like the religious zealot in Contact. Global climate change might drain public and private coffers as coastal areas become uninhabitable, coinciding with the death of the last coral reefs and collapse of fisheries, leading to a mass exodus and WWIII. We just don't know.

My feeling is that the future plays out differently than any prediction, so something will happen that negates the concept of the Singularity. Maybe we'll merge with AGI and time will no longer exist (oops that's the definition). Maybe we'll meet aliens (same thing). Or maybe the k-shaped economy will lead to most people surviving as rebels while empire metastasizes, so we take droids for granted but live a subsistence feudal lifestyle. That anticlimactic conclusion is probably the safest bet, given what we know of history and trying to extrapolate from this point along the journey.