logoalt Hacker News

king_philyesterday at 8:17 PM9 repliesview on HN

Dark forest makes no sense to me. Why would a civilization eradicate another, spending huge amounts of resources (time, energy, material) when the universe has such an enormous scale that you cannot even get to each other in a timescale that makes much sense...


Replies

cbauyesterday at 8:40 PM

To quote from the book:

> “First: Survival is the primary need of civilization. Second: Civilization continuously grows and expands, but the total matter in the universe remains constant. One more thing: To derive a basic picture of cosmic sociology from these two axioms, you need two other important concepts: chains of suspicion and the technological explosion.”

1. you can never know the intentions of other entities, and they cannot know yours (chain of suspicion)

2. technology level grows unpredictably (technological explosion)

3. the goal of civilization is survival

4. resources are finite but growth is infinite

As soon as you identify another entity in the forest, even if they cannot annihilate you at present and signal peace, both could change without warning. Therefore, the only rational move is to eradicate the other immediately. (Especially if you believe the other will deduce the same.)

Elimination in the book is basically sending a nuke, not a costly invasion force.

not sure it actually is true, but that's the argument in the book

show 4 replies
nateyesterday at 8:30 PM

Are you asking about the 3 body problem version of this? Spoiler alert: The folks doing the eradicating aren't spending much time/energy/anything on eradicating. It's one large missile through space.

I think the gist is: sure, we humans can't conceive of getting to anyone else in the universe in any timescale, but if we can keep ourselves from destroying ourselves, we'll eventually figure it out. And we'll spread. And we'll kill everything that isn't us in the process as we've done as explorers on this planet.

So really in 3BP: it's inexpensive to eradicate. But insanely expensive to possibly get the intention wrong of any other civilization you encounter. They might kill you.

(again, this is just my interpretation of what 3BP said)

show 2 replies
pikeryesterday at 8:20 PM

Makes some sense to me, as the prisoner's dilemma dictates at least some fraction will try to kill you. So you've got to go first.

Reminds me of the Dan Carlin take on aircraft carriers in World War II: if you in a carrier spotted an opposing carrier and didn't send everything you had before it spotted you, you were dead. The only move was to go all in every time.

Phemistyesterday at 8:27 PM

The dark forest is conditional on that it does not require huge amounts of resources to eradicate another civilization and that (over time) the universe turns out not to be of a scale enormous enough (and in the book there are agents working to actively make it smaller).

Bringing it back to the dark forest of idea space, it is an interesting question whether the the space of feasibly executable ideas being small (as this essay assumes) is inherently true, or more of a function of our inability to navigate/travel it very well.

If the former, then yes it probably is/will be a dark forest. If the latter, then I would think the jury is still out.

show 1 reply
lifeformedyesterday at 9:19 PM

"Timescales that makes sense" may be a human reasoning but not necessarily the reasoning of inconceivably advanced timeless civilizations. Sure, that planet of fish may be harmless now, but what about in a quick three billion years when they have FTL and AGI and Von Neuman probes and Dyson spheres and antimatter bombs? Easier to click the delete button now to save the trouble later.

sebastianconcptyesterday at 9:03 PM

Agree, is a fiction based in accepting the premise of zero-sum game.

It denies that more advanced civilizations might have better models of the universe where they know this isn't an issue and we're just stupid teenagers in the neighborhood playing dangerous games and merely taking a look every now and then to see if we prove we will survive ourselves.

0x3fyesterday at 8:44 PM

Competition kills margins (profits, security, QoL), so the budget for eradication should be quite high, but generally speaking the idea is to destroy even fledgling upstarts, back when the cost is low.

show 1 reply
viccistoday at 12:32 AM

It's a silly concept IMO because it assumes that civilizations with the ability to do interstellar travel or communication make the decision to not do so because they have knowledge of an interstellar force that destroys any civilization that does so. It would seem like any civilization that becomes aware of such a force would be destroyed, so how would all of these surviving ones know of the danger? Actual dark forests are quiet because a mix of the animals' instinct and visible signs of danger.

While it's possible that some civilizations would hypothetically be able to observe what happened to others and keep quiet, they would all have to do so to solve the contradictions of Fermi's paradox.

Hikikomoriyesterday at 8:53 PM

A space war is not needed, they could just send a few missiles to take out anyone.

I have my own theory of dark forest and AGIs. That there's some collection of AGIs out there allowing evolution to develop intelligence anywhere it happens and takes them out once it produces an AGI, or if it doesn't performs a reset. They have literally all the time available to them, can easily travel the vast distances if needed.