the final recursion point is the most honest part you can't warn about the forest without feeding it. but i'd push back slightly on the inevitability. the forest needs novelty to absorb, which means the edge always exists, it just keeps moving. the question isn't whether to hide but whether the speed of individual innovation can outpace the speed of absorption. so far it still can barely.
As a work of persuasive writing, this is unfocused and seems mostly generated.
One thing I would have expected of someone who knows their history - forget LLMs, this is how startups have worked for decades now. You're only as good as your idea, your ability to execute, and your moat. And the small fish get eaten.
> The original Dark Forest assumes civilizations hide from hunters - other civilizations that might destroy them. But in the cognitive dark forest, the most dangerous actor is not your peer. It’s the forest itself.
Note the needless undercutting of the metaphor for the sake of the limp rhetorical flourish.
> I wrote this knowing it feeds the thing I’m warning you about. That’s not a contradiction. That’s the condition. You can’t step outside the forest to warn people about the forest. There is no outside.
Quite dramatic!
Except literally going outside and just talking to people? Using whiteboards?
Also, you fed it when you used a model to write this blog post. You didn't have to do that.
The LLMisms in the "thinkpad" section caused me to close the tab
>You are creating your cool streaming platform in your bedroom. Nobody is stopping you, but if you succeed, if you get the signal out, if you are being noticed, the large platform with loads of cash can incorporate your specific innovations simply by throwing compute and capital at the problem. They can generate a variation of your innovation every few days, eventually they will be able to absorb your uniqueness. It’s just cash, and they have more of it than you.
That's not exactly a new phenomenon and doesn't require AI. If anything that was worse in the 90s with Microsoft starving out pretty much any would-be competitor they could find.
And it wasn't just Microsoft: https://en.wikipedia.org/wiki/Sherlock_(software)#Sherlocked...
Like Cixin Liu's "Dark Forest" which inspired the author, this is science fiction.
LLMs do not have and cannot obtain the capabilities the author is hand-wringing about, and the current much-hyped apparent-productivity will pop with the bubble & corps have to start paying full-price for chatbot access.
if the idea can just be obliterated by an LLM, there was never a moat to begin with
Relatively grandiose post sweeps in all kinds of claims about the universe to merely warn a big corporation can copy your idea easily.
This maps nicely to Cybermen in Dr Who
I would rename "the dark forest" to "the interesting horizon"
> meat doesn’t scale
great oneliner
Big up for the reference
> I wrote this knowing it feeds the thing I’m warning you about. That’s not a contradiction. That’s the condition.
HN needs a better AI slop filter.
Or maybe I do. Maybe I can vibe code a browser extension that pre loads TFA links and auto hides anything that isn’t sufficiently human authored.
The view here shows big huge powers of technocapital consuming all else, stealing every idea.
My hope is the opposite. Integrative, resonant computing (https://resonantcomputing.org/ https://news.ycombinator.com/item?id=46659456 although I have some qualms with it's focus on privacy), with open social protocols baked in seems like maybe possibly can eat some of the vicious consumptive technocapital. In a way that capital's orientation prevents it from effectively competing with. MCP is already blowing up the old rules, tearing down strong gates, making systems more fluid / interface-y / intertwingular again, after a long interregnum of everything closing it's APIs / borders.
People seem so tired and exhausted, so aware of how predatory the technosystems about us are. But it's still so unclear people will move, shift, much less fund and support the better world. The AT proto Atmosphereconf is happening right now, and there's been a long mantra of "we can just build things"; finding adoption but also doing what conference organizer Boris said yesterday, of, "maybe we can just pay for things", support the projects doing amazing work: that's a huge unknown that is essential to actually steering us out of the dark technology, where none of us get to see or get any way in how the software-eaten world arounds us runs, where mankind for the first time in tens or hundreds of thousands of years been cut off from the world os, has been removed from gods's enlightenment / our homo erectus mankind-the-toolmaker natural-scientist role.
I think the answer to the Dark Forest fear to be building together. To be a radiant civilization, together. To energize ourselves & lead ourselves towards better systems, where we all can do things, make things, grow things, in integrative social empowering ways.
Liu Cixin's Dark Forest theory is a pretty dumb take, honestly. Just look at Earth — different species don't constantly try to wipe each other out. Sure, it happens sometimes, but it's actually relatively rare, and a lot of the time extinction isn't even intentional. Like, a huge chunk of Native American deaths came from disease, not deliberate extermination.
At the end of the day, Liu Cixin is basically a social darwinist who's got a thing for authoritarianism, and it bleeds through pretty heavily into his work. Dude is massively overrated imo.
While I agree with the sentiment, and even had the same fears, I think about it differently now…
The existing megacorps have huge swaths of infrastructure, expenses, and requirements that require massive amounts of capex to maintain. Even if performative, Meta, Google, OpenAI, Anthropic, et. Al cannot simply layoff their entire engineering, accounting, HR, sales, and support infrastructure. Those orgs are large for “good” (historically necessary) reasons.
Now fast forward to today, and this is where I differ in opinion, it is our megacorps are the civilizations who should be scared of being discovered. Minus infrastructure providers, they are the large advanced entities which can be annihilated by someone with a decent budget and a good local model.
For ~$30k-$50k (primarily buying RTX 6000 pro GPUs and a CPU with enough PCIE lanes) “anyone” can build a system using open weight models that, and let me truly emphasize this: autonomously create functionality to compete. Previously it would take me months, or years, of immense dedication to show up after work and produce something of value. Now I can do it using excess compute on my existing workstation. No existing corporation can afford to undercut every possible idea. If I only gave 1000, 10,000, or a 100,000 users they cannot compete. That may, and I believe it will, provide more than enough capital to attack that megacorps X or Y. If I’m making $100k a month, I can afford multiple autonomous systems per month. After that initial capex, I can then hire other people to help manage them. At no point will a company with billions upon billions of dollars in quarterly capex be able to compete.
Maybe they can compete with one, two, ten, or a hundred but they cannot compete with the absolute onslaught on thousands of possible frontlines. They can cut costs, by reducing their workforce, but they’ll only be increasing their competition to save their earnings report.
And yes, I realize that the open weight models are created via obscene amounts of capital, but we’re lucky that competing nation states, and cultures, like China have immense incentive to do so. Good enough, is still good enough.
The forest may be dark, but it won’t be for much longer.
tldr; call the an ambulance, but not for me. It’s going to be for the existing power structure.
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
Yep AI has made it X times easier to successfully make millions copying someone's idea.
X=1.0000001
This is one of those naive takes from a human who thinks he is even 1/1000th the intelligence of ASI which is just on the horizon.
I find the selective framing here very telling.
When there's higher violence and lower property values in a Black neighborhood, people like OP are quick to blame Black culture. But when the "Cognitive Dark Forest" emerges from a community that shares its own common characteristics, suddenly collective accountability no longer applies.
When discussing violence in the Black community, it's "cultural." But when the subject turns to financial crimes or exploitation — where the per-capita ratios tell their own story — proportionality and population-to-crime-rate analysis mysteriously stop mattering.
It's difficult to take the "Cognitive Dark Forest" seriously as an existential concern when the people raising the alarm are so selectively offended. The crisis only becomes real when their innovations, their livelihoods, and their moats are threatened. Everyone else was supposed to just adapt.
The "Cognitive Dark Forest" is and will be continued to be perpetuated by "them" and if you really cared about the issue you would have addressed them.
This essay is not written by an LLM. An LLM might not be creative but it would be able to make a coherent thesis.